Sample records for reliable quantitative analysis

  1. Inter-rater reliability of motor unit number estimates and quantitative motor unit analysis in the tibialis anterior muscle.

    PubMed

    Boe, S G; Dalton, B H; Harwood, B; Doherty, T J; Rice, C L

    2009-05-01

    To establish the inter-rater reliability of decomposition-based quantitative electromyography (DQEMG) derived motor unit number estimates (MUNEs) and quantitative motor unit (MU) analysis. Using DQEMG, two examiners independently obtained a sample of needle and surface-detected motor unit potentials (MUPs) from the tibialis anterior muscle from 10 subjects. Coupled with a maximal M wave, surface-detected MUPs were used to derive a MUNE for each subject and each examiner. Additionally, size-related parameters of the individual MUs were obtained following quantitative MUP analysis. Test-retest MUNE values were similar with high reliability observed between examiners (ICC=0.87). Additionally, MUNE variability from test-retest as quantified by a 95% confidence interval was relatively low (+/-28 MUs). Lastly, quantitative data pertaining to MU size, complexity and firing rate were similar between examiners. MUNEs and quantitative MU data can be obtained with high reliability by two independent examiners using DQEMG. Establishing the inter-rater reliability of MUNEs and quantitative MU analysis using DQEMG is central to the clinical applicability of the technique. In addition to assessing response to treatments over time, multiple clinicians may be involved in the longitudinal assessment of the MU pool of individuals with disorders of the central or peripheral nervous system.

  2. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0109] NUREG/CR-XXXX, Development of Quantitative Software..., ``Development of Quantitative Software Reliability Models for Digital Protection Systems of Nuclear Power Plants... of Risk Analysis, Office of Nuclear Regulatory Research, U.S. Nuclear Regulatory Commission...

  3. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  4. Reliability and safety, and the risk of construction damage in mining areas

    NASA Astrophysics Data System (ADS)

    Skrzypczak, Izabela; Kogut, Janusz P.; Kokoszka, Wanda; Oleniacz, Grzegorz

    2018-04-01

    This article concerns the reliability and safety of building structures in mining areas, with a particular emphasis on the quantitative risk analysis of buildings. The issues of threat assessment and risk estimation, in the design of facilities in mining exploitation areas, are presented here, indicating the difficulties and ambiguities associated with their quantification and quantitative analysis. This article presents the concept of quantitative risk assessment of the impact of mining exploitation, in accordance with ISO 13824 [1]. The risk analysis is illustrated through an example of a construction located within an area affected by mining exploitation.

  5. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  6. Smile line assessment comparing quantitative measurement and visual estimation.

    PubMed

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  7. Psychometric Inferences from a Meta-Analysis of Reliability and Internal Consistency Coefficients

    ERIC Educational Resources Information Center

    Botella, Juan; Suero, Manuel; Gambara, Hilda

    2010-01-01

    A meta-analysis of the reliability of the scores from a specific test, also called reliability generalization, allows the quantitative synthesis of its properties from a set of studies. It is usually assumed that part of the variation in the reliability coefficients is due to some unknown and implicit mechanism that restricts and biases the…

  8. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  9. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.

  10. The influence of biological and technical factors on quantitative analysis of amyloid PET: Points to consider and recommendations for controlling variability in longitudinal data.

    PubMed

    Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William

    2015-09-01

    In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  11. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  12. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.

  13. A Meta-Analysis of Reliability Coefficients in Second Language Research

    ERIC Educational Resources Information Center

    Plonsky, Luke; Derrick, Deirdre J.

    2016-01-01

    Ensuring internal validity in quantitative research requires, among other conditions, reliable instrumentation. Unfortunately, however, second language (L2) researchers often fail to report and even more often fail to interpret reliability estimates beyond generic benchmarks for acceptability. As a means to guide interpretations of such estimates,…

  14. Evaluation of patients with painful total hip arthroplasty using combined single photon emission tomography and conventional computerized tomography (SPECT/CT) - a comparison of semi-quantitative versus 3D volumetric quantitative measurements.

    PubMed

    Barthassat, Emilienne; Afifi, Faik; Konala, Praveen; Rasch, Helmut; Hirschmann, Michael T

    2017-05-08

    It was the primary purpose of our study to evaluate the inter- and intra-observer reliability of a standardized SPECT/CT algorithm for evaluating patients with painful primary total hip arthroplasty (THA). The secondary purpose was a comparison of semi-quantitative and 3D volumetric quantification method for assessment of bone tracer uptake (BTU) in those patients. A novel SPECT/CT localization scheme consisting of 14 femoral and 4 acetabular regions on standardized axial and coronal slices was introduced and evaluated in terms of inter- and intra-observer reliability in 37 consecutive patients with hip pain after THA. BTU for each anatomical region was assessed semi-quantitatively using a color-coded Likert type scale (0-10) and volumetrically quantified using a validated software. Two observers interpreted the SPECT/CT findings in all patients two times with six weeks interval between interpretations in random order. Semi-quantitative and quantitative measurements were compared in terms of reliability. In addition, the values were correlated using Pearson`s correlation. A factorial cluster analysis of BTU was performed to identify clinically relevant regions, which should be grouped and analysed together. The localization scheme showed high inter- and intra-observer reliabilities for all femoral and acetabular regions independent of the measurement method used (semiquantitative versus 3D volumetric quantitative measurements). A high to moderate correlation between both measurement methods was shown for the distal femur, the proximal femur and the acetabular cup. The factorial cluster analysis showed that the anatomical regions might be summarized into three distinct anatomical regions. These were the proximal femur, the distal femur and the acetabular cup region. The SPECT/CT algorithm for assessment of patients with pain after THA is highly reliable independent from the measurement method used. Three clinically relevant anatomical regions (proximal femoral, distal femoral, acetabular) were identified.

  15. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    PubMed

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. [Development and application of morphological analysis method in Aspergillus niger fermentation].

    PubMed

    Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang

    2015-02-01

    Filamentous fungi are widely used in industrial fermentation. Particular fungal morphology acts as a critical index for a successful fermentation. To break the bottleneck of morphological analysis, we have developed a reliable method for fungal morphological analysis. By this method, we can prepare hundreds of pellet samples simultaneously and obtain quantitative morphological information at large scale quickly. This method can largely increase the accuracy and reliability of morphological analysis result. Based on that, the studies of Aspergillus niger morphology under different oxygen supply conditions and shear rate conditions were carried out. As a result, the morphological responding patterns of A. niger morphology to these conditions were quantitatively demonstrated, which laid a solid foundation for the further scale-up.

  17. Reliability on intra-laboratory and inter-laboratory data of hair mineral analysis comparing with blood analysis.

    PubMed

    Namkoong, Sun; Hong, Seung Phil; Kim, Myung Hwa; Park, Byung Cheol

    2013-02-01

    Nowadays, although its clinical value remains controversial institutions utilize hair mineral analysis. Arguments about the reliability of hair mineral analysis persist, and there have been evaluations of commercial laboratories performing hair mineral analysis. The objective of this study was to assess the reliability of intra-laboratory and inter-laboratory data at three commercial laboratories conducting hair mineral analysis, compared to serum mineral analysis. Two divided hair samples taken from near the scalp were submitted for analysis at the same time, to all laboratories, from one healthy volunteer. Each laboratory sent a report consisting of quantitative results and their interpretation of health implications. Differences among intra-laboratory and interlaboratory data were analyzed using SPSS version 12.0 (SPSS Inc., USA). All the laboratories used identical methods for quantitative analysis, and they generated consistent numerical results according to Friedman analysis of variance. However, the normal reference ranges of each laboratory varied. As such, each laboratory interpreted the patient's health differently. On intra-laboratory data, Wilcoxon analysis suggested they generated relatively coherent data, but laboratory B could not in one element, so its reliability was doubtful. In comparison with the blood test, laboratory C generated identical results, but not laboratory A and B. Hair mineral analysis has its limitations, considering the reliability of inter and intra laboratory analysis comparing with blood analysis. As such, clinicians should be cautious when applying hair mineral analysis as an ancillary tool. Each laboratory included in this study requires continuous refinement from now on for inducing standardized normal reference levels.

  18. Integrated Approach To Design And Analysis Of Systems

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Iverson, David L.

    1993-01-01

    Object-oriented fault-tree representation unifies evaluation of reliability and diagnosis of faults. Programming/fault tree described more fully in "Object-Oriented Algorithm For Evaluation Of Fault Trees" (ARC-12731). Augmented fault tree object contains more information than fault tree object used in quantitative analysis of reliability. Additional information needed to diagnose faults in system represented by fault tree.

  19. Soft error evaluation and vulnerability analysis in Xilinx Zynq-7010 system-on chip

    NASA Astrophysics Data System (ADS)

    Du, Xuecheng; He, Chaohui; Liu, Shuhuan; Zhang, Yao; Li, Yonghong; Xiong, Ceng; Tan, Pengkang

    2016-09-01

    Radiation-induced soft errors are an increasingly important threat to the reliability of modern electronic systems. In order to evaluate system-on chip's reliability and soft error, the fault tree analysis method was used in this work. The system fault tree was constructed based on Xilinx Zynq-7010 All Programmable SoC. Moreover, the soft error rates of different components in Zynq-7010 SoC were tested by americium-241 alpha radiation source. Furthermore, some parameters that used to evaluate the system's reliability and safety were calculated using Isograph Reliability Workbench 11.0, such as failure rate, unavailability and mean time to failure (MTTF). According to fault tree analysis for system-on chip, the critical blocks and system reliability were evaluated through the qualitative and quantitative analysis.

  20. Reliability Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange Kevin E.; Anderson, Molly S.

    2012-01-01

    Quantitative assessments of system reliability and equivalent system mass (ESM) were made for different life support architectures based primarily on International Space Station technologies. The analysis was applied to a one-year deep-space mission. System reliability was increased by adding redundancy and spares, which added to the ESM. Results were thus obtained allowing a comparison of the ESM for each architecture at equivalent levels of reliability. Although the analysis contains numerous simplifications and uncertainties, the results suggest that achieving necessary reliabilities for deep-space missions will add substantially to the life support ESM and could influence the optimal degree of life support closure. Approaches for reducing reliability impacts were investigated and are discussed.

  1. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    PubMed

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  2. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies.

    PubMed

    Madill, A; Jordan, A; Shirley, C

    2000-02-01

    The effect of the individual analyst on research findings can create a credibility problem for qualitative approaches from the perspective of evaluative criteria utilized in quantitative psychology. This paper explicates the ways in which objectivity and reliability are understood in qualitative analysis conducted from within three distinct epistemological frameworks: realism, contextual constructionism, and radical constructionism. It is argued that quality criteria utilized in quantitative psychology are appropriate to the evaluation of qualitative analysis only to the extent that it is conducted within a naive or scientific realist framework. The discussion is illustrated with reference to the comparison of two independent grounded theory analyses of identical material. An implication of this illustration is to identify the potential to develop a radical constructionist strand of grounded theory.

  3. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  4. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  5. Tannin structural elucidation and quantitative ³¹P NMR analysis. 2. Hydrolyzable tannins and proanthocyanidins.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products.

  6. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  7. [The application of stereology in radiology imaging and cell biology fields].

    PubMed

    Hu, Na; Wang, Yan; Feng, Yuanming; Lin, Wang

    2012-08-01

    Stereology is an interdisciplinary method for 3D morphological study developed from mathematics and morphology. It is widely used in medical image analysis and cell biology studies. Because of its unbiased, simple, fast, reliable and non-invasive characteristics, stereology has been widely used in biomedical areas for quantitative analysis and statistics, such as histology, pathology and medical imaging. Because the stereological parameters show distinct differences in different pathology, many scholars use stereological methods to do quantitative analysis in their studies in recent years, for example, in the areas of the condition of cancer cells, tumor grade, disease development and the patient's prognosis, etc. This paper describes the stereological concept and estimation methods, also illustrates the applications of stereology in the fields of CT images, MRI images and cell biology, and finally reflects the universality, the superiority and reliability of stereology.

  8. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  9. Automated Quantitative Analysis of Retinal Microvasculature in Normal Eyes on Optical Coherence Tomography Angiography.

    PubMed

    Lupidi, Marco; Coscas, Florence; Cagini, Carlo; Fiore, Tito; Spaccini, Elisa; Fruttini, Daniela; Coscas, Gabriel

    2016-09-01

    To describe a new automated quantitative technique for displaying and analyzing macular vascular perfusion using optical coherence tomography angiography (OCT-A) and to determine a normative data set, which might be used as reference in identifying progressive changes due to different retinal vascular diseases. Reliability study. A retrospective review of 47 eyes of 47 consecutive healthy subjects imaged with a spectral-domain OCT-A device was performed in a single institution. Full-spectrum amplitude-decorrelation angiography generated OCT angiograms of the retinal superficial and deep capillary plexuses. A fully automated custom-built software was used to provide quantitative data on the foveal avascular zone (FAZ) features and the total vascular and avascular surfaces. A comparative analysis between central macular thickness (and volume) and FAZ metrics was performed. Repeatability and reproducibility were also assessed in order to establish the feasibility and reliability of the method. The comparative analysis between the superficial capillary plexus and the deep capillary plexus revealed a statistically significant difference (P < .05) in terms of FAZ perimeter, surface, and major axis and a not statistically significant difference (P > .05) when considering total vascular and avascular surfaces. A linear correlation was demonstrated between central macular thickness (and volume) and the FAZ surface. Coefficients of repeatability and reproducibility were less than 0.4, thus demonstrating high intraobserver repeatability and interobserver reproducibility for all the examined data. A quantitative approach on retinal vascular perfusion, which is visible on Spectralis OCT angiography, may offer an objective and reliable method for monitoring disease progression in several retinal vascular diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Detection of sex chromosome aneuploidies using quantitative fluorescent PCR in the Hungarian population.

    PubMed

    Nagy, Balint; Nagy, Richard Gyula; Lazar, Levente; Schonleber, Julianna; Papp, Csaba; Rigo, Janos

    2015-05-20

    Aneuploidies are the most frequent chromosomal abnormalities at birth. Autosomal aneuploidies cause serious malformations like trisomy 21, trisomy 18 and trisomy 13. However sex chromosome aneuploidies are causing less severe syndromes. For the detection of these aneuploidies, the "gold standard" method is the cytogenetic analysis of fetal cells, karyograms show all numerical and structural abnormalities, but it takes 2-4 weeks to get the reports. Molecular biological methods were developed to overcome the long culture time, thus, FISH and quantitative fluorescent PCR were introduced. In this work we show our experience with a commercial kit for the detection of sex chromosome aneuploidies. We analyzed 20.173 amniotic fluid samples for the period of 2006-2013 in our department. A conventional cytogenetic analysis was performed on the samples. We checked the reliability of quantitative fluorescent PCR and DNA fragment analysis on those samples where sex chromosomal aneuploidy was diagnosed. From the 20.173 amniotic fluid samples we found 50 samples with sex chromosome aneuploidy. There were 19 samples showing 46, XO, 17 samples with 46, XXY, 9 samples with 47, XXX and 5 samples with 47, XYY karyotypes. The applied quantitative fluorescent PCR and DNA fragment analyses method are suitable to detect all abnormal sex chromosome aneuploidies. Quantitative fluorescent PCR is a fast and reliable method for detection of sex chromosome aneuploidies. Copyright © 2015. Published by Elsevier B.V.

  11. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  12. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    ERIC Educational Resources Information Center

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  13. The Effect of Different Cultural Lenses on Reliability and Validity in Observational Data: The Example of Chinese Immigrant Parent-Toddler Dinner Interactions

    ERIC Educational Resources Information Center

    Wang, Yan Z.; Wiley, Angela R.; Zhou, Xiaobin

    2007-01-01

    This study used a mixed methodology to investigate reliability, validity, and analysis level with Chinese immigrant observational data. European-American and Chinese coders quantitatively rated 755 minutes of Chinese immigrant parent-toddler dinner interactions on parental sensitivity, intrusiveness, detachment, negative affect, positive affect,…

  14. SuperSegger: robust image segmentation, analysis and lineage tracking of bacterial cells.

    PubMed

    Stylianidou, Stella; Brennan, Connor; Nissen, Silas B; Kuwada, Nathan J; Wiggins, Paul A

    2016-11-01

    Many quantitative cell biology questions require fast yet reliable automated image segmentation to identify and link cells from frame-to-frame, and characterize the cell morphology and fluorescence. We present SuperSegger, an automated MATLAB-based image processing package well-suited to quantitative analysis of high-throughput live-cell fluorescence microscopy of bacterial cells. SuperSegger incorporates machine-learning algorithms to optimize cellular boundaries and automated error resolution to reliably link cells from frame-to-frame. Unlike existing packages, it can reliably segment microcolonies with many cells, facilitating the analysis of cell-cycle dynamics in bacteria as well as cell-contact mediated phenomena. This package has a range of built-in capabilities for characterizing bacterial cells, including the identification of cell division events, mother, daughter and neighbouring cells, and computing statistics on cellular fluorescence, the location and intensity of fluorescent foci. SuperSegger provides a variety of postprocessing data visualization tools for single cell and population level analysis, such as histograms, kymographs, frame mosaics, movies and consensus images. Finally, we demonstrate the power of the package by analyzing lag phase growth with single cell resolution. © 2016 John Wiley & Sons Ltd.

  15. User-perceived reliability of unrepairable shared protection systems with functionally identical units

    NASA Astrophysics Data System (ADS)

    Ozaki, Hirokazu; Kara, Atsushi; Cheng, Zixue

    2012-05-01

    In this article, we investigate the reliability of M-for-N (M:N) shared protection systems. We focus on the reliability that is perceived by an end user of one of N units. We assume that any failed unit is instantly replaced by one of the M units (if available). We describe the effectiveness of such a protection system in a quantitative manner under the condition that the failed units are not repairable. Mathematical analysis gives the closed-form solution of the reliability and mean time to failure (MTTF). We also analyse several numerical examples of the reliability and MTTF. This result can be applied, for example, to the analysis and design of an integrated circuit consisting of redundant backup components. In such a device, repairing a failed component is unrealistic. The analysis provides useful information for the design for general shared protection systems in which the failed units are not repaired.

  16. A Bayesian approach to reliability and confidence

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1989-01-01

    The historical evolution of NASA's interest in quantitative measures of reliability assessment is outlined. The introduction of some quantitative methodologies into the Vehicle Reliability Branch of the Safety, Reliability and Quality Assurance (SR and QA) Division at Johnson Space Center (JSC) was noted along with the development of the Extended Orbiter Duration--Weakest Link study which will utilize quantitative tools for a Bayesian statistical analysis. Extending the earlier work of NASA sponsor, Richard Heydorn, researchers were able to produce a consistent Bayesian estimate for the reliability of a component and hence by a simple extension for a system of components in some cases where the rate of failure is not constant but varies over time. Mechanical systems in general have this property since the reliability usually decreases markedly as the parts degrade over time. While they have been able to reduce the Bayesian estimator to a simple closed form for a large class of such systems, the form for the most general case needs to be attacked by the computer. Once a table is generated for this form, researchers will have a numerical form for the general solution. With this, the corresponding probability statements about the reliability of a system can be made in the most general setting. Note that the utilization of uniform Bayesian priors represents a worst case scenario in the sense that as researchers incorporate more expert opinion into the model, they will be able to improve the strength of the probability calculations.

  17. Statistical methodology: II. Reliability and validity assessment in study design, Part B.

    PubMed

    Karras, D J

    1997-02-01

    Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.

  18. The reliability analysis of a separated, dual fail operational redundant strapdown IMU. [inertial measurement unit

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology for quantitatively analyzing the reliability of redundant avionics systems, in general, and the dual, separated Redundant Strapdown Inertial Measurement Unit (RSDIMU), in particular, is presented. The RSDIMU is described and a candidate failure detection and isolation system presented. A Markov reliability model is employed. The operational states of the system are defined and the single-step state transition diagrams discussed. Graphical results, showing the impact of major system parameters on the reliability of the RSDIMU system, are presented and discussed.

  19. Reliable LC-MS quantitative glycomics using iGlycoMab stable isotope labeled glycans as internal standards.

    PubMed

    Zhou, Shiyue; Tello, Nadia; Harvey, Alex; Boyes, Barry; Orlando, Ron; Mechref, Yehia

    2016-06-01

    Glycans have numerous functions in various biological processes and participate in the progress of diseases. Reliable quantitative glycomic profiling techniques could contribute to the understanding of the biological functions of glycans, and lead to the discovery of potential glycan biomarkers for diseases. Although LC-MS is a powerful analytical tool for quantitative glycomics, the variation of ionization efficiency and MS intensity bias are influencing quantitation reliability. Internal standards can be utilized for glycomic quantitation by MS-based methods to reduce variability. In this study, we used stable isotope labeled IgG2b monoclonal antibody, iGlycoMab, as an internal standard to reduce potential for errors and to reduce variabililty due to sample digestion, derivatization, and fluctuation of nanoESI efficiency in the LC-MS analysis of permethylated N-glycans released from model glycoproteins, human blood serum, and breast cancer cell line. We observed an unanticipated degradation of isotope labeled glycans, tracked a source of such degradation, and optimized a sample preparation protocol to minimize degradation of the internal standard glycans. All results indicated the effectiveness of using iGlycoMab to minimize errors originating from sample handling and instruments. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Tackling reliability and construct validity: the systematic development of a qualitative protocol for skill and incident analysis.

    PubMed

    Savage, Trevor Nicholas; McIntosh, Andrew Stuart

    2017-03-01

    It is important to understand factors contributing to and directly causing sports injuries to improve the effectiveness and safety of sports skills. The characteristics of injury events must be evaluated and described meaningfully and reliably. However, many complex skills cannot be effectively investigated quantitatively because of ethical, technological and validity considerations. Increasingly, qualitative methods are being used to investigate human movement for research purposes, but there are concerns about reliability and measurement bias of such methods. Using the tackle in Rugby union as an example, we outline a systematic approach for developing a skill analysis protocol with a focus on improving objectivity, validity and reliability. Characteristics for analysis were selected using qualitative analysis and biomechanical theoretical models and epidemiological and coaching literature. An expert panel comprising subject matter experts provided feedback and the inter-rater reliability of the protocol was assessed using ten trained raters. The inter-rater reliability results were reviewed by the expert panel and the protocol was revised and assessed in a second inter-rater reliability study. Mean agreement in the second study improved and was comparable (52-90% agreement and ICC between 0.6 and 0.9) with other studies that have reported inter-rater reliability of qualitative analysis of human movement.

  1. Differential reliability : probabilistic engineering applied to wood members in bending-tension

    Treesearch

    Stanley K. Suddarth; Frank E. Woeste; William L. Galligan

    1978-01-01

    Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...

  2. Insights from Industry: A Quantitative Analysis of Engineers' Perceptions of Empathy and Care within Their Practice

    ERIC Educational Resources Information Center

    Hess, Justin L.; Strobel, Johannes; Pan, Rui; Wachter Morris, Carrie A.

    2017-01-01

    This study focuses on two seldom-investigated skills or dispositions aligned with engineering habits of mind--empathy and care. In order to conduct quantitative research, we designed, explored the underlying structure of, validated, and tested the reliability of the Empathy and Care Questionnaire (ECQ), a new psychometric instrument. In the second…

  3. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    PubMed

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Reliability of a novel, semi-quantitative scale for classification of structural brain magnetic resonance imaging in children with cerebral palsy.

    PubMed

    Fiori, Simona; Cioni, Giovanni; Klingels, Katrjin; Ortibus, Els; Van Gestel, Leen; Rose, Stephen; Boyd, Roslyn N; Feys, Hilde; Guzzetta, Andrea

    2014-09-01

    To describe the development of a novel rating scale for classification of brain structural magnetic resonance imaging (MRI) in children with cerebral palsy (CP) and to assess its interrater and intrarater reliability. The scale consists of three sections. Section 1 contains descriptive information about the patient and MRI. Section 2 contains the graphical template of brain hemispheres onto which the lesion is transposed. Section 3 contains the scoring system for the quantitative analysis of the lesion characteristics, grouped into different global scores and subscores that assess separately side, regions, and depth. A larger interrater and intrarater reliability study was performed in 34 children with CP (22 males, 12 females; mean age at scan of 9 y 5 mo [SD 3 y 3 mo], range 4 y-16 y 11 mo; Gross Motor Function Classification System level I, [n=22], II [n=10], and level III [n=2]). Very high interrater and intrarater reliability of the total score was found with indices above 0.87. Reliability coefficients of the lobar and hemispheric subscores ranged between 0.53 and 0.95. Global scores for hemispheres, basal ganglia, brain stem, and corpus callosum showed reliability coefficients above 0.65. This study presents the first visual, semi-quantitative scale for classification of brain structural MRI in children with CP. The high degree of reliability of the scale supports its potential application for investigating the relationship between brain structure and function and examining treatment response according to brain lesion severity in children with CP. © 2014 Mac Keith Press.

  5. Reliability and validity evidence of the Assessment of Language Use in Social Contexts for Adults (ALUSCA).

    PubMed

    Valente, Ana Rita S; Hall, Andreia; Alvelos, Helena; Leahy, Margaret; Jesus, Luis M T

    2018-04-12

    The appropriate use of language in context depends on the speaker's pragmatic language competencies. A coding system was used to develop a specific and adult-focused self-administered questionnaire to adults who stutter and adults who do not stutter, The Assessment of Language Use in Social Contexts for Adults, with three categories: precursors, basic exchanges, and extended literal/non-literal discourse. This paper presents the content validity, item analysis, reliability coefficients and evidences of construct validity of the instrument. Content validity analysis was based on a two-stage process: first, 11 pragmatic questionnaires were assessed to identify items that probe each pragmatic competency and to create the first version of the instrument; second, items were assessed qualitatively by an expert panel composed by adults who stutter and controls, and quantitatively and qualitatively by an expert panel composed by clinicians. A pilot study was conducted with five adults who stutter and five controls to analyse items and calculate reliability. Construct validity evidences were obtained using the hypothesized relationships method and factor analysis with 28 adults who stutter and 28 controls. Concerning content validity, the questionnaires assessed up to 13 pragmatic competencies. Qualitative and quantitative analysis revealed ambiguities in items construction. Disagreement between experts was solved through item modification. The pilot study showed that the instrument presented internal consistency and temporal stability. Significant differences between adults who stutter and controls and different response profiles revealed the instrument's underlying construct. The instrument is reliable and presented evidences of construct validity.

  6. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  7. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  8. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  9. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  10. Reference genes for reverse transcription quantitative PCR in canine brain tissue.

    PubMed

    Stassen, Quirine E M; Riemers, Frank M; Reijmerink, Hannah; Leegwater, Peter A J; Penning, Louis C

    2015-12-09

    In the last decade canine models have been used extensively to study genetic causes of neurological disorders such as epilepsy and Alzheimer's disease and unravel their pathophysiological pathways. Reverse transcription quantitative polymerase chain reaction is a sensitive and inexpensive method to study expression levels of genes involved in disease processes. Accurate normalisation with stably expressed so-called reference genes is crucial for reliable expression analysis. Following the minimum information for publication of quantitative real-time PCR experiments precise guidelines, the expression of ten frequently used reference genes, namely YWHAZ, HMBS, B2M, SDHA, GAPDH, HPRT, RPL13A, RPS5, RPS19 and GUSB was evaluated in seven brain regions (frontal lobe, parietal lobe, occipital lobe, temporal lobe, thalamus, hippocampus and cerebellum) and whole brain of healthy dogs. The stability of expression varied between different brain areas. Using the GeNorm and Normfinder software HMBS, GAPDH and HPRT were the most reliable reference genes for whole brain. Furthermore based on GeNorm calculations it was concluded that as little as two to three reference genes are sufficient to obtain reliable normalisation, irrespective the brain area. Our results amend/extend the limited previously published data on canine brain reference genes. Despite the excellent expression stability of HMBS, GAPDH and HRPT, the evaluation of expression stability of reference genes must be a standard and integral part of experimental design and subsequent data analysis.

  11. Influence of sample preparation and reliability of automated numerical refocusing in stain-free analysis of dissected tissues with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Lenz, Philipp; Bettenworth, Dominik; Krausewitz, Philipp; Domagk, Dirk; Ketelhut, Steffi

    2015-05-01

    Digital holographic microscopy (DHM) has been demonstrated to be a versatile tool for high resolution non-destructive quantitative phase imaging of surfaces and multi-modal minimally-invasive monitoring of living cell cultures in-vitro. DHM provides quantitative monitoring of physiological processes through functional imaging and structural analysis which, for example, gives new insight into signalling of cellular water permeability and cell morphology changes due to toxins and infections. Also the analysis of dissected tissues quantitative DHM phase contrast prospects application fields by stain-free imaging and the quantification of tissue density changes. We show that DHM allows imaging of different tissue layers with high contrast in unstained tissue sections. As the investigation of fixed samples represents a very important application field in pathology, we also analyzed the influence of the sample preparation. The retrieved data demonstrate that the quality of quantitative DHM phase images of dissected tissues depends strongly on the fixing method and common staining agents. As in DHM the reconstruction is performed numerically, multi-focus imaging is achieved from a single digital hologram. Thus, we evaluated the automated refocussing feature of DHM for application on different types of dissected tissues and revealed that on moderately stained samples highly reproducible holographic autofocussing can be achieved. Finally, it is demonstrated that alterations of the spatial refractive index distribution in murine and human tissue samples represent a reliable absolute parameter that is related of different degrees of inflammation in experimental colitis and Crohn's disease. This paves the way towards the usage of DHM in digital pathology for automated histological examinations and further studies to elucidate the translational potential of quantitative phase microscopy for the clinical management of patients, e.g., with inflammatory bowel disease.

  12. Hyperspectral Imaging and SPA-LDA Quantitative Analysis for Detection of Colon Cancer Tissue

    NASA Astrophysics Data System (ADS)

    Yuan, X.; Zhang, D.; Wang, Ch.; Dai, B.; Zhao, M.; Li, B.

    2018-05-01

    Hyperspectral imaging (HSI) has been demonstrated to provide a rapid, precise, and noninvasive method for cancer detection. However, because HSI contains many data, quantitative analysis is often necessary to distill information useful for distinguishing cancerous from normal tissue. To demonstrate that HSI with our proposed algorithm can make this distinction, we built a Vis-NIR HSI setup and made many spectral images of colon tissues, and then used a successive projection algorithm (SPA) to analyze the hyperspectral image data of the tissues. This was used to build an identification model based on linear discrimination analysis (LDA) using the relative reflectance values of the effective wavelengths. Other tissues were used as a prediction set to verify the reliability of the identification model. The results suggest that Vis-NIR hyperspectral images, together with the spectroscopic classification method, provide a new approach for reliable and safe diagnosis of colon cancer and could lead to advances in cancer diagnosis generally.

  13. Quantitative analysis of fungicide azoxystrobin in agricultural samples with rapid, simple and reliable monoclonal immunoassay.

    PubMed

    Watanabe, Eiki; Miyake, Shiro

    2013-01-15

    This work presents analytical performance of a kit-based direct competitive enzyme-linked immunosorbent assay (dc-ELISA) for azoxystrobin detection in agricultural products. The dc-ELISA was sufficiently sensitive for analysis of residue levels close to the maximum residue limits. The dc-ELISA did not show cross-reactivity to other strobilurin analogues. Absorbance decreased with the increase of methanol concentration in sample solution from 2% to 40%, while the standard curve became most linear when the sample solution contained 10% methanol. Agricultural samples were extracted with methanol, and the extracts were diluted with water to 10% methanol adequate. No significant matrix interference was observed. Satisfying recovery was found for all of spiked samples and the results were well agreed with the analysis with liquid chromatography. These results clearly indicate that the kit-based dc-ELISA is suitable for the rapid, simple, quantitative and reliable determination of the fungicide. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. New horizons in mouse immunoinformatics: reliable in silico prediction of mouse class I histocompatibility major complex peptide binding affinity.

    PubMed

    Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R

    2004-11-21

    Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).

  15. Factor Structure and Reliability of the 2008 and 2009 SERU/UCUES Questionnaire Core. SERU Project Technical Report

    ERIC Educational Resources Information Center

    Chatman, Steve

    2009-01-01

    This technical report summarizes the third independent factor analysis of the SERU/UCUES questionnaire responses of students with majors. The 2009 solution employed the same quantitative analysis used in the prior solutions--varimax orthogonal rotation to determine principal components followed by promax oblique rotation to identify…

  16. Improved FTA methodology and application to subsea pipeline reliability design.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  17. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  18. A New Algorithm Using Cross-Assignment for Label-Free Quantitation with LC/LTQ-FT MS

    PubMed Central

    Andreev, Victor P.; Li, Lingyun; Cao, Lei; Gu, Ye; Rejtar, Tomas; Wu, Shiaw-Lin; Karger, Barry L.

    2008-01-01

    A new algorithm is described for label-free quantitation of relative protein abundances across multiple complex proteomic samples. Q-MEND is based on the denoising and peak picking algorithm, MEND, previously developed in our laboratory. Q-MEND takes advantage of the high resolution and mass accuracy of the hybrid LTQFT MS mass spectrometer (or other high resolution mass spectrometers, such as a Q-TOF MS). The strategy, termed “cross-assignment”, is introduced to increase substantially the number of quantitated proteins. In this approach, all MS/MS identifications for the set of analyzed samples are combined into a master ID list, and then each LC/MS run is searched for the features that can be assigned to a specific identification from that master list. The reliability of quantitation is enhanced by quantitating separately all peptide charge states, along with a scoring procedure to filter out less reliable peptide abundance measurements. The effectiveness of Q-MEND is illustrated in the relative quantitative analysis of E.coli samples spiked with known amounts of non-E.coli protein digests. A mean quantitation accuracy of 7% and mean precision of 15% is demonstrated. Q-MEND can perform relative quantitation of a set of LC/MS datasets without manual intervention and can generate files compatible with the Guidelines for Proteomic Data Publication. PMID:17441747

  19. A new algorithm using cross-assignment for label-free quantitation with LC-LTQ-FT MS.

    PubMed

    Andreev, Victor P; Li, Lingyun; Cao, Lei; Gu, Ye; Rejtar, Tomas; Wu, Shiaw-Lin; Karger, Barry L

    2007-06-01

    A new algorithm is described for label-free quantitation of relative protein abundances across multiple complex proteomic samples. Q-MEND is based on the denoising and peak picking algorithm, MEND, previously developed in our laboratory. Q-MEND takes advantage of the high resolution and mass accuracy of the hybrid LTQ-FT MS mass spectrometer (or other high-resolution mass spectrometers, such as a Q-TOF MS). The strategy, termed "cross-assignment", is introduced to increase substantially the number of quantitated proteins. In this approach, all MS/MS identifications for the set of analyzed samples are combined into a master ID list, and then each LC-MS run is searched for the features that can be assigned to a specific identification from that master list. The reliability of quantitation is enhanced by quantitating separately all peptide charge states, along with a scoring procedure to filter out less reliable peptide abundance measurements. The effectiveness of Q-MEND is illustrated in the relative quantitative analysis of Escherichia coli samples spiked with known amounts of non-E. coli protein digests. A mean quantitation accuracy of 7% and mean precision of 15% is demonstrated. Q-MEND can perform relative quantitation of a set of LC-MS data sets without manual intervention and can generate files compatible with the Guidelines for Proteomic Data Publication.

  20. Concurrent validation and reliability of digital image analysis of granulation tissue color for clinical pressure ulcers.

    PubMed

    Iizaka, Shinji; Sugama, Junko; Nakagami, Gojiro; Kaitani, Toshiko; Naito, Ayumi; Koyanagi, Hiroe; Matsuo, Junko; Kadono, Takafumi; Konya, Chizuko; Sanada, Hiromi

    2011-01-01

    Granulation tissue color is one indicator for pressure ulcer (PU) assessment. However, it entails a subjective evaluation only, and quantitative methods have not been established. We developed color indicators from digital image analysis and investigated their concurrent validity and reliability for clinical PUs. A cross-sectional study was conducted on 47 patients with 55 full-thickness PUs. After color calibration, a wound photograph was converted into three images representing red color: erythema index (EI), modified erythema index with additional color calibration (granulation red index [GRI]), and , which represents the artificially created red-green axis of L(*) a(*) b(*) color space. The mean intensity of the granulation tissue region and the percentage of pixels exceeding the optimal cutoff intensity (% intensity) were calculated. Mean GRI (ρ=0.39, p=0.007) and (ρ=0.55, p<0.001), as well as their % intensity indicators, showed positive correlations with a(*) measured by tristimulus colorimeter, but erythema index did not. They were correlated with hydroxyproline concentration in wound fluid, healthy granulation tissue area, and blood hemoglobin level. Intra- and interrater reliability of the indicator calculation using both GRI and had an intraclass correlation coefficient >0.9. GRI and from digital image analysis can quantitatively evaluate granulation tissue color of clinical PUs. © 2011 by the Wound Healing Society.

  1. A diameter-sensitive flow entropy method for reliability consideration in water distribution system design

    NASA Astrophysics Data System (ADS)

    Liu, Haixing; Savić, Dragan; Kapelan, Zoran; Zhao, Ming; Yuan, Yixing; Zhao, Hongbin

    2014-07-01

    Flow entropy is a measure of uniformity of pipe flows in water distribution systems. By maximizing flow entropy one can identify reliable layouts or connectivity in networks. In order to overcome the disadvantage of the common definition of flow entropy that does not consider the impact of pipe diameter on reliability, an extended definition of flow entropy, termed as diameter-sensitive flow entropy, is proposed. This new methodology is then assessed by using other reliability methods, including Monte Carlo Simulation, a pipe failure probability model, and a surrogate measure (resilience index) integrated with water demand and pipe failure uncertainty. The reliability assessment is based on a sample of WDS designs derived from an optimization process for each of the two benchmark networks. Correlation analysis is used to evaluate quantitatively the relationship between entropy and reliability. To ensure reliability, a comparative analysis between the flow entropy and the new method is conducted. The results demonstrate that the diameter-sensitive flow entropy shows consistently much stronger correlation with the three reliability measures than simple flow entropy. Therefore, the new flow entropy method can be taken as a better surrogate measure for reliability and could be potentially integrated into the optimal design problem of WDSs. Sensitivity analysis results show that the velocity parameters used in the new flow entropy has no significant impact on the relationship between diameter-sensitive flow entropy and reliability.

  2. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  3. Regional reliability of quantitative signal targeting with alternating radiofrequency (STAR) labeling of arterial regions (QUASAR).

    PubMed

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. © 2014 The Authors. Journal of Neuroimaging published by the American Society of Neuroimaging.

  4. Regional Reliability of Quantitative Signal Targeting with Alternating Radiofrequency (STAR) Labeling of Arterial Regions (QUASAR)

    PubMed Central

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    BACKGROUND AND PURPOSE Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. METHODS Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. RESULTS The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. CONCLUSIONS Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. PMID:25370338

  5. Reliability of Fault Tolerant Control Systems. Part 1

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2001-01-01

    This paper reports Part I of a two part effort, that is intended to delineate the relationship between reliability and fault tolerant control in a quantitative manner. Reliability analysis of fault-tolerant control systems is performed using Markov models. Reliability properties, peculiar to fault-tolerant control systems are emphasized. As a consequence, coverage of failures through redundancy management can be severely limited. It is shown that in the early life of a syi1ein composed of highly reliable subsystems, the reliability of the overall system is affine with respect to coverage, and inadequate coverage induces dominant single point failures. The utility of some existing software tools for assessing the reliability of fault tolerant control systems is also discussed. Coverage modeling is attempted in Part II in a way that captures its dependence on the control performance and on the diagnostic resolution.

  6. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  7. 78 FR 63036 - Transmission Planning Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-23

    ... blend of specific quantitative and qualitative parameters for the permissible use of planned non... circumstances, Reliability Standard TPL-001-4 provides a blend of specific quantitative and qualitative... considerations, such as costs and alternatives, guards against a determination based solely on a quantitative...

  8. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview

    PubMed Central

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee

    2013-01-01

    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037

  9. Reliability of digital reactor protection system based on extenics.

    PubMed

    Zhao, Jing; He, Ya-Nan; Gu, Peng-Fei; Chen, Wei-Hua; Gao, Feng

    2016-01-01

    After the Fukushima nuclear accident, safety of nuclear power plants (NPPs) is widespread concerned. The reliability of reactor protection system (RPS) is directly related to the safety of NPPs, however, it is difficult to accurately evaluate the reliability of digital RPS. The method is based on estimating probability has some uncertainties, which can not reflect the reliability status of RPS dynamically and support the maintenance and troubleshooting. In this paper, the reliability quantitative analysis method based on extenics is proposed for the digital RPS (safety-critical), by which the relationship between the reliability and response time of RPS is constructed. The reliability of the RPS for CPR1000 NPP is modeled and analyzed by the proposed method as an example. The results show that the proposed method is capable to estimate the RPS reliability effectively and provide support to maintenance and troubleshooting of digital RPS system.

  10. Determination of Aspartame, Caffeine, Saccharin, and Benzoic Acid in Beverages by High Performance Liquid Chromatography.

    ERIC Educational Resources Information Center

    Delaney, Michael F.; And Others

    1985-01-01

    Describes a simple and reliable new quantitative analysis experiment using liquid chromatography for the determinaiton of caffeine, saccharin, and sodium benzoate in beverages. Background information, procedures used, and typical results obtained are provided. (JN)

  11. NASA trend analysis procedures

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  12. Quality and rigor of the concept mapping methodology: a pooled study analysis.

    PubMed

    Rosas, Scott R; Kane, Mary

    2012-05-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative characteristics and estimates of quality and rigor that may guide for future studies are lacking. To address this gap, we conducted a pooled analysis of 69 concept mapping studies to describe characteristics across study phases, generate specific indicators of validity and reliability, and examine the relationship between select study characteristics and quality indicators. Individual study characteristics and estimates were pooled and quantitatively summarized, describing the distribution, variation and parameters for each. In addition, variation in the concept mapping data collection in relation to characteristics and estimates was examined. Overall, results suggest concept mapping yields strong internal representational validity and very strong sorting and rating reliability estimates. Validity and reliability were consistently high despite variation in participation and task completion percentages across data collection modes. The implications of these findings as a practical reference to assess the quality and rigor for future concept mapping studies are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Quantitative analysis of virgin coconut oil in cream cosmetics preparations using fourier transform infrared (FTIR) spectroscopy.

    PubMed

    Rohman, A; Man, Yb Che; Sismindari

    2009-10-01

    Today, virgin coconut oil (VCO) is becoming valuable oil and is receiving an attractive topic for researchers because of its several biological activities. In cosmetics industry, VCO is excellent material which functions as a skin moisturizer and softener. Therefore, it is important to develop a quantitative analytical method offering a fast and reliable technique. Fourier transform infrared (FTIR) spectroscopy with sample handling technique of attenuated total reflectance (ATR) can be successfully used to analyze VCO quantitatively in cream cosmetic preparations. A multivariate analysis using calibration of partial least square (PLS) model revealed the good relationship between actual value and FTIR-predicted value of VCO with coefficient of determination (R2) of 0.998.

  14. The Effectiveness of Second Language Strategy Instruction: A Meta-Analysis

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2011-01-01

    Research on the effects of second language strategy instruction (SI) has been extensive yet inconclusive. This meta-analysis, therefore, aims to provide a reliable, quantitative measure of the effect of SI as well as a description of the relationship between SI and the variables that moderate its effectiveness (i.e., different learning contexts,…

  15. Writing Across the Curriculum: Reliability Testing of a Standardized Rubric.

    PubMed

    Minnich, Margo; Kirkpatrick, Amanda J; Goodman, Joely T; Whittaker, Ali; Stanton Chapple, Helen; Schoening, Anne M; Khanna, Maya M

    2018-06-01

    Rubrics positively affect student academic performance; however, accuracy and consistency of the rubric and its use is imperative. The researchers in this study developed a standardized rubric for use across an undergraduate nursing curriculum, then evaluated the interrater reliability and general usability of the tool. Faculty raters graded papers using the standardized rubric, submitted their independent scoring for interrater reliability analyses, then participated in a focus group discussion regarding rubric use experience. Quantitative analysis of the data showed a high interrater reliability (α = .998). Content analysis of transcription revealed several positive themes: Consistency, Emphasis on Writing Ability, and Ability to Use the Rubric as a Teaching Tool. Areas for improvement included use of value words and difficulty with point allocation. Investigators recommend effective faculty orientation for rubric use and future work in developing a rubric to assess reflective writing. [J Nurs Educ. 2018;57(6):366-370.]. Copyright 2018, SLACK Incorporated.

  16. The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.

    2010-01-01

    HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.

  17. Stable isotope dimethyl labelling for quantitative proteomics and beyond

    PubMed Central

    Hsu, Jue-Liang; Chen, Shu-Hui

    2016-01-01

    Stable-isotope reductive dimethylation, a cost-effective, simple, robust, reliable and easy-to- multiplex labelling method, is widely applied to quantitative proteomics using liquid chromatography-mass spectrometry. This review focuses on biological applications of stable-isotope dimethyl labelling for a large-scale comparative analysis of protein expression and post-translational modifications based on its unique properties of the labelling chemistry. Some other applications of the labelling method for sample preparation and mass spectrometry-based protein identification and characterization are also summarized. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644970

  18. Quantitative mass spectrometry methods for pharmaceutical analysis

    PubMed Central

    Loos, Glenn; Van Schepdael, Ann

    2016-01-01

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982

  19. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  20. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    PubMed

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  2. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs.

  3. [Analysis of active components of evidence materials secured in the cases of drugs abuse associated with amphetamines and cannabis products].

    PubMed

    Wachowiak, Roman; Strach, Bogna

    2006-01-01

    The study takes advantage of the presently available effective physicochemical methods (isolation, crystallization, determination of melting point, TLC, GLC and UV spectrophotometry) for an objective and reliable qualitative and quantitative analysis of frequently abused drugs. The authors determined the conditions for qualitative and quantitative analysis of active components of the secured evidence materials containing amphetamine sulphate, methylamphetamine hydrochloride, 3,4-me-tylenedioxy-methamphetamine hydrochloride (MDMA, Ecstasy), as well as delta(9)-tetrahydrocannabinol (delta(9)-THC) as an active component of cannabis (marihuana, hashish). The usefulness of physicochemical tests of evidence materials for opinionating purposes is subject to a detailed forensic toxicological interpretation.

  4. Reliability analysis for the smart grid : from cyber control and communication to physical manifestations of failure.

    DOT National Transportation Integrated Search

    2010-01-01

    The Smart Grid is a cyber-physical system comprised of physical components, such as transmission lines and generators, and a : network of embedded systems deployed for their cyber control. Our objective is to qualitatively and quantitatively analyze ...

  5. Characterization and quantitative analysis of phenylpropanoid amides in eggplant (Solanum melongena L.) by high performance liquid chromatography coupled with diode array detection and hybrid ion trap time-of-flight mass spectrometry.

    PubMed

    Sun, Jing; Song, Yue-Lin; Zhang, Jing; Huang, Zheng; Huo, Hui-Xia; Zheng, Jiao; Zhang, Qian; Zhao, Yun-Fang; Li, Jun; Tu, Peng-Fei

    2015-04-08

    Eggplant (Solanum melongena L.) is a famous edible and medicinal plant. Despite being widely cultivated and used, data on certain parts other than the fruit are limited. The present study focused on the qualitative and quantitative analysis of the chemical constituents, particularly phenylpropanoid amides (PAs), in eggplant. The mass fragmentation patterns of PAs were proposed using seven authentic compounds with the assistance of a hybrid ion trap time-of-flight mass spectrometer. Thirty-seven compounds (27 PAs and 10 others) were detected and plausibly assigned in the different parts of eggplant. Afterward, a reliable method based on liquid chromatography coupled with diode array detection was developed, validated, and applied for the simultaneous determination of seven PAs and three caffeoylquinic acids in 17 batches of eggplant roots with satisfactory accuracy, precision, and reproducibility, which could not only provide global chemical insight of eggplant but also offer a reliable tool for quality control.

  6. Reproducibility of sonographic measurement of thickness and echogenicity of the plantar fascia.

    PubMed

    Cheng, Ju-Wen; Tsai, Wen-Chung; Yu, Tung-Yang; Huang, Kuo-Yao

    2012-01-01

    To evaluate the intra- and interrater reliability of ultrasonographic measurements of the thickness and echogenicity of the plantar fascia. Eleven patients (20 feet), who complained of inferior heel pain, and 26 volunteers (52 feet) were enrolled. Two sonographers independently imaged the plantar fascia in both longitudinal and transverse planes. Volunteers were assessed twice to evaluate intrarater reliability. Quantitative evaluation of the echogenicity of the plantar fascia was performed by measuring the mean gray level of the region of interest using Digital Imaging and Communications in Medicine viewer software. Sonographic evaluation of the thickness of the plantar fascia showed high reliability. Sonographic evaluations of the presence or absence of hypoechoic change in the plantar fascia showed surprisingly low agreement. The reliability of gray-scale evaluations appears to be much better than subjective judgments in the evaluation of echogenicity. Transverse scanning did not show any advantage in sonographic evaluation of the plantar fascia. The reliability of sonographic examination of the thickness of the plantar fascia is high. Mean gray-level analysis of quantitative sonography can be used for the evaluation of echogenicity, which could reduce discrepancies in the interpretation of echogenicity by different sonographers. Longitudinal instead of transverse scanning is recommended for imaging the plantar fascia. Copyright © 2011 Wiley Periodicals, Inc.

  7. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    PubMed

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  8. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  9. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  10. Validating a tool to measure auxiliary nurse midwife and nurse motivation in rural Nepal.

    PubMed

    Morrison, Joanna; Batura, Neha; Thapa, Rita; Basnyat, Regina; Skordis-Worrall, Jolene

    2015-05-12

    A global shortage of health workers in rural areas increases the salience of motivating and supporting existing health workers. Understandings of motivation may vary in different settings, and it is important to use measurement methods that are contextually appropriate. We identified a measurement tool, previously used in Kenya, and explored its validity and reliability to measure the motivation of auxiliary nurse midwives (ANM) and staff nurses (SN) in rural Nepal. Qualitative and quantitative methods were used to assess the content validity, the construct validity, the internal consistency and the reliability of the tool. We translated the tool into Nepali and it was administered to 137 ANMs and SNs in three districts. We collected qualitative data from 78 nursing personnel and district- and central-level stakeholders using interviews and focus group discussions. We calculated motivation scores for ANMs and SNs using the quantitative data and conducted statistical tests for validity and reliability. Motivation scores were compared with qualitative data. Descriptive exploratory analysis compared mean motivation scores by ANM and SN sociodemographic characteristics. The concept of self-efficacy was added to the tool before data collection. Motivation was revealed through conscientiousness. Teamwork and the exertion of extra effort were not adequately captured by the tool, but important in illustrating motivation. The statement on punctuality was problematic in quantitative analysis, and attendance was more expressive of motivation. The calculated motivation scores usually reflected ANM and SN interview data, with some variation in other stakeholder responses. The tool scored within acceptable limits in validity and reliability testing and was able to distinguish motivation of nursing personnel with different sociodemographic characteristics. We found that with minor modifications, the tool provided valid and internally consistent measures of motivation among ANMs and SNs in this context. We recommend the use of this tool in similar contexts, with the addition of statements about self-efficacy, teamwork and exertion of extra effort. Absenteeism should replace the punctuality statement, and statements should be worded both positively and negatively to mitigate positive response bias. Collection of qualitative data on motivation creates a more nuanced understanding of quantitative scores.

  11. The Reliability and Validity of Discrete and Continuous Measures of Psychopathology: A Quantitative Review

    ERIC Educational Resources Information Center

    Markon, Kristian E.; Chmielewski, Michael; Miller, Christopher J.

    2011-01-01

    In 2 meta-analyses involving 58 studies and 59,575 participants, we quantitatively summarized the relative reliability and validity of continuous (i.e., dimensional) and discrete (i.e., categorical) measures of psychopathology. Overall, results suggest an expected 15% increase in reliability and 37% increase in validity through adoption of a…

  12. Relating design and environmental variables to reliability

    NASA Astrophysics Data System (ADS)

    Kolarik, William J.; Landers, Thomas L.

    The combination of space application and nuclear power source demands high reliability hardware. The possibilities of failure, either an inability to provide power or a catastrophic accident, must be minimized. Nuclear power experiences on the ground have led to highly sophisticated probabilistic risk assessment procedures, most of which require quantitative information to adequately assess such risks. In the area of hardware risk analysis, reliability information plays a key role. One of the lessons learned from the Three Mile Island experience is that thorough analyses of critical components are essential. Nuclear grade equipment shows some reliability advantages over commercial. However, no statistically significant difference has been found. A recent study pertaining to spacecraft electronics reliability, examined some 2500 malfunctions on more than 300 aircraft. The study classified the equipment failures into seven general categories. Design deficiencies and lack of environmental protection accounted for about half of all failures. Within each class, limited reliability modeling was performed using a Weibull failure model.

  13. Subject-level reliability analysis of fast fMRI with application to epilepsy.

    PubMed

    Hao, Yongfu; Khoo, Hui Ming; von Ellenrieder, Nicolas; Gotman, Jean

    2017-07-01

    Recent studies have applied the new magnetic resonance encephalography (MREG) sequence to the study of interictal epileptic discharges (IEDs) in the electroencephalogram (EEG) of epileptic patients. However, there are no criteria to quantitatively evaluate different processing methods, to properly use the new sequence. We evaluated different processing steps of this new sequence under the common generalized linear model (GLM) framework by assessing the reliability of results. A bootstrap sampling technique was first used to generate multiple replicated data sets; a GLM with different processing steps was then applied to obtain activation maps, and the reliability of these maps was assessed. We applied our analysis in an event-related GLM related to IEDs. A higher reliability was achieved by using a GLM with head motion confound regressor with 24 components rather than the usual 6, with an autoregressive model of order 5 and with a canonical hemodynamic response function (HRF) rather than variable latency or patient-specific HRFs. Comparison of activation with IED field also favored the canonical HRF, consistent with the reliability analysis. The reliability analysis helps to optimize the processing methods for this fast fMRI sequence, in a context in which we do not know the ground truth of activation areas. Magn Reson Med 78:370-382, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  14. Advancing the Fork detector for quantitative spent nuclear fuel verification

    DOE PAGES

    Vaccaro, S.; Gauld, I. C.; Hu, J.; ...

    2018-01-31

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less

  15. Advancing the Fork detector for quantitative spent nuclear fuel verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaccaro, S.; Gauld, I. C.; Hu, J.

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less

  16. Advancing the Fork detector for quantitative spent nuclear fuel verification

    NASA Astrophysics Data System (ADS)

    Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.

    2018-04-01

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. The results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.

  17. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  18. A fast and reliable readout method for quantitative analysis of surface-enhanced Raman scattering nanoprobes on chip surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Hyejin; Jeong, Sinyoung; Ko, Eunbyeol

    2015-05-15

    Surface-enhanced Raman scattering techniques have been widely used for bioanalysis due to its high sensitivity and multiplex capacity. However, the point-scanning method using a micro-Raman system, which is the most common method in the literature, has a disadvantage of extremely long measurement time for on-chip immunoassay adopting a large chip area of approximately 1-mm scale and confocal beam point of ca. 1-μm size. Alternative methods such as sampled spot scan with high confocality and large-area scan method with enlarged field of view and low confocality have been utilized in order to minimize the measurement time practically. In this study, wemore » analyzed the two methods in respect of signal-to-noise ratio and sampling-led signal fluctuations to obtain insights into a fast and reliable readout strategy. On this basis, we proposed a methodology for fast and reliable quantitative measurement of the whole chip area. The proposed method adopted a raster scan covering a full area of 100 μm × 100 μm region as a proof-of-concept experiment while accumulating signals in the CCD detector for single spectrum per frame. One single scan with 10 s over 100 μm × 100 μm area yielded much higher sensitivity compared to sampled spot scanning measurements and no signal fluctuations attributed to sampled spot scan. This readout method is able to serve as one of key technologies that will bring quantitative multiplexed detection and analysis into practice.« less

  19. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  20. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy

    PubMed Central

    Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  1. Validation of an instrument to assess toddler feeding practices of Latino mothers.

    PubMed

    Chaidez, Virginia; Kaiser, Lucia L

    2011-08-01

    This paper describes qualitative and quantitative aspects of testing a 34-item Toddler-Feeding Questionnaire (TFQ), designed for use in Latino families, and the associations between feeding practices and toddler dietary outcomes. Qualitative methods included review by an expert panel for content validity and cognitive testing of the tool to assess face validity. Quantitative analyses included use of exploratory factor analysis for construct validity; Pearson's correlations for test-retest reliability; Cronbach's alpha (α) for internal reliability; and multivariate regression for investigating relationships between feeding practices and toddler diet and anthropometry. Interviews were conducted using a convenience sample of 94 Latino mother and toddler dyads obtained largely through the Supplemental Nutrition Program for Women, Infants and Children (WIC). Data collection included household characteristics, self-reported early-infant feeding practices, the toddler's dietary intake, and anthropometric measurements. Factor analysis suggests the TFQ contains three subscales: indulgent; authoritative; and environmental influences. The TFQ demonstrated acceptable reliability for most measures. As hypothesized, indulgent practices in Latino toddlers were associated with increased energy consumption and higher intakes of total fat, saturated fat, and sweetened beverages. This tool may be useful in future research exploring the relationship of toddler feeding practices to nutritional outcomes in Latino families. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Computer-aided analysis with Image J for quantitatively assessing psoriatic lesion area.

    PubMed

    Sun, Z; Wang, Y; Ji, S; Wang, K; Zhao, Y

    2015-11-01

    Body surface area is important in determining the severity of psoriasis. However, objective, reliable, and practical method is still in need for this purpose. We performed a computer image analysis (CIA) of psoriatic area using the image J freeware to determine whether this method could be used for objective evaluation of psoriatic area. Fifteen psoriasis patients were randomized to be treated with adalimumab or placebo in a clinical trial. At each visit, the psoriasis area of each body site was estimated by two physicians (E-method), and standard photographs were taken. The psoriasis area in the pictures was assessed with CIA using semi-automatic threshold selection (T-method), or manual selection (M-method, gold standard). The results assessed by the three methods were analyzed with reliability and affecting factors evaluated. Both T- and E-method correlated strongly with M-method, and T-method had a slightly stronger correlation with M-method. Both T- and E-methods had a good consistency between the evaluators. All the three methods were able to detect the change in the psoriatic area after treatment, while the E-method tends to overestimate. The CIA with image J freeware is reliable and practicable in quantitatively assessing the lesional of psoriasis area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. User-Perceived Reliability of M-for-N (M: N) Shared Protection Systems

    NASA Astrophysics Data System (ADS)

    Ozaki, Hirokazu; Kara, Atsushi; Cheng, Zixue

    In this paper we investigate the reliability of general type shared protection systems i.e. M for N (M: N) that can typically be applied to various telecommunication network devices. We focus on the reliability that is perceived by an end user of one of N units. We assume that any failed unit is instantly replaced by one of the M units (if available). We describe the effectiveness of such a protection system in a quantitative manner. The mathematical analysis gives the closed-form solution of the availability, the recursive computing algorithm of the MTTFF (Mean Time to First Failure) and the MTTF (Mean Time to Failure) perceived by an arbitrary end user. We also show that, under a certain condition, the probability distribution of TTFF (Time to First Failure) can be approximated by a simple exponential distribution. The analysis provides useful information for the analysis and the design of not only the telecommunication network devices but also other general shared protection systems that are subject to service level agreements (SLA) involving user-perceived reliability measures.

  4. Quantitative metal magnetic memory reliability modeling for welded joints

    NASA Astrophysics Data System (ADS)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  5. Intra- and inter-observer reliability of quantitative analysis of the infra-patellar fat pad and comparison between fat- and non-fat-suppressed imaging--Data from the osteoarthritis initiative.

    PubMed

    Steidle-Kloc, E; Wirth, W; Ruhdorfer, A; Dannhauer, T; Eckstein, F

    2016-03-01

    The infra-patellar fat pad (IPFP), as intra-articular adipose tissue represents a potential source of pro-inflammatory cytokines and its size has been suggested to be associated with osteoarthritis (OA) of the knee. This study examines inter- and intra-observer reliability of fat-suppressed (fs) and non-fat-suppressed (nfs) MR imaging for determination of IPFP morphological measurements as novel biomarkers. The IPFP of nine right knees of healthy Osteoarthritis Initiative participants was segmented by five readers, using fs and nfs baseline sagittal MRIs. The intra-observer reliability was determined from baseline and 1-year follow-up images. All segmentations were quality controlled (QC) by an expert reader. Reliability was expressed as root mean square coefficient of variation (RMS CV%). After QC, the inter-observer reliability for fs (nfs) imaging was 2.0% (1.1%) for IPFP volume, 2.1%/2.5% (1.6%/1.8%) for anterior/posterior surface areas, 1.8% (1.8%) for depth, and 2.1% (2.4%) for maximum sagittal area. The intra-observer reliability was 3.1% (5.0%) for volume, 2.3%/2.8% (2.5%/2.9%) for anterior/posterior surfaces, 1.9% (3.5%) for depth, and 3.3% (4.5%) for maximum sagittal area. IPFP volume from nfs images was systematically greater (+7.3%) than from fs images, but highly correlated (r=0.98). The results suggest that quantitative measurements of IPFP morphology can be performed with satisfactory reliability when expert QC is implemented. The IPFP is more clearly depicted in nfs images, and there is a small systematic off-set versus analysis from fs images. However, the high linear relationship between fs and nfs imaging suggests that fs images can be used to analyze IPFP morphology, when nfs images are not available. Copyright © 2015 Elsevier GmbH. All rights reserved.

  6. Intra- and inter-observer reliability of quantitative analysis of the infra-patellar fat pad and comparison between fat- and non-fat-suppressed imaging—Data from the osteoarthritis initiative

    PubMed Central

    Steidle-Kloc, E.; Wirth, W.; Ruhdorfer, A.; Dannhauer, T.; Eckstein, F.

    2015-01-01

    The infra-patellar fat pad (IPFP), as intra-articular adipose tissue represents a potential source of pro-inflammatory cytokines and its size has been suggested to be associated with osteoarthritis (OA) of the knee. This study examines inter- and intra-observer reliability of fat-suppressed (fs) and non-fat-suppressed (nfs) MR imaging for determination of IPFP morphological measurements as novel biomarkers. The IPFP of nine right knees of healthy Osteoarthritis Initiative participants was segmented by five readers, using fs and nfs baseline sagittal MRIs. The intra-observer reliability was determined from baseline and 1-year follow-up images. All segmentations were quality controlled (QC) by an expert reader. Reliability was expressed as root mean square coefficient of variation (RMS CV%). After QC, the inter-observer reliability for fs (nfs) imaging was 2.0% (1.1%) for IPFP volume, 2.1%/2.5% (1.6%/1.8%) for anterior/posterior surface areas, 1.8% (1.8%) for depth, and 2.1% (2.4%) for maximum sagittal area. The intra-observer reliability was 3.1% (5.0%) for volume, 2.3%/2.8% (2.5%/2.9%) for anterior/posterior surfaces, 1.9% (3.5%) for depth, and 3.3% (4.5%) for maximum sagittal area. IPFP volume from nfs images was systematically greater (+7.3%) than from fs images, but highly correlated (r = 0.98). The results suggest that quantitative measurements of IPFP morphology can be performed with satisfactory reliability when expert QC is implemented. The IPFP is more clearly depicted in nfs images, and there is a small systematic off-set versus analysis from fs images. However, the high linear relationship between fs and nfs imaging suggests that fs images can be used to analyze IPFP morphology, when nfs images are not available. PMID:26569532

  7. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  8. Collaboration Indices for Monitoring Potential Problems in Online Small Groups

    ERIC Educational Resources Information Center

    Jahng, Namsook

    2013-01-01

    The purpose of this study is to test the validity and reliability of three collaboration indices ("quantity", "equality", "and shareness") proposed by Jahng et al. (2010). The present study repeated the quantitative assessment of Jahng et al., and performed a further qualitative analysis to identify possible factors…

  9. Quantitative analysis of Al-Si alloy using calibration free laser induced breakdown spectroscopy (CF-LIBS)

    NASA Astrophysics Data System (ADS)

    Shakeel, Hira; Haq, S. U.; Aisha, Ghulam; Nadeem, Ali

    2017-06-01

    The quantitative analysis of the standard aluminum-silicon alloy has been performed using calibration free laser induced breakdown spectroscopy (CF-LIBS). The plasma was produced using the fundamental harmonic (1064 nm) of the Nd: YAG laser and the emission spectra were recorded at 3.5 μs detector gate delay. The qualitative analysis of the emission spectra confirms the presence of Mg, Al, Si, Ti, Mn, Fe, Ni, Cu, Zn, Sn, and Pb in the alloy. The background subtracted and self-absorption corrected emission spectra were used for the estimation of plasma temperature as 10 100 ± 300 K. The plasma temperature and self-absorption corrected emission lines of each element have been used for the determination of concentration of each species present in the alloy. The use of corrected emission intensities and accurate evaluation of plasma temperature yield reliable quantitative analysis up to a maximum 2.2% deviation from reference sample concentration.

  10. Computer-Aided Diagnosis of Solid Breast Lesions Using an Ultrasonic Multi-Feature Analysis Procedure

    DTIC Science & Technology

    2011-01-01

    areas. We quantified morphometric features by geometric and fractal analysis of traced lesion boundaries. Although no single parameter can reliably...These include acoustic descriptors (“echogenicity,” “heterogeneity,” “shadowing”) and morphometric descriptors (“area,” “aspect ratio,” “border...quantitative descriptors; some morphometric features (such as border irregularity) also were particularly effective in lesion classification. Our

  11. Developing safety performance functions incorporating reliability-based risk measures.

    PubMed

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. A Quantitative Analysis of Pulsed Signals Emitted by Wild Bottlenose Dolphins.

    PubMed

    Luís, Ana Rita; Couchinho, Miguel N; Dos Santos, Manuel E

    2016-01-01

    Common bottlenose dolphins (Tursiops truncatus), produce a wide variety of vocal emissions for communication and echolocation, of which the pulsed repertoire has been the most difficult to categorize. Packets of high repetition, broadband pulses are still largely reported under a general designation of burst-pulses, and traditional attempts to classify these emissions rely mainly in their aural characteristics and in graphical aspects of spectrograms. Here, we present a quantitative analysis of pulsed signals emitted by wild bottlenose dolphins, in the Sado estuary, Portugal (2011-2014), and test the reliability of a traditional classification approach. Acoustic parameters (minimum frequency, maximum frequency, peak frequency, duration, repetition rate and inter-click-interval) were extracted from 930 pulsed signals, previously categorized using a traditional approach. Discriminant function analysis revealed a high reliability of the traditional classification approach (93.5% of pulsed signals were consistently assigned to their aurally based categories). According to the discriminant function analysis (Wilk's Λ = 0.11, F3, 2.41 = 282.75, P < 0.001), repetition rate is the feature that best enables the discrimination of different pulsed signals (structure coefficient = 0.98). Classification using hierarchical cluster analysis led to a similar categorization pattern: two main signal types with distinct magnitudes of repetition rate were clustered into five groups. The pulsed signals, here described, present significant differences in their time-frequency features, especially repetition rate (P < 0.001), inter-click-interval (P < 0.001) and duration (P < 0.001). We document the occurrence of a distinct signal type-short burst-pulses, and highlight the existence of a diverse repertoire of pulsed vocalizations emitted in graded sequences. The use of quantitative analysis of pulsed signals is essential to improve classifications and to better assess the contexts of emission, geographic variation and the functional significance of pulsed signals.

  13. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    PubMed

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (p<0.05). This study showed high test-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Accurate reliability analysis method for quantum-dot cellular automata circuits

    NASA Astrophysics Data System (ADS)

    Cui, Huanqing; Cai, Li; Wang, Sen; Liu, Xiaoqiang; Yang, Xiaokuo

    2015-10-01

    Probabilistic transfer matrix (PTM) is a widely used model in the reliability research of circuits. However, PTM model cannot reflect the impact of input signals on reliability, so it does not completely conform to the mechanism of the novel field-coupled nanoelectronic device which is called quantum-dot cellular automata (QCA). It is difficult to get accurate results when PTM model is used to analyze the reliability of QCA circuits. To solve this problem, we present the fault tree models of QCA fundamental devices according to different input signals. After that, the binary decision diagram (BDD) is used to quantitatively investigate the reliability of two QCA XOR gates depending on the presented models. By employing the fault tree models, the impact of input signals on reliability can be identified clearly and the crucial components of a circuit can be found out precisely based on the importance values (IVs) of components. So this method is contributive to the construction of reliable QCA circuits.

  15. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    PubMed

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    Quantitative real-time polymerase chain reaction (RT-qPCR) is the key platform for the quantitative analysis of gene expression in a wide range of experimental systems and conditions. However, the accuracy and reproducibility of gene expression quantification via RT-qPCR is entirely dependent on the identification of reliable reference genes for data normalisation. Green foxtail ( Setaria viridis ) has recently been proposed as a potential experimental model for the study of C 4 photosynthesis and is closely related to many economically important crop species of the Panicoideae subfamily of grasses, including Zea mays (maize), Sorghum bicolor (sorghum) and Sacchurum officinarum (sugarcane). Setaria viridis (Accession 10) possesses a number of key traits as an experimental model, namely; (i) a small sized, sequenced and well annotated genome; (ii) short stature and generation time; (iii) prolific seed production, and; (iv) is amendable to Agrobacterium tumefaciens -mediated transformation. There is currently however, a lack of reference gene expression information for Setaria viridis ( S. viridis ). We therefore aimed to identify a cohort of suitable S. viridis reference genes for accurate and reliable normalisation of S. viridis RT-qPCR expression data. Eleven putative candidate reference genes were identified and examined across thirteen different S. viridis tissues. Of these, the geNorm and NormFinder analysis software identified SERINE / THERONINE - PROTEIN PHOSPHATASE 2A ( PP2A ), 5 '- ADENYLYLSULFATE REDUCTASE 6 ( ASPR6 ) and DUAL SPECIFICITY PHOSPHATASE ( DUSP ) as the most suitable combination of reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. To demonstrate the suitability of the three selected reference genes, PP2A , ASPR6 and DUSP , were used to normalise the expression of CINNAMYL ALCOHOL DEHYDROGENASE ( CAD ) genes across the same tissues. This approach readily demonstrated the suitably of the three selected reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. Further, the work reported here forms a highly useful platform for future gene expression quantification in S. viridis and can also be potentially directly translatable to other closely related and agronomically important C 4 crop species.

  16. Meta-analysis is not an exact science: Call for guidance on quantitative synthesis decisions.

    PubMed

    Haddaway, Neal R; Rytwinski, Trina

    2018-05-01

    Meta-analysis is becoming increasingly popular in the field of ecology and environmental management. It increases the effective power of analyses relative to single studies, and allows researchers to investigate effect modifiers and sources of heterogeneity that could not be easily examined within single studies. Many systematic reviewers will set out to conduct a meta-analysis as part of their synthesis, but meta-analysis requires a niche set of skills that are not widely held by the environmental research community. Each step in the process of carrying out a meta-analysis requires decisions that have both scientific and statistical implications. Reviewers are likely to be faced with a plethora of decisions over which effect size to choose, how to calculate variances, and how to build statistical models. Some of these decisions may be simple based on appropriateness of the options. At other times, reviewers must choose between equally valid approaches given the information available to them. This presents a significant problem when reviewers are attempting to conduct a reliable synthesis, such as a systematic review, where subjectivity is minimised and all decisions are documented and justified transparently. We propose three urgent, necessary developments within the evidence synthesis community. Firstly, we call on quantitative synthesis experts to improve guidance on how to prepare data for quantitative synthesis, providing explicit detail to support systematic reviewers. Secondly, we call on journal editors and evidence synthesis coordinating bodies (e.g. CEE) to ensure that quantitative synthesis methods are adequately reported in a transparent and repeatable manner in published systematic reviews. Finally, where faced with two or more broadly equally valid alternative methods or actions, reviewers should conduct multiple analyses, presenting all options, and discussing the implications of the different analytical approaches. We believe it is vital to tackle the possible subjectivity in quantitative synthesis described herein to ensure that the extensive efforts expended in producing systematic reviews and other evidence synthesis products is not wasted because of a lack of rigour or reliability in the final synthesis step. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. High performance liquid chromatographic assay for the quantitation of total glutathione in plasma

    NASA Technical Reports Server (NTRS)

    Abukhalaf, Imad K.; Silvestrov, Natalia A.; Menter, Julian M.; von Deutsch, Daniel A.; Bayorh, Mohamed A.; Socci, Robin R.; Ganafa, Agaba A.

    2002-01-01

    A simple and widely used homocysteine HPLC procedure was applied for the HPLC identification and quantitation of glutathione in plasma. The method, which utilizes SBDF as a derivatizing agent utilizes only 50 microl of sample volume. Linear quantitative response curve was generated for glutathione over a concentration range of 0.3125-62.50 micromol/l. Linear regression analysis of the standard curve exhibited correlation coefficient of 0.999. Limit of detection (LOD) and limit of quantitation (LOQ) values were 5.0 and 15 pmol, respectively. Glutathione recovery using this method was nearly complete (above 96%). Intra-assay and inter-assay precision studies reflected a high level of reliability and reproducibility of the method. The applicability of the method for the quantitation of glutathione was demonstrated successfully using human and rat plasma samples.

  18. The long-term reliability of static and dynamic quantitative sensory testing in healthy individuals.

    PubMed

    Marcuzzi, Anna; Wrigley, Paul J; Dean, Catherine M; Adams, Roger; Hush, Julia M

    2017-07-01

    Quantitative sensory tests (QSTs) have been increasingly used to investigate alterations in somatosensory function in a wide range of painful conditions. The interpretation of these findings is based on the assumption that the measures are stable and reproducible. To date, reliability of QST has been investigated for short test-retest intervals. The aim of this study was to investigate the long-term reliability of a multimodal QST assessment in healthy people, with testing conducted on 3 occasions over 4 months. Forty-two healthy people were enrolled in the study. Static and dynamic tests were performed, including cold and heat pain threshold (CPT, HPT), mechanical wind-up [wind-up ratio (WUR)], pressure pain threshold (PPT), 2-point discrimination (TPD), and conditioned pain modulation (CPM). Systematic bias, relative reliability and agreement were analysed using repeated measure analysis of variance, intraclass correlation coefficients (ICCs3,1) and SE of the measurement (SEM), respectively. Static QST (CPT, HPT, PPT, and TPD) showed good-to-excellent reliability (ICCs: 0.68-0.90). Dynamic QST (WUR and CPM) showed poor-to-good reliability (ICCs: 0.35-0.61). A significant linear decrease over time was observed for mechanical QST at the back (PPT and TPD) and for CPM (P < 0.01). Static QST were stable over a period of 4 months; however, a small systematic decrease over time has been observed for mechanical QST. Dynamic QST showed considerable variability over time; in particular, CPM using PPT as the test stimulus did not show adequate reliability, suggesting that this test paradigm may be less useful for monitoring individuals over time.

  19. A systematic review of reliability and objective criterion-related validity of physical activity questionnaires.

    PubMed

    Helmerhorst, Hendrik J F; Brage, Søren; Warren, Janet; Besson, Herve; Ekelund, Ulf

    2012-08-31

    Physical inactivity is one of the four leading risk factors for global mortality. Accurate measurement of physical activity (PA) and in particular by physical activity questionnaires (PAQs) remains a challenge. The aim of this paper is to provide an updated systematic review of the reliability and validity characteristics of existing and more recently developed PAQs and to quantitatively compare the performance between existing and newly developed PAQs.A literature search of electronic databases was performed for studies assessing reliability and validity data of PAQs using an objective criterion measurement of PA between January 1997 and December 2011. Articles meeting the inclusion criteria were screened and data were extracted to provide a systematic overview of measurement properties. Due to differences in reported outcomes and criterion methods a quantitative meta-analysis was not possible.In total, 31 studies testing 34 newly developed PAQs, and 65 studies examining 96 existing PAQs were included. Very few PAQs showed good results on both reliability and validity. Median reliability correlation coefficients were 0.62-0.71 for existing, and 0.74-0.76 for new PAQs. Median validity coefficients ranged from 0.30-0.39 for existing, and from 0.25-0.41 for new PAQs.Although the majority of PAQs appear to have acceptable reliability, the validity is moderate at best. Newly developed PAQs do not appear to perform substantially better than existing PAQs in terms of reliability and validity. Future PAQ studies should include measures of absolute validity and the error structure of the instrument.

  20. A systematic review of reliability and objective criterion-related validity of physical activity questionnaires

    PubMed Central

    2012-01-01

    Physical inactivity is one of the four leading risk factors for global mortality. Accurate measurement of physical activity (PA) and in particular by physical activity questionnaires (PAQs) remains a challenge. The aim of this paper is to provide an updated systematic review of the reliability and validity characteristics of existing and more recently developed PAQs and to quantitatively compare the performance between existing and newly developed PAQs. A literature search of electronic databases was performed for studies assessing reliability and validity data of PAQs using an objective criterion measurement of PA between January 1997 and December 2011. Articles meeting the inclusion criteria were screened and data were extracted to provide a systematic overview of measurement properties. Due to differences in reported outcomes and criterion methods a quantitative meta-analysis was not possible. In total, 31 studies testing 34 newly developed PAQs, and 65 studies examining 96 existing PAQs were included. Very few PAQs showed good results on both reliability and validity. Median reliability correlation coefficients were 0.62–0.71 for existing, and 0.74–0.76 for new PAQs. Median validity coefficients ranged from 0.30–0.39 for existing, and from 0.25–0.41 for new PAQs. Although the majority of PAQs appear to have acceptable reliability, the validity is moderate at best. Newly developed PAQs do not appear to perform substantially better than existing PAQs in terms of reliability and validity. Future PAQ studies should include measures of absolute validity and the error structure of the instrument. PMID:22938557

  1. Metrological approach to quantitative analysis of clinical samples by LA-ICP-MS: A critical review of recent studies.

    PubMed

    Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta

    2018-05-15

    Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  3. Lift truck safety review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadwallader, L.C.

    1997-03-01

    This report presents safety information about powered industrial trucks. The basic lift truck, the counterbalanced sit down rider truck, is the primary focus of the report. Lift truck engineering is briefly described, then a hazard analysis is performed on the lift truck. Case histories and accident statistics are also given. Rules and regulations about lift trucks, such as the US Occupational Safety an Health Administration laws and the Underwriter`s Laboratories standards, are discussed. Safety issues with lift trucks are reviewed, and lift truck safety and reliability are discussed. Some quantitative reliability values are given.

  4. Impact of Oriented Clay Particles on X-Ray Spectroscopy Analysis

    NASA Astrophysics Data System (ADS)

    Lim, A. J. M. S.; Syazwani, R. N.; Wijeyesekera, D. C.

    2016-07-01

    Understanding the engineering properties of the mineralogy and microfabic of clayey soils is very complex and thus very difficult for soil characterization. Micromechanics of soils recognize that the micro structure and mineralogy of clay have a significant influence on its engineering behaviour. To achieve a more reliable quantitative evaluation of clay mineralogy, a proper sample preparation technique for quantitative clay mineral analysis is necessary. This paper presents the quantitative evaluation of elemental analysis and chemical characterization of oriented and random oriented clay particles using X-ray spectroscopy. Three different types of clays namely marine clay, bentonite and kaolin clay were studied. The oriented samples were prepared by placing the dispersed clay in water and left to settle on porous ceramic tiles by applying a relatively weak suction through a vacuum pump. Images form a Scanning Electron Microscope (SEM) was also used to show the comparison between the orientation patterns of both the sample preparation techniques. From the quantitative analysis of the X-ray spectroscopy, oriented sampling method showed more accuracy in identifying mineral deposits, because it produced better peak intensity on the spectrum and more mineral content can be identified compared to randomly oriented samples.

  5. Quantitative analysis of sitagliptin using the (19)F-NMR method: a universal technique for fluorinated compound detection.

    PubMed

    Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya

    2015-01-07

    To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.

  6. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  7. Global Relative Quantification with Liquid Chromatography–Matrix-assisted Laser Desorption Ionization Time-of-flight (LC-MALDI-TOF)—Cross–validation with LTQ-Orbitrap Proves Reliability and Reveals Complementary Ionization Preferences*

    PubMed Central

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-01-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530

  8. Global relative quantification with liquid chromatography-matrix-assisted laser desorption ionization time-of-flight (LC-MALDI-TOF)--cross-validation with LTQ-Orbitrap proves reliability and reveals complementary ionization preferences.

    PubMed

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-10-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.

  9. Using multiple PCR and CE with chemiluminescence detection for simultaneous qualitative and quantitative analysis of genetically modified organism.

    PubMed

    Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan

    2008-09-01

    In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.

  10. A Quantitative Socio-hydrological Characterization of Water Security in Large-Scale Irrigation Systems

    NASA Astrophysics Data System (ADS)

    Siddiqi, A.; Muhammad, A.; Wescoat, J. L., Jr.

    2017-12-01

    Large-scale, legacy canal systems, such as the irrigation infrastructure in the Indus Basin in Punjab, Pakistan, have been primarily conceived, constructed, and operated with a techno-centric approach. The emerging socio-hydrological approaches provide a new lens for studying such systems to potentially identify fresh insights for addressing contemporary challenges of water security. In this work, using the partial definition of water security as "the reliable availability of an acceptable quantity and quality of water", supply reliability is construed as a partial measure of water security in irrigation systems. A set of metrics are used to quantitatively study reliability of surface supply in the canal systems of Punjab, Pakistan using an extensive dataset of 10-daily surface water deliveries over a decade (2007-2016) and of high frequency (10-minute) flow measurements over one year. The reliability quantification is based on comparison of actual deliveries and entitlements, which are a combination of hydrological and social constructs. The socio-hydrological lens highlights critical issues of how flows are measured, monitored, perceived, and experienced from the perspective of operators (government officials) and users (famers). The analysis reveals varying levels of reliability (and by extension security) of supply when data is examined across multiple temporal and spatial scales. The results shed new light on evolution of water security (as partially measured by supply reliability) for surface irrigation in the Punjab province of Pakistan and demonstrate that "information security" (defined as reliable availability of sufficiently detailed data) is vital for enabling water security. It is found that forecasting and management (that are social processes) lead to differences between entitlements and actual deliveries, and there is significant potential to positively affect supply reliability through interventions in the social realm.

  11. Systematic review of methods for quantifying teamwork in the operating theatre

    PubMed Central

    Marshall, D.; Sykes, M.; McCulloch, P.; Shalhoub, J.; Maruthappu, M.

    2018-01-01

    Background Teamwork in the operating theatre is becoming increasingly recognized as a major factor in clinical outcomes. Many tools have been developed to measure teamwork. Most fall into two categories: self‐assessment by theatre staff and assessment by observers. A critical and comparative analysis of the validity and reliability of these tools is lacking. Methods MEDLINE and Embase databases were searched following PRISMA guidelines. Content validity was assessed using measurements of inter‐rater agreement, predictive validity and multisite reliability, and interobserver reliability using statistical measures of inter‐rater agreement and reliability. Quantitative meta‐analysis was deemed unsuitable. Results Forty‐eight articles were selected for final inclusion; self‐assessment tools were used in 18 and observational tools in 28, and there were two qualitative studies. Self‐assessment of teamwork by profession varied with the profession of the assessor. The most robust self‐assessment tool was the Safety Attitudes Questionnaire (SAQ), although this failed to demonstrate multisite reliability. The most robust observational tool was the Non‐Technical Skills (NOTECHS) system, which demonstrated both test–retest reliability (P > 0·09) and interobserver reliability (Rwg = 0·96). Conclusion Self‐assessment of teamwork by the theatre team was influenced by professional differences. Observational tools, when used by trained observers, circumvented this.

  12. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    PubMed

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  13. Determination of exposure multiples of human metabolites for MIST assessment in preclinical safety species without using reference standards or radiolabeled compounds.

    PubMed

    Ma, Shuguang; Li, Zhiling; Lee, Keun-Joong; Chowdhury, Swapan K

    2010-12-20

    A simple, reliable, and accurate method was developed for quantitative assessment of metabolite coverage in preclinical safety species by mixing equal volumes of human plasma with blank plasma of animal species and vice versa followed by an analysis using high-resolution full-scan accurate mass spectrometry. This approach provided comparable results (within (±15%) to those obtained from regulated bioanalysis and did not require synthetic standards or radiolabeled compounds. In addition, both qualitative and quantitative data were obtained from a single LC-MS analysis on all metabolites and, therefore, the coverage of any metabolite of interest can be obtained.

  14. Fine phenotyping of pod and seed traits in Arachis germplasm accessions using digital image analysis

    USDA-ARS?s Scientific Manuscript database

    Reliable and objective phenotyping of peanut pod and seed traits is important for cultivar selection and genetic mapping of yield components. To develop useful and efficient methods to quantitatively define peanut pod and seed traits, a group of peanut germplasm with high levels of phenotypic varia...

  15. Quantitation of permethylated N-glycans through multiple-reaction monitoring (MRM) LC-MS/MS.

    PubMed

    Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L; Mechref, Yehia

    2015-04-01

    The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple-reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan was determined to be 30%, whereas it was found to be 35% for either fucosylated or sialylated N-glycans. The optimum CE for mannose and complex type N-glycan was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan compositions in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these glycans was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitude. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples.

  16. Quantitation of Permethylated N-Glycans through Multiple-Reaction Monitoring (MRM) LC-MS/MS

    NASA Astrophysics Data System (ADS)

    Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L.; Mechref, Yehia

    2015-04-01

    The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple-reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan was determined to be 30%, whereas it was found to be 35% for either fucosylated or sialylated N-glycans. The optimum CE for mannose and complex type N-glycan was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan compositions in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these glycans was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitude. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples.

  17. Measuring professional satisfaction in Greek nurses: combination of qualitative and quantitative investigation to evaluate the validity and reliability of the Index of Work Satisfaction.

    PubMed

    Karanikola, Maria N K; Papathanassoglou, Elizabeth D E

    2015-02-01

    The Index of Work Satisfaction (IWS) is a comprehensive scale assessing nurses' professional satisfaction. The aim of the present study was to explore: a) the applicability, reliability and validity of the Greek version of the IWS and b) contrasts among the factors addressed by IWS against the main themes emerging from a qualitative phenomenological investigation of nurses' professional experiences. A descriptive correlational design was applied using a sample of 246 emergency and critical care nurses. Internal consistency and test-retest reliability were tested. Construct and content validity were assessed by factor analysis, and through qualitative phenomenological analysis with a purposive sample of 12 nurses. Scale factors were contrasted to qualitative themes to assure that IWS embraces all aspects of Greek nurses' professional satisfaction. The internal consistency (α = 0.81) and test-retest (tau = 1, p < 0.0001) reliability were adequate. Following appropriate modifications, factor analysis confirmed the construct validity of the scale and subscales. The qualitative data partially clarified the low reliability of one subscale. The Greek version of the IWS scale is supported for use in acute care. The mixed methods approach constitutes a powerful tool for transferring scales to different cultures and healthcare systems. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Developing the Quantitative Histopathology Image Ontology (QHIO): A case study using the hot spot detection problem.

    PubMed

    Gurcan, Metin N; Tomaszewski, John; Overton, James A; Doyle, Scott; Ruttenberg, Alan; Smith, Barry

    2017-02-01

    Interoperability across data sets is a key challenge for quantitative histopathological imaging. There is a need for an ontology that can support effective merging of pathological image data with associated clinical and demographic data. To foster organized, cross-disciplinary, information-driven collaborations in the pathological imaging field, we propose to develop an ontology to represent imaging data and methods used in pathological imaging and analysis, and call it Quantitative Histopathological Imaging Ontology - QHIO. We apply QHIO to breast cancer hot-spot detection with the goal of enhancing reliability of detection by promoting the sharing of data between image analysts. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Graph Theoretical Analysis of Functional Brain Networks: Test-Retest Evaluation on Short- and Long-Term Resting-State Functional MRI Data

    PubMed Central

    Wang, Jin-Hui; Zuo, Xi-Nian; Gohel, Suril; Milham, Michael P.; Biswal, Bharat B.; He, Yong

    2011-01-01

    Graph-based computational network analysis has proven a powerful tool to quantitatively characterize functional architectures of the brain. However, the test-retest (TRT) reliability of graph metrics of functional networks has not been systematically examined. Here, we investigated TRT reliability of topological metrics of functional brain networks derived from resting-state functional magnetic resonance imaging data. Specifically, we evaluated both short-term (<1 hour apart) and long-term (>5 months apart) TRT reliability for 12 global and 6 local nodal network metrics. We found that reliability of global network metrics was overall low, threshold-sensitive and dependent on several factors of scanning time interval (TI, long-term>short-term), network membership (NM, networks excluding negative correlations>networks including negative correlations) and network type (NT, binarized networks>weighted networks). The dependence was modulated by another factor of node definition (ND) strategy. The local nodal reliability exhibited large variability across nodal metrics and a spatially heterogeneous distribution. Nodal degree was the most reliable metric and varied the least across the factors above. Hub regions in association and limbic/paralimbic cortices showed moderate TRT reliability. Importantly, nodal reliability was robust to above-mentioned four factors. Simulation analysis revealed that global network metrics were extremely sensitive (but varying degrees) to noise in functional connectivity and weighted networks generated numerically more reliable results in compared with binarized networks. For nodal network metrics, they showed high resistance to noise in functional connectivity and no NT related differences were found in the resistance. These findings provide important implications on how to choose reliable analytical schemes and network metrics of interest. PMID:21818285

  20. An image analysis system for near-infrared (NIR) fluorescence lymph imaging

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.

    2011-03-01

    Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.

  1. A novel multi-walled carbon nanotube-based antibody conjugate for quantitative and semi-quantitative lateral flow assays.

    PubMed

    Sun, Wenjuan; Hu, Xiaolong; Liu, Jia; Zhang, Yurong; Lu, Jianzhong; Zeng, Libo

    2017-10-01

    In this study, the multi-walled carbon nanotubes (MWCNTs) were applied in lateral flow strips (LFS) for semi-quantitative and quantitative assays. Firstly, the solubility of MWCNTs was improved using various surfactants to enhance their biocompatibility for practical application. The dispersed MWCNTs were conjugated with the methamphetamine (MET) antibody in a non-covalent manner and then manufactured into the LFS for the quantitative detection of MET. The MWCNTs-based lateral flow assay (MWCNTs-LFA) exhibited an excellent linear relationship between the values of test line and MET when its concentration ranges from 62.5 to 1500 ng/mL. The sensitivity of the LFS was evaluated by conjugating MWCNTs with HCG antibody and the MWCNTs conjugated method is 10 times more sensitive than the one conjugated with classical colloidal gold nanoparticles. Taken together, our data demonstrate that MWCNTs-LFA is a more sensitive and reliable assay for semi-quantitative and quantitative detection which can be used in forensic analysis.

  2. Development and psychometric evaluation of a quantitative measure of "fat talk".

    PubMed

    MacDonald Clarke, Paige; Murnen, Sarah K; Smolak, Linda

    2010-01-01

    Based on her anthropological research, Nichter (2000) concluded that it is normative for many American girls to engage in body self-disparagement in the form of "fat talk." The purpose of the present two studies was to develop a quantitative measure of fat talk. A series of 17 scenarios were created in which "Naomi" is talking with a female friend(s) and there is an expression of fat talk. College women respondents rated the frequency with which they would behave in a similar way as the women in each scenario. A nine-item one-factor scale was determined through principal components analysis and its scores yielded evidence of internal consistency reliability, test-retest reliability over a five-week time period, construct validity, discriminant validity, and incremental validity in that it predicted unique variance in body shame and eating disorder symptoms above and beyond other measures of self-objectification. Copyright 2009 Elsevier Ltd. All rights reserved.

  3. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  4. Recent advances in computational structural reliability analysis methods

    NASA Astrophysics Data System (ADS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  5. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  6. Spatially Regularized Machine Learning for Task and Resting-state fMRI

    PubMed Central

    Song, Xiaomu; Panych, Lawrence P.; Chen, Nan-kuei

    2015-01-01

    Background Reliable mapping of brain function across sessions and/or subjects in task- and resting-state has been a critical challenge for quantitative fMRI studies although it has been intensively addressed in the past decades. New Method A spatially regularized support vector machine (SVM) technique was developed for the reliable brain mapping in task- and resting-state. Unlike most existing SVM-based brain mapping techniques, which implement supervised classifications of specific brain functional states or disorders, the proposed method performs a semi-supervised classification for the general brain function mapping where spatial correlation of fMRI is integrated into the SVM learning. The method can adapt to intra- and inter-subject variations induced by fMRI nonstationarity, and identify a true boundary between active and inactive voxels, or between functionally connected and unconnected voxels in a feature space. Results The method was evaluated using synthetic and experimental data at the individual and group level. Multiple features were evaluated in terms of their contributions to the spatially regularized SVM learning. Reliable mapping results in both task- and resting-state were obtained from individual subjects and at the group level. Comparison with Existing Methods A comparison study was performed with independent component analysis, general linear model, and correlation analysis methods. Experimental results indicate that the proposed method can provide a better or comparable mapping performance at the individual and group level. Conclusions The proposed method can provide accurate and reliable mapping of brain function in task- and resting-state, and is applicable to a variety of quantitative fMRI studies. PMID:26470627

  7. Comparison of 13C Nuclear Magnetic Resonance and Fourier Transform Infrared spectroscopy for estimating humification and aromatization of soil organic matter

    NASA Astrophysics Data System (ADS)

    Rogers, K.; Cooper, W. T.; Hodgkins, S. B.; Verbeke, B. A.; Chanton, J.

    2017-12-01

    Solid state direct polarization 13C NMR spectroscopy (DP-NMR) is generally considered the most quantitatively reliable method for soil organic matter (SOM) characterization, including determination of the relative abundances of carbon functional groups. These functional abundances can then be used to calculate important soil parameters such as degree of humification and extent of aromaticity that reveal differences in reactivity or compositional changes along gradients (e.g. thaw chronosequence in permafrost). Unfortunately, the 13C NMR DP-NMR experiment is time-consuming, with a single sample often requiring over 24 hours of instrument time. Alternatively, solid state cross polarization 13C NMR (CP-NMR) can circumvent this problem, reducing analyses times to 4-6 hours but with some loss of quantitative reliability. Attenuated Total Reflectance Fourier Transform Infrared spectroscopy (ATR-FTIR) is a quick and relatively inexpensive method for characterizing solid materials, and has been suggested as an alternative to NMR for analysis of soil organic matter and determination of humification (HI) and aromatization (AI) indices. However, the quantitative reliability of ATR-FTIR for SOM analyses has never been verified, nor have any ATR-FTIR data been compared to similar measurements by NMR. In this work we focused on FTIR vibrational bands that correspond to the three functional groups used to calculate HI and AI values: carbohydrates (1030 cm-1), aromatics (1510, 1630 cm-1), and aliphatics (2850, 2920 cm-1). Data from ATR-FTIR measurements were compared to analogous quantitation by DP- and CP-NMR using peat samples from Sweden, Minnesota, and North Carolina. DP- and CP-NMR correlate very strongly, although the correlations are not always 1:1. Direct comparison of relative abundances of the three functional groups determined by NMR and ATR-FTIR yielded satisfactory results for carbohydrates (r2= 0.78) and aliphatics (r2=0.58), but less so for aromatics (r2= 0.395). ATR-FTIR has to this point been used primarily for relative abundance analyses (e.g. calculating HI and AI values), but these results suggest FTIR can provide quantitative reliability that approaches that of NMR.

  8. [Doppler echocardiography of tricuspid insufficiency. Methods of quantification].

    PubMed

    Loubeyre, C; Tribouilloy, C; Adam, M C; Mirode, A; Trojette, F; Lesbre, J P

    1994-01-01

    Evaluation of tricuspid incompetence has benefitted considerably from the development of Doppler ultrasound. In addition to direct analysis of the valves, which provides information about the mechanism involved, this method is able to provide an accurate evaluation, mainly through use of the Doppler mode. In addition to new criteria being evaluated (mainly the convergence zone of the regurgitant jet), some indices are recognised as good quantitative parameters: extension of the regurgitant jet into the right atrium, anterograde tricuspid flow, laminar nature of the regurgitant flow, analysis of the flow in the supra-hepatic veins, this is only semi-quantitative, since the calculation of the regurgitation fraction from the pulsed Doppler does not seem to be reliable; This accurate semi-quantitative evaluation is made possible by careful and consistent use of all the criteria available. The authors set out to discuss the value of the various evaluation criteria mentioned in the literature and try to define a practical approach.

  9. Enabling More than Moore: Accelerated Reliability Testing and Risk Analysis for Advanced Electronics Packaging

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza; Evans, John W.

    2014-01-01

    For five decades, the semiconductor industry has distinguished itself by the rapid pace of improvement in miniaturization of electronics products-Moore's Law. Now, scaling hits a brick wall, a paradigm shift. The industry roadmaps recognized the scaling limitation and project that packaging technologies will meet further miniaturization needs or ak.a "More than Moore". This paper presents packaging technology trends and accelerated reliability testing methods currently being practiced. Then, it presents industry status on key advanced electronic packages, factors affecting accelerated solder joint reliability of area array packages, and IPC/JEDEC/Mil specifications for characterizations of assemblies under accelerated thermal and mechanical loading. Finally, it presents an examples demonstrating how Accelerated Testing and Analysis have been effectively employed in the development of complex spacecraft thereby reducing risk. Quantitative assessments necessarily involve the mathematics of probability and statistics. In addition, accelerated tests need to be designed which consider the desired risk posture and schedule for particular project. Such assessments relieve risks without imposing additional costs. and constraints that are not value added for a particular mission. Furthermore, in the course of development of complex systems, variances and defects will inevitably present themselves and require a decision concerning their disposition, necessitating quantitative assessments. In summary, this paper presents a comprehensive view point, from technology to systems, including the benefits and impact of accelerated testing in offsetting risk.

  10. Black phosphorus-assisted laser desorption ionization mass spectrometry for the determination of low-molecular-weight compounds in biofluids.

    PubMed

    He, Xiao-Mei; Ding, Jun; Yu, Lei; Hussain, Dilshad; Feng, Yu-Qi

    2016-09-01

    Quantitative analysis of small molecules by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) has been a challenging task due to matrix-derived interferences in low m/z region and poor reproducibility of MS signal response. In this study, we developed an approach by applying black phosphorus (BP) as a matrix-assisted laser desorption ionization (MALDI) matrix for the quantitative analysis of small molecules for the first time. Black phosphorus-assisted laser desorption/ionization mass spectrometry (BP/ALDI-MS) showed clear background and exhibited superior detection sensitivity toward quaternary ammonium compounds compared to carbon-based materials. By combining stable isotope labeling (SIL) strategy with BP/ALDI-MS (SIL-BP/ALDI-MS), a variety of analytes labeled with quaternary ammonium group were sensitively detected. Moreover, the isotope-labeled forms of analytes also served as internal standards, which broadened the analyte coverage of BP/ALDI-MS and improved the reproducibility of MS signals. Based on these advantages, a reliable method for quantitative analysis of aldehydes from complex biological samples (saliva, urine, and serum) was successfully established. Good linearities were obtained for five aldehydes in the range of 0.1-20.0 μM with correlation coefficients (R (2)) larger than 0.9928. The LODs were found to be 20 to 100 nM. Reproducibility of the method was obtained with intra-day and inter-day relative standard deviations (RSDs) less than 10.4 %, and the recoveries in saliva samples ranged from 91.4 to 117.1 %. Taken together, the proposed SIL-BP/ALDI-MS strategy has proved to be a reliable tool for quantitative analysis of aldehydes from complex samples. Graphical Abstract An approach for the determination of small molecules was developed by using black phosphorus (BP) as a matrix-assisted laser desorption ionization (MALDI) matrix.

  11. Genome-Wide Identification and Evaluation of Reference Genes for Quantitative RT-PCR Analysis during Tomato Fruit Development.

    PubMed

    Cheng, Yuan; Bian, Wuying; Pang, Xin; Yu, Jiahong; Ahammed, Golam J; Zhou, Guozhi; Wang, Rongqing; Ruan, Meiying; Li, Zhimiao; Ye, Qingjing; Yao, Zhuping; Yang, Yuejian; Wan, Hongjian

    2017-01-01

    Gene expression analysis in tomato fruit has drawn increasing attention nowadays. Quantitative real-time PCR (qPCR) is a routine technique for gene expression analysis. In qPCR operation, reliability of results largely depends on the choice of appropriate reference genes (RGs). Although tomato is a model for fruit biology study, few RGs for qPCR analysis in tomato fruit had yet been developed. In this study, we initially identified 38 most stably expressed genes based on tomato transcriptome data set, and their expression stabilities were further determined in a set of tomato fruit samples of four different fruit developmental stages (Immature, mature green, breaker, mature red) using qPCR analysis. Two statistical algorithms, geNorm and Normfinder, concordantly determined the superiority of these identified putative RGs. Notably, SlFRG05 (Solyc01g104170), SlFRG12 (Solyc04g009770), SlFRG16 (Solyc10g081190), SlFRG27 (Solyc06g007510), and SlFRG37 (Solyc11g005330) were proved to be suitable RGs for tomato fruit development study. Further analysis using geNorm indicate that the combined use of SlFRG03 (Solyc02g063070) and SlFRG27 would provide more reliable normalization results in qPCR experiments. The identified RGs in this study will be beneficial for future qPCR analysis of tomato fruit developmental study, as well as for the potential identification of optimal normalization controls in other plant species.

  12. Genome-Wide Identification and Evaluation of Reference Genes for Quantitative RT-PCR Analysis during Tomato Fruit Development

    PubMed Central

    Cheng, Yuan; Bian, Wuying; Pang, Xin; Yu, Jiahong; Ahammed, Golam J.; Zhou, Guozhi; Wang, Rongqing; Ruan, Meiying; Li, Zhimiao; Ye, Qingjing; Yao, Zhuping; Yang, Yuejian; Wan, Hongjian

    2017-01-01

    Gene expression analysis in tomato fruit has drawn increasing attention nowadays. Quantitative real-time PCR (qPCR) is a routine technique for gene expression analysis. In qPCR operation, reliability of results largely depends on the choice of appropriate reference genes (RGs). Although tomato is a model for fruit biology study, few RGs for qPCR analysis in tomato fruit had yet been developed. In this study, we initially identified 38 most stably expressed genes based on tomato transcriptome data set, and their expression stabilities were further determined in a set of tomato fruit samples of four different fruit developmental stages (Immature, mature green, breaker, mature red) using qPCR analysis. Two statistical algorithms, geNorm and Normfinder, concordantly determined the superiority of these identified putative RGs. Notably, SlFRG05 (Solyc01g104170), SlFRG12 (Solyc04g009770), SlFRG16 (Solyc10g081190), SlFRG27 (Solyc06g007510), and SlFRG37 (Solyc11g005330) were proved to be suitable RGs for tomato fruit development study. Further analysis using geNorm indicate that the combined use of SlFRG03 (Solyc02g063070) and SlFRG27 would provide more reliable normalization results in qPCR experiments. The identified RGs in this study will be beneficial for future qPCR analysis of tomato fruit developmental study, as well as for the potential identification of optimal normalization controls in other plant species. PMID:28900431

  13. 40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Solubilities and Octanol-Water Partition Coefficients of Hydrophobic Substances,” Journal of Research of the...

  14. 40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Solubilities and Octanol-Water Partition Coefficients of Hydrophobic Substances,” Journal of Research of the...

  15. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  16. Selection and Reporting of Statistical Methods to Assess Reliability of a Diagnostic Test: Conformity to Recommended Methods in a Peer-Reviewed Journal

    PubMed Central

    Park, Ji Eun; Han, Kyunghwa; Sung, Yu Sub; Chung, Mi Sun; Koo, Hyun Jung; Yoon, Hee Mang; Choi, Young Jun; Lee, Seung Soo; Kim, Kyung Won; Shin, Youngbin; An, Suah; Cho, Hyo-Min

    2017-01-01

    Objective To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Materials and Methods Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Results Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Conclusion Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary. PMID:29089821

  17. Multivariate calibration in Laser-Induced Breakdown Spectroscopy quantitative analysis: The dangers of a 'black box' approach and how to avoid them

    NASA Astrophysics Data System (ADS)

    Safi, A.; Campanella, B.; Grifoni, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Poggialini, F.; Ripoll-Seguer, L.; Hidalgo, M.; Palleschi, V.

    2018-06-01

    The introduction of multivariate calibration curve approach in Laser-Induced Breakdown Spectroscopy (LIBS) quantitative analysis has led to a general improvement of the LIBS analytical performances, since a multivariate approach allows to exploit the redundancy of elemental information that are typically present in a LIBS spectrum. Software packages implementing multivariate methods are available in the most diffused commercial and open source analytical programs; in most of the cases, the multivariate algorithms are robust against noise and operate in unsupervised mode. The reverse of the coin of the availability and ease of use of such packages is the (perceived) difficulty in assessing the reliability of the results obtained which often leads to the consideration of the multivariate algorithms as 'black boxes' whose inner mechanism is supposed to remain hidden to the user. In this paper, we will discuss the dangers of a 'black box' approach in LIBS multivariate analysis, and will discuss how to overcome them using the chemical-physical knowledge that is at the base of any LIBS quantitative analysis.

  18. Histomorphometric, fractal and lacunarity comparative analysis of sheep (Ovis aries), goat (Capra hircus) and roe deer (Capreolus capreolus) compact bone samples.

    PubMed

    Gudea, A I; Stefan, A C

    2013-08-01

    Quantitative and qualitative studies dealing with histomorphometry of the bone tissue play a new role in modern legal medicine/forensic medicine and archaeozoology nowadays. This study deals with the differences found in case of humerus and metapodial bones of recent sheep (Ovis aries), goat (Capra hircus) and roedeer (Capreolus capreolus) specimens, both from a qualitative point of view, but mainly from a quantitative perspective. A novel perspective given by the fractal analysis performed on the digital histological images is approached. This study shows that the qualitative assessment may not be a reliable one due to the close resemblance of the structures. From the quantitative perspective (several measurements performed on osteonal units and statistical processing of data),some of the elements measured show significant differences among 3 species(the primary osteonal diameter, etc.). The fractal analysis and the lacunarity of the images show a great deal of potential, proving that this type of analysis can be of great help in the separation of the material from this perspective.

  19. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  20. Quantitation of Permethylated N-Glycans through Multiple-Reaction Monitoring (MRM) LC-MS/MS

    PubMed Central

    Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L.; Mechref, Yehia

    2015-01-01

    The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan structures was determined to be 30% while it was found to be 35% for either fucosylated or sialylated structures The optimum CE for mannose and complex type N-glycan structures was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan structures in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these structures was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitudes. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan structures enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples. PMID:25698222

  1. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. A reliable methodology for quantitative extraction of fruit and vegetable physiological amino acids and their subsequent analysis with commonly available HPLC systems

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiological amino acids in selected fruits and vegetables. This method was found to be particularly useful because the dabsyl derivatives of glutamine and citrulline were sufficiently se...

  3. Selection of reference genes for RT-qPCR analysis in the monarch butterfly, Danaus plexippus (L.), a migrating bio-indicator

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time PCR (qRT-PCR) is a reliable and reproducible technique for measuring and evaluating changes in gene expression. To facilitate gene expression studies and obtain more accurate qRT-PCR data, normalization relative to stable housekeeping genes is required. In this study, expres...

  4. Further Evidence of Complex Motor Dysfunction in Drug Naive Children with Autism Using Automatic Motion Analysis of Gait

    ERIC Educational Resources Information Center

    Nobile, Maria; Perego, Paolo; Piccinini, Luigi; Mani, Elisa; Rossi, Agnese; Bellina, Monica; Molteni, Massimo

    2011-01-01

    In order to increase the knowledge of locomotor disturbances in children with autism, and of the mechanism underlying them, the objective of this exploratory study was to reliably and quantitatively evaluate linear gait parameters (spatio-temporal and kinematic parameters), upper body kinematic parameters, walk orientation and smoothness using an…

  5. Teaching Effectiveness in Private Higher Education Institutions in Botswana: Analysis of Students' Perceptions

    ERIC Educational Resources Information Center

    Baliyan, Som Pal; Moorad, Fazlur Rehman

    2018-01-01

    This quantitative study analyzed the perceptions of students on teaching effectiveness in private higher education institutions in Botswana. An exploratory and descriptive survey research design was adopted in this study. A valid and reliable questionnaire was used to collect data through a survey of 560 stratified randomly sampled students in…

  6. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  7. Simultaneous qualitative and quantitative analysis of flavonoids and alkaloids from the leaves of Nelumbo nucifera Gaertn. using high-performance liquid chromatography with quadrupole time-of-flight mass spectrometry.

    PubMed

    Guo, Yujie; Chen, Xi; Qi, Jin; Yu, Boyang

    2016-07-01

    A reliable method, combining qualitative analysis by high-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry and quantitative assessment by high-performance liquid chromatography with photodiode array detection, has been developed to simultaneously analyze flavonoids and alkaloids in lotus leaf extracts. In the qualitative analysis, a total of 30 compounds, including 12 flavonoids, 16 alkaloids, and two proanthocyanidins, were identified. The fragmentation behaviors of four types of flavone glycoside and three types of alkaloid are summarized. The mass spectra of four representative components, quercetin 3-O-glucuronide, norcoclaurine, nuciferine, and neferine, are shown to illustrate their fragmentation pathways. Five pairs of isomers were detected and three of them were distinguished by comparing the elution order with reference substances and the mass spectrometry data with reported data. In the quantitative analysis, 30 lotus leaf samples from different regions were analyzed to investigate the proportion of eight representative compounds. Quercetin 3-O-glucuronide was found to be the predominant constituent of lotus leaf extracts. For further discrimination among the samples, hierarchical cluster analysis, and principal component analysis, based on the areas of the eight quantitative peaks, were carried out. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.

    NASA Astrophysics Data System (ADS)

    Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan

    2017-09-01

    Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.

  9. GAPDH, β-actin and β2-microglobulin, as three common reference genes, are not reliable for gene expression studies in equine adipose- and marrow-derived mesenchymal stem cells.

    PubMed

    Nazari, Fatemeh; Parham, Abbas; Maleki, Adham Fani

    2015-01-01

    Quantitative real time reverse transcription PCR (qRT-PCR) is one of the most important techniques for gene-expression analysis in molecular based studies. Selecting a proper internal control gene for normalizing data is a crucial step in gene expression analysis via this method. The expression levels of reference genes should be remained constant among cells in different tissues. However, it seems that the location of cells in different tissues might influence their expression. The purpose of this study was to determine whether the source of mesenchymal stem cells (MSCs) has any effect on expression level of three common reference genes (GAPDH, β-actin and β2-microglobulin) in equine marrow- and adipose- derived undifferentiated MSCs and consequently their reliability for comparative qRT-PCR. Adipose tissue (AT) and bone marrow (BM) samples were harvested from 3 mares. MSCs were isolated and cultured until passage 3 (P3). Total RNA of P3 cells was extracted for cDNA synthesis. The generated cDNAs were analyzed by quantitative real-time PCR. The PCR reactions were ended with a melting curve analysis to verify the specificity of amplicon. The expression levels of GAPDH were significantly different between AT- and BM- derived MSCs (p < 0.05). Differences in expression level of β-actin (P < 0.001) and B2M (P < 0.006.) between MSCs derived from AT and BM were substantially higher than GAPDH. In addition, the fold change in expression levels of GAPDH, β-actin and B2M in AT-derived MSCs compared to BM-derived MSCs were 2.38, 6.76 and 7.76, respectively. This study demonstrated that GAPDH and especially β-actin and B2M express in different levels in equine AT- and BM- derived MSCs. Thus they cannot be considered as reliable reference genes for comparative quantitative gene expression analysis in MSCs derived from equine bone marrow and adipose tissue.

  10. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less

  11. Intersession reliability of fMRI activation for heat pain and motor tasks

    PubMed Central

    Quiton, Raimi L.; Keaser, Michael L.; Zhuo, Jiachen; Gullapalli, Rao P.; Greenspan, Joel D.

    2014-01-01

    As the practice of conducting longitudinal fMRI studies to assess mechanisms of pain-reducing interventions becomes more common, there is a great need to assess the test–retest reliability of the pain-related BOLD fMRI signal across repeated sessions. This study quantitatively evaluated the reliability of heat pain-related BOLD fMRI brain responses in healthy volunteers across 3 sessions conducted on separate days using two measures: (1) intraclass correlation coefficients (ICC) calculated based on signal amplitude and (2) spatial overlap. The ICC analysis of pain-related BOLD fMRI responses showed fair-to-moderate intersession reliability in brain areas regarded as part of the cortical pain network. Areas with the highest intersession reliability based on the ICC analysis included the anterior midcingulate cortex, anterior insula, and second somatosensory cortex. Areas with the lowest intersession reliability based on the ICC analysis also showed low spatial reliability; these regions included pregenual anterior cingulate cortex, primary somatosensory cortex, and posterior insula. Thus, this study found regional differences in pain-related BOLD fMRI response reliability, which may provide useful information to guide longitudinal pain studies. A simple motor task (finger-thumb opposition) was performed by the same subjects in the same sessions as the painful heat stimuli were delivered. Intersession reliability of fMRI activation in cortical motor areas was comparable to previously published findings for both spatial overlap and ICC measures, providing support for the validity of the analytical approach used to assess intersession reliability of pain-related fMRI activation. A secondary finding of this study is that the use of standard ICC alone as a measure of reliability may not be sufficient, as the underlying variance structure of an fMRI dataset can result in inappropriately high ICC values; a method to eliminate these false positive results was used in this study and is recommended for future studies of test–retest reliability. PMID:25161897

  12. Advancing Usability Evaluation through Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; David I. Gertman

    2005-07-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probabilitymore » of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.« less

  13. Quantitative analysis of three chiral pesticide enantiomers by high-performance column liquid chromatography.

    PubMed

    Wang, Peng; Liu, Donghui; Gu, Xu; Jiang, Shuren; Zhou, Zhiqiang

    2008-01-01

    Methods for the enantiomeric quantitative determination of 3 chiral pesticides, paclobutrazol, myclobutanil, and uniconazole, and their residues in soil and water are reported. An effective chiral high-performance liquid chromatographic (HPLC)-UV method using an amylose-tris(3,5-dimethylphenylcarbamate; AD) column was developed for resolving the enantiomers and quantitative determination. The enantiomers were identified by a circular dichroism detector. Validation involved complete resolution of each of the 2 enantiomers, plus determination of linearity, precision, and limit of detection (LOD). The pesticide enantiomers were isolated by solvent extraction from soil and C18 solid-phase extraction from water. The 2 enantiomers of the 3 pesticides could be completely separated on the AD column using n-hexane isopropanol mobile phase. The linearity and precision results indicated that the method was reliable for the quantitative analysis of the enantiomers. LODs were 0.025, 0.05, and 0.05 mg/kg for each enantiomer of paclobutrazol, myclobutanil, and uniconazole, respectively. Recovery and precision data showed that the pretreatment procedures were satisfactory for enantiomer extraction and cleanup. This method can be used for optical purity determination of technical material and analysis of environmental residues.

  14. A new approach for quantitative analysis of L-phenylalanine using a novel semi-sandwich immunometric assay.

    PubMed

    Kubota, Kazuyuki; Mizukoshi, Toshimi; Miyano, Hiroshi

    2013-10-01

    Here, we describe a novel method for L-phenylalanine analysis using a sandwich-type immunometric assay approach for use as a new method for amino acid analysis. To overcome difficulties of the preparation of high-affinity and selectivity monoclonal antibodies against L-phenylalanine and the inability to use sandwich-type immunometric assays due to their small molecular weight, three procedures were examined. First, amino groups of L-phenylalanine were modified by "N-Fmoc-L-cysteine" (FC) residues and the derivative (FC-Phe) was used as a hapten. Immunization of mice with bovine serum albumin/FC-Phe conjugate successfully yielded specific monoclonal anti-FC-Phe antibodies. Second, a new derivatization reagent, "biotin linker conjugate of FC-Phe N-succinimidyl ester" (FC(Biotin)-NHS), was synthesized to convert L-phenylalanine to FC-(Biotin)-Phe as a hapten structure. The biotin moiety linked to the thiol group of cysteine formed a second binding site for streptavidin/horseradish peroxidase (HRP) conjugates for optical detection. Third, a new semi-sandwich-type immunometric assay was established using pre-derivatized L-phenylalanine, the monoclonal anti-FC-Phe antibody, and streptavidin/HRP conjugate (without second antibody). Using the new "semi-sandwich" immunometric assay system, a detection limit of 35 nM (60 amol per analysis) and a detection range of 0.1-20 μM were attained using a standard L-phenylalanine solution. Rat plasma samples were analyzed to test reliability. Intra-day assay precision was within 6% of the coefficient of variation; inter-day variation was 0.1%. The recovery rates were from 92.4 to 123.7%. This is the first report of the quantitative determination of L-phenylalanine using a reliable semi-sandwich immunometric assay approach and will be applicable to the quantitative determination of other amino acids.

  15. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    PubMed

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  16. Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.

    PubMed

    Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K

    2017-05-01

    Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.

  17. The effect of leverage and/or influential on structure-activity relationships.

    PubMed

    Bolboacă, Sorana D; Jäntschi, Lorentz

    2013-05-01

    In the spirit of reporting valid and reliable Quantitative Structure-Activity Relationship (QSAR) models, the aim of our research was to assess how the leverage (analysis with Hat matrix, h(i)) and the influential (analysis with Cook's distance, D(i)) of QSAR models may reflect the models reliability and their characteristics. The datasets included in this research were collected from previously published papers. Seven datasets which accomplished the imposed inclusion criteria were analyzed. Three models were obtained for each dataset (full-model, h(i)-model and D(i)-model) and several statistical validation criteria were applied to the models. In 5 out of 7 sets the correlation coefficient increased when compounds with either h(i) or D(i) higher than the threshold were removed. Withdrawn compounds varied from 2 to 4 for h(i)-models and from 1 to 13 for D(i)-models. Validation statistics showed that D(i)-models possess systematically better agreement than both full-models and h(i)-models. Removal of influential compounds from training set significantly improves the model and is recommended to be conducted in the process of quantitative structure-activity relationships developing. Cook's distance approach should be combined with hat matrix analysis in order to identify the compounds candidates for removal.

  18. Toward improved peptide feature detection in quantitative proteomics using stable isotope labeling.

    PubMed

    Nilse, Lars; Sigloch, Florian Christoph; Biniossek, Martin L; Schilling, Oliver

    2015-08-01

    Reliable detection of peptides in LC-MS data is a key algorithmic step in the analysis of quantitative proteomics experiments. While highly abundant peptides can be detected reliably by most modern software tools, there is much less agreement on medium and low-intensity peptides in a sample. The choice of software tools can have a big impact on the quantification of proteins, especially for proteins that appear in lower concentrations. However, in many experiments, it is precisely this region of less abundant but substantially regulated proteins that holds the biggest potential for discoveries. This is particularly true for discovery proteomics in the pharmacological sector with a specific interest in key regulatory proteins. In this viewpoint article, we discuss how the development of novel software algorithms allows us to study this region of the proteome with increased confidence. Reliable results are one of many aspects to be considered when deciding on a bioinformatics software platform. Deployment into existing IT infrastructures, compatibility with other software packages, scalability, automation, flexibility, and support need to be considered and are briefly addressed in this viewpoint article. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Quantitative analysis of doped/undoped ZnO nanomaterials using laser assisted atom probe tomography: Influence of the analysis parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amirifar, Nooshin; Lardé, Rodrigue, E-mail: rodrigue.larde@univ-rouen.fr; Talbot, Etienne

    2015-12-07

    In the last decade, atom probe tomography has become a powerful tool to investigate semiconductor and insulator nanomaterials in microelectronics, spintronics, and optoelectronics. In this paper, we report an investigation of zinc oxide nanostructures using atom probe tomography. We observed that the chemical composition of zinc oxide is strongly dependent on the analysis parameters used for atom probe experiments. It was observed that at high laser pulse energies, the electric field at the specimen surface is strongly dependent on the crystallographic directions. This dependence leads to an inhomogeneous field evaporation of the surface atoms, resulting in unreliable measurements. We showmore » that the laser pulse energy has to be well tuned to obtain reliable quantitative chemical composition measurements of undoped and doped ZnO nanomaterials.« less

  20. A methodological analysis of chaplaincy research: 2000-2009.

    PubMed

    Galek, Kathleen; Flannelly, Kevin J; Jankowski, Katherine R B; Handzo, George F

    2011-01-01

    The present article presents a comprehensive review and analysis of quantitative research conducted in the United States on chaplaincy and closely related topics published between 2000 and 2009. A combined search strategy identified 49 quantitative studies in 13 journals. The analysis focuses on the methodological sophistication of the studies, compared to earlier research on chaplaincy and pastoral care. Cross-sectional surveys of convenience samples still dominate the field, but sample sizes have increased somewhat over the past three decades. Reporting of the validity and reliability of measures continues to be low, although reporting of response rates has improved. Improvements in the use of inferential statistics and statistical controls were also observed, compared to previous research. The authors conclude that more experimental research is needed on chaplaincy, along with an increased use of hypothesis testing, regardless of the research designs that are used.

  1. Reliable gene expression analysis by reverse transcription-quantitative PCR: reporting and minimizing the uncertainty in data accuracy.

    PubMed

    Remans, Tony; Keunen, Els; Bex, Geert Jan; Smeets, Karen; Vangronsveld, Jaco; Cuypers, Ann

    2014-10-01

    Reverse transcription-quantitative PCR (RT-qPCR) has been widely adopted to measure differences in mRNA levels; however, biological and technical variation strongly affects the accuracy of the reported differences. RT-qPCR specialists have warned that, unless researchers minimize this variability, they may report inaccurate differences and draw incorrect biological conclusions. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines describe procedures for conducting and reporting RT-qPCR experiments. The MIQE guidelines enable others to judge the reliability of reported results; however, a recent literature survey found low adherence to these guidelines. Additionally, even experiments that use appropriate procedures remain subject to individual variation that statistical methods cannot correct. For example, since ideal reference genes do not exist, the widely used method of normalizing RT-qPCR data to reference genes generates background noise that affects the accuracy of measured changes in mRNA levels. However, current RT-qPCR data reporting styles ignore this source of variation. In this commentary, we direct researchers to appropriate procedures, outline a method to present the remaining uncertainty in data accuracy, and propose an intuitive way to select reference genes to minimize uncertainty. Reporting the uncertainty in data accuracy also serves for quality assessment, enabling researchers and peer reviewers to confidently evaluate the reliability of gene expression data. © 2014 American Society of Plant Biologists. All rights reserved.

  2. Detecting Autophagy and Autophagy Flux in Chronic Myeloid Leukemia Cells Using a Cyto-ID Fluorescence Spectrophotometric Assay.

    PubMed

    Guo, Sujuan; Pridham, Kevin J; Sheng, Zhi

    2016-01-01

    Autophagy is a catabolic process whereby cellular components are degraded to fuel cells for longer survival during stress. Hence, autophagy plays a vital role in determining cell fate and is central for homeostasis and pathogenesis of many human diseases including chronic myeloid leukemia (CML). It has been well established that autophagy is important for the leukemogenesis as well as drug resistance in CML. Thus, autophagy is an intriguing therapeutic target. However, current approaches that detect autophagy lack reliability and often fail to provide quantitative measurements. To overcome this hurdle and facilitate the development of autophagy-related therapies, we have recently developed an autophagy assay termed as the Cyto-ID fluorescence spectrophotometric assay. This method uses a cationic fluorescence dye, Cyto-ID, which specifically labels autophagic compartments and is detected by a spectrophotometer to permit a large-scale and quantitative analysis. As such, it allows rapid, reliable, and quantitative detection of autophagy and estimation of autophagy flux. In this chapter, we further provide technical details of this method and step-by-step protocols for measuring autophagy or autophagy flux in CML cell lines as well as primary hematopoietic cells.

  3. Three-Dimensional Photography for Quantitative Assessment of Penile Volume-Loss Deformities in Peyronie's Disease.

    PubMed

    Margolin, Ezra J; Mlynarczyk, Carrie M; Mulhall, John P; Stember, Doron S; Stahl, Peter J

    2017-06-01

    Non-curvature penile deformities are prevalent and bothersome manifestations of Peyronie's disease (PD), but the quantitative metrics that are currently used to describe these deformities are inadequate and non-standardized, presenting a barrier to clinical research and patient care. To introduce erect penile volume (EPV) and percentage of erect penile volume loss (percent EPVL) as novel metrics that provide detailed quantitative information about non-curvature penile deformities and to study the feasibility and reliability of three-dimensional (3D) photography for measurement of quantitative penile parameters. We constructed seven penis models simulating deformities found in PD. The 3D photographs of each model were captured in triplicate by four observers using a 3D camera. Computer software was used to generate automated measurements of EPV, percent EPVL, penile length, minimum circumference, maximum circumference, and angle of curvature. The automated measurements were statistically compared with measurements obtained using water-displacement experiments, a tape measure, and a goniometer. Accuracy of 3D photography for average measurements of all parameters compared with manual measurements; inter-test, intra-observer, and inter-observer reliabilities of EPV and percent EPVL measurements as assessed by the intraclass correlation coefficient. The 3D images were captured in a median of 52 seconds (interquartile range = 45-61). On average, 3D photography was accurate to within 0.3% for measurement of penile length. It overestimated maximum and minimum circumferences by averages of 4.2% and 1.6%, respectively; overestimated EPV by an average of 7.1%; and underestimated percent EPVL by an average of 1.9%. All inter-test, inter-observer, and intra-observer intraclass correlation coefficients for EPV and percent EPVL measurements were greater than 0.75, reflective of excellent methodologic reliability. By providing highly descriptive and reliable measurements of penile parameters, 3D photography can empower researchers to better study volume-loss deformities in PD and enable clinicians to offer improved clinical assessment, communication, and documentation. This is the first study to apply 3D photography to the assessment of PD and to accurately measure the novel parameters of EPV and percent EPVL. This proof-of-concept study is limited by the lack of data in human subjects, which could present additional challenges in obtaining reliable measurements. EPV and percent EPVL are novel metrics that can be quickly, accurately, and reliably measured using computational analysis of 3D photographs and can be useful in describing non-curvature volume-loss deformities resulting from PD. Margolin EJ, Mlynarczyk CM, Muhall JP, et al. Three-Dimensional Photography for Quantitative Assessment of Penile Volume-Loss Deformities in Peyronie's Disease. J Sex Med 2017;14:829-833. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  4. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    PubMed

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  5. Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers

    NASA Technical Reports Server (NTRS)

    Kenny, Sean (Technical Monitor); Wertz, Julie

    2002-01-01

    As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.

  6. The relationship between quantitative measures of disc height and disc signal intensity with Pfirrmann score of disc degeneration.

    PubMed

    Salamat, Sara; Hutchings, John; Kwong, Clemens; Magnussen, John; Hancock, Mark J

    2016-01-01

    To assess the relationship between quantitative measures of disc height and signal intensity with the Pfirrmann disc degeneration scoring system and to test the inter-rater reliability of the quantitative measures. Participants were 76 people who had recently recovered from their last episode of acute low back pain and underwent MRI scan on a single 3T machine. At all 380 lumbar discs, quantitative measures of disc height and signal intensity were made by 2 independent raters and compared to Pfirrmann scores from a single radiologist. For quantitative measures of disc height and signal intensity a "raw" score and 2 adjusted ratios were calculated and the relationship with Pfirrmann scores was assessed. The inter-tester reliability of quantitative measures was also investigated. There was a strong linear relationship between quantitative disc signal intensity and Pfirrmann scores for grades 1-4, but not for grades 4 and 5. For disc height only, Pfirrmann grade 5 had significantly reduced disc height compared to all other grades. Results were similar regardless of whether raw or adjusted scores were used. Inter-rater reliability for the quantitative measures was excellent (ICC > 0.97). Quantitative measures of disc signal intensity were strongly related to Pfirrmann scores from grade 1 to 4; however disc height only differentiated between grade 4 and 5 Pfirrmann scores. Using adjusted ratios for quantitative measures of disc height or signal intensity did not significantly alter the relationship with Pfirrmann scores.

  7. STAMPS: development and verification of swallowing kinematic analysis software.

    PubMed

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P < 0.001) for displacement and velocity. The Bland-Altman plots showed good agreement between the measurements and the reference values. STAMPS provides precise and reliable kinematic measurements and multiple practical functionalities for spatiotemporal analysis. The software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  8. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of the Goutallier classification and thus improve the prediction of clinical results after rotator cuff repair. However, these techniques are currently only available in an experimental setting.

  9. Propensity to Leave versus Probability of Leaving: The Relationship between Intrinsic and Extrinsic Satisfaction in the Voluntary Leaving Behavior of IT Professionals

    ERIC Educational Resources Information Center

    King, Christopher S.

    2013-01-01

    This dissertation presents a quantitative analysis of the relationship between intrinsic and extrinsic job satisfaction and the voluntary leaving behavior of IT professionals. In addition, the study adds to the validity and reliability of the Udechukwu and Mujtaba Mathematical Turnover Model. Surveyed within the study for their intrinsic and…

  10. An Examination of the Predictive Relationships of Self-Evaluation Capacity and Staff Competency on Strategic Planning in Hong Kong Aided Secondary Schools

    ERIC Educational Resources Information Center

    Cheng, Eric C. K.

    2011-01-01

    This article aims to examine the predictive relationships of self-evaluation capacity and staff competency on the effect of strategic planning in aided secondary schools in Hong Kong. A quantitative questionnaire survey was compiled to collect data from principals of the participating schools. Confirmatory factor analysis and reliability tests…

  11. A quantitative analysis of the quality and content of the health advice in popular Australian magazines.

    PubMed

    Wilson, Amanda; Smith, David; Peel, Roseanne; Robertson, Jane; Kypri, Kypros

    2017-06-01

    To examine how health advice is provided in popular magazines and the quality of that advice. A prospective quantitative analysis of the quality of health advice provided in Australian magazines between July and December 2011 was conducted. A rating instrument was adapted from the Media Doctor Australia rating tool used to assess quality of health news reporting. Criteria included: recommends seeing a doctor; advice based on reliable evidence; advice clear and easily applied; benefits presented meaningfully; potential harms mentioned; evidence of disease mongering; availability and cost of treatments; obvious advertising; vested interest, and anecdotal evidence. 163 health advice articles were rated showing a wide variation in the quality of advice presented between magazines. Magazines with 'health' in the title, rated most poorly with only 36% (26/73) of these articles presenting clear and meaningful advice and 52% (38/73) giving advice based on reliable evidence. Australian magazines, especially those with health in the title, generally presented poor quality, unreliable health advice. Teen magazine Dolly provided the highest quality advice. Consumers need to be aware of this when making health choices. © 2016 Public Health Association of Australia.

  12. 40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Research of the National Bureau of Standards, 86:361-366 (1981). (7) Fujita, T. et al. “A New Substituent...

  13. 40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Research of the National Bureau of Standards, 86:361-366 (1981). (7) Fujita, T. et al. “A New Substituent...

  14. 40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Research of the National Bureau of Standards, 86:361-366 (1981). (7) Fujita, T. et al. “A New Substituent...

  15. Quantitative trait loci markers derived from whole genome sequence data increases the reliability of genomic prediction.

    PubMed

    Brøndum, R F; Su, G; Janss, L; Sahana, G; Guldbrandtsen, B; Boichard, D; Lund, M S

    2015-06-01

    This study investigated the effect on the reliability of genomic prediction when a small number of significant variants from single marker analysis based on whole genome sequence data were added to the regular 54k single nucleotide polymorphism (SNP) array data. The extra markers were selected with the aim of augmenting the custom low-density Illumina BovineLD SNP chip (San Diego, CA) used in the Nordic countries. The single-marker analysis was done breed-wise on all 16 index traits included in the breeding goals for Nordic Holstein, Danish Jersey, and Nordic Red cattle plus the total merit index itself. Depending on the trait's economic weight, 15, 10, or 5 quantitative trait loci (QTL) were selected per trait per breed and 3 to 5 markers were selected to tag each QTL. After removing duplicate markers (same marker selected for more than one trait or breed) and filtering for high pairwise linkage disequilibrium and assaying performance on the array, a total of 1,623 QTL markers were selected for inclusion on the custom chip. Genomic prediction analyses were performed for Nordic and French Holstein and Nordic Red animals using either a genomic BLUP or a Bayesian variable selection model. When using the genomic BLUP model including the QTL markers in the analysis, reliability was increased by up to 4 percentage points for production traits in Nordic Holstein animals, up to 3 percentage points for Nordic Reds, and up to 5 percentage points for French Holstein. Smaller gains of up to 1 percentage point was observed for mastitis, but only a 0.5 percentage point increase was seen for fertility. When using a Bayesian model accuracies were generally higher with only 54k data compared with the genomic BLUP approach, but increases in reliability were relatively smaller when QTL markers were included. Results from this study indicate that the reliability of genomic prediction can be increased by including markers significant in genome-wide association studies on whole genome sequence data alongside the 54k SNP set. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. [Study of the reliability in one dimensional size measurement with digital slit lamp microscope].

    PubMed

    Wang, Tao; Qi, Chaoxiu; Li, Qigen; Dong, Lijie; Yang, Jiezheng

    2010-11-01

    To study the reliability of digital slit lamp microscope as a tool for quantitative analysis in one dimensional size measurement. Three single-blinded observers acquired and repeatedly measured the images with a size of 4.00 mm and 10.00 mm on the vernier caliper, which simulatated the human eye pupil and cornea diameter under China-made digital slit lamp microscope in the objective magnification of 4 times, 10 times, 16 times, 25 times, 40 times and 4 times, 10 times, 16 times, respectively. The correctness and precision of measurement were compared. The images with 4 mm size were measured by three investigators and the average values were located between 3.98 to 4.06. For the images with 10.00 mm size, the average values fell within 10.00 ~ 10.04. Measurement results of 4.00 mm images showed, except A4, B25, C16 and C25, significant difference was noted between the measured value and the true value. Regarding measurement results of 10.00 mm iamges indicated, except A10, statistical significance was found between the measured value and the true value. In terms of comparing the results of the same size measured at different magnifications by the same investigator, except for investigators A's measurements of 10.00 mm dimension, the measurement results by all the remaining investigators presented statistical significance at different magnifications. Compared measurements of the same size with different magnifications, measurements of 4.00 mm in 4-fold magnification had no significant difference among the investigators', the remaining results were statistically significant. The coefficient of variation of all measurement results were less than 5%; as magnification increased, the coefficient of variation decreased. The measurement of digital slit lamp microscope in one-dimensional size has good reliability,and should be performed for reliability analysis before used for quantitative analysis to reduce systematic errors.

  17. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  18. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2017-12-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  19. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures.

    PubMed

    Boes, Kelsey S; Roberts, Michael S; Vinueza, Nelson R

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R 2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R 2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. Graphical Abstract ᅟ.

  20. Rocketdyne PSAM: In-house enhancement/application

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ohara, K.

    1991-01-01

    The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.

  1. An effective approach to quantitative analysis of ternary amino acids in foxtail millet substrate based on terahertz spectroscopy.

    PubMed

    Lu, Shao Hua; Li, Bao Qiong; Zhai, Hong Lin; Zhang, Xin; Zhang, Zhuo Yong

    2018-04-25

    Terahertz time-domain spectroscopy has been applied to many fields, however, it still encounters drawbacks in multicomponent mixtures analysis due to serious spectral overlapping. Here, an effective approach to quantitative analysis was proposed, and applied on the determination of the ternary amino acids in foxtail millet substrate. Utilizing three parameters derived from the THz-TDS, the images were constructed and the Tchebichef image moments were used to extract the information of target components. Then the quantitative models were obtained by stepwise regression. The correlation coefficients of leave-one-out cross-validation (R loo-cv 2 ) were more than 0.9595. As for external test set, the predictive correlation coefficients (R p 2 ) were more than 0.8026 and the root mean square error of prediction (RMSE p ) were less than 1.2601. Compared with the traditional methods (PLS and N-PLS methods), our approach is more accurate, robust and reliable, and can be a potential excellent approach to quantify multicomponent with THz-TDS spectroscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Use of a deuterated internal standard with pyrolysis-GC/MS dimeric marker analysis to quantify tire tread particles in the environment.

    PubMed

    Unice, Kenneth M; Kreider, Marisa L; Panko, Julie M

    2012-11-08

    Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.

  3. Anthropometric and quantitative EMG status of femoral quadriceps before and after conventional kinesitherapy with and without magnetotherapy.

    PubMed

    Graberski Matasović, M; Matasović, T; Markovac, Z

    1997-06-01

    The frequency of femoral quadriceps muscle hypotrophy has become a significant therapeutic problem. Efforts are being made to improve the standard scheme of kinesitherapeutic treatment by using additional more effective therapeutic methods. Beside kinesitherapy, the authors have used magnetotherapy in 30 of the 60 patients. The total of 60 patients, both sexes, similar age groups and intensity of hypotrophy, were included in the study. They were divided into groups A and B, the experimental and the control one (30 patients each). The treatment was scheduled for the usual 5-6 weeks. Electromyographic quantitative analysis was used to check-up the treatment results achieved after 5 and 6 weeks of treatment period. Analysis of results has confirmed the assumption that magnetotherapy may yield better and faster treatment results, disappearance of pain and decreased risk of complications. The same results were obtained in the experimental group, only one week earlier than in the control group. The EMG quantitative analysis has not proved sufficiently reliable and objective method in the assessment of real condition of the muscle and effects of treatment.

  4. Quantitative transmission Raman spectroscopy of pharmaceutical tablets and capsules.

    PubMed

    Johansson, Jonas; Sparén, Anders; Svensson, Olof; Folestad, Staffan; Claybourn, Mike

    2007-11-01

    Quantitative analysis of pharmaceutical formulations using the new approach of transmission Raman spectroscopy has been investigated. For comparison, measurements were also made in conventional backscatter mode. The experimental setup consisted of a Raman probe-based spectrometer with 785 nm excitation for measurements in backscatter mode. In transmission mode the same system was used to detect the Raman scattered light, while an external diode laser of the same type was used as excitation source. Quantitative partial least squares models were developed for both measurement modes. The results for tablets show that the prediction error for an independent test set was lower for the transmission measurements with a relative root mean square error of about 2.2% as compared with 2.9% for the backscatter mode. Furthermore, the models were simpler in the transmission case, for which only a single partial least squares (PLS) component was required to explain the variation. The main reason for the improvement using the transmission mode is a more representative sampling of the tablets compared with the backscatter mode. Capsules containing mixtures of pharmaceutical powders were also assessed by transmission only. The quantitative results for the capsules' contents were good, with a prediction error of 3.6% w/w for an independent test set. The advantage of transmission Raman over backscatter Raman spectroscopy has been demonstrated for quantitative analysis of pharmaceutical formulations, and the prospects for reliable, lean calibrations for pharmaceutical analysis is discussed.

  5. European Workshop Industrical Computer Science Systems approach to design for safety

    NASA Technical Reports Server (NTRS)

    Zalewski, Janusz

    1992-01-01

    This paper presents guidelines on designing systems for safety, developed by the Technical Committee 7 on Reliability and Safety of the European Workshop on Industrial Computer Systems. The focus is on complementing the traditional development process by adding the following four steps: (1) overall safety analysis; (2) analysis of the functional specifications; (3) designing for safety; (4) validation of design. Quantitative assessment of safety is possible by means of a modular questionnaire covering various aspects of the major stages of system development.

  6. [Reconsidering evaluation criteria regarding health care research: toward an integrative framework of quantitative and qualitative criteria].

    PubMed

    Miyata, Hiroaki; Kai, Ichiro

    2006-05-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.

  7. Molecular design of anticancer drug leads based on three-dimensional quantitative structure-activity relationship.

    PubMed

    Huang, Xiao Yan; Shan, Zhi Jie; Zhai, Hong Lin; Li, Li Na; Zhang, Xiao Yun

    2011-08-22

    Heat shock protein 90 (Hsp90) takes part in the developments of several cancers. Novobiocin, a typically C-terminal inhibitor for Hsp90, will probably used as an important anticancer drug in the future. In this work, we explored the valuable information and designed new novobiocin derivatives based on a three-dimensional quantitative structure-activity relationship (3D QSAR). The comparative molecular field analysis and comparative molecular similarity indices analysis models with high predictive capability were established, and their reliabilities are supported by the statistical parameters. Based on the several important influence factors obtained from these models, six new novobiocin derivatives with higher inhibitory activities were designed and confirmed by the molecular simulation with our models, which provide the potential anticancer drug leads for further research.

  8. LC-MS Data Processing with MAVEN: A Metabolomic Analysis and Visualization Engine

    PubMed Central

    Clasquin, Michelle F.; Melamud, Eugene; Rabinowitz, Joshua D.

    2014-01-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis. PMID:22389014

  9. LC-MS data processing with MAVEN: a metabolomic analysis and visualization engine.

    PubMed

    Clasquin, Michelle F; Melamud, Eugene; Rabinowitz, Joshua D

    2012-03-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis.

  10. Quantitation of sweet steviol glycosides by means of a HILIC-MS/MS-SIDA approach.

    PubMed

    Well, Caroline; Frank, Oliver; Hofmann, Thomas

    2013-11-27

    Meeting the rising consumer demand for natural food ingredients, steviol glycosides, the sweet principle of Stevia rebaudiana Bertoni (Bertoni), have recently been approved as food additives in the European Union. As regulatory constraints require sensitive methods to analyze the sweet-tasting steviol glycosides in foods and beverages, a HILIC-MS/MS method was developed enabling the accurate and reliable quantitation of the major steviol glycosides stevioside, rebaudiosides A-F, steviolbioside, rubusoside, and dulcoside A by using the corresponding deuterated 16,17-dihydrosteviol glycosides as suitable internal standards. This quantitation not only enables the analysis of the individual steviol glycosides in foods and beverages but also can support the optimization of breeding and postharvest downstream processing of Stevia plants to produce preferentially sweet and least bitter tasting Stevia extracts.

  11. Recent advances in mass spectrometry-based proteomics of gastric cancer.

    PubMed

    Kang, Changwon; Lee, Yejin; Lee, J Eugene

    2016-10-07

    The last decade has witnessed remarkable technological advances in mass spectrometry-based proteomics. The development of proteomics techniques has enabled the reliable analysis of complex proteomes, leading to the identification and quantification of thousands of proteins in gastric cancer cells, tissues, and sera. This quantitative information has been used to profile the anomalies in gastric cancer and provide insights into the pathogenic mechanism of the disease. In this review, we mainly focus on the advances in mass spectrometry and quantitative proteomics that were achieved in the last five years and how these up-and-coming technologies are employed to track biochemical changes in gastric cancer cells. We conclude by presenting a perspective on quantitative proteomics and its future applications in the clinic and translational gastric cancer research.

  12. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD.

    PubMed

    Mansur, Sanawar; Abdulla, Rahima; Ayupbec, Amatjan; Aisa, Haji Akbar

    2016-12-21

    A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD) was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa . Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA) of China. In quantitative analysis, the five compounds showed good regression (R² = 0.9995) within the test ranges, and the recovery of the method was in the range of 94.2%-103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa . Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa .

  13. Reliable enumeration of malaria parasites in thick blood films using digital image analysis.

    PubMed

    Frean, John A

    2009-09-23

    Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.

  14. Two-dimensional digital photography for child body posture evaluation: standardized technique, reliable parameters and normative data for age 7-10 years.

    PubMed

    Stolinski, L; Kozinoga, M; Czaprowski, D; Tyrakowski, M; Cerny, P; Suzuki, N; Kotwicki, T

    2017-01-01

    Digital photogrammetry provides measurements of body angles or distances which allow for quantitative posture assessment with or without the use of external markers. It is becoming an increasingly popular tool for the assessment of the musculoskeletal system. The aim of this paper is to present a structured method for the analysis of posture and its changes using a standardized digital photography technique. The purpose of the study was twofold. The first one comprised 91 children (44 girls and 47 boys) aged 7-10 (8.2 ± 1.0), i.e., students of primary school, and its aim was to develop the photographic method, choose the quantitative parameters, and determine the intraobserver reliability (repeatability) along with the interobserver reliability (reproducibility) measurements in sagittal plane using digital photography, as well as to compare the Rippstein plurimeter and digital photography measurements. The second one involved 7782 children (3804 girls, 3978 boys) aged 7-10 (8.4 ± 0.5), who underwent digital photography postural screening. The methods consisted in measuring and calculating selected parameters, establishing the normal ranges of photographic parameters, presenting percentile charts, as well as noticing common pitfalls and possible sources of errors in digital photography. A standardized procedure for the photographic evaluation of child body posture was presented. The photographic measurements revealed very good intra- and inter-rater reliability regarding the five sagittal parameters and good reliability performed against Rippstein plurimeter measurements. The parameters displayed insignificant variability over time. Normative data were calculated based on photographic assessment, while the percentile charts were provided to serve as reference values. The technical errors observed during photogrammetry are carefully discussed in this article. Technical developments are allowed for the regular use of digital photogrammetry in body posture assessment. Specific child positioning (described above) enables us to avoid incidentally modified posture. Image registration is simple, quick, harmless, and cost-effective. The semi-automatic image analysis, together with the normal values and percentile charts, makes the technique reliable in terms of child's posture documentation and corrective therapy effects' monitoring.

  15. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    PubMed

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  16. Differentiation of five body fluids from forensic samples by expression analysis of four microRNAs using quantitative PCR.

    PubMed

    Sauer, Eva; Reinke, Ann-Kathrin; Courts, Cornelius

    2016-05-01

    Applying molecular genetic approaches for the identification of forensically relevant body fluids, which often yield crucial information for the reconstruction of a potential crime, is a current topic of forensic research. Due to their body fluid specific expression patterns and stability against degradation, microRNAs (miRNA) emerged as a promising molecular species, with a range of candidate markers published. The analysis of miRNA via quantitative Real-Time PCR, however, should be based on a relevant strategy of normalization of non-biological variances to deliver reliable and biologically meaningful results. The herein presented work is the as yet most comprehensive study of forensic body fluid identification via miRNA expression analysis based on a thoroughly validated qPCR procedure and unbiased statistical decision making to identify single source samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Developing a database for pedestrians' earthquake emergency evacuation in indoor scenarios.

    PubMed

    Zhou, Junxue; Li, Sha; Nie, Gaozhong; Fan, Xiwei; Tan, Jinxian; Li, Huayue; Pang, Xiaoke

    2018-01-01

    With the booming development of evacuation simulation software, developing an extensive database in indoor scenarios for evacuation models is imperative. In this paper, we conduct a qualitative and quantitative analysis of the collected videotapes and aim to provide a complete and unitary database of pedestrians' earthquake emergency response behaviors in indoor scenarios, including human-environment interactions. Using the qualitative analysis method, we extract keyword groups and keywords that code the response modes of pedestrians and construct a general decision flowchart using chronological organization. Using the quantitative analysis method, we analyze data on the delay time, evacuation speed, evacuation route and emergency exit choices. Furthermore, we study the effect of classroom layout on emergency evacuation. The database for indoor scenarios provides reliable input parameters and allows the construction of real and effective constraints for use in software and mathematical models. The database can also be used to validate the accuracy of evacuation models.

  18. Tissue-based quantitative proteome analysis of human hepatocellular carcinoma using tandem mass tags.

    PubMed

    Megger, Dominik Andre; Rosowski, Kristin; Ahrens, Maike; Bracht, Thilo; Eisenacher, Martin; Schlaak, Jörg F; Weber, Frank; Hoffmann, Andreas-Claudius; Meyer, Helmut E; Baba, Hideo A; Sitek, Barbara

    2017-03-01

    Human hepatocellular carcinoma (HCC) is a severe malignant disease, and accurate and reliable diagnostic markers are still needed. This study was aimed for the discovery of novel marker candidates by quantitative proteomics. Proteomic differences between HCC and nontumorous liver tissue were studied by mass spectrometry. Among several significantly upregulated proteins, translocator protein 18 (TSPO) and Ras-related protein Rab-1A (RAB1A) were selected for verification by immunohistochemistry in an independent cohort. For RAB1A, a high accuracy for the discrimination of HCC and nontumorous liver tissue was observed. RAB1A was verified to be a potent biomarker candidate for HCC.

  19. Quantitative analysis of tympanic membrane perforation: a simple and reliable method.

    PubMed

    Ibekwe, T S; Adeosun, A A; Nwaorgu, O G

    2009-01-01

    Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.

  20. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Treesearch

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  1. Quantitative Analysis of the Rubric as an Assessment Tool: An Empirical Study of Student Peer-Group Rating

    ERIC Educational Resources Information Center

    Hafner, John C.; Hafner, Patti M.

    2003-01-01

    Although the rubric has emerged as one of the most popular assessment tools in progressive educational programs, there is an unfortunate dearth of information in the literature quantifying the actual effectiveness of the rubric as an assessment tool "in the hands of the students." This study focuses on the validity and reliability of the rubric as…

  2. Effect of Sensors on the Reliability and Control Performance of Power Circuits in the Web of Things (WoT)

    PubMed Central

    Bae, Sungwoo; Kim, Myungchin

    2016-01-01

    In order to realize a true WoT environment, a reliable power circuit is required to ensure interconnections among a range of WoT devices. This paper presents research on sensors and their effects on the reliability and response characteristics of power circuits in WoT devices. The presented research can be used in various power circuit applications, such as energy harvesting interfaces, photovoltaic systems, and battery management systems for the WoT devices. As power circuits rely on the feedback from voltage/current sensors, the system performance is likely to be affected by the sensor failure rates, sensor dynamic characteristics, and their interface circuits. This study investigated how the operational availability of the power circuits is affected by the sensor failure rates by performing a quantitative reliability analysis. In the analysis process, this paper also includes the effects of various reconstruction and estimation techniques used in power processing circuits (e.g., energy harvesting circuits and photovoltaic systems). This paper also reports how the transient control performance of power circuits is affected by sensor interface circuits. With the frequency domain stability analysis and circuit simulation, it was verified that the interface circuit dynamics may affect the transient response characteristics of power circuits. The verification results in this paper showed that the reliability and control performance of the power circuits can be affected by the sensor types, fault tolerant approaches against sensor failures, and the response characteristics of the sensor interfaces. The analysis results were also verified by experiments using a power circuit prototype. PMID:27608020

  3. Quantification of dopamine transporters in the mouse brain using ultra-high resolution single-photon emission tomography.

    PubMed

    Acton, Paul D; Choi, Seok-Rye; Plössl, Karl; Kung, Hank F

    2002-05-01

    Functional imaging of small animals, such as mice and rats, using ultra-high resolution positron emission tomography (PET) and single-photon emission tomography (SPET), is becoming a valuable tool for studying animal models of human disease. While several studies have shown the utility of PET imaging in small animals, few have used SPET in real research applications. In this study we aimed to demonstrate the feasibility of using ultra-high resolution SPET in quantitative studies of dopamine transporters (DAT) in the mouse brain. Four healthy ICR male mice were injected with (mean+/-SD) 704+/-154 MBq [(99m)Tc]TRODAT-1, and scanned using an ultra-high resolution SPET system equipped with pinhole collimators (spatial resolution 0.83 mm at 3 cm radius of rotation). Each mouse had two studies, to provide an indication of test-retest reliability. Reference tissue kinetic modeling analysis of the time-activity data in the striatum and cerebellum was used to quantitate the availability of DAT. A simple equilibrium ratio of striatum to cerebellum provided another measure of DAT binding. The SPET imaging results were compared against ex vivo biodistribution data from the striatum and cerebellum. The mean distribution volume ratio (DVR) from the reference tissue kinetic model was 2.17+/-0.34, with a test-retest reliability of 2.63%+/-1.67%. The ratio technique gave similar results (DVR=2.03+/-0.38, test-retest reliability=6.64%+/-3.86%), and the ex vivo analysis gave DVR=2.32+/-0.20. Correlations between the kinetic model and the ratio technique ( R(2)=0.86, P<0.001) and the ex vivo data ( R(2)=0.92, P=0.04) were both excellent. This study demonstrated clearly that ultra-high resolution SPET of small animals is capable of accurate, repeatable, and quantitative measures of DAT binding, and should open up the possibility of further studies of cerebral binding sites in mice using pinhole SPET.

  4. Quantitative Determination of Bioactive Constituents in Noni Juice by High-performance Liquid Chromatography with Electrospray Ionization Triple Quadrupole Mass Spectrometry.

    PubMed

    Yan, Yongqiu; Lu, Yu; Jiang, Shiping; Jiang, Yu; Tong, Yingpeng; Zuo, Limin; Yang, Jun; Gong, Feng; Zhang, Ling; Wang, Ping

    2018-01-01

    Noni juice has been extensively used as folk medicine for the treatment of arthritis, infections, analgesic, colds, cancers, and diabetes by Polynesians for many years. Due to the lack of standard scientific evaluation methods, various kinds of commercial Noni juice with different quality and price were available on the market. To establish a sensitive, reliable, and accurate high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry (HPLC-ESI-MS/MS) method for separation, identification, and simultaneous quantitative analysis of bioactive constituents in Noni juice. The analytes and eight batches of commercially available samples from different origins were separated and analyzed by the HPLC-ESI-MS/MS method on an Agilent ZORBAX SB-C 18 (150 mm × 4.6 mm i.d., 5 μm) column using a gradient elution of acetonitrile-methanol-0.05% glacial acetic acid in water (v/v) at a constant flow rate of 0.5 mL/min. Seven components were identification and all of the assay parameters were within the required limits. Components were within the correlation coefficient values ( R 2 ≥ 0.9993) at the concentration ranges tested. The precision of the assay method was <0.91% and the repeatability between 1.36% and 3.31%. The accuracy varied from 96.40% to 103.02% and the relative standard deviations of stability were <3.91%. Samples from the same origin showed similar content while different origins showed significant different result. The developed methods would provide a reliable basis and be useful in the establishment of a rational quality control standard of Noni juice. Separation, identification, and simultaneous quantitative analysis method of seven bioactive constituents in Noni juice is originally developed by high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometryThe presented method was successfully applied to the quality control of eight batches of commercially available samples of Noni juiceThis method is simple, sensitive, reliable, accurate, and efficient method with strong specificity, good precision, and high recovery rate and provides a reliable basis for quality control of Noni juice. Abbreviations used: HPLC-ESI-MS/MS: High-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry, LOD: Limit of detection, LOQ: Limit of quantitation, S/N: Signal-to-noise ratio, RSD: Relative standard deviations, DP: Declustering potential, CE: Collision energy, MRM: Multiple reaction monitoring, RT: Retention time.

  5. Quantitative Determination of Bioactive Constituents in Noni Juice by High-performance Liquid Chromatography with Electrospray Ionization Triple Quadrupole Mass Spectrometry

    PubMed Central

    Yan, Yongqiu; Lu, Yu; Jiang, Shiping; Jiang, Yu; Tong, Yingpeng; Zuo, Limin; Yang, Jun; Gong, Feng; Zhang, Ling; Wang, Ping

    2018-01-01

    Background: Noni juice has been extensively used as folk medicine for the treatment of arthritis, infections, analgesic, colds, cancers, and diabetes by Polynesians for many years. Due to the lack of standard scientific evaluation methods, various kinds of commercial Noni juice with different quality and price were available on the market. Objective: To establish a sensitive, reliable, and accurate high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry (HPLC-ESI-MS/MS) method for separation, identification, and simultaneous quantitative analysis of bioactive constituents in Noni juice. Materials and Methods: The analytes and eight batches of commercially available samples from different origins were separated and analyzed by the HPLC-ESI-MS/MS method on an Agilent ZORBAX SB-C18 (150 mm × 4.6 mm i.d., 5 μm) column using a gradient elution of acetonitrile-methanol-0.05% glacial acetic acid in water (v/v) at a constant flow rate of 0.5 mL/min. Results: Seven components were identification and all of the assay parameters were within the required limits. Components were within the correlation coefficient values (R2 ≥ 0.9993) at the concentration ranges tested. The precision of the assay method was <0.91% and the repeatability between 1.36% and 3.31%. The accuracy varied from 96.40% to 103.02% and the relative standard deviations of stability were <3.91%. Samples from the same origin showed similar content while different origins showed significant different result. Conclusions: The developed methods would provide a reliable basis and be useful in the establishment of a rational quality control standard of Noni juice. SUMMARY Separation, identification, and simultaneous quantitative analysis method of seven bioactive constituents in Noni juice is originally developed by high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometryThe presented method was successfully applied to the quality control of eight batches of commercially available samples of Noni juiceThis method is simple, sensitive, reliable, accurate, and efficient method with strong specificity, good precision, and high recovery rate and provides a reliable basis for quality control of Noni juice. Abbreviations used: HPLC-ESI-MS/MS: High-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry, LOD: Limit of detection, LOQ: Limit of quantitation, S/N: Signal-to-noise ratio, RSD: Relative standard deviations, DP: Declustering potential, CE: Collision energy, MRM: Multiple reaction monitoring, RT: Retention time. PMID:29576704

  6. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    NASA Astrophysics Data System (ADS)

    Li, Zhiqiang; Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-04-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit.

  7. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    PubMed

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Quantification of the methylation status of the PWS/AS imprinted region: comparison of two approaches based on bisulfite sequencing and methylation-sensitive MLPA.

    PubMed

    Dikow, Nicola; Nygren, Anders Oh; Schouten, Jan P; Hartmann, Carolin; Krämer, Nikola; Janssen, Bart; Zschocke, Johannes

    2007-06-01

    Standard methods used for genomic methylation analysis allow the detection of complete absence of either methylated or non-methylated alleles but are usually unable to detect changes in the proportion of methylated and unmethylated alleles. We compare two methods for quantitative methylation analysis, using the chromosome 15q11-q13 imprinted region as model. Absence of the non-methylated paternal allele in this region leads to Prader-Willi syndrome (PWS) whilst absence of the methylated maternal allele results in Angelman syndrome (AS). A proportion of AS is caused by mosaic imprinting defects which may be missed with standard methods and require quantitative analysis for their detection. Sequence-based quantitative methylation analysis (SeQMA) involves quantitative comparison of peaks generated through sequencing reactions after bisulfite treatment. It is simple, cost-effective and can be easily established for a large number of genes. However, our results support previous suggestions that methods based on bisulfite treatment may be problematic for exact quantification of methylation status. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) avoids bisulfite treatment. It detects changes in both CpG methylation as well as copy number of up to 40 chromosomal sequences in one simple reaction. Once established in a laboratory setting, the method is more accurate, reliable and less time consuming.

  9. Fatty degeneration of the rotator cuff muscles on pre- and postoperative CT arthrography (CTA): is the Goutallier grading system reliable?

    PubMed

    Lee, Eugene; Choi, Jung-Ah; Oh, Joo Han; Ahn, Soyeon; Hong, Sung Hwan; Chai, Jee Won; Kang, Heung Sik

    2013-09-01

    To retrospectively evaluate fatty degeneration (FD) of rotator cuff muscles on CTA using Goutallier's grading system and quantitative measurements with comparison between pre- and postoperative states. IRB approval was obtained for this study. Two radiologists independently reviewed pre- and postoperative CTAs of 43 patients (24 males and 19 females, mean age, 58.1 years) with 46 shoulders confirmed as full-thickness tears with random distribution. FD of supraspinatus, infraspinatus/teres minor, and subscapularis was assessed using Goutallier's system and by quantitative measurements of Hounsfield units (HUs) on sagittal images. Changes in FD grades and HUs were compared between pre- and postoperative CTAs and analyzed with respect to preoperative tear size and postoperative cuff integrity. The correlations between qualitative grades and quantitative measurements and their inter-observer reliabilities were also assessed. There was statistically significant correlation between FD grades and HU measurements of all muscles on pre- and postoperative CTA (p < 0.05). Inter-observer reliability of fatty degeneration grades were excellent to substantial on both pre- and postoperative CTA in supraspinatus (0.8685 and 0.8535) and subscapularis muscles (0.7777 and 0.7972), but fair in infraspinatus/teres minor muscles (0.5791 and 0.5740); however, quantitative Hounsfield units measurements showed excellent reliability for all muscles (ICC: 0.7950 and 0.9346 for SST, 0.7922 and 0.8492 for SSC, and 0.9254 and 0.9052 for IST/TM). No muscle showed improvement of fatty degeneration after surgical repair on qualitative and quantitative assessments; there was no difference in changes of fatty degeneration after surgical repair according to preoperative tear size and post-operative cuff integrity (p > 0.05). The average dose-length product (DLP, mGy · cm) was 365.2 mGy · cm (range, 323.8-417.2 mGy · cm) and estimated average effective dose was 5.1 mSv. Goutallier grades correlated well with HUs of rotator cuff muscles. Reliability was excellent for both systems, except for FD grade of IST/TM muscles, which may be more reliably assessed using quantitative measurements.

  10. Quantitative estimation of the high-intensity zone in the lumbar spine: comparison between the symptomatic and asymptomatic population.

    PubMed

    Liu, Chao; Cai, Hong-Xin; Zhang, Jian-Feng; Ma, Jian-Jun; Lu, Yin-Jiang; Fan, Shun-Wu

    2014-03-01

    The high-intensity zone (HIZ) on magnetic resonance imaging (MRI) has been studied for more than 20 years, but its diagnostic value in low back pain (LBP) is limited by the high incidence in asymptomatic subjects. Little effort has been made to improve the objective assessment of HIZ. To develop quantitative measurements for HIZ and estimate intra- and interobserver reliability and to clarify different signal intensity of HIZ in patients with or without LBP. A measurement reliability and prospective comparative study. A consecutive series of patients with LBP between June 2010 and May 2011 (group A) and a successive series of asymptomatic controls during the same period (group B). Incidence of HIZ; quantitative measures, including area of disc, area and signal intensity of HIZ, and magnetic resonance imaging index; and intraclass correlation coefficients (ICCs) for intra- and interobserver reliability. On the basis of HIZ criteria, a series of quantitative dimension and signal intensity measures was developed for assessing HIZ. Two experienced spine surgeons traced the region of interest twice within 4 weeks for assessment of the intra- and interobserver reliability. The quantitative variables were compared between groups A and B. There were 72 patients with LBP and 79 asymptomatic controls enrolling in this study. The prevalence of HIZ in group A and group B was 45.8% and 20.2%, respectively. The intraobserver agreement was excellent for the quantitative measures (ICC=0.838-0.977) as well as interobserver reliability (ICC=0.809-0.935). The mean signal of HIZ in group A was significantly brighter than in group B (57.55±14.04% vs. 45.61±7.22%, p=.000). There was no statistical difference of area of disc and HIZ between the two groups. The magnetic resonance imaging index was found to be higher in group A when compared with group B (3.94±1.71 vs. 3.06±1.50), but with a p value of .050. A series of quantitative measurements for HIZ was established and demonstrated excellent intra- and interobserver reliability. The signal intensity of HIZ was different in patients with or without LBP, and significant brighter signal was observed in symptomatic subjects. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Isolation and quantification of Quillaja saponaria Molina saponins and lipids in iscom-matrix and iscoms.

    PubMed

    Behboudi, S; Morein, B; Rönnberg, B

    1995-12-01

    In the iscom, multiple copies of antigen are attached by hydrophobic interaction to a matrix which is built up by Quillaja triterpenoid saponins and lipids. Thus, the iscom presents antigen in multimeric form in a small particle with a built-in adjuvant resulting in a highly immunogenic antigen formulation. We have designed a chloroform-methanol-water extraction procedure to isolate the triterpenoid saponins and lipids incorporated into iscom-matrix and iscoms. The triterpenoids in the triterpenoid phase were quantitated using orcinol sulfuric acid detecting their carbohydrate chains and by HPLC. The cholesterol and phosphatidylcholine in the lipid phase were quantitated by HPLC and a commercial colorimetric method for the cholesterol. The quantitative methods showed an almost total separation and recovery of triterpenoids and lipids in their respective phases, while protein was detected in all phases after extraction. The protein content was determined by the method of Lowry and by amino acid analysis. Amino acid analysis was shown to be the reliable method of the two to quantitate proteins in iscoms. In conclusion, simple, reproducible and efficient procedures have been designed to isolate and quantitate the triterpenoids and lipids added for preparation of iscom-matrix and iscoms. The procedures described should also be useful to adequately define constituents in prospective vaccines.

  12. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  13. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework.

    PubMed

    Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.

  14. Quantitative measurement of carbon nanotubes released from their composites by thermal carbon analysis

    NASA Astrophysics Data System (ADS)

    Ogura, I.; Kotake, M.; Ata, S.; Honda, K.

    2017-06-01

    The release of free carbon nanotubes (CNTs) and CNTs partly embedded in matrix debris into the air may occur during mechanical and abrasion processes involving CNT composites. Since the harmful effects of CNT-matrix mixtures have not yet been fully evaluated, it is considered that any exposure to CNTs, including CNT-matrix mixtures, should be measured and controlled. Thermal carbon analysis, such as Method 5040 of the National Institute for Occupational Safety and Health, is one of the most reliable quantitative methods for measuring CNTs in the air. However, when CNTs are released together with polymer matrices, this technique may be inapplicable. In this study, we evaluated the potential for using thermal carbon analysis to determine CNTs in the presence of polymer matrices. Our results showed that thermal carbon analysis was potentially capable of determining CNTs in distinction from polyamide 12, polybutylene terephthalate, polypropylene, and polyoxymethylene. However, it was difficult to determine CNTs in the presence of polyethylene terephthalate, polycarbonate, polyetheretherketone, or polyamide 6.

  15. Use of FTA® classic cards for epigenetic analysis of sperm DNA.

    PubMed

    Serra, Olga; Frazzi, Raffaele; Perotti, Alessio; Barusi, Lorenzo; Buschini, Annamaria

    2018-02-01

    FTA® technologies provide the most reliable method for DNA extraction. Although FTA technologies have been widely used for genetic analysis, there is no literature on their use for epigenetic analysis yet. We present for the first time, a simple method for quantitative methylation assessment based on sperm cells stored on Whatman FTA classic cards. Specifically, elution of seminal DNA from FTA classic cards was successfully tested with an elution buffer and an incubation step in a thermocycler. The eluted DNA was bisulfite converted, amplified by PCR, and a region of interest was pyrosequenced.

  16. An overview of technical considerations when using quantitative real-time PCR analysis of gene expression in human exercise research

    PubMed Central

    Yan, Xu; Bishop, David J.

    2018-01-01

    Gene expression analysis by quantitative PCR in skeletal muscle is routine in exercise studies. The reproducibility and reliability of the data fundamentally depend on how the experiments are performed and interpreted. Despite the popularity of the assay, there is a considerable variation in experimental protocols and data analyses from different laboratories, and there is a lack of consistency of proper quality control steps throughout the assay. In this study, we present a number of experiments on various steps of quantitative PCR workflow, and demonstrate how to perform a quantitative PCR experiment with human skeletal muscle samples in an exercise study. We also tested some common mistakes in performing qPCR. Interestingly, we found that mishandling of muscle for a short time span (10 mins) before RNA extraction did not affect RNA quality, and isolated total RNA was preserved for up to one week at room temperature. Demonstrated by our data, use of unstable reference genes lead to substantial differences in the final results. Alternatively, cDNA content can be used for data normalisation; however, complete removal of RNA from cDNA samples is essential for obtaining accurate cDNA content. PMID:29746477

  17. PIQMIe: a web server for semi-quantitative proteomics data management and analysis

    PubMed Central

    Kuzniar, Arnold; Kanaar, Roland

    2014-01-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. PMID:24861615

  18. Relative quantitation of glycosylation variants by stable isotope labeling of enzymatically released N-glycans using [12C]/[13C] aniline and ZIC-HILIC-ESI-TOF-MS.

    PubMed

    Giménez, Estela; Sanz-Nebot, Victòria; Rizzi, Andreas

    2013-09-01

    Glycan reductive isotope labeling (GRIL) using [(12)C]- and [(13)C]-coded aniline was used for relative quantitation of N-glycans. In a first step, the labeling method by reductive amination was optimized for this reagent. It could be demonstrated that selecting aniline as limiting reactant and using the reductant in excess is critical for achieving high derivatization yields (over 95 %) and good reproducibility (relative standard deviations ∼1-5 % for major and ∼5-10 % for minor N-glycans). In a second step, zwitterionic-hydrophilic interaction liquid chromatography in capillary columns coupled to electrospray mass spectrometry with time-of-flight analyzer (μZIC-HILIC-ESI-TOF-MS) was applied for the analysis of labeled N-glycans released from intact glycoproteins. Ovalbumin, bovine α1-acid-glycoprotein and bovine fetuin were used as test glycoproteins to establish and evaluate the methodology. Excellent separation of isomeric N-glycans and reproducible quantitation via the extracted ion chromatograms indicate a great potential of the proposed methodology for glycoproteomic analysis and for reliable relative quantitation of glycosylation variants in biological samples.

  19. Quantification of EEG reactivity in comatose patients

    PubMed Central

    Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas

    2016-01-01

    Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757

  20. PIQMIe: a web server for semi-quantitative proteomics data management and analysis.

    PubMed

    Kuzniar, Arnold; Kanaar, Roland

    2014-07-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Comparative study between quantitative digital image analysis and fluorescence in situ hybridization of breast cancer equivocal human epidermal growth factor receptors 2 score 2(+) cases.

    PubMed

    Ayad, Essam; Mansy, Mina; Elwi, Dalal; Salem, Mostafa; Salama, Mohamed; Kayser, Klaus

    2015-01-01

    Optimization of workflow for breast cancer samples with equivocal human epidermal growth factor receptors 2 (HER2)/neu score 2(+) results in routine practice, remains to be a central focus of the on-going efforts to assess HER2 status. According to the College of American Pathologists/American Society of Clinical Oncology guidelines equivocal HER2/neu score 2(+) cases are subject for further testing, usually by fluorescence in situ hybridization (FISH) investigations. It still remains on open question, whether quantitative digital image analysis of HER2 immunohistochemistry (IHC) stained slides can assist in further refining the HER2 score 2(+). To assess utility of quantitative digital analysis of IHC stained slides and compare its performance to FISH in cases of breast cancer with equivocal HER2 score 2(+). Fifteen specimens (previously diagnosed as breast cancer and was evaluated as HER 2(-) score 2(+)) represented the study population. Contemporary new cuts were prepared for re-evaluation of HER2 immunohistochemical studies and FISH examination. All the cases were digitally scanned by iScan (Produced by BioImagene [Now Roche-Ventana]). The IHC signals of HER2 were measured using an automated image analyzing system (MECES, www.Diagnomx.eu/meces). Finally, a comparative study was done between the results of the FISH and the quantitative analysis of the virtual slides. Three out of the 15 cases with equivocal HER2 score 2(+), turned out to be positive (3(+)) by quantitative digital analysis, and 12 were found to be negative in FISH too. Two of these three positive cases proved to be positive with FISH, and only one was negative. Quantitative digital analysis is highly sensitive and relatively specific when compared to FISH in detecting HER2/neu overexpression. Therefore, it represents a potential reliable substitute for FISH in breast cancer cases, which desire further refinement of equivocal IHC results.

  2. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    PubMed

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  3. Reliability analysis and fault-tolerant system development for a redundant strapdown inertial measurement unit. [inertial platforms

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.

  4. Quantitative analysis of the rubric as an assessment tool: an empirical study of student peer-group rating

    NASA Astrophysics Data System (ADS)

    Hafner, John C.; Hafner, Patti M.

    2003-12-01

    Although the rubric has emerged as one of the most popular assessment tools in progressive educational programs, there is an unfortunate dearth of information in the literature quantifying the actual effectiveness of the rubric as an assessment tool in the hands of the students. This study focuses on the validity and reliability of the rubric as an assessment tool for student peer-group evaluation in an effort to further explore the use and effectiveness of the rubric. A total of 1577 peer-group ratings using a rubric for an oral presentation was used in this 3-year study involving 107 college biology students. A quantitative analysis of the rubric used in this study shows that it is used consistently by both students and the instructor across the study years. Moreover, the rubric appears to be 'gender neutral' and the students' academic strength has no significant bearing on the way that they employ the rubric. A significant, one-to-one relationship (slope = 1.0) between the instructor's assessment and the students' rating is seen across all years using the rubric. A generalizability study yields estimates of inter-rater reliability of moderate values across all years and allows for the estimation of variance components. Taken together, these data indicate that the general form and evaluative criteria of the rubric are clear and that the rubric is a useful assessment tool for peer-group (and self-) assessment by students. To our knowledge, these data provide the first statistical documentation of the validity and reliability of the rubric for student peer-group assessment.

  5. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  6. Quantitative performance of a polarization diffraction grating polarimeter encoded onto two liquid-crystal-on-silicon displays

    NASA Astrophysics Data System (ADS)

    Cofré, Aarón; Vargas, Asticio; Torres-Ruiz, Fabián A.; Campos, Juan; Lizana, Angel; del Mar Sánchez-López, María; Moreno, Ignacio

    2017-11-01

    We present a quantitative analysis of the performance of a complete snapshot polarimeter based on a polarization diffraction grating (PDGr). The PDGr is generated in a common path polarization interferometer with a Z optical architecture that uses two liquid-crystal on silicon (LCoS) displays to imprint two different phase-only diffraction gratings onto two orthogonal linear states of polarization. As a result, we obtain a programmable PDGr capable to act as a simultaneous polarization state generator (PSG), yielding diffraction orders with different states of polarization. The same system is also shown to operate as a polarization state analyzer (PSA), therefore useful for the realization of a snapshot polarimeter. We analyze its performance using quantitative metrics such as the conditional number, and verify its reliability for the detection of states of polarization.

  7. An Online Risk Monitor System (ORMS) to Increase Safety and Security Levels in Industry

    NASA Astrophysics Data System (ADS)

    Zubair, M.; Rahman, Khalil Ur; Hassan, Mehmood Ul

    2013-12-01

    The main idea of this research is to develop an Online Risk Monitor System (ORMS) based on Living Probabilistic Safety Assessment (LPSA). The article highlights the essential features and functions of ORMS. The basic models and modules such as, Reliability Data Update Model (RDUM), running time update, redundant system unavailability update, Engineered Safety Features (ESF) unavailability update and general system update have been described in this study. ORMS not only provides quantitative analysis but also highlights qualitative aspects of risk measures. ORMS is capable of automatically updating the online risk models and reliability parameters of equipment. ORMS can support in the decision making process of operators and managers in Nuclear Power Plants.

  8. Characterizing ceramics and the interfacial adhesion to resin: I - The relationship of microstructure, composition, properties and fractography.

    PubMed

    Della Bona, Alvaro

    2005-03-01

    The appeal of ceramics as structural dental materials is based on their light weight, high hardness values, chemical inertness, and anticipated unique tribological characteristics. A major goal of current ceramic research and development is to produce tough, strong ceramics that can provide reliable performance in dental applications. Quantifying microstructural parameters is important to develop structure/property relationships. Quantitative microstructural analysis provides an association among the constitution, physical properties, and structural characteristics of materials. Structural reliability of dental ceramics is a major factor in the clinical success of ceramic restorations. Complex stress distributions are present in most practical conditions and strength data alone cannot be directly extrapolated to predict structural performance.

  9. Functional quantitative susceptibility mapping (fQSM).

    PubMed

    Balla, Dávid Z; Sanchez-Panchuelo, Rosa M; Wharton, Samuel J; Hagberg, Gisela E; Scheffler, Klaus; Francis, Susan T; Bowtell, Richard

    2014-10-15

    Blood oxygenation level dependent (BOLD) functional magnetic resonance imaging (fMRI) is a powerful technique, typically based on the statistical analysis of the magnitude component of the complex time-series. Here, we additionally interrogated the phase data of the fMRI time-series and used quantitative susceptibility mapping (QSM) in order to investigate the potential of functional QSM (fQSM) relative to standard magnitude BOLD fMRI. High spatial resolution data (1mm isotropic) were acquired every 3 seconds using zoomed multi-slice gradient-echo EPI collected at 7 T in single orientation (SO) and multiple orientation (MO) experiments, the latter involving 4 repetitions with the subject's head rotated relative to B0. Statistical parametric maps (SPM) were reconstructed for magnitude, phase and QSM time-series and each was subjected to detailed analysis. Several fQSM pipelines were evaluated and compared based on the relative number of voxels that were coincidentally found to be significant in QSM and magnitude SPMs (common voxels). We found that sensitivity and spatial reliability of fQSM relative to the magnitude data depended strongly on the arbitrary significance threshold defining "activated" voxels in SPMs, and on the efficiency of spatio-temporal filtering of the phase time-series. Sensitivity and spatial reliability depended slightly on whether MO or SO fQSM was performed and on the QSM calculation approach used for SO data. Our results present the potential of fQSM as a quantitative method of mapping BOLD changes. We also critically discuss the technical challenges and issues linked to this intriguing new technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Quantitative outcome measures for systemic sclerosis-related Microangiopathy - Reliability of image acquisition in Nailfold Capillaroscopy.

    PubMed

    Dinsdale, Graham; Moore, Tonia; O'Leary, Neil; Berks, Michael; Roberts, Christopher; Manning, Joanne; Allen, John; Anderson, Marina; Cutolo, Maurizio; Hesselstrand, Roger; Howell, Kevin; Pizzorni, Carmen; Smith, Vanessa; Sulli, Alberto; Wildt, Marie; Taylor, Christopher; Murray, Andrea; Herrick, Ariane L

    2017-09-01

    Nailfold capillaroscopic parameters hold increasing promise as outcome measures for clinical trials in systemic sclerosis (SSc). Their inclusion as outcomes would often naturally require capillaroscopy images to be captured at several time points during any one study. Our objective was to assess repeatability of image acquisition (which has been little studied), as well as of measurement. 41 patients (26 with SSc, 15 with primary Raynaud's phenomenon) and 10 healthy controls returned for repeat high-magnification (300×) videocapillaroscopy mosaic imaging of 10 digits one week after initial imaging (as part of a larger study of reliability). Images were assessed in a random order by an expert blinded observer and 4 outcome measures extracted: (1) overall image grade and then (where possible) distal vessel locations were marked, allowing (2) vessel density (across the whole nailfold) to be calculated (3) apex width measurement and (4) giant vessel count. Intra-rater, intra-visit and intra-rater inter-visit (baseline vs. 1week) reliability were examined in 475 and 392 images respectively. A linear, mixed-effects model was used to estimate variance components, from which intra-class correlation coefficients (ICCs) were determined. Intra-visit and inter-visit reliability estimates (ICCs) were (respectively): overall image grade, 0.97 and 0.90; vessel density, 0.92 and 0.65; mean vessel width, 0.91 and 0.79; presence of giant capillary, 0.68 and 0.56. These estimates were conditional on each parameter being measurable. Within-operator image analysis and acquisition are reproducible. Quantitative nailfold capillaroscopy, at least with a single observer, provides reliable outcome measures for clinical studies including randomised controlled trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Quantitation of TGF-beta1 mRNA in porcine mesangial cells by comparative kinetic RT/PCR: comparison with ribonuclease protection assay and in situ hybridization.

    PubMed

    Ceol, M; Forino, M; Gambaro, G; Sauer, U; Schleicher, E D; D'Angelo, A; Anglani, F

    2001-01-01

    Gene expression can be examined with different techniques including ribonuclease protection assay (RPA), in situ hybridisation (ISH), and quantitative reverse transcription-polymerase chain reaction (RT/PCR). These methods differ considerably in their sensitivity and precision in detecting and quantifying low abundance mRNA. Although there is evidence that RT/PCR can be performed in a quantitative manner, the quantitative capacity of this method is generally underestimated. To demonstrate that the comparative kinetic RT/PCR strategy-which uses a housekeeping gene as internal standard-is a quantitative method to detect significant differences in mRNA levels between different samples, the inhibitory effect of heparin on phorbol 12-myristate 13-acetate (PMA)-induced-TGF-beta1 mRNA expression was evaluated by RT/PCR and RPA, the standard method of mRNA quantification, and the results were compared. The reproducibility of RT/PCR amplification was calculated by comparing the quantity of G3PDH and TGF-beta1 PCR products, generated during the exponential phases, estimated from two different RT/PCR (G3PDH, r = 0.968, P = 0.0000; TGF-beta1, r = 0.966, P = 0.0000). The quantitative capacity of comparative kinetic RT/PCR was demonstrated by comparing the results obtained from RPA and RT/PCR using linear regression analysis. Starting from the same RNA extraction, but using only 1% of the RNA for the RT/PCR compared to RPA, significant correlation was observed (r = 0.984, P = 0.0004). Moreover the morphometric analysis of ISH signal was applied for the semi-quantitative evaluation of the expression and localisation of TGF-beta1 mRNA in the entire cell population. Our results demonstrate the close similarity of the RT/PCR and RPA methods in giving quantitative information on mRNA expression and indicate the possibility to adopt the comparative kinetic RT/PCR as reliable quantitative method of mRNA analysis. Copyright 2001 Wiley-Liss, Inc.

  12. In-depth quantitative analysis of the microstructures produced by Surface Mechanical Attrition Treatment (SMAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samih, Y., E-mail: youssef.samih@univ-lorraine.fr; Université de Lorraine, Laboratory of Excellence on Design of Alloy Metals for low-mAss Structures; Beausir, B.

    2013-09-15

    Electron BackScattered Diffraction (EBSD) maps are used to characterize quantitatively the graded microstructure formed by Surface Mechanical Attrition Treatment (SMAT) and applied here to the 316L stainless steel. In particular, the analysis of GNDs – coupled with relevant and reliable criteria – was used to depict the thickness of each zone identified in the SMAT-affected layers: (i) the “ultrafine grain” (UFG) zone present at the extreme top surface, (ii), the “transition zone” where grains were fragmented under the heavy plastic deformation and, finally, (iii) the “deformed zone” where initial grains are simply deformed. The interest of this procedure is illustratedmore » through the comparative analysis of the effect of some SMAT processing parameters (amplitude of vibration and treatment duration). The UFG and transition zones are more significantly modified than the overall affected thickness under our tested conditions. - Highlights: • EBSD maps are used to characterize quantitatively the microstructure of SMAT treated samples. • Calculation of the GND density to quantify strain gradients • A new method to depict the different zone thicknesses in the SMAT affected layer • Effects of SMAT processing parameters on the surface microstructure evolution.« less

  13. Use of a Deuterated Internal Standard with Pyrolysis-GC/MS Dimeric Marker Analysis to Quantify Tire Tread Particles in the Environment

    PubMed Central

    Unice, Kenneth M.; Kreider, Marisa L.; Panko, Julie M.

    2012-01-01

    Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories. PMID:23202830

  14. 40 CFR 795.225 - Dermal pharmacokinetics of DGBE and DGBA.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this section because they will facilitate the work and improve the reliability of quantitative... for this purpose. (ii) Biotransformation after dermal dosing. Appropriate qualitative and quantitative... tabular form. (2) Evaluation of results. All observed results, quantitative or incidental, shall be...

  15. Quantitative analysis of dengue-2 virus RNA during the extrinsic incubation period in individual Aedes aegypti.

    PubMed

    Richardson, Jason; Molina-Cruz, Alvaro; Salazar, Ma Isabel; Black, William

    2006-01-01

    Dengue virus-2 (DENV-2) RNA was quantified from the midgut and legs of individual Aedes aegypti at each of 14 days postinfectious blood meal (dpi) in a DENV-2 susceptible strain from Chetumal, Mexico. A SYBR Green I based strand-specific, quantitative real-time reverse transcription-polymerase chain reaction (RT-PCR) assay was developed. The lower detection and quantitation limits were 20 and 200 copies per reaction, respectively. Amounts of positive and negative strand viral RNA strands were correlated. Numbers of plaque-forming units (PFU) were correlated with DENV-2 RNA copy number in both C6/36 cell cultures and mosquitoes. PFU were consistently lower than RNA copy number by 2-3 log(10). Midgut levels of DENV-2 RNA peaked 8 dpi and fluctuated erratically between 6 and 9 dpi. Copies of DENV-2 RNA varied significantly among infected mosquitoes at each time point. Quantitative real-time RT-PCR is a convenient and reliable method that provides new insights into virus-vector interactions.

  16. Further assessment of a method to estimate reliability and validity of qualitative research findings.

    PubMed

    Hinds, P S; Scandrett-Hibden, S; McAulay, L S

    1990-04-01

    The reliability and validity of qualitative research findings are viewed with scepticism by some scientists. This scepticism is derived from the belief that qualitative researchers give insufficient attention to estimating reliability and validity of data, and the differences between quantitative and qualitative methods in assessing data. The danger of this scepticism is that relevant and applicable research findings will not be used. Our purpose is to describe an evaluative strategy for use with qualitative data, a strategy that is a synthesis of quantitative and qualitative assessment methods. Results of the strategy and factors that influence its use are also described.

  17. Translation into Brazilian Portuguese and validation of the "Quantitative Global Scarring Grading System for Post-acne Scarring" *

    PubMed Central

    Cachafeiro, Thais Hofmann; Escobar, Gabriela Fortes; Maldonado, Gabriela; Cestari, Tania Ferreira

    2014-01-01

    The "Quantitative Global Scarring Grading System for Postacne Scarring" was developed in English for acne scar grading, based on the number and severity of each type of scar. The aims of this study were to translate this scale into Brazilian Portuguese and verify its reliability and validity. The study followed five steps: Translation, Expert Panel, Back Translation, Approval of authors and Validation. The translated scale showed high internal consistency and high test-retest reliability, confirming its reproducibility. Therefore, it has been validated for our population and can be recommended as a reliable instrument to assess acne scarring. PMID:25184939

  18. A human reliability based usability evaluation method for safety-critical software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.

    2006-07-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thusmore » allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)« less

  19. Quantification of Hepcidin-related Iron Accumulation in the Rat Liver.

    PubMed

    Böser, Preethne; Mordashova, Yulia; Maasland, Mark; Trommer, Isabel; Lorenz, Helga; Hafner, Mathias; Seemann, Dietmar; Mueller, Bernhard K; Popp, Andreas

    2016-02-01

    Hepcidin was originally detected as a liver peptide with antimicrobial activity and it functions as a central regulator in the systemic iron metabolism. Consequently suppression of hepcidin leads to iron accumulation in the liver. AbbVie developed a monoclonal antibody ([mAb]; repulsive guidance molecule [RGMa/c] mAb) that downregulates hepcidin expression by influencing the RGMc/bone morphogenetic protein (BMP)/neogenin receptor complex and causes iron deposition in the liver. In a dose range finding study with RGMa/c mAb, rats were treated with different dose levels for a total of 4 weekly doses. The results of this morphometric analysis in the liver showed that iron accumulation is not homogenous between liver lobes and the left lateral lobe was the most responsive lobe in the rat. Quantitative hepcidin messenger RNA analysis showed that the left lateral lobe was the most responsive lobe showing hepcidin downregulation with increasing antibody dose. In addition, the morphometric analysis had higher sensitivity than the chemical iron extraction and quantification using a colorimetric assay. In conclusion, the Prussian blue stain in combination with semi-quantitative and quantitative morphometric analysis is the most reliable method to demonstrate iron accumulation in the liver compared to direct measurement of iron in unfixed tissue using a colorimetric assay. © The Author(s) 2016.

  20. Changes in monosaccharides, organic acids and amino acids during Cabernet Sauvignon wine ageing based on a simultaneous analysis using gas chromatography-mass spectrometry.

    PubMed

    Zhang, Xin-Ke; Lan, Yi-Bin; Zhu, Bao-Qing; Xiang, Xiao-Feng; Duan, Chang-Qing; Shi, Ying

    2018-01-01

    Monosaccharides, organic acids and amino acids are the important flavour-related components in wines. The aim of this article is to develop and validate a method that could simultaneously analyse these compounds in wine based on silylation derivatisation and gas chromatography-mass spectrometry (GC-MS), and apply this method to the investigation of the changes of these compounds and speculate upon their related influences on Cabernet Sauvignon wine flavour during wine ageing. This work presented a new approach for wine analysis and provided more information concerning red wine ageing. This method could simultaneously quantitatively analyse 2 monosaccharides, 8 organic acids and 13 amino acids in wine. A validation experiment showed good linearity, sensitivity, reproducibility and recovery. Multiple derivatives of five amino acids have been found but their effects on quantitative analysis were negligible, except for methionine. The evolution pattern of each category was different, and we speculated that the corresponding mechanisms involving microorganism activities, physical interactions and chemical reactions had a great correlation with red wine flavours during ageing. Simultaneously quantitative analysis of monosaccharides, organic acids and amino acids in wine was feasible and reliable and this method has extensive application prospects. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  1. Recent Achievements in Characterizing the Histone Code and Approaches to Integrating Epigenomics and Systems Biology.

    PubMed

    Janssen, K A; Sidoli, S; Garcia, B A

    2017-01-01

    Functional epigenetic regulation occurs by dynamic modification of chromatin, including genetic material (i.e., DNA methylation), histone proteins, and other nuclear proteins. Due to the highly complex nature of the histone code, mass spectrometry (MS) has become the leading technique in identification of single and combinatorial histone modifications. MS has now overcome antibody-based strategies due to its automation, high resolution, and accurate quantitation. Moreover, multiple approaches to analysis have been developed for global quantitation of posttranslational modifications (PTMs), including large-scale characterization of modification coexistence (middle-down and top-down proteomics), which is not currently possible with any other biochemical strategy. Recently, our group and others have simplified and increased the effectiveness of analyzing histone PTMs by improving multiple MS methods and data analysis tools. This review provides an overview of the major achievements in the analysis of histone PTMs using MS with a focus on the most recent improvements. We speculate that the workflow for histone analysis at its state of the art is highly reliable in terms of identification and quantitation accuracy, and it has the potential to become a routine method for systems biology thanks to the possibility of integrating histone MS results with genomics and proteomics datasets. © 2017 Elsevier Inc. All rights reserved.

  2. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  3. Data-Independent Acquisition-Based Quantitative Proteomic Analysis Reveals Potential Biomarkers of Kidney Cancer.

    PubMed

    Song, Yimeng; Zhong, Lijun; Zhou, Juntuo; Lu, Min; Xing, Tianying; Ma, Lulin; Shen, Jing

    2017-12-01

    Renal cell carcinoma (RCC) is a malignant and metastatic cancer with 95% mortality, and clear cell RCC (ccRCC) is the most observed among the five major subtypes of RCC. Specific biomarkers that can distinguish cancer tissues from adjacent normal tissues should be developed to diagnose this disease in early stages and conduct a reliable prognostic evaluation. Data-independent acquisition (DIA) strategy has been widely employed in proteomic analysis because of various advantages, including enhanced protein coverage and reliable data acquisition. In this study, a DIA workflow is constructed on a quadrupole-Orbitrap LC-MS platform to reveal dysregulated proteins between ccRCC and adjacent normal tissues. More than 4000 proteins are identified, 436 of these proteins are dysregulated in ccRCC tissues. Bioinformatic analysis reveals that multiple pathways and Gene Ontology items are strongly associated with ccRCC. The expression levels of L-lactate dehydrogenase A chain, annexin A4, nicotinamide N-methyltransferase, and perilipin-2 examined through RT-qPCR, Western blot, and immunohistochemistry confirm the validity of the proteomic analysis results. The proposed DIA workflow yields optimum time efficiency and data reliability and provides a good choice for proteomic analysis in biological and clinical studies, and these dysregulated proteins might be potential biomarkers for ccRCC diagnosis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Reliability analysis of repairable systems using Petri nets and vague Lambda-Tau methodology.

    PubMed

    Garg, Harish

    2013-01-01

    The main objective of the paper is to developed a methodology, named as vague Lambda-Tau, for reliability analysis of repairable systems. Petri net tool is applied to represent the asynchronous and concurrent processing of the system instead of fault tree analysis. To enhance the relevance of the reliability study, vague set theory is used for representing the failure rate and repair times instead of classical(crisp) or fuzzy set theory because vague sets are characterized by a truth membership function and false membership functions (non-membership functions) so that sum of both values is less than 1. The proposed methodology involves qualitative modeling using PN and quantitative analysis using Lambda-Tau method of solution with the basic events represented by intuitionistic fuzzy numbers of triangular membership functions. Sensitivity analysis has also been performed and the effects on system MTBF are addressed. The methodology improves the shortcomings of the existing probabilistic approaches and gives a better understanding of the system behavior through its graphical representation. The washing unit of a paper mill situated in a northern part of India, producing approximately 200 ton of paper per day, has been considered to demonstrate the proposed approach. The results may be helpful for the plant personnel for analyzing the systems' behavior and to improve their performance by adopting suitable maintenance strategies. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  6. Developing a postal screening tool for frailty in primary care: a secondary data analysis.

    PubMed

    Kydd, Lauren

    2016-07-01

    The purpose of this secondary data analysis (SDA) was to review a subset of quantitative and qualitative paired data sets from a returned postal screening tool (PST) completed by patients and compare them to the clinical letters composed by elderly care community nurses (ECCN) following patient assessment to ascertain the tool's reliability and validity. The aim was to understand to what extent the problems identified by patients in PSTs aligned with actual or potential problems identified by the ECCNs. The researcher examined this connection to establish whether the PST was a valid, reliable approach to proactive care. The findings of this SDA indicated that patients did understand the PST. Many appropriate referrals were made as a result of the ECCN visit that would not have occurred if the PST had not been sent. This article focuses specifically upon the physiotherapy section as this was the area where the most red flags were identified.

  7. The redoubtable ecological periodic table

    EPA Science Inventory

    Ecological periodic tables are repositories of reliable information on quantitative, predictably recurring (periodic) habitat–community patterns and their uncertainty, scaling and transferability. Their reliability derives from their grounding in sound ecological principle...

  8. Development of the Chinese version of the Hospital Autonomy Questionnaire: a cross-sectional study in Guangdong Province

    PubMed Central

    Liu, Zifeng; Yuan, Lianxiong; Huang, Yixiang; Zhang, Lingling; Luo, Futian

    2016-01-01

    Objective We aimed to develop a questionnaire for quantitative evaluation of the autonomy of public hospitals in China. Method An extensive literature review was conducted to select possible items for inclusion in the questionnaire, which was then reviewed by 5 experts. After a two-round Delphi method, we distributed the questionnaire to 404 secondary and tertiary hospitals in Guangdong Province, China, and 379 completed questionnaires were collected. The final questionnaire was then developed on the basis of the results of exploratory and confirmatory factor analysis. Results Analysis suggested that all internal consistency reliabilities exceeded the minimum reliability standard of 0.70 for the α coefficient. The overall scale coefficient was 0.87, and 6 subscale coefficients were 0.92 (strategic management), 0.81 (budget and expenditure), 0.85 (financing), 0.75 (financing, medical management), 0.86 (human resources) and 0.86 (accountability). Correlation coefficients between and among items and their hypothesised subscales were higher than those with other subscales. The value of average variance extracted (AVE) was higher than 0.5, the value of construct reliability (CR) was higher than 0.7, and the square roots of the AVE of each subscale were larger than the correlation of the specific subscale with the other subscales, supporting the convergent and discriminant validity of the Chinese version of the Hospital Autonomy Questionnaire (CVHAQ). The model fit indices were all acceptable: χ2/df=1.73, Goodness of Fit Index (GFI) = 0.93, Adjusted Goodness of Fit Index (AGFI) = 0.91, Non-Normed Fit Index (NNFI) = 0.96, Comparative Fit Index (CFI) = 0.97, Root Mean Square Error of Approximation (RMSEA) = 0.04, Standardised Root Mean Square Residual (SRMR) = 0.07. Conclusions This study demonstrated the reliability and validity of a CVHAQ and provides a quantitative method for the assessment of hospital autonomy. PMID:26911587

  9. Contextual and perceptual brain processes underlying moral cognition: a quantitative meta-analysis of moral reasoning and moral emotions.

    PubMed

    Sevinc, Gunes; Spreng, R Nathan

    2014-01-01

    Human morality has been investigated using a variety of tasks ranging from judgments of hypothetical dilemmas to viewing morally salient stimuli. These experiments have provided insight into neural correlates of moral judgments and emotions, yet these approaches reveal important differences in moral cognition. Moral reasoning tasks require active deliberation while moral emotion tasks involve the perception of stimuli with moral implications. We examined convergent and divergent brain activity associated with these experimental paradigms taking a quantitative meta-analytic approach. A systematic search of the literature yielded 40 studies. Studies involving explicit decisions in a moral situation were categorized as active (n = 22); studies evoking moral emotions were categorized as passive (n = 18). We conducted a coordinate-based meta-analysis using the Activation Likelihood Estimation to determine reliable patterns of brain activity. Results revealed a convergent pattern of reliable brain activity for both task categories in regions of the default network, consistent with the social and contextual information processes supported by this brain network. Active tasks revealed more reliable activity in the temporoparietal junction, angular gyrus and temporal pole. Active tasks demand deliberative reasoning and may disproportionately involve the retrieval of social knowledge from memory, mental state attribution, and construction of the context through associative processes. In contrast, passive tasks reliably engaged regions associated with visual and emotional information processing, including lingual gyrus and the amygdala. A laterality effect was observed in dorsomedial prefrontal cortex, with active tasks engaging the left, and passive tasks engaging the right. While overlapping activity patterns suggest a shared neural network for both tasks, differential activity suggests that processing of moral input is affected by task demands. The results provide novel insight into distinct features of moral cognition, including the generation of moral context through associative processes and the perceptual detection of moral salience.

  10. Contextual and Perceptual Brain Processes Underlying Moral Cognition: A Quantitative Meta-Analysis of Moral Reasoning and Moral Emotions

    PubMed Central

    Sevinc, Gunes; Spreng, R. Nathan

    2014-01-01

    Background and Objectives Human morality has been investigated using a variety of tasks ranging from judgments of hypothetical dilemmas to viewing morally salient stimuli. These experiments have provided insight into neural correlates of moral judgments and emotions, yet these approaches reveal important differences in moral cognition. Moral reasoning tasks require active deliberation while moral emotion tasks involve the perception of stimuli with moral implications. We examined convergent and divergent brain activity associated with these experimental paradigms taking a quantitative meta-analytic approach. Data Source A systematic search of the literature yielded 40 studies. Studies involving explicit decisions in a moral situation were categorized as active (n = 22); studies evoking moral emotions were categorized as passive (n = 18). We conducted a coordinate-based meta-analysis using the Activation Likelihood Estimation to determine reliable patterns of brain activity. Results & Conclusions Results revealed a convergent pattern of reliable brain activity for both task categories in regions of the default network, consistent with the social and contextual information processes supported by this brain network. Active tasks revealed more reliable activity in the temporoparietal junction, angular gyrus and temporal pole. Active tasks demand deliberative reasoning and may disproportionately involve the retrieval of social knowledge from memory, mental state attribution, and construction of the context through associative processes. In contrast, passive tasks reliably engaged regions associated with visual and emotional information processing, including lingual gyrus and the amygdala. A laterality effect was observed in dorsomedial prefrontal cortex, with active tasks engaging the left, and passive tasks engaging the right. While overlapping activity patterns suggest a shared neural network for both tasks, differential activity suggests that processing of moral input is affected by task demands. The results provide novel insight into distinct features of moral cognition, including the generation of moral context through associative processes and the perceptual detection of moral salience. PMID:24503959

  11. Quantitative Confocal Microscopy Analysis as a Basis for Search and Study of Potassium Kv1.x Channel Blockers

    NASA Astrophysics Data System (ADS)

    Feofanov, Alexey V.; Kudryashova, Kseniya S.; Nekrasova, Oksana V.; Vassilevski, Alexander A.; Kuzmenkov, Alexey I.; Korolkova, Yuliya V.; Grishin, Eugene V.; Kirpichnikov, Mikhail P.

    Artificial KcsA-Kv1.x (x = 1, 3) receptors were recently designed by transferring the ligand-binding site from human Kv1.x voltage-gated potassium channels into corresponding domain of the bacterial KscA channel. We found that KcsA-Kv1.x receptors expressed in E. coli cells are embedded into cell membrane and bind ligands when the cells are transformed to spheroplasts. We supposed that E. coli spheroplasts with membrane-embedded KcsA-Kv1.x and fluorescently labeled ligand agitoxin-2 (R-AgTx2) can be used as elements of an advanced analytical system for search and study of Kv1-channel blockers. To realize this idea, special procedures were developed for measurement and quantitative treatment of fluorescence signals obtained from spheroplast membrane using confocal laser scanning microscopy (CLSM). The worked out analytical "mix and read" systems supported by quantitative CLSM analysis were demonstrated to be reliable alternative to radioligand and electrophysiology techniques in the search and study of selective Kv1.x channel blockers of high scientific and medical importance.

  12. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    PubMed Central

    Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-01-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit. PMID:29765629

  13. THERMAL AND THERMO-MECHANICAL CHARACTERISTICS OF CRYOGENIC MICROCOOLER FOR OPTIMUM PERFORMANCE AND RELIABILITY

    DTIC Science & Technology

    2017-10-19

    consequently, important to obtain relevant experimental data for such short, pin fin channels before finalizing the design of the LN2 microcooler. In the next...must be taken in designing the LD micro pin-fin cooler to reflect these experimental trends. Figure 8: Base Heat Transfer Coefficient vs... Experimental Hybrid Approach Based on Spectral Power Distribution for Quantitative Degradation Analysis of Phosphor Converted LED," Ieee Transactions on

  14. [Development and evaluation of the reliability and validity of an empowerment scale for health promotion volunteers].

    PubMed

    Koyama, Utako; Murayama, Nobuko

    2011-08-01

    This qualitative and quantitative research was conducted to develop an empowerment scale for health promotion volunteers (hereinafter referred to as the ESFHPV), key persons responsible for creating healthy communities. A focus group interview was conducted with four groups of health promotion volunteers from two cities in S Public Health Center of N Prefecture. A qualitative analysis was employed and a 32-item draft scale was created. The reliability and validity of this scale were then evaluated using quantitative methods. A questionnaire survey was conducted in 2009 for all 660 health promotion volunteers across the 2 cities. Of 401 respondents (response rate, 60.8%), 356 (53.9%) provided valid responses and were thus included in the analysis. 1) Internal consistency was confirmed by item-total correlation analysis (I-T analysis), assessment of Cronbach's coefficient alpha for all except one item and good-poor analysis (G-P analysis). Four items were excluded from the 32-item draft scale because of correlation coefficients more than 0.7, leaving 28 items for analysis. 2) Based on the results obtained from the factor analysis performed on the 28 provisional empowerment questions, 28 items were chosen for inclusion in the ESFHPV. These items consisted of four sub-scales, namely 'activity for healthy community' (10 items), 'intention for solving health problems of the community' (10 items), 'democratic organization activity' (four items) and 'growth as individual health promotion volunteers' (four items). 3) The Cronbach's coefficient alpha for the ESFHPV and its four sub-scales were 0.93, 0.88, 0.89, 0.84 and 0.79 respectively. The coefficients of I-T analysis were between 0.33 and 0.69. 4) The health promotion volunteers who attended other community activities demonstrated significantly high scores for the ESFHPV and the four sub-scales. Persons who were above 60 years, had a longer duration of activity as a health promotion volunteer and were housewives showed significantly high scores on the first sub-scale, 'growth as individual health promotion volunteers' To measure the empowerment levels of health promotion volunteers, a 28-item scale was developed and its reliability and validity were confirmed. Health promotion volunteers as well as the public health nurses who assist them can use this scale to assess the empowerment levels of other health promotion volunteers.

  15. 76 FR 13018 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-09

    ... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. Total Burden Estimate for the...

  16. Precise Estimation of Allele Frequencies of Single-Nucleotide Polymorphisms by a Quantitative SSCP Analysis of Pooled DNA

    PubMed Central

    Sasaki, Tomonari; Tahira, Tomoko; Suzuki, Akari; Higasa, Koichiro; Kukita, Yoji; Baba, Shingo; Hayashi, Kenshi

    2001-01-01

    We show that single-nucleotide polymorphisms (SNPs) of moderate to high heterozygosity (minor allele frequencies >10%) can be efficiently detected, and their allele frequencies accurately estimated, by pooling the DNA samples and applying a capillary-based SSCP analysis. In this method, alleles are separated into peaks, and their frequencies can be reliably and accurately quantified from their peak heights (SD <1.8%). We found that as many as 40% of publicly available SNPs that were analyzed by this method have widely differing allele frequency distributions among groups of different ethnicity (parents of Centre d'Etude Polymorphisme Humaine families vs. Japanese individuals). These results demonstrate the effectiveness of the present pooling method in the reevaluation of candidate SNPs that have been collected by examination of limited numbers of individuals. The method should also serve as a robust quantitative technique for studies in which a precise estimate of SNP allele frequencies is essential—for example, in linkage disequilibrium analysis. PMID:11083945

  17. Project risk management in the construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  18. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    DOE PAGES

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta; ...

    2016-08-29

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less

  19. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less

  20. Calibration with MCNP of NaI detector for the determination of natural radioactivity levels in the field.

    PubMed

    Cinelli, Giorgia; Tositti, Laura; Mostacci, Domiziano; Baré, Jonathan

    2016-05-01

    In view of assessing natural radioactivity with on-site quantitative gamma spectrometry, efficiency calibration of NaI(Tl) detectors is investigated. A calibration based on Monte Carlo simulation of detector response is proposed, to render reliable quantitative analysis practicable in field campaigns. The method is developed with reference to contact geometry, in which measurements are taken placing the NaI(Tl) probe directly against the solid source to be analyzed. The Monte Carlo code used for the simulations was MCNP. Experimental verification of the calibration goodness is obtained by comparison with appropriate standards, as reported. On-site measurements yield a quick quantitative assessment of natural radioactivity levels present ((40)K, (238)U and (232)Th). On-site gamma spectrometry can prove particularly useful insofar as it provides information on materials from which samples cannot be taken. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, D.E.

    Preliminary analysis of two populations of Artemisia tridentata compared leaf chemical and physiological characteristics which influence herbivores. The proportion of sixteen of the volatile compounds differed significantly between the two populations; however, total yield of volatiles did not. This initial survey established the reliability of the procedure to quantitatively monitor plant responses to CO/sub 2/ enrichment and suggests that test samples be restricted to a single population. Four sesquiterpene lactones have been selected for the experimental quantitative HPLC analysis; all peaks have been assigned identities and have demonstrated high degree of reproducibility. Growth of Artemisia under high and low lightmore » at three CO/sub 2/ levels demonstrated that this species also undergoes a ''dilution'' of the leaf carbon content and is useful as test species for herbivory response to CO/sub 2/ induced effects. The initial experiment also showed that high irradiance is a necessary growth condition. 10 refs.« less

  2. Quantitative analysis of spatial variability of geotechnical parameters

    NASA Astrophysics Data System (ADS)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  3. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  4. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  5. A comparative uncertainty study of the calibration of macrolide antibiotic reference standards using quantitative nuclear magnetic resonance and mass balance methods.

    PubMed

    Liu, Shu-Yu; Hu, Chang-Qin

    2007-10-17

    This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.

  6. HPAEC-PAD quantification of Haemophilus influenzae type b polysaccharide in upstream and downstream samples.

    PubMed

    van der Put, Robert M F; de Haan, Alex; van den IJssel, Jan G M; Hamidi, Ahd; Beurret, Michel

    2015-11-27

    Due to the rapidly increasing introduction of Haemophilus influenzae type b (Hib) and other conjugate vaccines worldwide during the last decade, reliable and robust analytical methods are needed for the quantitative monitoring of intermediate samples generated during fermentation (upstream processing, USP) and purification (downstream processing, DSP) of polysaccharide vaccine components. This study describes the quantitative characterization of in-process control (IPC) samples generated during the fermentation and purification of the capsular polysaccharide (CPS), polyribosyl-ribitol-phosphate (PRP), derived from Hib. Reliable quantitative methods are necessary for all stages of production; otherwise accurate process monitoring and validation is not possible. Prior to the availability of high performance anion exchange chromatography methods, this polysaccharide was predominantly quantified either with immunochemical methods, or with the colorimetric orcinol method, which shows interference from fermentation medium components and reagents used during purification. Next to an improved high performance anion exchange chromatography-pulsed amperometric detection (HPAEC-PAD) method, using a modified gradient elution, both the orcinol assay and high performance size exclusion chromatography (HPSEC) analyses were evaluated. For DSP samples, it was found that the correlation between the results obtained by HPAEC-PAD specific quantification of the PRP monomeric repeat unit released by alkaline hydrolysis, and those from the orcinol method was high (R(2)=0.8762), and that it was lower between HPAEC-PAD and HPSEC results. Additionally, HPSEC analysis of USP samples yielded surprisingly comparable results to those obtained by HPAEC-PAD. In the early part of the fermentation, medium components interfered with the different types of analysis, but quantitative HPSEC data could still be obtained, although lacking the specificity of the HPAEC-PAD method. Thus, the HPAEC-PAD method has the advantage of giving a specific response compared to the orcinol assay and HPSEC, and does not show interference from various components that can be present in intermediate and purified PRP samples. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Improved sample preparation of glyphosate and methylphosphonic acid by EPA method 6800A and time-of-flight mass spectrometry using novel solid-phase extraction.

    PubMed

    Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip

    2012-02-01

    The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.

  8. Development of a Fourier transform infrared spectroscopy coupled to UV-Visible analysis technique for aminosides and glycopeptides quantitation in antibiotic locks.

    PubMed

    Sayet, G; Sinegre, M; Ben Reguiga, M

    2014-01-01

    Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  9. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    PubMed

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be applied.

  10. Comment on Hall et al. (2017), "How to Choose Between Measures of Tinnitus Loudness for Clinical Research? A Report on the Reliability and Validity of an Investigator-Administered Test and a Patient-Reported Measure Using Baseline Data Collected in a Phase IIa Drug Trial".

    PubMed

    Sabour, Siamak

    2018-03-08

    The purpose of this letter, in response to Hall, Mehta, and Fackrell (2017), is to provide important knowledge about methodology and statistical issues in assessing the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. The author uses reference textbooks and published articles regarding scientific assessment of the validity and reliability of a clinical test to discuss the statistical test and the methodological approach in assessing validity and reliability in clinical research. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess reliability and validity. The qualitative variables of sensitivity, specificity, positive predictive value, negative predictive value, false positive and false negative rates, likelihood ratio positive and likelihood ratio negative, as well as odds ratio (i.e., ratio of true to false results), are the most appropriate estimates to evaluate validity of a test compared to a gold standard. In the case of quantitative variables, depending on distribution of the variable, Pearson r or Spearman rho can be applied. Diagnostic accuracy (validity) and diagnostic precision (reliability or agreement) are two completely different methodological issues. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess validity.

  11. Quantitative Computerized Two-Point Correlation Analysis of Lung CT Scans Correlates With Pulmonary Function in Pulmonary Sarcoidosis

    PubMed Central

    Erdal, Barbaros Selnur; Yildiz, Vedat; King, Mark A.; Patterson, Andrew T.; Knopp, Michael V.; Clymer, Bradley D.

    2012-01-01

    Background: Chest CT scans are commonly used to clinically assess disease severity in patients presenting with pulmonary sarcoidosis. Despite their ability to reliably detect subtle changes in lung disease, the utility of chest CT scans for guiding therapy is limited by the fact that image interpretation by radiologists is qualitative and highly variable. We sought to create a computerized CT image analysis tool that would provide quantitative and clinically relevant information. Methods: We established that a two-point correlation analysis approach reduced the background signal attendant to normal lung structures, such as blood vessels, airways, and lymphatics while highlighting diseased tissue. This approach was applied to multiple lung fields to generate an overall lung texture score (LTS) representing the quantity of diseased lung parenchyma. Using deidentified lung CT scan and pulmonary function test (PFT) data from The Ohio State University Medical Center’s Information Warehouse, we analyzed 71 consecutive CT scans from patients with sarcoidosis for whom simultaneous matching PFTs were available to determine whether the LTS correlated with standard PFT results. Results: We found a high correlation between LTS and FVC, total lung capacity, and diffusing capacity of the lung for carbon monoxide (P < .0001 for all comparisons). Moreover, LTS was equivalent to PFTs for the detection of active lung disease. The image analysis protocol was conducted quickly (< 1 min per study) on a standard laptop computer connected to a publicly available National Institutes of Health ImageJ toolkit. Conclusions: The two-point image analysis tool is highly practical and appears to reliably assess lung disease severity. We predict that this tool will be useful for clinical and research applications. PMID:22628487

  12. 76 FR 12072 - Guidance for Agency Information Collection Activities: Proposed Collection; Comment Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... not statistical surveys that yield quantitative results that can be generalized to the population of... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. No comments were received in response...

  13. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  14. Application of real-time PCR for total airborne bacterial assessment: Comparison with epifluorescence microscopy and culture-dependent methods

    NASA Astrophysics Data System (ADS)

    Rinsoz, Thomas; Duquenne, Philippe; Greff-Mirguet, Guylaine; Oppliger, Anne

    Traditional culture-dependent methods to quantify and identify airborne microorganisms are limited by factors such as short-duration sampling times and inability to count non-culturable or non-viable bacteria. Consequently, the quantitative assessment of bioaerosols is often underestimated. Use of the real-time quantitative polymerase chain reaction (Q-PCR) to quantify bacteria in environmental samples presents an alternative method, which should overcome this problem. The aim of this study was to evaluate the performance of a real-time Q-PCR assay as a simple and reliable way to quantify the airborne bacterial load within poultry houses and sewage treatment plants, in comparison with epifluorescence microscopy and culture-dependent methods. The estimates of bacterial load that we obtained from real-time PCR and epifluorescence methods, are comparable, however, our analysis of sewage treatment plants indicate these methods give values 270-290 fold greater than those obtained by the "impaction on nutrient agar" method. The culture-dependent method of air impaction on nutrient agar was also inadequate in poultry houses, as was the impinger-culture method, which gave a bacterial load estimate 32-fold lower than obtained by Q-PCR. Real-time quantitative PCR thus proves to be a reliable, discerning, and simple method that could be used to estimate airborne bacterial load in a broad variety of other environments expected to carry high numbers of airborne bacteria.

  15. A newly developed spinal simulator.

    PubMed

    Chester, R; Watson, M J

    2000-11-01

    A number of studies indicate poor intra-therapist and inter-therapist reliability in the performance of graded, passive oscillatory movements to the lumbar spine. However, it has been suggested that therapists can be trained to be more consistent in their performance of these techniques if given reliable quantitative feedback. The intention of this study was to develop equipment, analogous to the lumbar spine that could be used for both teaching and research purposes. Equipment has been updated and connected to a personal IBM compatible computer. Custom designed software allows concurrent and accurate feedback to students on their performance and in a form suitable for advanced data analysis using statistical packages. The uses and implications of this equipment are discussed. Copyright 2000 Harcourt Publishers Ltd.

  16. Quantitative O-glycomics based on improvement of the one-pot method for nonreductive O-glycan release and simultaneous stable isotope labeling with 1-(d0/d5)phenyl-3-methyl-5-pyrazolone followed by mass spectrometric analysis.

    PubMed

    Wang, Chengjian; Zhang, Ping; Jin, Wanjun; Li, Lingmei; Qiang, Shan; Zhang, Ying; Huang, Linjuan; Wang, Zhongfu

    2017-01-06

    Rapid, simple and versatile methods for quantitative analysis of glycoprotein O-glycans are urgently required for current studies on protein O-glycosylation patterns and the search for disease O-glycan biomarkers. Relative quantitation of O-glycans using stable isotope labeling followed by mass spectrometric analysis represents an ideal and promising technique. However, it is hindered by the shortage of reliable nonreductive O-glycan release methods as well as the too large or too small inconstant mass difference between the light and heavy isotope form derivatives of O-glycans, which results in difficulties during the recognition and quantitative analysis of O-glycans by mass spectrometry. Herein we report a facile and versatile O-glycan relative quantification strategy, based on an improved one-pot method that can quantitatively achieve nonreductive release and in situ chromophoric labeling of intact mucin-type O-glycans in one step. In this study, the one-pot method is optimized and applied for quantitative O-glycan release and tagging with either non-deuterated (d 0 -) or deuterated (d 5 -) 1-phenyl-3-methyl-5-pyrazolone (PMP). The obtained O-glycan derivatives feature a permanent 10-Da mass difference between the d 0 - and d 5 -PMP forms, allowing complete discrimination and comparative quantification of these isotopically labeled O-glycans by mass spectrometric techniques. Moreover, the d 0 - and d 5 -PMP derivatives of O-glycans also have a relatively high hydrophobicity as well as a strong UV adsorption, especially suitable for high-resolution separation and high-sensitivity detection by RP-HPLC-UV. We have refined the conditions for the one-pot reaction as well as the corresponding sample purification approach. The good quantitation feasibility, reliability and linearity of this strategy have been verified using bovine fetuin and porcine stomach mucin as model O-glycoproteins. Additionally, we have also successfully applied this method to the quantitative O-glycomic comparison between perch and salmon eggs by ESI-MS, MS/MS and online RP-HPLC-UV-ESI-MS/MS, demonstrating its excellent applicability to various complex biological samples. O-Linked glycoproteins, generated via a widely existing glycosylation modification process on serine (Ser) or threonine (Thr) residues of nascent proteins, play essential roles in a series of biological processes. As a type of informational molecule, the O-glycans of these glycoproteins participate directly in these biological mechanisms. Thus, the characteristic differences or changes of O-glycans in expression level usually relate to pathologies of many diseases and represent an important opportunity to uncover the functional mechanisms of various glycoprotein O-glycans. The novel strategy introduced here provides a simple and versatile analytical method for the precise quantitation of glycoprotein O-glycans by mass spectrometry, enabling rapid evaluation of the differences or changes of O-glycans in expression level. It is attractive for the field of quantitative/comparative O-glycomics, which has great significance for exploring the complex structure-function relationship of O-glycans, as well as for the search of O-glycan biomarkers of some major diseases and O-glycan related targets of some drugs. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Quantification of EEG reactivity in comatose patients.

    PubMed

    Hermans, Mathilde C; Westover, M Brandon; van Putten, Michel J A M; Hirsch, Lawrence J; Gaspard, Nicolas

    2016-01-01

    EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet's AC1: 65-70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts' agreement regarding reactivity for each individual case. Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  18. Tissue-Specific Analysis of Secondary Metabolites Creates a Reliable Morphological Criterion for Quality Grading of Polygoni Multiflori Radix.

    PubMed

    Liang, Li; Xu, Jun; Liang, Zhi-Tao; Dong, Xiao-Ping; Chen, Hu-Biao; Zhao, Zhong-Zhen

    2018-05-08

    In commercial herbal markets, Polygoni Multiflori Radix (PMR, the tuberous roots of Polygonum multiflorum Thunb.), a commonly-used Chinese medicinal material, is divided into different grades based on morphological features of size and weight. While more weight and larger size command a higher price, there is no scientific data confirming that the more expensive roots are in fact of better quality. To assess the inherent quality of various grades and of various tissues in PMR and to find reliable morphological indicators of quality, a method combining laser microdissection (LMD) and ultra-performance liquid chromatography triple-quadrupole mass spectrometry (UPLC-QqQ-MS/MS) was applied. Twelve major chemical components were quantitatively determined in both whole material and different tissues of PMR. Determination of the whole material revealed that traditional commercial grades based on size and weight of PRM did not correspond to any significant differences in chemical content. Instead, tissue-specific analysis indicated that the morphological features could be linked with quality in a new way. That is, PMR with broader cork and phloem, as seen in a transverse section, were typically of better quality as these parts are where the bioactive components accumulate. The tissue-specific analysis of secondary metabolites creates a reliable morphological criterion for quality grading of PMR.

  19. Evaluating the Reliability of Emergency Response Systems for Large-Scale Incident Operations

    PubMed Central

    Jackson, Brian A.; Faith, Kay Sullivan; Willis, Henry H.

    2012-01-01

    Abstract The ability to measure emergency preparedness—to predict the likely performance of emergency response systems in future events—is critical for policy analysis in homeland security. Yet it remains difficult to know how prepared a response system is to deal with large-scale incidents, whether it be a natural disaster, terrorist attack, or industrial or transportation accident. This research draws on the fields of systems analysis and engineering to apply the concept of system reliability to the evaluation of emergency response systems. The authors describe a method for modeling an emergency response system; identifying how individual parts of the system might fail; and assessing the likelihood of each failure and the severity of its effects on the overall response effort. The authors walk the reader through two applications of this method: a simplified example in which responders must deliver medical treatment to a certain number of people in a specified time window, and a more complex scenario involving the release of chlorine gas. The authors also describe an exploratory analysis in which they parsed a set of after-action reports describing real-world incidents, to demonstrate how this method can be used to quantitatively analyze data on past response performance. The authors conclude with a discussion of how this method of measuring emergency response system reliability could inform policy discussion of emergency preparedness, how system reliability might be improved, and the costs of doing so. PMID:28083267

  20. Reliability of Various Measurement Stations for Determining Plantar Fascia Thickness and Echogenicity.

    PubMed

    Bisi-Balogun, Adebisi; Cassel, Michael; Mayer, Frank

    2016-04-13

    This study aimed to determine the relative and absolute reliability of ultrasound (US) measurements of the thickness and echogenicity of the plantar fascia (PF) at different measurement stations along its length using a standardized protocol. Twelve healthy subjects (24 feet) were enrolled. The PF was imaged in the longitudinal plane. Subjects were assessed twice to evaluate the intra-rater reliability. A quantitative evaluation of the thickness and echogenicity of the plantar fascia was performed using Image J, a digital image analysis and viewer software. A sonography evaluation of the thickness and echogenicity of the PF showed a high relative reliability with an Intra class correlation coefficient of ≥0.88 at all measurement stations. However, the measurement stations for both the PF thickness and echogenicity which showed the highest intraclass correlation coefficient (ICCs) did not have the highest absolute reliability. Compared to other measurement stations, measuring the PF thickness at 3 cm distal and the echogenicity at a region of interest 1 cm to 2 cm distal from its insertion at the medial calcaneal tubercle showed the highest absolute reliability with the least systematic bias and random error. Also, the reliability was higher using a mean of three measurements compared to one measurement. To reduce discrepancies in the interpretation of the thickness and echogenicity measurements of the PF, the absolute reliability of the different measurement stations should be considered in clinical practice and research rather than the relative reliability with the ICC.

  1. Reliability of Various Measurement Stations for Determining Plantar Fascia Thickness and Echogenicity

    PubMed Central

    Bisi-Balogun, Adebisi; Cassel, Michael; Mayer, Frank

    2016-01-01

    This study aimed to determine the relative and absolute reliability of ultrasound (US) measurements of the thickness and echogenicity of the plantar fascia (PF) at different measurement stations along its length using a standardized protocol. Twelve healthy subjects (24 feet) were enrolled. The PF was imaged in the longitudinal plane. Subjects were assessed twice to evaluate the intra-rater reliability. A quantitative evaluation of the thickness and echogenicity of the plantar fascia was performed using Image J, a digital image analysis and viewer software. A sonography evaluation of the thickness and echogenicity of the PF showed a high relative reliability with an Intra class correlation coefficient of ≥0.88 at all measurement stations. However, the measurement stations for both the PF thickness and echogenicity which showed the highest intraclass correlation coefficient (ICCs) did not have the highest absolute reliability. Compared to other measurement stations, measuring the PF thickness at 3 cm distal and the echogenicity at a region of interest 1 cm to 2 cm distal from its insertion at the medial calcaneal tubercle showed the highest absolute reliability with the least systematic bias and random error. Also, the reliability was higher using a mean of three measurements compared to one measurement. To reduce discrepancies in the interpretation of the thickness and echogenicity measurements of the PF, the absolute reliability of the different measurement stations should be considered in clinical practice and research rather than the relative reliability with the ICC. PMID:27089369

  2. Reliability and precision of pellet-group counts for estimating landscape-level deer density

    Treesearch

    David S. deCalesta

    2013-01-01

    This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...

  3. A brief update on physical and optical disector applications and sectioning-staining methods in neuroscience.

    PubMed

    Yurt, Kıymet Kübra; Kivrak, Elfide Gizem; Altun, Gamze; Mohamed, Hamza; Ali, Fathelrahman; Gasmalla, Hosam Eldeen; Kaplan, Suleyman

    2018-02-26

    A quantitative description of a three-dimensional (3D) object based on two-dimensional images can be made using stereological methods These methods involve unbiased approaches and provide reliable results with quantitative data. The quantitative morphology of the nervous system has been thoroughly researched in this context. In particular, various novel methods such as design-based stereological approaches have been applied in neuoromorphological studies. The main foundations of these methods are systematic random sampling and a 3D approach to structures such as tissues and organs. One key point in these methods is that selected samples should represent the entire structure. Quantification of neurons, i.e. particles, is important for revealing degrees of neurodegeneration and regeneration in an organ or system. One of the most crucial morphometric parameters in biological studies is thus the "number". The disector counting method introduced by Sterio in 1984 is an efficient and reliable solution for particle number estimation. In order to obtain precise results by means of stereological analysis, counting items should be seen clearly in the tissue. If an item in the tissue cannot be seen, these cannot be analyzed even using unbiased stereological techniques. Staining and sectioning processes therefore play a critical role in stereological analysis. The purpose of this review is to evaluate current neuroscientific studies using optical and physical disector counting methods and to discuss their definitions and methodological characteristics. Although the efficiency of the optical disector method in light microscopic studies has been revealed in recent years, the physical disector method is more easily performed in electron microscopic studies. Also, we offered to readers summaries of some common basic staining and sectioning methods, which can be used for stereological techniques in this review. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Assessing Psychodynamic Conflict.

    PubMed

    Simmonds, Joshua; Constantinides, Prometheas; Perry, J Christopher; Drapeau, Martin; Sheptycki, Amanda R

    2015-09-01

    Psychodynamic psychotherapies suggest that symptomatic relief is provided, in part, with the resolution of psychic conflicts. Clinical researchers have used innovative methods to investigate such phenomenon. This article aims to review the literature on quantitative psychodynamic conflict rating scales. An electronic search of the literature was conducted to retrieve quantitative observer-rated scales used to assess conflict noting each measure's theoretical model, information source, and training and clinical experience required. Scales were also examined for levels of reliability and validity. Five quantitative observer-rated conflict scales were identified. Reliability varied from poor to excellent with each measure demonstrating good validity. However a small number of studies and limited links to current conflict theory suggest further clinical research is needed.

  5. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    PubMed

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  6. Multivariate pattern analysis for MEG: A comparison of dissimilarity measures.

    PubMed

    Guggenmos, Matthias; Sterzer, Philipp; Cichy, Radoslaw Martin

    2018-06-01

    Multivariate pattern analysis (MVPA) methods such as decoding and representational similarity analysis (RSA) are growing rapidly in popularity for the analysis of magnetoencephalography (MEG) data. However, little is known about the relative performance and characteristics of the specific dissimilarity measures used to describe differences between evoked activation patterns. Here we used a multisession MEG data set to qualitatively characterize a range of dissimilarity measures and to quantitatively compare them with respect to decoding accuracy (for decoding) and between-session reliability of representational dissimilarity matrices (for RSA). We tested dissimilarity measures from a range of classifiers (Linear Discriminant Analysis - LDA, Support Vector Machine - SVM, Weighted Robust Distance - WeiRD, Gaussian Naïve Bayes - GNB) and distances (Euclidean distance, Pearson correlation). In addition, we evaluated three key processing choices: 1) preprocessing (noise normalisation, removal of the pattern mean), 2) weighting decoding accuracies by decision values, and 3) computing distances in three different partitioning schemes (non-cross-validated, cross-validated, within-class-corrected). Four main conclusions emerged from our results. First, appropriate multivariate noise normalization substantially improved decoding accuracies and the reliability of dissimilarity measures. Second, LDA, SVM and WeiRD yielded high peak decoding accuracies and nearly identical time courses. Third, while using decoding accuracies for RSA was markedly less reliable than continuous distances, this disadvantage was ameliorated by decision-value-weighting of decoding accuracies. Fourth, the cross-validated Euclidean distance provided unbiased distance estimates and highly replicable representational dissimilarity matrices. Overall, we strongly advise the use of multivariate noise normalisation as a general preprocessing step, recommend LDA, SVM and WeiRD as classifiers for decoding and highlight the cross-validated Euclidean distance as a reliable and unbiased default choice for RSA. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Evaluating the Performance of the IEEE Standard 1366 Method for Identifying Major Event Days

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; LaCommare, Kristina Hamachi; Sohn, Michael D.

    IEEE Standard 1366 offers a method for segmenting reliability performance data to isolate the effects of major events from the underlying year-to-year trends in reliability. Recent analysis by the IEEE Distribution Reliability Working Group (DRWG) has found that reliability performance of some utilities differs from the expectations that helped guide the development of the Standard 1366 method. This paper proposes quantitative metrics to evaluate the performance of the Standard 1366 method in identifying major events and in reducing year-to-year variability in utility reliability. The metrics are applied to a large sample of utility-reported reliability data to assess performance of themore » method with alternative specifications that have been considered by the DRWG. We find that none of the alternatives perform uniformly 'better' than the current Standard 1366 method. That is, none of the modifications uniformly lowers the year-to-year variability in System Average Interruption Duration Index without major events. Instead, for any given alternative, while it may lower the value of this metric for some utilities, it also increases it for other utilities (sometimes dramatically). Thus, we illustrate some of the trade-offs that must be considered in using the Standard 1366 method and highlight the usefulness of the metrics we have proposed in conducting these evaluations.« less

  8. Application of high-resolution melting analysis for authenticity testing of valuable Dendrobium commercial products.

    PubMed

    Dong, Xiaoman; Jiang, Chao; Yuan, Yuan; Peng, Daiyin; Luo, Yuqin; Zhao, Yuyang; Huang, Luqi

    2018-01-01

    The accurate identification of botanical origin in commercial products is important to ensure food authenticity and safety for consumers. The Dendrobium species have long been commercialised as functional food supplements and herbal medicines in Asia. Three valuable Dendrobium species, namely Dendrobium officinale, D. huoshanense and D. moniliforme, are often mutually adulterated in trade products in pursuit of higher profit. In this paper, a rapid and reliable semi-quantitative method for identifying the botanical origin of Dendrobium products in terminal markets was developed using high-resolution melting (HRM) analysis with specific primer pairs to target the trnL-F region. The HRM analysis method detected amounts of D. moniliforme adulterants as low as 1% in D. huoshanense or D. officinale products. The results have demonstrated that HRM analysis is a fast and effective tool for the differentiation of these Dendrobium species both for their authenticity as well as for the semi-quantitative determination of the purity of their processed products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  9. Computerized EEG analysis for studying the effect of drugs on the central nervous system.

    PubMed

    Rosadini, G; Cavazza, B; Rodriguez, G; Sannita, W G; Siccardi, A

    1977-11-01

    Samples of our experience in quantitative pharmaco-EEG are reviewed to discuss and define its applicability and limits. Simple processing systems, such as the computation of Hjorth's descriptors, are useful for on-line monitoring of drug-induced EEG modifications which are evident also at the visual visual analysis. Power spectral analysis is suitable to identify and quantify EEG effects not evident at the visual inspection. It demonstrated how the EEG effects of compounds in a long-acting formulation vary according to the sampling time and the explored cerebral area. EEG modifications not detected by power spectral analysis can be defined by comparing statistically (F test) the spectral values of the EEG from a single lead at the different samples (longitudinal comparison), or the spectral values from different leads at any sample (intrahemispheric comparison). The presently available procedures of quantitative pharmaco-EEG are effective when applied to the study of mutltilead EEG recordings in a statistically significant sample of population. They do not seem reliable in the monitoring of directing of neuropyschiatric therapies in single patients, due to individual variability of drug effects.

  10. Failure mode and effects analysis: too little for too much?

    PubMed

    Dean Franklin, Bryony; Shebl, Nada Atef; Barber, Nick

    2012-07-01

    Failure mode and effects analysis (FMEA) is a structured prospective risk assessment method that is widely used within healthcare. FMEA involves a multidisciplinary team mapping out a high-risk process of care, identifying the failures that can occur, and then characterising each of these in terms of probability of occurrence, severity of effects and detectability, to give a risk priority number used to identify failures most in need of attention. One might assume that such a widely used tool would have an established evidence base. This paper considers whether or not this is the case, examining the evidence for the reliability and validity of its outputs, the mathematical principles behind the calculation of a risk prioirty number, and variation in how it is used in practice. We also consider the likely advantages of this approach, together with the disadvantages in terms of the healthcare professionals' time involved. We conclude that although FMEA is popular and many published studies have reported its use within healthcare, there is little evidence to support its use for the quantitative prioritisation of process failures. It lacks both reliability and validity, and is very time consuming. We would not recommend its use as a quantitative technique to prioritise, promote or study patient safety interventions. However, the stage of FMEA involving multidisciplinary mapping process seems valuable and work is now needed to identify the best way of converting this into plans for action.

  11. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  12. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  13. Practical considerations for obtaining high quality quantitative computed tomography data of the skeletal system.

    PubMed

    Troy, Karen L; Edwards, W Brent

    2018-05-01

    Quantitative CT (QCT) analysis involves the calculation of specific parameters such as bone volume and density from CT image data, and can be a powerful tool for understanding bone quality and quantity. However, without careful attention to detail during all steps of the acquisition and analysis process, data can be of poor- to unusable-quality. Good quality QCT for research requires meticulous attention to detail and standardization of all aspects of data collection and analysis to a degree that is uncommon in a clinical setting. Here, we review the literature to summarize practical and technical considerations for obtaining high quality QCT data, and provide examples of how each recommendation affects calculated variables. We also provide an overview of the QCT analysis technique to illustrate additional opportunities to improve data reproducibility and reliability. Key recommendations include: standardizing the scanner and data acquisition settings, minimizing image artifacts, selecting an appropriate reconstruction algorithm, and maximizing repeatability and objectivity during QCT analysis. The goal of the recommendations is to reduce potential sources of error throughout the analysis, from scan acquisition to the interpretation of results. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework

    PubMed Central

    Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028

  15. Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images

    PubMed Central

    Frey, Eric C.; Humm, John L.; Ljungberg, Michael

    2012-01-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429

  16. 76 FR 55725 - Agency Information Collection Activities: Request for Comments for a New Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. The FHWA received no comments in...

  17. Characterization and quantitation of polyolefin microplastics in personal-care products using high-temperature gel-permeation chromatography.

    PubMed

    Hintersteiner, Ingrid; Himmelsbach, Markus; Buchberger, Wolfgang W

    2015-02-01

    In recent years, the development of reliable methods for the quantitation of microplastics in different samples, including evaluating the particles' adverse effects in the marine environment, has become a great concern. Because polyolefins are the most prevalent type of polymer in personal-care products containing microplastics, this study presents a novel approach for their quantitation. The method is suitable for aqueous and hydrocarbon-based products, and includes a rapid sample clean-up involving twofold density separation and a subsequent quantitation with high-temperature gel-permeation chromatography. In contrast with previous procedures, both errors caused by weighing after insufficient separation of plastics and matrix and time-consuming visual sorting are avoided. In addition to reliable quantitative results, in this investigation a comprehensive characterization of the polymer particles isolated from the product matrix, covering size, shape, molecular weight distribution and stabilization, is provided. Results for seven different personal-care products are presented. Recoveries of this method were in the range of 92-96 %.

  18. Reliability and validity of Edinburgh visual gait score as an evaluation tool for children with cerebral palsy.

    PubMed

    Del Pilar Duque Orozco, Maria; Abousamra, Oussama; Church, Chris; Lennon, Nancy; Henley, John; Rogers, Kenneth J; Sees, Julieanne P; Connor, Justin; Miller, Freeman

    2016-09-01

    Assessment of gait abnormalities in cerebral palsy (CP) is challenging, and access to instrumented gait analysis is not always feasible. Therefore, many observational gait analysis scales have been devised. This study aimed to evaluate the interobserver reliability, intraobserver reliability, and validity of Edinburgh visual gait score (EVGS). Video of 30 children with spastic CP were reviewed by 7 raters (10 children each in GMFCS levels I, II, and III, age 6-12 years). Three observers had high level of experience in gait analysis (10+ years), two had medium level (2-5 years) and two had no previous experience (orthopedic fellows). Interobserver reliability was evaluated using percentage of complete agreement and kappa values. Criterion validity was evaluated by comparing EVGS scores with 3DGA data taken from the same video visit. Interobserver agreement was 60-90% and Kappa values were 0.18-0.85 for the 17 items in EVGS. Reliability was higher for distal segments (foot/ankle/knee 63-90%; trunk/pelvis/hip 60-76%), with greater experience (high 66-91%, medium 62-90%, no-experience 41-87%), with more EVGS practice (1st 10 videos 52-88%, last 10 videos 64-97%) and when used with higher functioning children (GMFCS I 65-96%, II 58-90%, III 35-65%). Intraobserver agreement was 64-92%. Agreement between EVGS and 3DGA was 52-73%. We believe that having EVGS as part of the standardized gait evaluation is helpful in optimizing the visual scoring. EVGS can be a supportive tool that adds quantitative data instead of only qualitative assessment to a video only gait evaluation. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Rapid quantitative analysis of 8-iso-prostaglandin-F(2alpha) using liquid chromatography-tandem mass spectrometry and comparison with an enzyme immunoassay method.

    PubMed

    Dahl, Jeffrey H; van Breemen, Richard B

    2010-09-15

    A rapid liquid chromatography-tandem mass spectrometry (LC-MS/MS) assay was developed for the measurement of urinary 8-iso-prostaglandin F(2alpha) (8-iso-PGF(2alpha)), a biomarker of lipid peroxidation. Because urine contains numerous F(2) prostaglandin isomers, each with identical mass and similar mass spectrometric fragmentation patterns, chromatographic separation of 8-iso-PGF(2alpha) from its isomers is necessary for its quantitative analysis using MS/MS. We were able to achieve this separation using an isocratic LC method with a run time of less than 9min, which is at least threefold faster than previous methods, while maintaining sensitivity, accuracy, precision, and reliability. The limits of detection and quantitation were 53 and 178pg/ml urine, respectively. We compared our method with a commercially available affinity purification and enzyme immunoassay kit and found both assays to be in agreement. Despite the high sensitivity of the enzyme immunoassay method, it is more expensive and has a narrower dynamic range than LC-MS/MS. Our method was optimized for rapid measurement of 8-iso-PGF(2alpha) in urine, and it is ideally suited for clinical sample analysis. 2010 Elsevier Inc. All rights reserved.

  20. Projecting technology change to improve space technology planning and systems management

    NASA Astrophysics Data System (ADS)

    Walk, Steven Robert

    2011-04-01

    Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.

  1. An assessment of the reliability of quantitative genetics estimates in study systems with high rate of extra-pair reproduction and low recruitment.

    PubMed

    Bourret, A; Garant, D

    2017-03-01

    Quantitative genetics approaches, and particularly animal models, are widely used to assess the genetic (co)variance of key fitness related traits and infer adaptive potential of wild populations. Despite the importance of precision and accuracy of genetic variance estimates and their potential sensitivity to various ecological and population specific factors, their reliability is rarely tested explicitly. Here, we used simulations and empirical data collected from an 11-year study on tree swallow (Tachycineta bicolor), a species showing a high rate of extra-pair paternity and a low recruitment rate, to assess the importance of identity errors, structure and size of the pedigree on quantitative genetic estimates in our dataset. Our simulations revealed an important lack of precision in heritability and genetic-correlation estimates for most traits, a low power to detect significant effects and important identifiability problems. We also observed a large bias in heritability estimates when using the social pedigree instead of the genetic one (deflated heritabilities) or when not accounting for an important cause of resemblance among individuals (for example, permanent environment or brood effect) in model parameterizations for some traits (inflated heritabilities). We discuss the causes underlying the low reliability observed here and why they are also likely to occur in other study systems. Altogether, our results re-emphasize the difficulties of generalizing quantitative genetic estimates reliably from one study system to another and the importance of reporting simulation analyses to evaluate these important issues.

  2. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  3. Visualizing variations in organizational safety culture across an inter-hospital multifaceted workforce.

    PubMed

    Kobuse, Hiroe; Morishima, Toshitaka; Tanaka, Masayuki; Murakami, Genki; Hirose, Masahiro; Imanaka, Yuichi

    2014-06-01

    To develop a reliable and valid questionnaire that can distinguish features of organizational culture for patient safety across subgroups such as hospitals, professions, management/non-management positions and units/wards. We developed a Hospital Organizational Culture Questionnaire based on a conceptual framework incorporating items from a review of existing literature. The questionnaire was administered to hospital staff including doctors, nurses, allied health personnel, and administrative staff at six public hospitals in Japan. Reliability and validity were assessed through exploratory factor analysis, multitrait scaling analysis, Cronbach's alpha coefficient and multiple regression analysis using staff-perceived achievement of safety as the response variable. Discriminative power across subgroups was assessed with radar chart profiling. Of the 3304 hospital staff surveyed, 2924 (88.5%) responded. After exploratory factor analysis and multitrait analysis, the finalized questionnaire was composed of 24 items in the following eight dimensions: improvement orientation, passion for mission, professional growth, resource allocation prioritization, inter-sectional collaboration, responsibility and authority, teamwork, and information sharing. Construct validity and internal consistency of dimensions were confirmed with multitrait analysis and Cronbach's alpha coefficients, respectively. Multiple regression analysis showed that improvement orientation, passion for mission, resource allocation prioritization and information sharing were significantly associated with higher achievement in safety practices. Our questionnaire tool was able to distinguish features of safety culture among different subgroups. Our questionnaire demonstrated excellent validity and reliability, and revealed distinct cultural patterns among different subgroups. Quantitative assessment of organizational safety culture with this tool may further the understanding of associated characteristics of each subgroup and provide insight into organizational readiness for patient safety improvement. © 2014 John Wiley & Sons, Ltd.

  4. Quantitative Microbial Community Analysis of Three Different Sulfidic Mine Tailing Dumps Generating Acid Mine Drainage▿

    PubMed Central

    Kock, Dagmar; Schippers, Axel

    2008-01-01

    The microbial communities of three different sulfidic and acidic mine waste tailing dumps located in Botswana, Germany, and Sweden were quantitatively analyzed using quantitative real-time PCR (Q-PCR), fluorescence in situ hybridization (FISH), catalyzed reporter deposition-FISH (CARD-FISH), Sybr green II direct counting, and the most probable number (MPN) cultivation technique. Depth profiles of cell numbers showed that the compositions of the microbial communities are greatly different at the three sites and also strongly varied between zones of oxidized and unoxidized tailings. Maximum cell numbers of up to 109 cells g−1 dry weight were determined in the pyrite or pyrrhotite oxidation zones, whereas cell numbers in unoxidized tailings were significantly lower. Bacteria dominated over Archaea and Eukarya at all tailing sites. The acidophilic Fe(II)- and/or sulfur-oxidizing Acidithiobacillus spp. dominated over the acidophilic Fe(II)-oxidizing Leptospirillum spp. among the Bacteria at two sites. The two genera were equally abundant at the third site. The acidophilic Fe(II)- and sulfur-oxidizing Sulfobacillus spp. were generally less abundant. The acidophilic Fe(III)-reducing Acidiphilium spp. could be found at only one site. The neutrophilic Fe(III)-reducing Geobacteraceae as well as the dsrA gene of sulfate reducers were quantifiable at all three sites. FISH analysis provided reliable data only for tailing zones with high microbial activity, whereas CARD-FISH, Q-PCR, Sybr green II staining, and MPN were suitable methods for a quantitative microbial community analysis of tailings in general. PMID:18586975

  5. Quantitative microbial community analysis of three different sulfidic mine tailing dumps generating acid mine drainage.

    PubMed

    Kock, Dagmar; Schippers, Axel

    2008-08-01

    The microbial communities of three different sulfidic and acidic mine waste tailing dumps located in Botswana, Germany, and Sweden were quantitatively analyzed using quantitative real-time PCR (Q-PCR), fluorescence in situ hybridization (FISH), catalyzed reporter deposition-FISH (CARD-FISH), Sybr green II direct counting, and the most probable number (MPN) cultivation technique. Depth profiles of cell numbers showed that the compositions of the microbial communities are greatly different at the three sites and also strongly varied between zones of oxidized and unoxidized tailings. Maximum cell numbers of up to 10(9) cells g(-1) dry weight were determined in the pyrite or pyrrhotite oxidation zones, whereas cell numbers in unoxidized tailings were significantly lower. Bacteria dominated over Archaea and Eukarya at all tailing sites. The acidophilic Fe(II)- and/or sulfur-oxidizing Acidithiobacillus spp. dominated over the acidophilic Fe(II)-oxidizing Leptospirillum spp. among the Bacteria at two sites. The two genera were equally abundant at the third site. The acidophilic Fe(II)- and sulfur-oxidizing Sulfobacillus spp. were generally less abundant. The acidophilic Fe(III)-reducing Acidiphilium spp. could be found at only one site. The neutrophilic Fe(III)-reducing Geobacteraceae as well as the dsrA gene of sulfate reducers were quantifiable at all three sites. FISH analysis provided reliable data only for tailing zones with high microbial activity, whereas CARD-FISH, Q-PCR, Sybr green II staining, and MPN were suitable methods for a quantitative microbial community analysis of tailings in general.

  6. Fast and simultaneous determination of 12 polyphenols in apple peel and pulp by using chemometrics-assisted high-performance liquid chromatography with diode array detection.

    PubMed

    Wang, Tong; Wu, Hai-Long; Xie, Li-Xia; Zhu, Li; Liu, Zhi; Sun, Xiao-Dong; Xiao, Rong; Yu, Ru-Qin

    2017-04-01

    In this work, a smart chemometrics-enhanced strategy, high-performance liquid chromatography, and diode array detection coupled with second-order calibration method based on alternating trilinear decomposition algorithm was proposed to simultaneously quantify 12 polyphenols in different kinds of apple peel and pulp samples. The proposed strategy proved to be a powerful tool to solve the problems of coelution, unknown interferences, and chromatographic shifts in the process of high-performance liquid chromatography analysis, making it possible for the determination of 12 polyphenols in complex apple matrices within 10 min under simple conditions of elution. The average recoveries with standard deviations, and figures of merit including sensitivity, selectivity, limit of detection, and limit of quantitation were calculated to validate the accuracy of the proposed method. Compared to the quantitative analysis results from the classic high-performance liquid chromatography method, the statistical and graphical analysis showed that our proposed strategy obtained more reliable results. All results indicated that our proposed method used in the quantitative analysis of apple polyphenols was an accurate, fast, universal, simple, and green one, and it was expected to be developed as an attractive alternative method for simultaneous determination of multitargeted analytes in complex matrices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Qualification Testing Versus Quantitative Reliability Testing of PV - Gaining Confidence in a Rapidly Changing Technology: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L

    Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less

  8. Estimation of reliability of predictions and model applicability domain evaluation in the analysis of acute toxicity (LD50).

    PubMed

    Sazonovas, A; Japertas, P; Didziapetris, R

    2010-01-01

    This study presents a new type of acute toxicity (LD(50)) prediction that enables automated assessment of the reliability of predictions (which is synonymous with the assessment of the Model Applicability Domain as defined by the Organization for Economic Cooperation and Development). Analysis involved nearly 75,000 compounds from six animal systems (acute rat toxicity after oral and intraperitoneal administration; acute mouse toxicity after oral, intraperitoneal, intravenous, and subcutaneous administration). Fragmental Partial Least Squares (PLS) with 100 bootstraps yielded baseline predictions that were automatically corrected for non-linear effects in local chemical spaces--a combination called Global, Adjusted Locally According to Similarity (GALAS) modelling methodology. Each prediction obtained in this manner is provided with a reliability index value that depends on both compound's similarity to the training set (that accounts for similar trends in LD(50) variations within multiple bootstraps) and consistency of experimental results with regard to the baseline model in the local chemical environment. The actual performance of the Reliability Index (RI) was proven by its good (and uniform) correlations with Root Mean Square Error (RMSE) in all validation sets, thus providing quantitative assessment of the Model Applicability Domain. The obtained models can be used for compound screening in the early stages of drug development and prioritization for experimental in vitro testing or later in vivo animal acute toxicity studies.

  9. Quantitative determination of low-Z elements in single atmospheric particles on boron substrates by automated scanning electron microscopy-energy-dispersive X-ray spectrometry.

    PubMed

    Choël, Marie; Deboudt, Karine; Osán, János; Flament, Pascal; Van Grieken, René

    2005-09-01

    Atmospheric aerosols consist of a complex heterogeneous mixture of particles. Single-particle analysis techniques are known to provide unique information on the size-resolved chemical composition of aerosols. A scanning electron microscope (SEM) combined with a thin-window energy-dispersive X-ray (EDX) detector enables the morphological and elemental analysis of single particles down to 0.1 microm with a detection limit of 1-10 wt %, low-Z elements included. To obtain data statistically representative of the air masses sampled, a computer-controlled procedure can be implemented in order to run hundreds of single-particle analyses (typically 1000-2000) automatically in a relatively short period of time (generally 4-8 h, depending on the setup and on the particle loading). However, automated particle analysis by SEM-EDX raises two practical challenges: the accuracy of the particle recognition and the reliability of the quantitative analysis, especially for micrometer-sized particles with low atomic number contents. Since low-Z analysis is hampered by the use of traditional polycarbonate membranes, an alternate choice of substrate is a prerequisite. In this work, boron is being studied as a promising material for particle microanalysis. As EDX is generally said to probe a volume of approximately 1 microm3, geometry effects arise from the finite size of microparticles. These particle geometry effects must be corrected by means of a robust concentration calculation procedure. Conventional quantitative methods developed for bulk samples generate elemental concentrations considerably in error when applied to microparticles. A new methodology for particle microanalysis, combining the use of boron as the substrate material and a reverse Monte Carlo quantitative program, was tested on standard particles ranging from 0.25 to 10 microm. We demonstrate that the quantitative determination of low-Z elements in microparticles is achievable and that highly accurate results can be obtained using the automatic data processing described here compared to conventional methods.

  10. Comparison among Reconstruction Algorithms for Quantitative Analysis of 11C-Acetate Cardiac PET Imaging.

    PubMed

    Shi, Ximin; Li, Nan; Ding, Haiyan; Dang, Yonghong; Hu, Guilan; Liu, Shuai; Cui, Jie; Zhang, Yue; Li, Fang; Zhang, Hui; Huo, Li

    2018-01-01

    Kinetic modeling of dynamic 11 C-acetate PET imaging provides quantitative information for myocardium assessment. The quality and quantitation of PET images are known to be dependent on PET reconstruction methods. This study aims to investigate the impacts of reconstruction algorithms on the quantitative analysis of dynamic 11 C-acetate cardiac PET imaging. Suspected alcoholic cardiomyopathy patients ( N = 24) underwent 11 C-acetate dynamic PET imaging after low dose CT scan. PET images were reconstructed using four algorithms: filtered backprojection (FBP), ordered subsets expectation maximization (OSEM), OSEM with time-of-flight (TOF), and OSEM with both time-of-flight and point-spread-function (TPSF). Standardized uptake values (SUVs) at different time points were compared among images reconstructed using the four algorithms. Time-activity curves (TACs) in myocardium and blood pools of ventricles were generated from the dynamic image series. Kinetic parameters K 1 and k 2 were derived using a 1-tissue-compartment model for kinetic modeling of cardiac flow from 11 C-acetate PET images. Significant image quality improvement was found in the images reconstructed using iterative OSEM-type algorithms (OSME, TOF, and TPSF) compared with FBP. However, no statistical differences in SUVs were observed among the four reconstruction methods at the selected time points. Kinetic parameters K 1 and k 2 also exhibited no statistical difference among the four reconstruction algorithms in terms of mean value and standard deviation. However, for the correlation analysis, OSEM reconstruction presented relatively higher residual in correlation with FBP reconstruction compared with TOF and TPSF reconstruction, and TOF and TPSF reconstruction were highly correlated with each other. All the tested reconstruction algorithms performed similarly for quantitative analysis of 11 C-acetate cardiac PET imaging. TOF and TPSF yielded highly consistent kinetic parameter results with superior image quality compared with FBP. OSEM was relatively less reliable. Both TOF and TPSF were recommended for cardiac 11 C-acetate kinetic analysis.

  11. Quantitative analysis of in situ optical diagnostics for inferring particle/aggregate parameters in flames: Implications for soot surface growth and total emissivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koeylue, U.O.

    1997-05-01

    An in situ particulate diagnostic/analysis technique is outlined based on the Rayleigh-Debye-Gans polydisperse fractal aggregate (RDG/PFA) scattering interpretation of absolute angular light scattering and extinction measurements. Using proper particle refractive index, the proposed data analysis method can quantitatively yield all aggregate parameters (particle volume fraction, f{sub v}, fractal dimension, D{sub f}, primary particle diameter, d{sub p}, particle number density, n{sub p}, and aggregate size distribution, pdf(N)) without any prior knowledge about the particle-laden environment. The present optical diagnostic/interpretation technique was applied to two different soot-containing laminar and turbulent ethylene/air nonpremixed flames in order to assess its reliability. The aggregate interpretationmore » of optical measurements yielded D{sub f}, d{sub p}, and pdf(N) that are in excellent agreement with ex situ thermophoretic sampling/transmission electron microscope (TS/TEM) observations within experimental uncertainties. However, volume-equivalent single particle models (Rayleigh/Mie) overestimated d{sub p} by about a factor of 3, causing an order of magnitude underestimation in n{sub p}. Consequently, soot surface areas and growth rates were in error by a factor of 3, emphasizing that aggregation effects need to be taken into account when using optical diagnostics for a reliable understanding of soot formation/evolution mechanism in flames. The results also indicated that total soot emissivities were generally underestimated using Rayleigh analysis (up to 50%), mainly due to the uncertainties in soot refractive indices at infrared wavelengths. This suggests that aggregate considerations may not be essential for reasonable radiation heat transfer predictions from luminous flames because of fortuitous error cancellation, resulting in typically a 10 to 30% net effect.« less

  12. Development and psychometric properties of a new social support scale for self-care in middle-aged patients with type II diabetes (S4-MAD)

    PubMed Central

    2012-01-01

    Background Social support has proved to be one of the most effective factors on the success of diabetic self-care. This study aimed to develop a scale for evaluating social support for self-care in middle-aged patients (30–60 years old) with type II diabetes. Methods This was a two-phase qualitative and quantitative study. The study was conducted during 2009 to 2011 in Tehran, Iran. In the qualitative part, a sample of diabetic patients participated in four focus group discussions in order to develop a preliminary item pool. Consequently, content and face validity were performed to provide a pre-final version of the questionnaire. Then, in a quantitative study, reliability (internal consistency and test-retest analysis), validity and factor analysis (both exploratory and confirmatory) were performed to assess psychometric properties of the scale. Results A 38-item questionnaire was developed through the qualitative phase. It was reduced to a 33-item after content validity. Exploratory factor analysis loaded a 30-item with a five-factor solution (nutrition, physical activity, self monitoring of blood glucose, foot care and smoking) that jointly accounted for 72.3% of observed variance. The confirmatory factor analysis indicated a good fit to the data. The Cronbach’s alpha coefficient showed excellent internal consistency (alpha=0.94), and test-retest of the scale with 2-weeks intervals indicated an appropriate stability for the scale (ICC=0.87). Conclusion The findings showed that the designed questionnaire was a valid and reliable instrument for measuring social support for self-care in middle-aged patients with type II diabetes. It is an easy to use questionnaire and contains the most significant diabetes related behaviors that need continuous support for self-care. PMID:23190685

  13. Development of a multidimensional labour satisfaction questionnaire: dimensions, validity, and internal reliability

    PubMed Central

    Smith, L

    2001-01-01

    Background—No published quantitative instrument exists to measure maternal satisfaction with the quality of different models of labour care in the UK. Methods—A quantitative psychometric multidimensional maternal satisfaction questionnaire, the Women's Views of Birth Labour Satisfaction Questionnaire (WOMBLSQ), was developed using principal components analysis with varimax rotation of successive versions. Internal reliability and content and construct validity were assessed. Results—Of 300 women sent the first version (WOMBLSQ1), 120 (40%) replied; of 300 sent WOMBLSQ2, 188 (62.7%) replied; of 500 women sent WOMBLSQ3, 319 (63.8%) replied; and of 2400 women sent WOMBLSQ4, 1683 (70.1%) replied. The latter two versions consisted of 10 dimensions in addition to general satisfaction. These were (Cronbach's alpha): professional support in labour (0.91), expectations of labour (0.90), home assessment in early labour (0.90), holding the baby (0.87), support from husband/partner (0.83), pain relief in labour (0.83), pain relief immediately after labour (0.65), knowing labour carers (0.82), labour environment (0.80), and control in labour (0.62). There were moderate correlations (range 0.16–0.73) between individual dimensions and the general satisfaction scale (0.75). Scores on individual dimensions were significantly related to a range of clinical and demographic variables. Conclusion—This multidimensional labour satisfaction instrument has good validity and internal reliability. It could be used to assess care in labour across different models of maternity care, or as a prelude to in depth exploration of specific areas of concern. Its external reliability and transferability to care outside the South West region needs further evaluation, particularly in terms of ethnicity and social class. Key Words: Women's Views of Birth Labour Satisfaction Questionnaire (WOMBLSQ); labour; questionnaire PMID:11239139

  14. Reliability Analysis of a Glacier Lake Warning System Using a Bayesian Net

    NASA Astrophysics Data System (ADS)

    Sturny, Rouven A.; Bründl, Michael

    2013-04-01

    Beside structural mitigation measures like avalanche defense structures, dams and galleries, warning and alarm systems have become important measures for dealing with Alpine natural hazards. Integrating them into risk mitigation strategies and comparing their effectiveness with structural measures requires quantification of the reliability of these systems. However, little is known about how reliability of warning systems can be quantified and which methods are suitable for comparing their contribution to risk reduction with that of structural mitigation measures. We present a reliability analysis of a warning system located in Grindelwald, Switzerland. The warning system was built for warning and protecting residents and tourists from glacier outburst floods as consequence of a rapid drain of the glacier lake. We have set up a Bayesian Net (BN, BPN) that allowed for a qualitative and quantitative reliability analysis. The Conditional Probability Tables (CPT) of the BN were determined according to manufacturer's reliability data for each component of the system as well as by assigning weights for specific BN nodes accounting for information flows and decision-making processes of the local safety service. The presented results focus on the two alerting units 'visual acoustic signal' (VAS) and 'alerting of the intervention entities' (AIE). For the summer of 2009, the reliability was determined to be 94 % for the VAS and 83 % for the AEI. The probability of occurrence of a major event was calculated as 0.55 % per day resulting in an overall reliability of 99.967 % for the VAS and 99.906 % for the AEI. We concluded that a failure of the VAS alerting unit would be the consequence of a simultaneous failure of the four probes located in the lake and the gorge. Similarly, we deduced that the AEI would fail either if there were a simultaneous connectivity loss of the mobile and fixed network in Grindelwald, an Internet access loss or a failure of the regional operations centre. However, the probability of a common failure of these components was assumed to be low. Overall it can be stated that due to numerous redundancies, the investigated warning system is highly reliable and its influence on risk reduction is very high. Comparable studies in the future are needed to classify these results and to gain more experience how the reliability of warning systems could be determined in practice.

  15. Establishment of quality assurance for respiratory-gated radiotherapy using a respiration-simulating phantom and gamma index: Evaluation of accuracy taking into account tumor motion and respiratory cycle

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Baek, Seong-Min

    2013-11-01

    The purpose of this study is to present a new method of quality assurance (QA) in order to ensure effective evaluation of the accuracy of respiratory-gated radiotherapy (RGR). This would help in quantitatively analyzing the patient's respiratory cycle and respiration-induced tumor motion and in performing a subsequent comparative analysis of dose distributions, using the gamma-index method, as reproduced in our in-house developed respiration-simulating phantom. Therefore, we designed a respiration-simulating phantom capable of reproducing the patient's respiratory cycle and respiration-induced tumor motion and evaluated the accuracy of RGR by estimating its pass rates. We applied the gamma index passing criteria of accepted error ranges of 3% and 3 mm for the dose distribution calculated by using the treatment planning system (TPS) and the actual dose distribution of RGR. The pass rate clearly increased inversely to the gating width chosen. When respiration-induced tumor motion was 12 mm or less, pass rates of 85% and above were achieved for the 30-70% respiratory phase, and pass rates of 90% and above were achieved for the 40-60% respiratory phase. However, a respiratory cycle with a very small fluctuation range of pass rates failed to prove reliable in evaluating the accuracy of RGR. Therefore, accurate and reliable outcomes of radiotherapy will be obtainable only by establishing a novel QA system using the respiration-simulating phantom, the gamma-index analysis, and a quantitative analysis of diaphragmatic motion, enabling an indirect measurement of tumor motion.

  16. The Focinator v2-0 - Graphical Interface, Four Channels, Colocalization Analysis and Cell Phase Identification.

    PubMed

    Oeck, Sebastian; Malewicz, Nathalie M; Hurst, Sebastian; Al-Refae, Klaudia; Krysztofiak, Adam; Jendrossek, Verena

    2017-07-01

    The quantitative analysis of foci plays an important role in various cell biological methods. In the fields of radiation biology and experimental oncology, the effect of ionizing radiation, chemotherapy or molecularly targeted drugs on DNA damage induction and repair is frequently performed by the analysis of protein clusters or phosphorylated proteins recruited to so called repair foci at DNA damage sites, involving for example γ-H2A.X, 53BP1 or RAD51. We recently developed "The Focinator" as a reliable and fast tool for automated quantitative and qualitative analysis of nuclei and DNA damage foci. The refined software is now even more user-friendly due to a graphical interface and further features. Thus, we included an R-script-based mode for automated image opening, file naming, progress monitoring and an error report. Consequently, the evaluation no longer required the attendance of the operator after initial parameter definition. Moreover, the Focinator v2-0 is now able to perform multi-channel analysis of four channels and evaluation of protein-protein colocalization by comparison of up to three foci channels. This enables for example the quantification of foci in cells of a specific cell cycle phase.

  17. Quantitative analysis of cardiovascular MR images.

    PubMed

    van der Geest, R J; de Roos, A; van der Wall, E E; Reiber, J H

    1997-06-01

    The diagnosis of cardiovascular disease requires the precise assessment of both morphology and function. Nearly all aspects of cardiovascular function and flow can be quantified nowadays with fast magnetic resonance (MR) imaging techniques. Conventional and breath-hold cine MR imaging allow the precise and highly reproducible assessment of global and regional left ventricular function. During the same examination, velocity encoded cine (VEC) MR imaging provides measurements of blood flow in the heart and great vessels. Quantitative image analysis often still relies on manual tracing of contours in the images. Reliable automated or semi-automated image analysis software would be very helpful to overcome the limitations associated with the manual and tedious processing of the images. Recent progress in MR imaging of the coronary arteries and myocardial perfusion imaging with contrast media, along with the further development of faster imaging sequences, suggest that MR imaging could evolve into a single technique ('one stop shop') for the evaluation of many aspects of heart disease. As a result, it is very likely that the need for automated image segmentation and analysis software algorithms will further increase. In this paper the developments directed towards the automated image analysis and semi-automated contour detection for cardiovascular MR imaging are presented.

  18. Diagnostics based on nucleic acid sequence variant profiling: PCR, hybridization, and NGS approaches.

    PubMed

    Khodakov, Dmitriy; Wang, Chunyan; Zhang, David Yu

    2016-10-01

    Nucleic acid sequence variations have been implicated in many diseases, and reliable detection and quantitation of DNA/RNA biomarkers can inform effective therapeutic action, enabling precision medicine. Nucleic acid analysis technologies being translated into the clinic can broadly be classified into hybridization, PCR, and sequencing, as well as their combinations. Here we review the molecular mechanisms of popular commercial assays, and their progress in translation into in vitro diagnostics. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Quantitative Determination of Citric and Ascorbic Acid in Powdered Drink Mixes

    ERIC Educational Resources Information Center

    Sigmann, Samuella B.; Wheeler, Dale E.

    2004-01-01

    A procedure by which the reactions are used to quantitatively determine the amount of total acid, the amount of total ascorbic acid and the amount of citric acid in a given sample of powdered drink mix, are described. A safe, reliable and low-cost quantitative method to analyze consumer product for acid content is provided.

  20. Quantitative spectroscopy for the analysis of GOME data

    NASA Technical Reports Server (NTRS)

    Chance, K.

    1997-01-01

    Accurate analysis of the global ozone monitoring experiment (GOME) data to obtain atmospheric constituents requires reliable, traceable spectroscopic parameters for atmospheric absorption and scattering. Results are summarized for research that includes: the re-determination of Rayleigh scattering cross sections and phase functions for the 200 nm to 1000 nm range; the analysis of solar spectra to obtain a high-resolution reference spectrum with excellent absolute vacuum wavelength calibration; Ring effect cross sections and phase functions determined directly from accurate molecular parameters of N2 and O2; O2 A band line intensities and pressure broadening coefficients; and the analysis of absolute accuracies for ultraviolet and visible absorption cross sections of O3 and other trace species measurable by GOME.

  1. Quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria by gas chromatography-mass spectrometry.

    PubMed

    Guan, Wenna; Zhao, Hui; Lu, Xuefeng; Wang, Cong; Yang, Menglong; Bai, Fali

    2011-11-11

    Simple and rapid quantitative determination of fatty-acid-based biofuels is greatly important for the study of genetic engineering progress for biofuels production by microalgae. Ideal biofuels produced from biological systems should be chemically similar to petroleum, like fatty-acid-based molecules including free fatty acids, fatty acid methyl esters, fatty acid ethyl esters, fatty alcohols and fatty alkanes. This study founded a gas chromatography-mass spectrometry (GC-MS) method for simultaneous quantification of seven free fatty acids, nine fatty acid methyl esters, five fatty acid ethyl esters, five fatty alcohols and three fatty alkanes produced by wild-type Synechocystis PCC 6803 and its genetically engineered strain. Data obtained from GC-MS analyses were quantified using internal standard peak area comparisons. The linearity, limit of detection (LOD) and precision (RSD) of the method were evaluated. The results demonstrated that fatty-acid-based biofuels can be directly determined by GC-MS without derivation. Therefore, rapid and reliable quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria can be achieved using the GC-MS method founded in this work. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Quantitative surface topography assessment of directly compressed and roller compacted tablet cores using photometric stereo image analysis.

    PubMed

    Allesø, Morten; Holm, Per; Carstensen, Jens Michael; Holm, René

    2016-05-25

    Surface topography, in the context of surface smoothness/roughness, was investigated by the use of an image analysis technique, MultiRay™, related to photometric stereo, on different tablet batches manufactured either by direct compression or roller compaction. In the present study, oblique illumination of the tablet (darkfield) was considered and the area of cracks and pores in the surface was used as a measure of tablet surface topography; the higher a value, the rougher the surface. The investigations demonstrated a high precision of the proposed technique, which was able to rapidly (within milliseconds) and quantitatively measure the obtained surface topography of the produced tablets. Compaction history, in the form of applied roll force and tablet punch pressure, was also reflected in the measured smoothness of the tablet surfaces. Generally it was found that a higher degree of plastic deformation of the microcrystalline cellulose resulted in a smoother tablet surface. This altogether demonstrated that the technique provides the pharmaceutical developer with a reliable, quantitative response parameter for visual appearance of solid dosage forms, which may be used for process and ultimately product optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Next-generation sequencing coupled with a cell-free display technology for high-throughput production of reliable interactome data

    PubMed Central

    Fujimori, Shigeo; Hirai, Naoya; Ohashi, Hiroyuki; Masuoka, Kazuyo; Nishikimi, Akihiko; Fukui, Yoshinori; Washio, Takanori; Oshikubo, Tomohiro; Yamashita, Tatsuhiro; Miyamoto-Sato, Etsuko

    2012-01-01

    Next-generation sequencing (NGS) has been applied to various kinds of omics studies, resulting in many biological and medical discoveries. However, high-throughput protein-protein interactome datasets derived from detection by sequencing are scarce, because protein-protein interaction analysis requires many cell manipulations to examine the interactions. The low reliability of the high-throughput data is also a problem. Here, we describe a cell-free display technology combined with NGS that can improve both the coverage and reliability of interactome datasets. The completely cell-free method gives a high-throughput and a large detection space, testing the interactions without using clones. The quantitative information provided by NGS reduces the number of false positives. The method is suitable for the in vitro detection of proteins that interact not only with the bait protein, but also with DNA, RNA and chemical compounds. Thus, it could become a universal approach for exploring the large space of protein sequences and interactome networks. PMID:23056904

  4. Quantitative PCR for Genetic Markers of Human Fecal Pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantificationapproach. We report the development of quantitative PCR assays for quantification of two recently described human-...

  5. A novel integrated assessment methodology of urban water reuse.

    PubMed

    Listowski, A; Ngo, H H; Guo, W S; Vigneswaran, S

    2011-01-01

    Wastewater is no longer considered a waste product and water reuse needs to play a stronger part in securing urban water supply. Although treatment technologies for water reclamation have significantly improved the question that deserves further analysis is, how selection of a particular wastewater treatment technology relates to performance and sustainability? The proposed assessment model integrates; (i) technology, characterised by selected quantity and quality performance parameters; (ii) productivity, efficiency and reliability criteria; (iii) quantitative performance indicators; (iv) development of evaluation model. The challenges related to hierarchy and selections of performance indicators have been resolved through the case study analysis. The goal of this study is to validate a new assessment methodology in relation to performance of the microfiltration (MF) technology, a key element of the treatment process. Specific performance data and measurements were obtained at specific Control and Data Acquisition Points (CP) to satisfy the input-output inventory in relation to water resources, products, material flows, energy requirements, chemicals use, etc. Performance assessment process contains analysis and necessary linking across important parametric functions leading to reliable outcomes and results.

  6. Development and validation of the irritable bowel syndrome scale under the system of quality of life instruments for chronic diseases QLICD-IBS: combinations of classical test theory and generalizability theory.

    PubMed

    Lei, Pingguang; Lei, Guanghe; Tian, Jianjun; Zhou, Zengfen; Zhao, Miao; Wan, Chonghua

    2014-10-01

    This paper is aimed to develop the irritable bowel syndrome (IBS) scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-IBS) by the modular approach and validate it by both classical test theory and generalizability theory. The QLICD-IBS was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, and quantitative statistical procedures. One hundred twelve inpatients with IBS were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability, and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t tests and also G studies and D studies of generalizability theory analysis. Multi-trait scaling analysis, correlation, and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. Test-retest reliability coefficients (Pearson r and intra-class correlation (ICC)) for the overall score and all domains were higher than 0.80; the internal consistency α for all domains at two measurements were higher than 0.70 except for the social domain (0.55 and 0.67, respectively). The overall score and scores for all domains/facets had statistically significant changes after treatments with moderate or higher effect size standardized response mean (SRM) ranging from 0.72 to 1.02 at domain levels. G coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-IBS has good validity, reliability, responsiveness, and some highlights and can be used as the quality of life instrument for patients with IBS.

  7. New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks

    NASA Astrophysics Data System (ADS)

    Kurtz, Nolan Scot

    The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.

  8. Development and validation of the coronary heart disease scale under the system of quality of life instruments for chronic diseases QLICD-CHD: combinations of classical test theory and Generalizability Theory.

    PubMed

    Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong

    2014-06-04

    Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be improved.

  9. SDAR 1.0 a New Quantitative Toolkit for Analyze Stratigraphic Data

    NASA Astrophysics Data System (ADS)

    Ortiz, John; Moreno, Carlos; Cardenas, Andres; Jaramillo, Carlos

    2015-04-01

    Since the foundation of stratigraphy geoscientists have recognized that data obtained from stratigraphic columns (SC), two dimensional schemes recording descriptions of both geological and paleontological features (e.g., thickness of rock packages, grain size, fossil and lithological components, and sedimentary structures), are key elements for establishing reliable hypotheses about the distribution in space and time of rock sequences, and ancient sedimentary environmental and paleobiological dynamics. Despite the tremendous advances on the way geoscientists store, plot, and quantitatively analyze sedimentological and paleontological data (e.g., Macrostrat [http://www.macrostrat.org/], Paleobiology Database [http://www.paleodb.org/], respectively), there is still a lack of computational methodologies designed to quantitatively examine data from a highly detailed SCs. Moreover, frequently the stratigraphic information is plotted "manually" using vector graphics editors (e.g., Corel Draw, Illustrator), however, this information although store on a digital format, cannot be used readily for any quantitative analysis. Therefore, any attempt to examine the stratigraphic data in an analytical fashion necessarily takes further steps. Given these issues, we have developed the sofware 'Stratigraphic Data Analysis in R' (SDAR), which stores in a database all sedimentological, stratigraphic, and paleontological information collected from a SC, allowing users to generate high-quality graphic plots (including one or multiple features stored in the database). SDAR also encompasses quantitative analyses helping users to quantify stratigraphic information (e.g. grain size, sorting and rounding, proportion of sand/shale). Finally, given that the SDAR analysis module, has been written in the open-source high-level computer language "R graphics/statistics language" [R Development Core Team, 2014], it is already loaded with many of the crucial features required to accomplish basic and complex tasks of statistical analysis (i.e., R language provide more than hundred spatial libraries that allow users to explore various Geostatistics and spatial analysis). Consequently, SDAR allows a deeper exploration of the stratigraphic data collected in the field, it will allow the geoscientific community in the near future to develop complex analyses related with the distribution in space and time of rock sequences, such as lithofacial correlations, by a multivariate comparison between empirical SCs with quantitative lithofacial models established from modern sedimentary environments.

  10. Non-invasive prenatal detection of achondroplasia using circulating fetal DNA in maternal plasma.

    PubMed

    Lim, Ji Hyae; Kim, Mee Jin; Kim, Shin Young; Kim, Hye Ok; Song, Mee Jin; Kim, Min Hyoung; Park, So Yeon; Yang, Jae Hyug; Ryu, Hyun Mee

    2011-02-01

    To perform a reliable non-invasive detection of the fetal achondroplasia using maternal plasma. We developed a quantitative fluorescent-polymerase chain reaction (QF-PCR) method suitable for detection of the FGFR3 mutation (G1138A) causing achondroplasia. This method was applied in a non-invasive detection of the fetal achondroplasia using circulating fetal-DNA (cf-DNA) in maternal plasma. Maternal plasmas were obtained at 27 weeks of gestational age from women carrying an achondroplasia fetus or a normal fetus. Two percent or less achondroplasia DNA was reliably detected by QF-PCR. In a woman carrying a normal fetus, analysis of cf-DNA showed only one peak of the wild-type G allele. In a woman expected an achondroplasia fetus, analysis of cf-DNA showed the two peaks of wild-type G allele and mutant-type A allele and accurately detected the fetal achondroplasia. The non-invasive method using maternal plasma and QF-PCR may be useful for diagnosis of the fetal achondroplasia.

  11. Reliability and validity of quantifying absolute muscle hardness using ultrasound elastography.

    PubMed

    Chino, Kentaro; Akagi, Ryota; Dohi, Michiko; Fukashiro, Senshi; Takahashi, Hideyuki

    2012-01-01

    Muscle hardness is a mechanical property that represents transverse muscle stiffness. A quantitative method that uses ultrasound elastography for quantifying absolute human muscle hardness has been previously devised; however, its reliability and validity have not been completely verified. This study aimed to verify the reliability and validity of this quantitative method. The Young's moduli of seven tissue-mimicking materials (in vitro; Young's modulus range, 20-80 kPa; increments of 10 kPa) and the human medial gastrocnemius muscle (in vivo) were quantified using ultrasound elastography. On the basis of the strain/Young's modulus ratio of two reference materials, one hard and one soft (Young's moduli of 7 and 30 kPa, respectively), the Young's moduli of the tissue-mimicking materials and medial gastrocnemius muscle were calculated. The intra- and inter-investigator reliability of the method was confirmed on the basis of acceptably low coefficient of variations (≤6.9%) and substantially high intraclass correlation coefficients (≥0.77) obtained from all measurements. The correlation coefficient between the Young's moduli of the tissue-mimicking materials obtained using a mechanical method and ultrasound elastography was 0.996, which was equivalent to values previously obtained using magnetic resonance elastography. The Young's moduli of the medial gastrocnemius muscle obtained using ultrasound elastography were within the range of values previously obtained using magnetic resonance elastography. The reliability and validity of the quantitative method for measuring absolute muscle hardness using ultrasound elastography were thus verified.

  12. Reliability and Validity of Quantifying Absolute Muscle Hardness Using Ultrasound Elastography

    PubMed Central

    Chino, Kentaro; Akagi, Ryota; Dohi, Michiko; Fukashiro, Senshi; Takahashi, Hideyuki

    2012-01-01

    Muscle hardness is a mechanical property that represents transverse muscle stiffness. A quantitative method that uses ultrasound elastography for quantifying absolute human muscle hardness has been previously devised; however, its reliability and validity have not been completely verified. This study aimed to verify the reliability and validity of this quantitative method. The Young’s moduli of seven tissue-mimicking materials (in vitro; Young’s modulus range, 20–80 kPa; increments of 10 kPa) and the human medial gastrocnemius muscle (in vivo) were quantified using ultrasound elastography. On the basis of the strain/Young’s modulus ratio of two reference materials, one hard and one soft (Young’s moduli of 7 and 30 kPa, respectively), the Young’s moduli of the tissue-mimicking materials and medial gastrocnemius muscle were calculated. The intra- and inter-investigator reliability of the method was confirmed on the basis of acceptably low coefficient of variations (≤6.9%) and substantially high intraclass correlation coefficients (≥0.77) obtained from all measurements. The correlation coefficient between the Young’s moduli of the tissue-mimicking materials obtained using a mechanical method and ultrasound elastography was 0.996, which was equivalent to values previously obtained using magnetic resonance elastography. The Young’s moduli of the medial gastrocnemius muscle obtained using ultrasound elastography were within the range of values previously obtained using magnetic resonance elastography. The reliability and validity of the quantitative method for measuring absolute muscle hardness using ultrasound elastography were thus verified. PMID:23029231

  13. Midwifery education and technology enhanced learning: Evaluating online story telling in preregistration midwifery education.

    PubMed

    Scamell, Mandie; Hanley, Thomas

    2018-03-01

    A major issue regarding the implementation of blended learning for preregistration health programmes is the analysis of students' perceptions and attitudes towards their learning. It is the extent of the embedding of Technology Enhanced Learning (TEL) into the higher education curriculum that makes this analysis so vital. This paper reports on the quantitative results of a UK based study that was set up to respond to the apparent disconnect between technology enhanced education provision and reliable student evaluation of this mode of learning. Employing a mixed methods research design, the research described here was carried to develop a reliable and valid evaluation tool to measure acceptability of and satisfaction with a blended learning approach, specifically designed for a preregistration midwifery module offered at level 4. Feasibility testing of 46 completed blended learning evaluation questionnaires - Student Midwife Evaluation of Online Learning Effectiveness (SMEOLE) - using descriptive statistics, reliability and internal consistency tests. Standard deviations and mean scores all followed predicted pattern. Results from the reliability and internal consistency testing confirm the feasibility of SMEOLE as an effective tool for measuring student satisfaction with a blended learning approach to preregistration learning. The analysis presented in this paper suggests that we have been successful in our aim to produce an evaluation tool capable of assessing the quality of technology enhanced, University level learning in Midwifery. This work can provide future benchmarking against which midwifery, and other health, blended learning curriculum planning could be structured and evaluated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Initial description of a quantitative, cross-species (chimpanzee-human) social responsiveness measure

    PubMed Central

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve; Constantino, John; Povinelli, Daniel; Pruett, John R.

    2011-01-01

    Objective Comparative studies of social responsiveness, an ability that is impaired in autistic spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species (human-chimpanzee) social responsiveness measure. Method We translated the Social Responsiveness Scale (SRS), an instrument that quantifies human social responsiveness, into an analogous instrument for chimpanzees. We then retranslated this "Chimp SRS" into a human "Cross-Species SRS" (XSRS). We evaluated three groups of chimpanzees (n=29) with the Chimp SRS and typical and autistic spectrum disorder (ASD) human children (n=20) with the XSRS. Results The Chimp SRS demonstrated strong inter-rater reliability at the three sites (ranges for individual ICCs: .534–.866 and mean ICCs: .851–.970). As has been observed in humans, exploratory principal components analysis of Chimp SRS scores supports a single factor underlying chimpanzee social responsiveness. Human subjects' XSRS scores were fully concordant with their SRS scores (r=.976, p=.001) and distinguished appropriately between typical and ASD subjects. One chimpanzee known for inappropriate social behavior displayed a significantly higher score than all other chimpanzees at its site, demonstrating the scale's ability to detect impaired social responsiveness in chimpanzees. Conclusion Our initial cross-species social responsiveness scale proved reliable and discriminated differences in social responsiveness across (in a relative sense) and within (in a more objectively quantifiable manner) humans and chimpanzees. PMID:21515200

  15. Development of an Electromechanical Grade to Assess Human Knee Articular Cartilage Quality.

    PubMed

    Sim, Sotcheadt; Hadjab, Insaf; Garon, Martin; Quenneville, Eric; Lavigne, Patrick; Buschmann, Michael D

    2017-10-01

    Quantitative assessments of articular cartilage function are needed to aid clinical decision making. Our objectives were to develop a new electromechanical grade to assess quantitatively cartilage quality and test its reliability. Electromechanical properties were measured using a hand-held electromechanical probe on 200 human articular surfaces from cadaveric donors and osteoarthritic patients. These data were used to create a reference electromechanical property database and to compare with visual arthroscopic International Cartilage Repair Society (ICRS) grading of cartilage degradation. The effect of patient-specific and location-specific characteristics on electromechanical properties was investigated to construct a continuous and quantitative electromechanical grade analogous to ICRS grade. The reliability of this novel grade was assessed by comparing it with ICRS grades on 37 human articular surfaces. Electromechanical properties were not affected by patient-specific characteristics for each ICRS grade, but were significantly different across the articular surface. Electromechanical properties varied linearly with ICRS grade, leading to a simple linear transformation from one scale to the other. The electromechanical grade correlated strongly with ICRS grade (r = 0.92, p < 0.0001). Additionally, the electromechanical grade detected lesions that were not found visually. This novel grade can assist the surgeon in assessing human knee cartilage by providing a quantitative and reliable grading system.

  16. Absolute Quantification of Middle- to High-Abundant Plasma Proteins via Targeted Proteomics.

    PubMed

    Dittrich, Julia; Ceglarek, Uta

    2017-01-01

    The increasing number of peptide and protein biomarker candidates requires expeditious and reliable quantification strategies. The utilization of liquid chromatography coupled to quadrupole tandem mass spectrometry (LC-MS/MS) for the absolute quantitation of plasma proteins and peptides facilitates the multiplexed verification of tens to hundreds of biomarkers from smallest sample quantities. Targeted proteomics assays derived from bottom-up proteomics principles rely on the identification and analysis of proteotypic peptides formed in an enzymatic digestion of the target protein. This protocol proposes a procedure for the establishment of a targeted absolute quantitation method for middle- to high-abundant plasma proteins waiving depletion or enrichment steps. Essential topics as proteotypic peptide identification and LC-MS/MS method development as well as sample preparation and calibration strategies are described in detail.

  17. Freddie Mercury-acoustic analysis of speaking fundamental frequency, vibrato, and subharmonics.

    PubMed

    Herbst, Christian T; Hertegard, Stellan; Zangger-Borch, Daniel; Lindestad, Per-Åke

    2017-04-01

    Freddie Mercury was one of the twentieth century's best-known singers of commercial contemporary music. This study presents an acoustical analysis of his voice production and singing style, based on perceptual and quantitative analysis of publicly available sound recordings. Analysis of six interviews revealed a median speaking fundamental frequency of 117.3 Hz, which is typically found for a baritone voice. Analysis of voice tracks isolated from full band recordings suggested that the singing voice range was 37 semitones within the pitch range of F#2 (about 92.2 Hz) to G5 (about 784 Hz). Evidence for higher phonations up to a fundamental frequency of 1,347 Hz was not deemed reliable. Analysis of 240 sustained notes from 21 a-cappella recordings revealed a surprisingly high mean fundamental frequency modulation rate (vibrato) of 7.0 Hz, reaching the range of vocal tremor. Quantitative analysis utilizing a newly introduced parameter to assess the regularity of vocal vibrato corroborated its perceptually irregular nature, suggesting that vibrato (ir)regularity is a distinctive feature of the singing voice. Imitation of subharmonic phonation samples by a professional rock singer, documented by endoscopic high-speed video at 4,132 frames per second, revealed a 3:1 frequency locked vibratory pattern of vocal folds and ventricular folds.

  18. Misinformation on vaccination: A quantitative analysis of YouTube videos.

    PubMed

    Donzelli, Gabriele; Palomba, Giacomo; Federigi, Ileana; Aquino, Francesco; Cioni, Lorenzo; Verani, Marco; Carducci, Annalaura; Lopalco, Pierluigi

    2018-03-19

    In Italy, the phenomenon of vaccine hesitancy has increased with time and represents a complex problem that requires a continuous monitoring. Misinformation on media and social media seems to be one of the determinants of the vaccine hesitancy since, for instance, 42.8 percent of Italian citizens used the internet to obtain vaccine information in 2016. This article reports a quantitative analysis of 560 YouTube videos related to the link between vaccines and autism or other serious side effects on children. The analysis revealed that most of the videos were negative in tone and that the annual number of uploaded videos has increased during the considered period, that goes from 27 December 2007 to 31 July 2017, with a peak of 224 videos in the first seven months of 2017. These findings suggest that the public institutions should be more engaged in establishing a web presence in order to provide reliable information, answers, stories, and videos so to respond to questions of the public about vaccination. These actions could be useful to allow citizens to make informed decisions about vaccines so to comply with vaccination regulations.

  19. Combining FT-IR spectroscopy and multivariate analysis for qualitative and quantitative analysis of the cell wall composition changes during apples development.

    PubMed

    Szymanska-Chargot, M; Chylinska, M; Kruk, B; Zdunek, A

    2015-01-22

    The aim of this work was to quantitatively and qualitatively determine the composition of the cell wall material from apples during development by means of Fourier transform infrared (FT-IR) spectroscopy. The FT-IR region of 1500-800 cm(-1), containing characteristic bands for galacturonic acid, hemicellulose and cellulose, was examined using principal component analysis (PCA), k-means clustering and partial least squares (PLS). The samples were differentiated by development stage and cultivar using PCA and k-means clustering. PLS calibration models for galacturonic acid, hemicellulose and cellulose content from FT-IR spectra were developed and validated with the reference data. PLS models were tested using the root-mean-square errors of cross-validation for contents of galacturonic acid, hemicellulose and cellulose which was 8.30 mg/g, 4.08% and 1.74%, respectively. It was proven that FT-IR spectroscopy combined with chemometric methods has potential for fast and reliable determination of the main constituents of fruit cell walls. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Quantitative PCR for genetic markers of human fecal pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantification approach. We report the development of quantitative PCR assays for enumeration of two recently described hum...

  1. Quantification of mitral valve morphology with three-dimensional echocardiography--can measurement lead to better management?

    PubMed

    Lee, Alex Pui-Wai; Fang, Fang; Jin, Chun-Na; Kam, Kevin Ka-Ho; Tsui, Gary K W; Wong, Kenneth K Y; Looi, Jen-Li; Wong, Randolph H L; Wan, Song; Sun, Jing Ping; Underwood, Malcolm J; Yu, Cheuk-Man

    2014-01-01

    The mitral valve (MV) has complex 3-dimensional (3D) morphology and motion. Advance in real-time 3D echocardiography (RT3DE) has revolutionized clinical imaging of the MV by providing clinicians with realistic visualization of the valve. Thus far, RT3DE of the MV structure and dynamics has adopted an approach that depends largely on subjective and qualitative interpretation of the 3D images of the valve, rather than objective and reproducible measurement. RT3DE combined with image-processing computer techniques provides precise segmentation and reliable quantification of the complex 3D morphology and rapid motion of the MV. This new approach to imaging may provide additional quantitative descriptions that are useful in diagnostic and therapeutic decision-making. Quantitative analysis of the MV using RT3DE has increased our understanding of the pathologic mechanism of degenerative, ischemic, functional, and rheumatic MV disease. Most recently, 3D morphologic quantification has entered into clinical use to provide more accurate diagnosis of MV disease and for planning surgery and transcatheter interventions. Current limitations of this quantitative approach to MV imaging include labor-intensiveness during image segmentation and lack of a clear definition of the clinical significance of many of the morphologic parameters. This review summarizes the current development and applications of quantitative analysis of the MV morphology using RT3DE.

  2. Using Perturbation Theory to Reduce Noise in Diffusion Tensor Fields

    PubMed Central

    Bansal, Ravi; Staib, Lawrence H.; Xu, Dongrong; Laine, Andrew F.; Liu, Jun; Peterson, Bradley S.

    2009-01-01

    We propose the use of Perturbation theory to reduce noise in Diffusion Tensor (DT) fields. Diffusion Tensor Imaging (DTI) encodes the diffusion of water molecules along different spatial directions in a positive-definite, 3 × 3 symmetric tensor. Eigenvectors and eigenvalues of DTs allow the in vivo visualization and quantitative analysis of white matter fiber bundles across the brain. The validity and reliability of these analyses are limited, however, by the low spatial resolution and low Signal-to-Noise Ratio (SNR) in DTI datasets. Our procedures can be applied to improve the validity and reliability of these quantitative analyses by reducing noise in the tensor fields. We model a tensor field as a three-dimensional Markov Random Field and then compute the likelihood and the prior terms of this model using Perturbation theory. The prior term constrains the tensor field to be smooth, whereas the likelihood term constrains the smoothed tensor field to be similar to the original field. Thus, the proposed method generates a smoothed field that is close in structure to the original tensor field. We evaluate the performance of our method both visually and quantitatively using synthetic and real-world datasets. We quantitatively assess the performance of our method by computing the SNR for eigenvalues and the coherence measures for eigenvectors of DTs across tensor fields. In addition, we quantitatively compare the performance of our procedures with the performance of one method that uses a Riemannian distance to compute the similarity between two tensors, and with another method that reduces noise in tensor fields by anisotropically filtering the diffusion weighted images that are used to estimate diffusion tensors. These experiments demonstrate that our method significantly increases the coherence of the eigenvectors and the SNR of the eigenvalues, while simultaneously preserving the fine structure and boundaries between homogeneous regions, in the smoothed tensor field. PMID:19540791

  3. Assessing the Reliability of Material Flow Analysis Results: The Cases of Rhenium, Gallium, and Germanium in the United States Economy.

    PubMed

    Meylan, Grégoire; Reck, Barbara K; Rechberger, Helmut; Graedel, Thomas E; Schwab, Oliver

    2017-10-17

    Decision-makers traditionally expect "hard facts" from scientific inquiry, an expectation that the results of material flow analyses (MFAs) can hardly meet. MFA limitations are attributable to incompleteness of flowcharts, limited data quality, and model assumptions. Moreover, MFA results are, for the most part, based less on empirical observation but rather on social knowledge construction processes. Developing, applying, and improving the means of evaluating and communicating the reliability of MFA results is imperative. We apply two recently proposed approaches for making quantitative statements on MFA reliability to national minor metals systems: rhenium, gallium, and germanium in the United States in 2012. We discuss the reliability of results in policy and management contexts. The first approach consists of assessing data quality based on systematic characterization of MFA data and the associated meta-information and quantifying the "information content" of MFAs. The second is a quantification of data inconsistencies indicated by the "degree of data reconciliation" between the data and the model. A high information content and a low degree of reconciliation indicate reliable or certain MFA results. This article contributes to reliability and uncertainty discourses in MFA, exemplifying the usefulness of the approaches in policy and management, and to raw material supply discussions by providing country-level information on three important minor metals often considered critical.

  4. Analysis Testing of Sociocultural Factors Influence on Human Reliability within Sociotechnical Systems: The Algerian Oil Companies.

    PubMed

    Laidoune, Abdelbaki; Rahal Gharbi, Med El Hadi

    2016-09-01

    The influence of sociocultural factors on human reliability within an open sociotechnical systems is highlighted. The design of such systems is enhanced by experience feedback. The study was focused on a survey related to the observation of working cases, and by processing of incident/accident statistics and semistructured interviews in the qualitative part. In order to consolidate the study approach, we considered a schedule for the purpose of standard statistical measurements. We tried to be unbiased by supporting an exhaustive list of all worker categories including age, sex, educational level, prescribed task, accountability level, etc. The survey was reinforced by a schedule distributed to 300 workers belonging to two oil companies. This schedule comprises 30 items related to six main factors that influence human reliability. Qualitative observations and schedule data processing had shown that the sociocultural factors can negatively and positively influence operator behaviors. The explored sociocultural factors influence the human reliability both in qualitative and quantitative manners. The proposed model shows how reliability can be enhanced by some measures such as experience feedback based on, for example, safety improvements, training, and information. With that is added the continuous systems improvements to improve sociocultural reality and to reduce negative behaviors.

  5. Validity and reliability of the Paprosky acetabular defect classification.

    PubMed

    Yu, Raymond; Hofstaetter, Jochen G; Sullivan, Thomas; Costi, Kerry; Howie, Donald W; Solomon, Lucian B

    2013-07-01

    The Paprosky acetabular defect classification is widely used but has not been appropriately validated. Reliability of the Paprosky system has not been evaluated in combination with standardized techniques of measurement and scoring. This study evaluated the reliability, teachability, and validity of the Paprosky acetabular defect classification. Preoperative radiographs from a random sample of 83 patients undergoing 85 acetabular revisions were classified by four observers, and their classifications were compared with quantitative intraoperative measurements. Teachability of the classification scheme was tested by dividing the four observers into two groups. The observers in Group 1 underwent three teaching sessions; those in Group 2 underwent one session and the influence of teaching on the accuracy of their classifications was ascertained. Radiographic evaluation showed statistically significant relationships with intraoperative measurements of anterior, medial, and superior acetabular defect sizes. Interobserver reliability improved substantially after teaching and did not improve without it. The weighted kappa coefficient went from 0.56 at Occasion 1 to 0.79 after three teaching sessions in Group 1 observers, and from 0.49 to 0.65 after one teaching session in Group 2 observers. The Paprosky system is valid and shows good reliability when combined with standardized definitions of radiographic landmarks and a structured analysis. Level II, diagnostic study. See the Guidelines for Authors for a complete description of levels of evidence.

  6. Wii Balance Board: Reliability and Clinical Use in Assessment of Balance in Healthy Elderly Women.

    PubMed

    Monteiro-Junior, Renato Sobral; Ferreira, Arthur Sá; Puell, Vivian Neiva; Lattari, Eduardo; Machado, Sérgio; Otero Vaghetti, César Augusto; da Silva, Elirez Bezerra

    2015-01-01

    Force plate is considered gold standard tool to assess body balance. However the Wii Balance Board (WBB) platform is a trustworthy equipment to assess stabilometric components in young people. Thus, we aim to examine the reliability of measures of center of pressure with WBB in healthy elderly women. Twenty one healthy and physically active women were enrolled in the study (age: 64 ± 7 years; body mass index: 29 ± 5 kg/m2. The WBB was used to assess the center of pressure measures in the individuals. Pressure was linearly applied to different points to test the platform precision. Three assessments were performed, with two of them being held on the same day at a 5- to 10-minute interval, and the third one was performed 48 h later. A linear regression analysis was used to find out linearity, while the intraclass correlation coefficient was used to assess reliability. The platform precision was adequate (R2 = 0.997, P = 0.01). Center of pressure measures showed an excellent reliability (all intraclass correlation coefficient values were > 0.90; p < 0.01). The WBB is a precise and reliable tool of body stability quantitative measure in healthy active elderly women and its use should be encouraged in clinical settings.

  7. High-Performance Liquid Chromatography (HPLC)-Based Detection and Quantitation of Cellular c-di-GMP.

    PubMed

    Petrova, Olga E; Sauer, Karin

    2017-01-01

    The modulation of c-di-GMP levels plays a vital role in the regulation of various processes in a wide array of bacterial species. Thus, investigation of c-di-GMP regulation requires reliable methods for the assessment of c-di-GMP levels and turnover. Reversed-phase high-performance liquid chromatography (RP-HPLC) analysis has become a commonly used approach to accomplish these goals. The following describes the extraction and HPLC-based detection and quantification of c-di-GMP from Pseudomonas aeruginosa samples, a procedure that is amenable to modifications for the analysis of c-di-GMP in other bacterial species.

  8. Reliability and Validity of the Professional Counseling Performance Evaluation

    ERIC Educational Resources Information Center

    Shepherd, J. Brad; Britton, Paula J.; Kress, Victoria E.

    2008-01-01

    The definition and measurement of counsellor trainee competency is an issue that has received increased attention yet lacks quantitative study. This research evaluates item responses, scale reliability and intercorrelations, interrater agreement, and criterion-related validity of the Professional Performance Fitness Evaluation/Professional…

  9. Development of Islamic Spiritual Health Scale (ISHS).

    PubMed

    Khorashadizadeh, Fatemeh; Heydari, Abbas; Nabavi, Fatemeh Heshmati; Mazlom, Seyed Reza; Ebrahimi, Mahdi; Esmaili, Habibollah

    2017-03-01

    To develop and psychometrically assess spiritual health scale based on Islamic view in Iran. The cross-sectional study was conducted at Imam Ali and Quem hospitals in Mashhad and Imam Ali and Imam Reza hospitals in Bojnurd, Iran, from 2015 to 2016 In the first stage, an 81-item Likert-type scale was developed using a qualitative approach. The second stage comprised quantitative component. The scale's impact factor, content validity ratio, content validity index, face validity and exploratory factor analysis were calculated. Test-retest and internal consistency was used to examine the reliability of the instrument. Data analysis was done using SPSS 11. Of 81 items in the scale, those with impact factor above 1.5, content validity ratio above 0.62, and content validity index above 0.79 were considered valid and the rest were discarded, resulting in a 61-item scale. Exploratory factor analysis reduced the list of items to 30, which were divided into seven groups with a minimum eigen value of 1 for each factor. But according to scatter plot, attributes of the concept of spiritual health included love to creator, duty-based life, religious rationality, psychological balance, and attention to afterlife. Internal reliability of the scale was calculated by alpha Cronbach coefficient as 0.91. There was solid evidence of the strength factor structure and reliability of the Islamic Spiritual Health Scale which provides a unique way for spiritual health assessment of Muslims.

  10. Rapid Trace Detection and Isomer Quantitation of Pesticide Residues via Matrix-Assisted Laser Desorption/Ionization Fourier Transform Ion Cyclotron Resonance Mass Spectrometry.

    PubMed

    Wu, Xinzhou; Li, Weifeng; Guo, Pengran; Zhang, Zhixiang; Xu, Hanhong

    2018-04-18

    Matrix-assisted laser desorption/ionization Fourier transform ion cyclotron resonance mass spectrometry (MALDI-FTICR-MS) has been applied for rapid, sensitive, undisputed, and quantitative detection of pesticide residues on fresh leaves with little sample pretreatment. Various pesticides (insecticides, bactericides, herbicides, and acaricides) are detected directly in the complex matrix with excellent limits of detection down to 4 μg/L. FTICR-MS could unambiguously identify pesticides with tiny mass differences (∼0.017 75 Da), thereby avoiding false-positive results. Remarkably, pesticide isomers can be totally discriminated by use of diagnostic fragments, and quantitative analysis of pesticide isomers is demonstrated. The present results expand the horizons of the MALDI-FTICR-MS platform in the reliable determination of pesticides, with integrated advantages of ultrahigh mass resolution and accuracy. This method provides growing evidence for the resultant detrimental effects of pesticides, expediting the identification and evaluation of innovative pesticides.

  11. Portable paper-based device for quantitative colorimetric assays relying on light reflectance principle.

    PubMed

    Li, Bowei; Fu, Longwen; Zhang, Wei; Feng, Weiwei; Chen, Lingxin

    2014-04-01

    This paper presents a novel paper-based analytical device based on the colorimetric paper assays through its light reflectance. The device is portable, low cost (<20 dollars), and lightweight (only 176 g) that is available to assess the cost-effectiveness and appropriateness of the original health care or on-site detection information. Based on the light reflectance principle, the signal can be obtained directly, stably and user-friendly in our device. We demonstrated the utility and broad applicability of this technique with measurements of different biological and pollution target samples (BSA, glucose, Fe, and nitrite). Moreover, the real samples of Fe (II) and nitrite in the local tap water were successfully analyzed, and compared with the standard UV absorption method, the quantitative results showed good performance, reproducibility, and reliability. This device could provide quantitative information very conveniently and show great potential to broad fields of resource-limited analysis, medical diagnostics, and on-site environmental detection. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. The diagnostic capability of laser induced fluorescence in the characterization of excised breast tissues

    NASA Astrophysics Data System (ADS)

    Galmed, A. H.; Elshemey, Wael M.

    2017-08-01

    Differentiating between normal, benign and malignant excised breast tissues is one of the major worldwide challenges that need a quantitative, fast and reliable technique in order to avoid personal errors in diagnosis. Laser induced fluorescence (LIF) is a promising technique that has been applied for the characterization of biological tissues including breast tissue. Unfortunately, only few studies have adopted a quantitative approach that can be directly applied for breast tissue characterization. This work provides a quantitative means for such characterization via introduction of several LIF characterization parameters and determining the diagnostic accuracy of each parameter in the differentiation between normal, benign and malignant excised breast tissues. Extensive analysis on 41 lyophilized breast samples using scatter diagrams, cut-off values, diagnostic indices and receiver operating characteristic (ROC) curves, shows that some spectral parameters (peak height and area under the peak) are superior for characterization of normal, benign and malignant breast tissues with high sensitivity (up to 0.91), specificity (up to 0.91) and accuracy ranking (highly accurate).

  13. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  14. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  15. Task Decomposition in Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald Laurids; Joe, Jeffrey Clark

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approachesmore » should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.« less

  16. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  17. Fuzzy method of recognition of high molecular substances in evidence-based biology

    NASA Astrophysics Data System (ADS)

    Olevskyi, V. I.; Smetanin, V. T.; Olevska, Yu. B.

    2017-10-01

    Nowadays modern requirements to achieving reliable results along with high quality of researches put mathematical analysis methods of results at the forefront. Because of this, evidence-based methods of processing experimental data have become increasingly popular in the biological sciences and medicine. Their basis is meta-analysis, a method of quantitative generalization of a large number of randomized trails contributing to a same special problem, which are often contradictory and performed by different authors. It allows identifying the most important trends and quantitative indicators of the data, verification of advanced hypotheses and discovering new effects in the population genotype. The existing methods of recognizing high molecular substances by gel electrophoresis of proteins under denaturing conditions are based on approximate methods for comparing the contrast of electrophoregrams with a standard solution of known substances. We propose a fuzzy method for modeling experimental data to increase the accuracy and validity of the findings of the detection of new proteins.

  18. Improving the efficiency of selection to Core Medical Training: a study of the use of multiple assessment stations.

    PubMed

    Atkinson, J M; Tullo, E; Mitchison, H; Pearce, M S; Kumar, N

    2012-06-01

    To compare three separate assessment stations used for selection to Core Medical Training (CMT) and to determine the effect of reducing the number from three to two. Quantitative analysis of candidates' assessment station scores, financial analysis of costs of the selection process and quantitative and qualitative surveys of candidates and assessors. The assessment stations used for selection to CMT were reliable and valid for assessing suitability for employment as a CMT trainee. There was no significant difference in candidate ranking if only two assessment stations were used rather than three, i.e. there was no change in the likelihood of receiving a job offer. All of the assessment stations were perceived to have face validity by candidates and assessors. The efficiency of the selection process could be improved without loss of quality if two stations were used rather than three. Using two assessment stations rather than three would appear to improve the efficiency and maintain the quality of the CMT selection process while reducing costs.

  19. Validating Quantitative Measurement Using Qualitative Data: Combining Rasch Scaling and Latent Semantic Analysis in Psychiatry

    NASA Astrophysics Data System (ADS)

    Lange, Rense

    2015-02-01

    An extension of concurrent validity is proposed that uses qualitative data for the purpose of validating quantitative measures. The approach relies on Latent Semantic Analysis (LSA) which places verbal (written) statements in a high dimensional semantic space. Using data from a medical / psychiatric domain as a case study - Near Death Experiences, or NDE - we established concurrent validity by connecting NDErs qualitative (written) experiential accounts with their locations on a Rasch scalable measure of NDE intensity. Concurrent validity received strong empirical support since the variance in the Rasch measures could be predicted reliably from the coordinates of their accounts in the LSA derived semantic space (R2 = 0.33). These coordinates also predicted NDErs age with considerable precision (R2 = 0.25). Both estimates are probably artificially low due to the small available data samples (n = 588). It appears that Rasch scalability of NDE intensity is a prerequisite for these findings, as each intensity level is associated (at least probabilistically) with a well- defined pattern of item endorsements.

  20. Systems analysis of the single photon response in invertebrate photoreceptors.

    PubMed

    Pumir, Alain; Graves, Jennifer; Ranganathan, Rama; Shraiman, Boris I

    2008-07-29

    Photoreceptors of Drosophila compound eye employ a G protein-mediated signaling pathway that transduces single photons into transient electrical responses called "quantum bumps" (QB). Although most of the molecular components of this pathway are already known, the system-level understanding of the mechanism of QB generation has remained elusive. Here, we present a quantitative model explaining how QBs emerge from stochastic nonlinear dynamics of the signaling cascade. The model shows that the cascade acts as an "integrate and fire" device and explains how photoreceptors achieve reliable responses to light although keeping low background in the dark. The model predicts the nontrivial behavior of mutants that enhance or suppress signaling and explains the dependence on external calcium, which controls feedback regulation. The results provide insight into physiological questions such as single-photon response efficiency and the adaptation of response to high incident-light level. The system-level analysis enabled by modeling phototransduction provides a foundation for understanding G protein signaling pathways less amenable to quantitative approaches.

  1. Current perspectives of CASA applications in diverse mammalian spermatozoa.

    PubMed

    van der Horst, Gerhard; Maree, Liana; du Plessis, Stefan S

    2018-03-26

    Since the advent of computer-aided sperm analysis (CASA) some four decades ago, advances in computer technology and software algorithms have helped establish it as a research and diagnostic instrument for the analysis of spermatozoa. Despite mammalian spermatozoa being the most diverse cell type known, CASA is a great tool that has the capacity to provide rapid, reliable and objective quantitative assessment of sperm quality. This paper provides contemporary research findings illustrating the scientific and commercial applications of CASA and its ability to evaluate diverse mammalian spermatozoa (human, primates, rodents, domestic mammals, wildlife species) at both structural and functional levels. The potential of CASA to quantitatively measure essential aspects related to sperm subpopulations, hyperactivation, morphology and morphometry is also demonstrated. Furthermore, applications of CASA are provided for improved mammalian sperm quality assessment, evaluation of sperm functionality and the effect of different chemical substances or pathologies on sperm fertilising ability. It is clear that CASA has evolved significantly and is currently superior to many manual techniques in the research and clinical setting.

  2. Using a Smart Phone as a Standalone Platform for Detection and Monitoring of Pathological Tremors

    PubMed Central

    Daneault, Jean-François; Carignan, Benoit; Codère, Carl Éric; Sadikot, Abbas F.; Duval, Christian

    2013-01-01

    Introduction: Smart phones are becoming ubiquitous and their computing capabilities are ever increasing. Consequently, more attention is geared toward their potential use in research and medical settings. For instance, their built-in hardware can provide quantitative data for different movements. Therefore, the goal of the current study was to evaluate the capabilities of a standalone smart phone platform to characterize tremor. Results: Algorithms for tremor recording and online analysis can be implemented within a smart phone. The smart phone provides reliable time- and frequency-domain tremor characteristics. The smart phone can also provide medically relevant tremor assessments. Discussion: Smart phones have the potential to provide researchers and clinicians with quantitative short- and long-term tremor assessments that are currently not easily available. Methods: A smart phone application for tremor quantification and online analysis was developed. Then, smart phone results were compared to those obtained simultaneously with a laboratory accelerometer. Finally, results from the smart phone were compared to clinical tremor assessments. PMID:23346053

  3. Validated reversed phase LC method for quantitative analysis of polymethoxyflavones in citrus peel extracts.

    PubMed

    Wang, Zhenyu; Li, Shiming; Ferguson, Stephen; Goodnow, Robert; Ho, Chi-Tang

    2008-01-01

    Polymethoxyflavones (PMFs), which exist exclusively in the citrus genus, have biological activities including anti-inflammatory, anticarcinogenic, and antiatherogenic properties. A validated RPLC method was developed for quantitative analysis of six major PMFs, namely nobiletin, tangeretin, sinensetin, 5,6,7,4'-tetramethoxyflavone, 3,5,6,7,3',4'-hexamethoxyflavone, and 3,5,6,7,8,3',4'-heptamethoxyflavone. The polar embedded LC stationary phase was able to fully resolve the six analogues. The developed method was fully validated in terms of linearity, accuracy, precision, sensitivity, and system suitability. The LOD of the method was calculated as 0.15 microg/mL and the recovery rate was between 97.0 and 105.1%. This analytical method was successfully applied to quantify the individual PMFs in four commercially available citrus peel extracts (CPEs). Each extract shows significant difference in the PMF composition and concentration. This method may provide a simple, rapid, and reliable tool to help reveal the correlation between the bioactivity of the PMF extracts and the individual PMF content.

  4. Normalization of Reverse Transcription Quantitative PCR Data During Ageing in Distinct Cerebral Structures.

    PubMed

    Bruckert, G; Vivien, D; Docagne, F; Roussel, B D

    2016-04-01

    Reverse transcription quantitative-polymerase chain reaction (RT-qPCR) has become a routine method in many laboratories. Normalization of data from experimental conditions is critical for data processing and is usually achieved by the use of a single reference gene. Nevertheless, as pointed by the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines, several reference genes should be used for reliable normalization. Ageing is a physiological process that results in a decline of many expressed genes. Reliable normalization of RT-qPCR data becomes crucial when studying ageing. Here, we propose a RT-qPCR study from four mouse brain regions (cortex, hippocampus, striatum and cerebellum) at different ages (from 8 weeks to 22 months) in which we studied the expression of nine commonly used reference genes. With the use of two different algorithms, we found that all brain structures need at least two genes for a good normalization step. We propose specific pairs of gene for efficient data normalization in the four brain regions studied. These results underline the importance of reliable reference genes for specific brain regions in ageing.

  5. Development of a multi-variate calibration approach for quantitative analysis of oxidation resistant Mo-Si-B coatings using laser ablation inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Cakara, Anja; Bonta, Maximilian; Riedl, Helmut; Mayrhofer, Paul H.; Limbeck, Andreas

    2016-06-01

    Nowadays, for the production of oxidation protection coatings in ultrahigh temperature environments, alloys of Mo-Si-B are employed. The properties of the material, mainly the oxidation resistance, are strongly influenced by the Si to B ratio; thus reliable analytical methods are needed to assure exact determination of the material composition for the respective applications. For analysis of such coatings, laser ablation inductively coupled mass spectrometry (LA-ICP-MS) has been reported as a versatile method with no specific requirements on the nature of the sample. However, matrix effects represent the main limitation of laser-based solid sampling techniques and usually the use of matrix-matched standards for quantitative analysis is required. In this work, LA-ICP-MS analysis of samples with known composition and varying Mo, Si and B content was carried out. Between known analyte concentrations and derived LA-ICP-MS signal intensities no linear correlation could be found. In order to allow quantitative analysis independent of matrix effects, a multiple linear regression model was developed. Besides the three target analytes also the signals of possible argides (40Ar36Ar and 98Mo40Ar) as well as detected impurities of the Mo-Si-B coatings (108Pd) were considered. Applicability of the model to unknown samples was confirmed using external validation. Relative deviations from the values determined using conventional liquid analysis after sample digestion between 5 and 10% for the main components Mo and Si were observed.

  6. Statistical image quantification toward optimal scan fusion and change quantification

    NASA Astrophysics Data System (ADS)

    Potesil, Vaclav; Zhou, Xiang Sean

    2007-03-01

    Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.

  7. Airborne particulate matter (PM) filter analysis and modeling by total reflection X-ray fluorescence (TXRF) and X-ray standing wave (XSW).

    PubMed

    Borgese, L; Salmistraro, M; Gianoncelli, A; Zacco, A; Lucchini, R; Zimmerman, N; Pisani, L; Siviero, G; Depero, L E; Bontempi, E

    2012-01-30

    This work is presented as an improvement of a recently introduced method for airborne particulate matter (PM) filter analysis [1]. X-ray standing wave (XSW) and total reflection X-ray fluorescence (TXRF) were performed with a new dedicated laboratory instrumentation. The main advantage of performing both XSW and TXRF, is the possibility to distinguish the nature of the sample: if it is a small droplet dry residue, a thin film like or a bulk sample. Another advantage is related to the possibility to select the angle of total reflection to make TXRF measurements. Finally, the possibility to switch the X-ray source allows to measure with more accuracy lighter and heavier elements (with a change in X-ray anode, for example from Mo to Cu). The aim of the present study is to lay the theoretical foundation of the new proposed method for airborne PM filters quantitative analysis improving the accuracy and efficiency of quantification by means of an external standard. The theoretical model presented and discussed demonstrated that airborne PM filters can be considered as thin layers. A set of reference samples is prepared in laboratory and used to obtain a calibration curve. Our results demonstrate that the proposed method for quantitative analysis of air PM filters is affordable and reliable without the necessity to digest filters to obtain quantitative chemical analysis, and that the use of XSW improve the accuracy of TXRF analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. HIFI-C: a robust and fast method for determining NMR couplings from adaptive 3D to 2D projections.

    PubMed

    Cornilescu, Gabriel; Bahrami, Arash; Tonelli, Marco; Markley, John L; Eghbalnia, Hamid R

    2007-08-01

    We describe a novel method for the robust, rapid, and reliable determination of J couplings in multi-dimensional NMR coupling data, including small couplings from larger proteins. The method, "High-resolution Iterative Frequency Identification of Couplings" (HIFI-C) is an extension of the adaptive and intelligent data collection approach introduced earlier in HIFI-NMR. HIFI-C collects one or more optimally tilted two-dimensional (2D) planes of a 3D experiment, identifies peaks, and determines couplings with high resolution and precision. The HIFI-C approach, demonstrated here for the 3D quantitative J method, offers vital features that advance the goal of rapid and robust collection of NMR coupling data. (1) Tilted plane residual dipolar couplings (RDC) data are collected adaptively in order to offer an intelligent trade off between data collection time and accuracy. (2) Data from independent planes can provide a statistical measure of reliability for each measured coupling. (3) Fast data collection enables measurements in cases where sample stability is a limiting factor (for example in the presence of an orienting medium required for residual dipolar coupling measurements). (4) For samples that are stable, or in experiments involving relatively stronger couplings, robust data collection enables more reliable determinations of couplings in shorter time, particularly for larger biomolecules. As a proof of principle, we have applied the HIFI-C approach to the 3D quantitative J experiment to determine N-C' RDC values for three proteins ranging from 56 to 159 residues (including a homodimer with 111 residues in each subunit). A number of factors influence the robustness and speed of data collection. These factors include the size of the protein, the experimental set up, and the coupling being measured, among others. To exhibit a lower bound on robustness and the potential for time saving, the measurement of dipolar couplings for the N-C' vector represents a realistic "worst case analysis". These couplings are among the smallest currently measured, and their determination in both isotropic and anisotropic media demands the highest measurement precision. The new approach yielded excellent quantitative agreement with values determined independently by the conventional 3D quantitative J NMR method (in cases where sample stability in oriented media permitted these measurements) but with a factor of 2-5 in time savings. The statistical measure of reliability, measuring the quality of each RDC value, offers valuable adjunct information even in cases where modest time savings may be realized.

  9. A Complete Color Normalization Approach to Histopathology Images Using Color Cues Computed From Saturation-Weighted Statistics.

    PubMed

    Li, Xingyu; Plataniotis, Konstantinos N

    2015-07-01

    In digital histopathology, tasks of segmentation and disease diagnosis are achieved by quantitative analysis of image content. However, color variation in image samples makes it challenging to produce reliable results. This paper introduces a complete normalization scheme to address the problem of color variation in histopathology images jointly caused by inconsistent biopsy staining and nonstandard imaging condition. Method : Different from existing normalization methods that either address partial cause of color variation or lump them together, our method identifies causes of color variation based on a microscopic imaging model and addresses inconsistency in biopsy imaging and staining by an illuminant normalization module and a spectral normalization module, respectively. In evaluation, we use two public datasets that are representative of histopathology images commonly received in clinics to examine the proposed method from the aspects of robustness to system settings, performance consistency against achromatic pixels, and normalization effectiveness in terms of histological information preservation. As the saturation-weighted statistics proposed in this study generates stable and reliable color cues for stain normalization, our scheme is robust to system parameters and insensitive to image content and achromatic colors. Extensive experimentation suggests that our approach outperforms state-of-the-art normalization methods as the proposed method is the only approach that succeeds to preserve histological information after normalization. The proposed color normalization solution would be useful to mitigate effects of color variation in pathology images on subsequent quantitative analysis.

  10. Selection of reliable reference genes for quantitative real-time PCR gene expression analysis in Jute (Corchorus capsularis) under stress treatments

    PubMed Central

    Niu, Xiaoping; Qi, Jianmin; Zhang, Gaoyang; Xu, Jiantang; Tao, Aifen; Fang, Pingping; Su, Jianguang

    2015-01-01

    To accurately measure gene expression using quantitative reverse transcription PCR (qRT-PCR), reliable reference gene(s) are required for data normalization. Corchorus capsularis, an annual herbaceous fiber crop with predominant biodegradability and renewability, has not been investigated for the stability of reference genes with qRT-PCR. In this study, 11 candidate reference genes were selected and their expression levels were assessed using qRT-PCR. To account for the influence of experimental approach and tissue type, 22 different jute samples were selected from abiotic and biotic stress conditions as well as three different tissue types. The stability of the candidate reference genes was evaluated using geNorm, NormFinder, and BestKeeper programs, and the comprehensive rankings of gene stability were generated by aggregate analysis. For the biotic stress and NaCl stress subsets, ACT7 and RAN were suitable as stable reference genes for gene expression normalization. For the PEG stress subset, UBC, and DnaJ were sufficient for accurate normalization. For the tissues subset, four reference genes TUBβ, UBI, EF1α, and RAN were sufficient for accurate normalization. The selected genes were further validated by comparing expression profiles of WRKY15 in various samples, and two stable reference genes were recommended for accurate normalization of qRT-PCR data. Our results provide researchers with appropriate reference genes for qRT-PCR in C. capsularis, and will facilitate gene expression study under these conditions. PMID:26528312

  11. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  12. Intramolecular carbon and nitrogen isotope analysis by quantitative dry fragmentation of the phenylurea herbicide isoproturon in a combined injector/capillary reactor prior to GC separation.

    PubMed

    Penning, Holger; Elsner, Martin

    2007-11-01

    Potentially, compound-specific isotope analysis may provide unique information on source and fate of pesticides in natural systems. Yet for isotope analysis, LC-based methods that are based on the use of organic solvents often cannot be used and GC-based analysis is frequently not possible due to thermolability of the analyte. A typical example of a compound with such properties is isoproturon (3-(4-isopropylphenyl)-1,1-dimethylurea), belonging to the worldwide extensively used phenylurea herbicides. To make isoproturon accessible to carbon and nitrogen isotope analysis, we developed a GC-based method during which isoproturon was quantitatively fragmented to dimethylamine and 4-isopropylphenylisocyanate. Fragmentation occurred only partially in the injector but was mainly achieved on a heated capillary column. The fragments were then chromatographically separated and individually measured by isotope ratio mass spectrometry. The reliability of the method was tested in hydrolysis experiments with three isotopically different batches of isoproturon. For all three products, the same isotope fractionation factors were observed during conversion and the difference in isotope composition between the batches was preserved. This study demonstrates that fragmentation of phenylurea herbicides does not only make them accessible to isotope analysis but even enables determination of intramolecular isotope fractionation.

  13. Segmental Musculoskeletal Examinations using Dual-Energy X-Ray Absorptiometry (DXA): Positioning and Analysis Considerations

    PubMed Central

    Hart, Nicolas H.; Nimphius, Sophia; Spiteri, Tania; Cochrane, Jodie L.; Newton, Robert U.

    2015-01-01

    Musculoskeletal examinations provide informative and valuable quantitative insight into muscle and bone health. DXA is one mainstream tool used to accurately and reliably determine body composition components and bone mass characteristics in-vivo. Presently, whole body scan models separate the body into axial and appendicular regions, however there is a need for localised appendicular segmentation models to further examine regions of interest within the upper and lower extremities. Similarly, inconsistencies pertaining to patient positioning exist in the literature which influence measurement precision and analysis outcomes highlighting a need for standardised procedure. This paper provides standardised and reproducible: 1) positioning and analysis procedures using DXA and 2) reliable segmental examinations through descriptive appendicular boundaries. Whole-body scans were performed on forty-six (n = 46) football athletes (age: 22.9 ± 4.3 yrs; height: 1.85 ± 0.07 cm; weight: 87.4 ± 10.3 kg; body fat: 11.4 ± 4.5 %) using DXA. All segments across all scans were analysed three times by the main investigator on three separate days, and by three independent investigators a week following the original analysis. To examine intra-rater and inter-rater, between day and researcher reliability, coefficients of variation (CV) and intraclass correlation coefficients (ICC) were determined. Positioning and segmental analysis procedures presented in this study produced very high, nearly perfect intra-tester (CV ≤ 2.0%; ICC ≥ 0.988) and inter-tester (CV ≤ 2.4%; ICC ≥ 0.980) reliability, demonstrating excellent reproducibility within and between practitioners. Standardised examinations of axial and appendicular segments are necessary. Future studies aiming to quantify and report segmental analyses of the upper- and lower-body musculoskeletal properties using whole-body DXA scans are encouraged to use the patient positioning and image analysis procedures outlined in this paper. Key points Musculoskeletal examinations using DXA technology require highly standardised and reproducible patient positioning and image analysis procedures to accurately measure and monitor axial, appendicular and segmental regions of interest. Internal rotation and fixation of the lower-limbs is strongly recommended during whole-body DXA scans to prevent undesired movement, improve frontal mass accessibility and enhance ankle joint visibility during scan performance and analysis. Appendicular segmental analyses using whole-body DXA scans are highly reliable for all regional upper-body and lower-body segmentations, with hard-tissue (CV ≤ 1.5%; R ≥ 0.990) achieving greater reliability and lower error than soft-tissue (CV ≤ 2.4%; R ≥ 0.980) masses when using our appendicular segmental boundaries. PMID:26336349

  14. Systematic review of quantitative clinical gait analysis in patients with dementia.

    PubMed

    van Iersel, M B; Hoefsloot, W; Munneke, M; Bloem, B R; Olde Rikkert, M G M

    2004-02-01

    Diminished mobility often accompanies dementia and has a great impact on independence and quality of life. New treatment strategies for dementia are emerging, but the effects on gait remains to be studied objectively. In this review we address the general effects of dementia on gait as revealed by quantitative gait analysis. A systematic literature search with the (MESH) terms: 'dementia' and 'gait disorders' in Medline, CC, Psychlit and CinaHL between 1980-2002. Main inclusion criteria: controlled studies; patients with dementia; quantitative gait data. Seven publications met the inclusion criteria. All compared gait in Alzheimer's Disease (AD) with healthy elderly controls; one also assessed gait in Vascular Dementia (VaD). The methodology used was inconsistent and often had many shortcomings. However, there were several consistent findings: walking velocity decreased in dementia compared to healthy controls and decreased further with progressing severity of dementia. VaD was associated with a significant decrease in walking velocity compared to AD subjects. Dementia was associated with a shortened step length, an increased double support time and step to step variability. Gait in dementia is hardly analyzed in a well-designed manner. Despite this, the literature suggests that quantitative gait analysis can be sufficiently reliable and responsive to measure decline in walking velocity between subjects with and without dementia. More research is required to assess, both on an individual and a group level, how the minimal clinically relevant changes in gait in elderly demented patients should be defined and what would be the most responsive method to measure these changes.

  15. The use of experimental design for the development of a capillary zone electrophoresis method for the quantitation of captopril.

    PubMed

    Mukozhiwa, S Y; Khamanga, S M M; Walker, R B

    2017-09-01

    A capillary zone electrophoresis (CZE) method for the quantitation of captopril (CPT) using UV detection was developed. Influence of electrolyte concentration and system variables on electrophoretic separation was evaluated and a central composite design (CCD) was used to optimize the method. Variables investigated were pH, molarity, applied voltage and capillary length. The influence of sodium metabisulphite on the stability of test solutions was also investigated. The use of sodium metabisulphite prevented degradation of CPT over 24 hours. A fused uncoated silica capillary of 67.5cm total and 57.5 cm effective length was used for analysis. The applied voltage and capillary length affected the migration time of CPT significantly. A 20 mM phosphate buffer adjusted to pH 7.0 was used as running buffer and an applied voltage of 23.90 kV was suitable to effect a separation. The optimized electrophoretic conditions produced sharp, well-resolved peaks for CPT and sodium metabisulphite. Linear regression analysis of the response for CPT standards revealed the method was linear (R2 = 0.9995) over the range 5-70 μg/mL. The limits of quantitation and detection were 5 and 1.5 μg/mL. A simple, rapid and reliable CZE method has been developed and successfully applied to the analysis of commercially available CPT products.

  16. Development and application of a multi-targeting reference plasmid as calibrator for analysis of five genetically modified soybean events.

    PubMed

    Pi, Liqun; Li, Xiang; Cao, Yiwei; Wang, Canhua; Pan, Liangwen; Yang, Litao

    2015-04-01

    Reference materials are important in accurate analysis of genetically modified organism (GMO) contents in food/feeds, and development of novel reference plasmid is a new trend in the research of GMO reference materials. Herein, we constructed a novel multi-targeting plasmid, pSOY, which contained seven event-specific sequences of five GM soybeans (MON89788-5', A2704-12-3', A5547-127-3', DP356043-5', DP305423-3', A2704-12-5', and A5547-127-5') and sequence of soybean endogenous reference gene Lectin. We evaluated the specificity, limit of detection and quantification, and applicability of pSOY in both qualitative and quantitative PCR analyses. The limit of detection (LOD) was as low as 20 copies in qualitative PCR, and the limit of quantification (LOQ) in quantitative PCR was 10 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and Lectin assays were higher than 90%, and the squared regression coefficients (R(2)) were more than 0.999. The quantification bias varied from 0.21% to 19.29%, and the relative standard deviations were from 1.08% to 9.84% in simulated samples analysis. All the results demonstrated that the developed multi-targeting plasmid, pSOY, was a credible substitute of matrix reference materials, and could be used as a reliable reference calibrator in the identification and quantification of multiple GM soybean events.

  17. 76 FR 63301 - Agency Information Collection Activities: Proposed Collection; Comment Request; Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-12

    ... INFORMATION: Title: Comparative Effectiveness Research Inventory. Abstract: The information collection... will not be used for quantitative information collections that are designed to yield reliably... mechanisms that are designed to yield quantitative results. The Agency received no comments in response to...

  18. Validity and reliability of the "German Utilization Questionnaire-Dissemination and Use of Research" to measure attitude, availability, and support toward implementation of research in nursing practice.

    PubMed

    Haslinger-Baumann, Elisabeth; Lang, Gert; Müller, Gerhard

    2014-01-01

    In nursing practice, research results have to undergo a systematic process of transformation. Currently in Austria, there is no empirical data available concerning the actual implementation of research results. An English validated questionnaire was translated into German and tested for validity and reliability. A survey of 178 registered nurses (n = 178) was conducted in a multicenter, quantitative, cross-sectional study in Austria in 2011. Cronbach's alpha values (.82-.92) were calculated for 4 variables ("use," "attitude," "availability," "support") after the reduction of 7 irrelevant items. Exploratory factor analysis was calculated with Kaiser-Meyer-Olkin (KMO) ranging from .78 to .92; the total variance ranged from 46% to 56%. A validated German questionnaire concerning the implementation of research results is now available for the nursing practice.

  19. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range between the true value and the maximum likelihood estimated value lines.

  20. Real-time quantitative PCR of Staphylococcus aureus and application in restaurant meals.

    PubMed

    Berrada, H; Soriano, J M; Mañes, J; Picó, Y

    2006-01-01

    Staphylococcus aureus is considered the second most common pathogen to cause outbreaks of food poisoning, exceeded only by Campylobacter. Consumption of foods containing this microorganism is often identified as the cause of illness. In this study, a rapid, reliable, and sensitive real-time quantitative PCR was developed and compared with conventional culture methods. Real-time quantitative PCR was carried out by purifying DNA extracts of S. aureus with a Staphylococcus sample preparation kit and quantifying it in the LightCycler system with hybridization probes. The assay was linear from a range of 10 to 10(6) S. aureus cells (r2 > 0.997). The PCR reaction presented an efficiency of >85%. Accuracy of the PCR-based assay, expressed as percent bias, was around 13%, and the precision, expressed as a percentage of the coefficient of variation, was 7 to 10%. Intraday and interday variability were studied at 10(2) CFU/g and was 12 and 14%, respectively. The proposed method was applied to the analysis of 77 samples of restaurant meals in Valencia (Spain). In 11.6% of samples S. aureus was detected by real-time quantitative PCR, as well as by the conventional microbiological method. An excellent correspondence between real-time quantitative PCR and microbiological numbers (CFU/g) was observed with deviations of < 28%.

  1. rpb2 is a reliable reference gene for quantitative gene expression analysis in the dermatophyte Trichophyton rubrum.

    PubMed

    Jacob, Tiago R; Peres, Nalu T A; Persinoti, Gabriela F; Silva, Larissa G; Mazucato, Mendelson; Rossi, Antonio; Martinez-Rossi, Nilce M

    2012-05-01

    The selection of reference genes used for data normalization to quantify gene expression by real-time PCR amplifications (qRT-PCR) is crucial for the accuracy of this technique. In spite of this, little information regarding such genes for qRT-PCR is available for gene expression analyses in pathogenic fungi. Thus, we investigated the suitability of eight candidate reference genes in isolates of the human dermatophyte Trichophyton rubrum subjected to several environmental challenges, such as drug exposure, interaction with human nail and skin, and heat stress. The stability of these genes was determined by geNorm, NormFinder and Best-Keeper programs. The gene with the most stable expression in the majority of the conditions tested was rpb2 (DNA-dependent RNA polymerase II), which was validated in three T. rubrum strains. Moreover, the combination of rpb2 and chs1 (chitin synthase) genes provided for the most reliable qRT-PCR data normalization in T. rubrum under a broad range of biological conditions. To the best of our knowledge this is the first report on the selection of reference genes for qRT-PCR data normalization in dermatophytes and the results of these studies should permit further analysis of gene expression under several experimental conditions, with improved accuracy and reliability.

  2. Reliable oligonucleotide conformational ensemble generation in explicit solvent for force field assessment using reservoir replica exchange molecular dynamics simulations

    PubMed Central

    Henriksen, Niel M.; Roe, Daniel R.; Cheatham, Thomas E.

    2013-01-01

    Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 microseconds of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations. PMID:23477537

  3. Reliable oligonucleotide conformational ensemble generation in explicit solvent for force field assessment using reservoir replica exchange molecular dynamics simulations.

    PubMed

    Henriksen, Niel M; Roe, Daniel R; Cheatham, Thomas E

    2013-04-18

    Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example, by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 μs of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations.

  4. Quantitative background parenchymal uptake on molecular breast imaging and breast cancer risk: a case-control study.

    PubMed

    Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M

    2018-06-05

    Background parenchymal uptake (BPU), which refers to the level of Tc-99m sestamibi uptake within normal fibroglandular tissue on molecular breast imaging (MBI), has been identified as a breast cancer risk factor, independent of mammographic density. Prior analyses have used subjective categories to describe BPU. We evaluate a new quantitative method for assessing BPU by testing its reproducibility, comparing quantitative results with previously established subjective BPU categories, and determining the association of quantitative BPU with breast cancer risk. Two nonradiologist operators independently performed region-of-interest analysis on MBI images viewed in conjunction with corresponding digital mammograms. Quantitative BPU was defined as a unitless ratio of the average pixel intensity (counts/pixel) within the fibroglandular tissue versus the average pixel intensity in fat. Operator agreement and the correlation of quantitative BPU measures with subjective BPU categories assessed by expert radiologists were determined. Percent density on mammograms was estimated using Cumulus. The association of quantitative BPU with breast cancer (per one unit BPU) was examined within an established case-control study of 62 incident breast cancer cases and 177 matched controls. Quantitative BPU ranged from 0.4 to 3.2 across all subjects and was on average higher in cases compared to controls (1.4 versus 1.2, p < 0.007 for both operators). Quantitative BPU was strongly correlated with subjective BPU categories (Spearman's r = 0.59 to 0.69, p < 0.0001, for each paired combination of two operators and two radiologists). Interoperator and intraoperator agreement in the quantitative BPU measure, assessed by intraclass correlation, was 0.92 and 0.98, respectively. Quantitative BPU measures showed either no correlation or weak negative correlation with mammographic percent density. In a model adjusted for body mass index and percent density, higher quantitative BPU was associated with increased risk of breast cancer for both operators (OR = 4.0, 95% confidence interval (CI) 1.6-10.1, and 2.4, 95% CI 1.2-4.7). Quantitative measurement of BPU, defined as the ratio of average counts in fibroglandular tissue relative to that in fat, can be reliably performed by nonradiologist operators with a simple region-of-interest analysis tool. Similar to results obtained with subjective BPU categories, quantitative BPU is a functional imaging biomarker of breast cancer risk, independent of mammographic density and hormonal factors.

  5. A practical approach to spectral calibration of short wavelength infrared hyper-spectral imaging systems

    NASA Astrophysics Data System (ADS)

    Bürmen, Miran; Pernuš, Franjo; Likar, Boštjan

    2010-02-01

    Near-infrared spectroscopy is a promising, rapidly developing, reliable and noninvasive technique, used extensively in the biomedicine and in pharmaceutical industry. With the introduction of acousto-optic tunable filters (AOTF) and highly sensitive InGaAs focal plane sensor arrays, real-time high resolution hyper-spectral imaging has become feasible for a number of new biomedical in vivo applications. However, due to the specificity of the AOTF technology and lack of spectral calibration standardization, maintaining long-term stability and compatibility of the acquired hyper-spectral images across different systems is still a challenging problem. Efficiently solving both is essential as the majority of methods for analysis of hyper-spectral images relay on a priori knowledge extracted from large spectral databases, serving as the basis for reliable qualitative or quantitative analysis of various biological samples. In this study, we propose and evaluate fast and reliable spectral calibration of hyper-spectral imaging systems in the short wavelength infrared spectral region. The proposed spectral calibration method is based on light sources or materials, exhibiting distinct spectral features, which enable robust non-rigid registration of the acquired spectra. The calibration accounts for all of the components of a typical hyper-spectral imaging system such as AOTF, light source, lens and optical fibers. The obtained results indicated that practical, fast and reliable spectral calibration of hyper-spectral imaging systems is possible, thereby assuring long-term stability and inter-system compatibility of the acquired hyper-spectral images.

  6. The use and reliability of SymNose for quantitative measurement of the nose and lip in unilateral cleft lip and palate patients.

    PubMed

    Mosmuller, David; Tan, Robin; Mulder, Frans; Bachour, Yara; de Vet, Henrica; Don Griot, Peter

    2016-10-01

    It is essential to have a reliable assessment method in order to compare the results of cleft lip and palate surgery. In this study the computer-based program SymNose, a method for quantitative assessment of the nose and lip, will be assessed on usability and reliability. The symmetry of the nose and lip was measured twice in 50 six-year-old complete and incomplete unilateral cleft lip and palate patients by four observers. For the frontal view the asymmetry level of the nose and upper lip were evaluated and for the basal view the asymmetry level of the nose and nostrils were evaluated. A mean inter-observer reliability when tracing each image once or twice was 0.70 and 0.75, respectively. Tracing the photographs with 2 observers and 4 observers gave a mean inter-observer score of 0.86 and 0.92, respectively. The mean intra-observer reliability varied between 0.80 and 0.84. SymNose is a practical and reliable tool for the retrospective assessment of large caseloads of 2D photographs of cleft patients for research purposes. Moderate to high single inter-observer reliability was found. For future research with SymNose reliable outcomes can be achieved by using the average outcomes of single tracings of two observers. Copyright © 2016 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  7. Reliability of intra-oral quantitative sensory testing (QST) in patients with atypical odontalgia and healthy controls - a multicentre study.

    PubMed

    Baad-Hansen, L; Pigg, M; Yang, G; List, T; Svensson, P; Drangsholt, M

    2015-02-01

    The reliability of comprehensive intra-oral quantitative sensory testing (QST) protocol has not been examined systematically in patients with chronic oro-facial pain. The aim of the present multicentre study was to examine test-retest and interexaminer reliability of intra-oral QST measures in terms of absolute values and z-scores as well as within-session coefficients of variation (CV) values in patients with atypical odontalgia (AO) and healthy pain-free controls. Forty-five patients with AO and 68 healthy controls were subjected to bilateral intra-oral gingival QST and unilateral extratrigeminal QST (thenar) on three occasions (twice on 1 day by two different examiners and once approximately 1 week later by one of the examiners). Intra-class correlation coefficients and kappa values for interexaminer and test-retest reliability were computed. Most of the standardised intra-oral QST measures showed fair to excellent interexaminer (9-12 of 13 measures) and test-retest (7-11 of 13 measures) reliability. Furthermore, no robust differences in reliability measures or within-session variability (CV) were detected between patients with AO and the healthy reference group. These reliability results in chronic orofacial pain patients support earlier suggestions based on data from healthy subjects that intra-oral QST is sufficiently reliable for use as a part of a comprehensive evaluation of patients with somatosensory disturbances or neuropathic pain in the trigeminal region. © 2014 John Wiley & Sons Ltd.

  8. Evaluation and Selection of Candidate Reference Genes for Normalization of Quantitative RT-PCR in Withania somnifera (L.) Dunal

    PubMed Central

    Singh, Varinder; Kaul, Sunil C.; Wadhwa, Renu; Pati, Pratap Kumar

    2015-01-01

    Quantitative real-time PCR (qRT-PCR) is now globally used for accurate analysis of transcripts levels in plants. For reliable quantification of transcripts, identification of the best reference genes is a prerequisite in qRT-PCR analysis. Recently, Withania somnifera has attracted lot of attention due to its immense therapeutic potential. At present, biotechnological intervention for the improvement of this plant is being seriously pursued. In this background, it is important to have comprehensive studies on finding suitable reference genes for this high valued medicinal plant. In the present study, 11 candidate genes were evaluated for their expression stability under biotic (fungal disease), abiotic (wounding, salt, drought, heat and cold) stresses, in different plant tissues and in response to various plant growth regulators (methyl jasmonate, salicylic acid, abscisic acid). The data as analyzed by various software packages (geNorm, NormFinder, Bestkeeper and ΔCt method) suggested that cyclophilin (CYP) is a most stable gene under wounding, heat, methyl jasmonate, different tissues and all stress conditions. T-SAND was found to be a best reference gene for salt and salicylic acid (SA) treated samples, while 26S ribosomal RNA (26S), ubiquitin (UBQ) and beta-tubulin (TUB) were the most stably expressed genes under drought, biotic and cold treatment respectively. For abscisic acid (ABA) treated samples 18S-rRNA was found to stably expressed gene. Finally, the relative expression level of the three genes involved in the withanolide biosynthetic pathway was detected to validate the selection of reliable reference genes. The present work will significantly contribute to gene analysis studies in W. somnifera and facilitate in improving the quality of gene expression data in this plant as well as and other related plant species. PMID:25769035

  9. GeLC-MRM quantitation of mutant KRAS oncoprotein in complex biological samples.

    PubMed

    Halvey, Patrick J; Ferrone, Cristina R; Liebler, Daniel C

    2012-07-06

    Tumor-derived mutant KRAS (v-Ki-ras-2 Kirsten rat sarcoma viral oncogene) oncoprotein is a critical driver of cancer phenotypes and a potential biomarker for many epithelial cancers. Targeted mass spectrometry analysis by multiple reaction monitoring (MRM) enables selective detection and quantitation of wild-type and mutant KRAS proteins in complex biological samples. A recently described immunoprecipitation approach (Proc. Nat. Acad. Sci.2011, 108, 2444-2449) can be used to enrich KRAS for MRM analysis, but requires large protein inputs (2-4 mg). Here, we describe sodium dodecyl sulfate-polyacrylamide gel electrophoresis-based enrichment of KRAS in a low molecular weight (20-25 kDa) protein fraction prior to MRM analysis (GeLC-MRM). This approach reduces background proteome complexity, thus, allowing mutant KRAS to be reliably quantified in low protein inputs (5-50 μg). GeLC-MRM detected KRAS mutant variants (G12D, G13D, G12V, G12S) in a panel of cancer cell lines. GeLC-MRM analysis of wild-type and mutant was linear with respect to protein input and showed low variability across process replicates (CV = 14%). Concomitant analysis of a peptide from the highly similar HRAS and NRAS proteins enabled correction of KRAS-targeted measurements for contributions from these other proteins. KRAS peptides were also quantified in fluid from benign pancreatic cysts and pancreatic cancers at concentrations from 0.08 to 1.1 fmol/μg protein. GeLC-MRM provides a robust, sensitive approach to quantitation of mutant proteins in complex biological samples.

  10. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A complementary perspective

    NASA Astrophysics Data System (ADS)

    Kahveci, Ajda

    2010-07-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses. An unobtrusive research method, content analysis, was used by coding the manifest content and counting the frequency of words, photographs, drawings, and questions by cognitive level. The context was an undergraduate chemistry teacher preparation program at a large public university in a metropolitan area in northwestern Turkey. Forty preservice chemistry teachers were guided to analyze 10 middle school science and 10 high school chemistry textbooks. Overall, the textbooks included unfair gender representations, a considerably higher number of input and processing than output level questions, and high load of science terminology. The textbooks failed to provide sufficient empirical evidence to be considered as gender equitable and inquiry-based. The quantitative approach employed for evaluation contrasts with a more interpretive approach, and has the potential in depicting textbook profiles in a more reliable way, complementing the commonly employed qualitative procedures. Implications suggest that further work in this line is needed on calibrating the analysis procedures with science textbooks used in different international settings. The procedures could be modified and improved to meet specific evaluation needs. In the Turkish context, next step research may concern the analysis of science textbooks being rewritten for the reform-based curricula to make cross-comparisons and evaluate a possible progression.

  11. Dose limited reliability of quantitative annular dark field scanning transmission electron microscopy for nano-particle atom-counting.

    PubMed

    De Backer, A; Martinez, G T; MacArthur, K E; Jones, L; Béché, A; Nellist, P D; Van Aert, S

    2015-04-01

    Quantitative annular dark field scanning transmission electron microscopy (ADF STEM) has become a powerful technique to characterise nano-particles on an atomic scale. Because of their limited size and beam sensitivity, the atomic structure of such particles may become extremely challenging to determine. Therefore keeping the incoming electron dose to a minimum is important. However, this may reduce the reliability of quantitative ADF STEM which will here be demonstrated for nano-particle atom-counting. Based on experimental ADF STEM images of a real industrial catalyst, we discuss the limits for counting the number of atoms in a projected atomic column with single atom sensitivity. We diagnose these limits by combining a thorough statistical method and detailed image simulations. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Quantitative PCR for Detection and Enumeration of Genetic Markers of Bovine Fecal Pollution

    EPA Science Inventory

    Accurate assessment of health risks associated with bovine (cattle) fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for the detection of two recently described cow feces-spec...

  13. Impedance Analysis of Colloidal Gold Nanoparticles in Chromatography Paper for Quantitation of an Immunochromatographic Assay.

    PubMed

    Hori, Fumitaka; Harada, Yuji; Kuretake, Tatsumi; Uno, Shigeyasu

    2016-01-01

    A detection method of gold nanoparticles in chromatography paper has been developed for a simple, cost-effective and reliable quantitation of immunochromatographic strip test. The time courses of the solution resistance in chromatography paper with the gold nanoparticles solution are electrochemically measured by chrono-impedimetry. The dependence of the solution resistance on the concentration of gold nanoparticles has been successfully observed. The main factor to increase the solution resistance may be obstruction of the ion transport due to the presence of gold nanoparticles. The existence of gold nanoparticles with 1.92 × 10(9) particles/mL in an indistinctly-colored chromatography paper is also identified by a solution resistance measurement. This indicates that the solution resistance assay has the potential to lower the detection limit of the conventional qualitative assay.

  14. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  15. Characterization of potassium dichromate solutions for spectrophotometercalibration

    NASA Astrophysics Data System (ADS)

    Conceição, F. C.; Silva, E. M.; Gomes, J. F. S.; Borges, P. P.

    2018-03-01

    Spectrophotometric analysis in the ultraviolet (UV) region is used in the determination of several quantitative and qualitative parameters. For ensuring reliability of the analyses performed on the spectrophotometers, verification / calibration of the equipment must be performed periodically using certified reference materials (CRMs). This work presents the characterization stage needed for producing this CRM. The property value characterized was the absorbance for the wavelengths in the UV spectral regions. This CRM will contribute to guarantee the accuracy and linearity of the absorbance scale to the spectrophotometers, through which analytical measurement results will be provided with metrological traceability.

  16. Correlative light-electron fractography for fatigue striations characterization in metallic alloys.

    PubMed

    Hein, Luis Rogerio de Oliveira; de Oliveira, José Alberto; de Campos, Kamila Amato

    2013-09-01

    The correlative light-electron fractography technique combines correlative microscopy concepts to the extended depth-from-focus reconstruction method, associating the reliable topographic information of 3-D maps from light microscopy ordered Z-stacks to the finest lateral resolution and large focus depth from scanning electron microscopy. Fatigue striations spacing analysis can be precisely measured, by correcting the mean surface tilting with the knowledge of local elevation data from elevation maps. This new technique aims to improve the accuracy of quantitative fractography in fatigue fracture investigations. Copyright © 2013 Wiley Periodicals, Inc.

  17. A Validity and Reliability Study of the Attitudes toward Sustainable Development Scale

    ERIC Educational Resources Information Center

    Biasutti, Michele; Frate, Sara

    2017-01-01

    This article describes the development and validation of the Attitudes toward Sustainable Development scale, a quantitative 20-item scale that measures Italian university students' attitudes toward sustainable development. A total of 484 undergraduate students completed the questionnaire. The validity and reliability of the scale was statistically…

  18. Identification and apportionment of hazardous elements in the sediments in the Yangtze River estuary.

    PubMed

    Wang, Jiawei; Liu, Ruimin; Wang, Haotian; Yu, Wenwen; Xu, Fei; Shen, Zhenyao

    2015-12-01

    In this study, positive matrix factorization (PMF) and principal components analysis (PCA) were combined to identify and apportion pollution-based sources of hazardous elements in the surface sediments in the Yangtze River estuary (YRE). Source identification analysis indicated that PC1, including Al, Fe, Mn, Cr, Ni, As, Cu, and Zn, can be defined as a sewage component; PC2, including Pb and Sb, can be considered as an atmospheric deposition component; and PC3, containing Cd and Hg, can be considered as an agricultural nonpoint component. To better identify the sources and quantitatively apportion the concentrations to their sources, eight sources were identified with PMF: agricultural/industrial sewage mixed (18.6 %), mining wastewater (15.9 %), agricultural fertilizer (14.5 %), atmospheric deposition (12.8 %), agricultural nonpoint (10.6 %), industrial wastewater (9.8 %), marine activity (9.0 %), and nickel plating industry (8.8 %). Overall, the hazardous element content seems to be more connected to anthropogenic activity instead of natural sources. The PCA results laid the foundation for the PMF analysis by providing a general classification of sources. PMF resolves more factors with a higher explained variance than PCA; PMF provided both the internal analysis and the quantitative analysis. The combination of the two methods can provide more reasonable and reliable results.

  19. [Quality evaluation of Artemisiae Argyi Folium based on fingerprint analysis and quantitative analysis of multicomponents].

    PubMed

    Guo, Long; Jiao, Qian; Zhang, Dan; Liu, Ai-Peng; Wang, Qian; Zheng, Yu-Guang

    2018-03-01

    Artemisiae Argyi Folium, the dried leaves of Artemisia argyi, has been widely used in traditional Chinese and folk medicines for treatment of hemorrhage, pain, and skin itch. Phytochemical studies indicated that volatile oil, organic acid and flavonoids were the main bioactive components in Artemisiae Argyi Folium. Compared to the volatile compounds, the research of nonvolatile compounds in Artemisiae Argyi Folium are limited. In the present study, an accurate and reliable fingerprint approach was developed using HPLC for quality control of Artemisiae Argyi Folium. A total of 10 common peaks were marked,and the similarity of all the Artemisiae Argyi Folium samples was above 0.940. The established fingerprint method could be used for quality control of Artemisiae Argyi Folium. Furthermore, an HPLC method was applied for simultaneous determination of seven bioactive compounds including five organic acids and two flavonoids in Artemisiae Argyi Folium and Artemisiae Lavandulaefoliae Folium samples. Moreover, chemometrics methods such as hierarchical clustering analysis and principal component analysis were performed to compare and discriminate the Artemisiae Argyi Folium and Artemisiae Lavandulaefoliae Folium based on the quantitative data of analytes. The results indicated that simultaneous quantification of multicomponents coupled with chemometrics analysis could be a well-acceptable strategy to identify and evaluate the quality of Artemisiae Argyi Folium. Copyright© by the Chinese Pharmaceutical Association.

  20. [HPLC fingerprint of flavonoids in Sophora flavescens and determination of five components].

    PubMed

    Ma, Hong-Yan; Zhou, Wan-Shan; Chu, Fu-Jiang; Wang, Dong; Liang, Sheng-Wang; Li, Shao

    2013-08-01

    A simple and reliable method of high-performance liquid chromatography with photodiode array detection (HPLC-DAD) was developed to evaluate the quality of a traditional Chinese medicine Sophora flavescens through establishing chromatographic fingerprint and simultaneous determination of five flavonoids, including trifolirhizin, maackiain, kushenol I, kurarinone and sophoraflavanone G. The optimal conditions of separation and detection were achieved on an ULTIMATE XB-C18 column (4.6 mm x 250 mm, 5 microm) with a gradient of acetonitrile and water, detected at 295 nm. In the chromatographic fingerprint, 13 peaks were selected as the characteristic peaks to assess the similarities of different samples collected from different origins in China according to similarity evaluation for chromatographic fingerprint of traditional chinese medicine (2004AB) and principal component analysis (PCA) were used in data analysis. There were significant differences in the fingerprint chromatograms between S. flavescens and S. tonkinensis. Principal component analysis showed that kurarinone and sophoraflavanone G were the most important component. In quantitative analysis, the five components showed good regression (R > 0.999) with linear ranges, and their recoveries were in the range of 96.3% - 102.3%. This study indicated that the combination of quantitative and chromatographic fingerprint analysis can be readily utilized as a quality control method for S. flavescens and its related traditional Chinese medicinal preparations.

  1. Utilization of wireless structural health monitoring as decision making tools for a condition and reliability-based assessment of railroad bridges

    NASA Astrophysics Data System (ADS)

    Flanigan, Katherine A.; Johnson, Nephi R.; Hou, Rui; Ettouney, Mohammed; Lynch, Jerome P.

    2017-04-01

    The ability to quantitatively assess the condition of railroad bridges facilitates objective evaluation of their robustness in the face of hazard events. Of particular importance is the need to assess the condition of railroad bridges in networks that are exposed to multiple hazards. Data collected from structural health monitoring (SHM) can be used to better maintain a structure by prompting preventative (rather than reactive) maintenance strategies and supplying quantitative information to aid in recovery. To that end, a wireless monitoring system is validated and installed on the Harahan Bridge which is a hundred-year-old long-span railroad truss bridge that crosses the Mississippi River near Memphis, TN. This bridge is exposed to multiple hazards including scour, vehicle/barge impact, seismic activity, and aging. The instrumented sensing system targets non-redundant structural components and areas of the truss and floor system that bridge managers are most concerned about based on previous inspections and structural analysis. This paper details the monitoring system and the analytical method for the assessment of bridge condition based on automated data-driven analyses. Two primary objectives of monitoring the system performance are discussed: 1) monitoring fatigue accumulation in critical tensile truss elements; and 2) monitoring the reliability index values associated with sub-system limit states of these members. Moreover, since the reliability index is a scalar indicator of the safety of components, quantifiable condition assessment can be used as an objective metric so that bridge owners can make informed damage mitigation strategies and optimize resource management on single bridge or network levels.

  2. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could lead to a major/top undesirable system event, and to identify those scenarios that are high-risk drivers. PRA output is critical to support risk informed decisions concerning system design. This paper describes the PRA process and the reliability engineering discipline in detail. It discusses their differences and similarities and how they work together as complementary analyses to support the design and risk assessment processes. Lessons learned, applications, and case studies in both areas are also discussed in the paper to demonstrate and explain these differences and similarities.

  3. A Case Study on Improving Intensive Care Unit (ICU) Services Reliability: By Using Process Failure Mode and Effects Analysis (PFMEA)

    PubMed Central

    Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad

    2016-01-01

    Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162

  4. Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion.

    PubMed

    Fröhlich, Fabian; Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J; Grima, Ramon; Hasenauer, Jan

    2016-07-01

    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity.

  5. Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion

    PubMed Central

    Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J.; Grima, Ramon; Hasenauer, Jan

    2016-01-01

    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity. PMID:27447730

  6. Smartphone based visual and quantitative assays on upconversional paper sensor.

    PubMed

    Mei, Qingsong; Jing, Huarong; Li, You; Yisibashaer, Wuerzha; Chen, Jian; Nan Li, Bing; Zhang, Yong

    2016-01-15

    The integration of smartphone with paper sensors recently has been gain increasing attentions because of the achievement of quantitative and rapid analysis. However, smartphone based upconversional paper sensors have been restricted by the lack of effective methods to acquire luminescence signals on test paper. Herein, by the virtue of 3D printing technology, we exploited an auxiliary reusable device, which orderly assembled a 980nm mini-laser, optical filter and mini-cavity together, for digitally imaging the luminescence variations on test paper and quantitative analyzing pesticide thiram by smartphone. In detail, copper ions decorated NaYF4:Yb/Tm upconversion nanoparticles were fixed onto filter paper to form test paper, and the blue luminescence on it would be quenched after additions of thiram through luminescence resonance energy transfer mechanism. These variations could be monitored by the smartphone camera, and then the blue channel intensities of obtained colored images were calculated to quantify amounts of thiram through a self-written Android program installed on the smartphone, offering a reliable and accurate detection limit of 0.1μM for the system. This work provides an initial demonstration of integrating upconversion nanosensors with smartphone digital imaging for point-of-care analysis on a paper-based platform. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Integrating Genomic Analysis with the Genetic Basis of Gene Expression: Preliminary Evidence of the Identification of Causal Genes for Cardiovascular and Metabolic Traits Related to Nutrition in Mexicans123

    PubMed Central

    Bastarrachea, Raúl A.; Gallegos-Cabriales, Esther C.; Nava-González, Edna J.; Haack, Karin; Voruganti, V. Saroja; Charlesworth, Jac; Laviada-Molina, Hugo A.; Veloz-Garza, Rosa A.; Cardenas-Villarreal, Velia Margarita; Valdovinos-Chavez, Salvador B.; Gomez-Aguilar, Patricia; Meléndez, Guillermo; López-Alvarenga, Juan Carlos; Göring, Harald H. H.; Cole, Shelley A.; Blangero, John; Comuzzie, Anthony G.; Kent, Jack W.

    2012-01-01

    Whole-transcriptome expression profiling provides novel phenotypes for analysis of complex traits. Gene expression measurements reflect quantitative variation in transcript-specific messenger RNA levels and represent phenotypes lying close to the action of genes. Understanding the genetic basis of gene expression will provide insight into the processes that connect genotype to clinically significant traits representing a central tenet of system biology. Synchronous in vivo expression profiles of lymphocytes, muscle, and subcutaneous fat were obtained from healthy Mexican men. Most genes were expressed at detectable levels in multiple tissues, and RNA levels were correlated between tissue types. A subset of transcripts with high reliability of expression across tissues (estimated by intraclass correlation coefficients) was enriched for cis-regulated genes, suggesting that proximal sequence variants may influence expression similarly in different cellular environments. This integrative global gene expression profiling approach is proving extremely useful for identifying genes and pathways that contribute to complex clinical traits. Clearly, the coincidence of clinical trait quantitative trait loci and expression quantitative trait loci can help in the prioritization of positional candidate genes. Such data will be crucial for the formal integration of positional and transcriptomic information characterized as genetical genomics. PMID:22797999

  8. Simultaneous determination of eight major steroids from Polyporus umbellatus by high-performance liquid chromatography coupled with mass spectrometry detections.

    PubMed

    Zhao, Ying-yong; Cheng, Xian-long; Zhang, Yongmin; Zhao, Ye; Lin, Rui-chao; Sun, Wen-ji

    2010-02-01

    Polyporus umbellatus is a widely used diuretic herbal medicine. In this study, a high-performance liquid chromatography coupled with atmospheric pressure chemical ionization-mass spectrometric detection (HPLC-APCI-MS) method was developed for qualitative and quantitative analysis of steroids, as well as for the quality control of Polyporus umbellatus. The selectivity, reproducibility and sensitivity were compared with HPLC with photodiode array detection and evaporative light scattering detection (ELSD). Selective ion monitoring in positive mode was used for qualitative and quantitative analysis of eight major components and beta-ecdysterone was used as the internal standard. Limits of detection and quantification fell in the ranges 7-21 and 18-63 ng/mL for the eight analytes with an injection of 10 microL samples, and all calibration curves showed good linear regression (r(2) > 0.9919) within the test range. The quantitative results demonstrated that samples from different localities showed different qualities. Advantages, in comparison with conventional HPLC-diode array detection and HPLC-ELSD, are that reliable identification of target compounds could be achieved by accurate mass measurements along with characteristic retention time, and the great enhancement in selectivity and sensitivity allows identification and quantification of low levels of constituents in complex Polyporus umbellatus matrixes. (c) 2009 John Wiley & Sons, Ltd.

  9. Comparison of detection limits in environmental analysis--is it possible? An approach on quality assurance in the lower working range by verification.

    PubMed

    Geiss, S; Einax, J W

    2001-07-01

    Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.

  10. Evaluation of different derivatisation approaches for gas chromatographic-mass spectrometric analysis of carbohydrates in complex matrices of biological and synthetic origin.

    PubMed

    Becker, M; Zweckmair, T; Forneck, A; Rosenau, T; Potthast, A; Liebner, F

    2013-03-15

    Gas chromatographic analysis of complex carbohydrate mixtures requires highly effective and reliable derivatisation strategies for successful separation, identification, and quantitation of all constituents. Different single-step (per-trimethylsilylation, isopropylidenation) and two-step approaches (ethoximation-trimethylsilylation, ethoximation-trifluoroacetylation, benzoximation-trimethylsilylation, benzoximation-trifluoroacetylation) have been comprehensively studied with regard to chromatographic characteristics, informational value of mass spectra, ease of peak assignment, robustness toward matrix effects, and quantitation using a set of reference compounds that comprise eight monosaccharides (C(5)-C(6)), glycolaldehyde, and dihydroxyacetone. It has been shown that isopropylidenation and the two oximation-trifluoroacetylation approaches are least suitable for complex carbohydrate matrices. Whereas the former is limited to compounds that contain vicinal dihydroxy moieties in cis configuration, the latter two methods are sensitive to traces of trifluoroacetic acid which strongly supports decomposition of ketohexoses. It has been demonstrated for two "real" carbohydrate-rich matrices of biological and synthetic origin, respectively, that two-step ethoximation-trimethylsilylation is superior to other approaches due to the low number of peaks obtained per carbohydrate, good peak separation performance, structural information of mass spectra, low limits of detection and quantitation, minor relative standard deviations, and low sensitivity toward matrix effects. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Preserved pontine glucose metabolism in Alzheimer disease: A reference region for functional brain image (PET) analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minoshima, Satoshi; Frey, K.A.; Foster, N.L.

    1995-07-01

    Our goal was to examine regional preservation of energy metabolism in Alzheimer disease (AD) and to evaluate effects of PET data normalization to reference regions. Regional metabolic rates in the pons, thalamus, putamen, sensorimotor cortex, visual cortex, and cerebellum (reference regions) were determined stereotaxically and examined in 37 patients with probable AD and 22 normal controls based on quantitative {sup 18}FDG-PET measurements. Following normalization of metabolic rates of the parietotemporal association cortex and whole brain to each reference region, distinctions of the two groups were assessed. The pons showed the best preservation of glucose metabolism in AD. Other reference regionsmore » showed relatively preserved metabolism compared with the parietotemporal association cortex and whole brain, but had significant metabolic reduction. Data normalization to the pons not only enhanced statistical significance of metabolic reduction in the parietotemporal association cortex, but also preserved the presence of global cerebral metabolic reduction indicated in analysis of the quantitative data. Energy metabolism in the pons in probable AD is well preserved. The pons is a reliable reference for data normalization and will enhance diagnostic accuracy and efficiency of quantitative and nonquantitative functional brain imaging. 39 refs., 2 figs., 3 tabs.« less

  12. Reliability techniques in the petroleum industry

    NASA Technical Reports Server (NTRS)

    Williams, H. L.

    1971-01-01

    Quantitative reliability evaluation methods used in the Apollo Spacecraft Program are translated into petroleum industry requirements with emphasis on offsetting reliability demonstration costs and limited production runs. Described are the qualitative disciplines applicable, the definitions and criteria that accompany the disciplines, and the generic application of these disciplines to the chemical industry. The disciplines are then translated into proposed definitions and criteria for the industry, into a base-line reliability plan that includes these disciplines, and into application notes to aid in adapting the base-line plan to a specific operation.

  13. Study on evaluation of construction reliability for engineering project based on fuzzy language operator

    NASA Astrophysics Data System (ADS)

    Shi, Yu-Fang; Ma, Yi-Yi; Song, Ping-Ping

    2018-03-01

    System Reliability Theory is a research hotspot of management science and system engineering in recent years, and construction reliability is useful for quantitative evaluation of project management level. According to reliability theory and target system of engineering project management, the defination of construction reliability appears. Based on fuzzy mathematics theory and language operator, value space of construction reliability is divided into seven fuzzy subsets and correspondingly, seven membership function and fuzzy evaluation intervals are got with the operation of language operator, which provides the basis of corresponding method and parameter for the evaluation of construction reliability. This method is proved to be scientific and reasonable for construction condition and an useful attempt for theory and method research of engineering project system reliability.

  14. Electric system restructuring and system reliability

    NASA Astrophysics Data System (ADS)

    Horiuchi, Catherine Miller

    In 1996 the California legislature passed AB 1890, explicitly defining economic benefits and detailing specific mechanisms for initiating a partial restructuring the state's electric system. Critics have since sought re-regulation and proponents have asked for patience as the new institutions and markets take shape. Other states' electric system restructuring activities have been tempered by real and perceived problems in the California model. This study examines the reduced regulatory controls and new constraints introduced in California's limited restructuring model using utility and regulatory agency records from the 1990's to investigate effects of new institutions and practices on system reliability for the state's five largest public and private utilities. Logit and negative binomial regressions indicate negative impact from the California model of restructuring on system reliability as measured by customer interruptions. Time series analysis of outage data could not predict the wholesale power market collapse and the subsequent rolling blackouts in early 2001; inclusion of near-outage reliability disturbances---load shedding and energy emergencies---provided a measure of forewarning. Analysis of system disruptions, generation capacity and demand, and the role of purchased power challenge conventional wisdom on the causality of Californian's power problems. The quantitative analysis was supplemented by a targeted survey of electric system restructuring participants. Findings suggest each utility and the organization controlling the state's electric grid provided protection from power outages comparable to pre-restructuring operations through 2000; however, this reliability has come at an inflated cost, resulting in reduced system purchases and decreased marginal protection. The historic margin of operating safety has fully eroded, increasing mandatory load shedding and emergency declarations for voluntary and mandatory conservation. Proposed remedies focused on state-funded contracts and government-managed power authorities may not help, as the findings suggest pricing models, market uncertainty, interjurisdictional conflict and an inability to respond to market perturbations are more significant contributors to reduced regional generation availability than the particular contract mechanisms and funding sources used for power purchases.

  15. A Comparison of Protein Extraction Methods Suitable for Gel-Based Proteomic Studies of Aphid Proteins

    PubMed Central

    Cilia, M.; Fish, T.; Yang, X.; Mclaughlin, M.; Thannhauser, T. W.

    2009-01-01

    Protein extraction methods can vary widely in reproducibility and in representation of the total proteome, yet there are limited data comparing protein isolation methods. The methodical comparison of protein isolation methods is the first critical step for proteomic studies. To address this, we compared three methods for isolation, purification, and solubilization of insect proteins. The aphid Schizaphis graminum, an agricultural pest, was the source of insect tissue. Proteins were extracted using TCA in acetone (TCA-acetone), phenol, or multi-detergents in a chaotrope solution. Extracted proteins were solubilized in a multiple chaotrope solution and examined using 1-D and 2-D electrophoresis and compared directly using 2-D Difference Gel Electrophoresis (2-D DIGE). Mass spectrometry was used to identify proteins from each extraction type. We were unable to ascribe the differences in the proteins extracted to particular physical characteristics, cell location, or biological function. The TCA-acetone extraction yielded the greatest amount of protein from aphid tissues. Each extraction method isolated a unique subset of the aphid proteome. The TCA-acetone method was explored further for its quantitative reliability using 2-D DIGE. Principal component analysis showed that little of the variation in the data was a result of technical issues, thus demonstrating that the TCA-acetone extraction is a reliable method for preparing aphid proteins for a quantitative proteomics experiment. These data suggest that although the TCA-acetone method is a suitable method for quantitative aphid proteomics, a combination of extraction approaches is recommended for increasing proteome coverage when using gel-based separation techniques. PMID:19721822

  16. Initial description of a quantitative, cross-species (chimpanzee-human) social responsiveness measure.

    PubMed

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E; Constantino, John N; Povinelli, Daniel J; Pruett, John R

    2011-05-01

    Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species (human-chimpanzee) social responsiveness measure. We translated the Social Responsiveness Scale (SRS), an instrument that quantifies human social responsiveness, into an analogous instrument for chimpanzees. We then retranslated this "Chimpanzee SRS" into a human "Cross-Species SRS" (XSRS). We evaluated three groups of chimpanzees (n = 29) with the Chimpanzee SRS and typical and human children with autism spectrum disorder (ASD; n = 20) with the XSRS. The Chimpanzee SRS demonstrated strong interrater reliability at the three sites (ranges for individual ICCs: 0.534 to 0.866; mean ICCs: 0.851 to 0.970). As has been observed in human beings, exploratory principal components analysis of Chimpanzee SRS scores supports a single factor underlying chimpanzee social responsiveness. Human subjects' XSRS scores were fully concordant with their SRS scores (r = 0.976, p = .001) and distinguished appropriately between typical and ASD subjects. One chimpanzee known for inappropriate social behavior displayed a significantly higher score than all other chimpanzees at its site, demonstrating the scale's ability to detect impaired social responsiveness in chimpanzees. Our initial cross-species social responsiveness scale proved reliable and discriminated differences in social responsiveness across (in a relative sense) and within (in a more objectively quantifiable manner) human beings and chimpanzees. Copyright © 2011 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  17. Quantitative Decision Making.

    ERIC Educational Resources Information Center

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  18. 76 FR 12140 - Agency Information Collection Activities: Proposed Collection; Comment Request; Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... provides useful insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that can be generalized to the population of study. This feedback will provide insights... used for quantitative information collections that are designed to yield reliably actionable results...

  19. 76 FR 37620 - Risk-Based Capital Standards: Advanced Capital Adequacy Framework-Basel II; Establishment of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-28

    ... systems. E. Quantitative Methods for Comparing Capital Frameworks The NPR sought comment on how the... industry while assessing levels of capital. This commenter points out maintaining reliable comparative data over time could make quantitative methods for this purpose difficult. For example, evaluating asset...

  20. Influence of echo time in quantitative proton MR spectroscopy using LCModel.

    PubMed

    Yamamoto, Tetsuya; Isobe, Tomonori; Akutsu, Hiroyoshi; Masumoto, Tomohiko; Ando, Hiroki; Sato, Eisuke; Takada, Kenta; Anno, Izumi; Matsumura, Akira

    2015-06-01

    The objective of this study was to elucidate the influence on quantitative analysis using LCModel with the condition of echo time (TE) longer than the recommended values in the spectrum acquisition specifications. A 3T magnetic resonance system was used to perform proton magnetic resonance spectroscopy. The participants were 5 healthy volunteers and 11 patients with glioma. Data were collected at TE of 72, 144 and 288ms. LCModel was used to quantify several metabolites (N-acetylaspartate, creatine and phosphocreatine, and choline-containing compounds). The results were compared with quantitative values obtained by using the T2-corrected internal reference method. In healthy volunteers, when TE was long, the quantitative values obtained using LCModel were up to 6.8-fold larger (p<0.05) than those obtained using the T2-corrected internal reference method. The ratios of the quantitative values obtained by the two methods differed between metabolites (p<0.05). In patients with glioma, the ratios of quantitative values obtained by the two methods tended to be larger at longer TE, similarly to the case of healthy volunteers, and large between-individual variation in the ratios was observed. In clinical practice, TE is sometimes set longer than the value recommended for LCModel. If TE is long, LCModel overestimates the quantitative value since it cannot compensate for signal attenuation, and this effect is different for each metabolite and condition. Therefore, if TE is longer than recommended, it is necessary to account for the possibly reduced reliability of quantitative values calculated using LCModel. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Augmenting Amyloid PET Interpretations With Quantitative Information Improves Consistency of Early Amyloid Detection.

    PubMed

    Harn, Nicholas R; Hunt, Suzanne L; Hill, Jacqueline; Vidoni, Eric; Perry, Mark; Burns, Jeffrey M

    2017-08-01

    Establishing reliable methods for interpreting elevated cerebral amyloid-β plaque on PET scans is increasingly important for radiologists, as availability of PET imaging in clinical practice increases. We examined a 3-step method to detect plaque in cognitively normal older adults, focusing on the additive value of quantitative information during the PET scan interpretation process. Fifty-five F-florbetapir PET scans were evaluated by 3 experienced raters. Scans were first visually interpreted as having "elevated" or "nonelevated" plaque burden ("Visual Read"). Images were then processed using a standardized quantitative analysis software (MIMneuro) to generate whole brain and region of interest SUV ratios. This "Quantitative Read" was considered elevated if at least 2 of 6 regions of interest had an SUV ratio of more than 1.1. The final interpretation combined both visual and quantitative data together ("VisQ Read"). Cohen kappa values were assessed as a measure of interpretation agreement. Plaque was elevated in 25.5% to 29.1% of the 165 total Visual Reads. Interrater agreement was strong (kappa = 0.73-0.82) and consistent with reported values. Quantitative Reads were elevated in 45.5% of participants. Final VisQ Reads changed from initial Visual Reads in 16 interpretations (9.7%), with most changing from "nonelevated" Visual Reads to "elevated." These changed interpretations demonstrated lower plaque quantification than those initially read as "elevated" that remained unchanged. Interrater variability improved for VisQ Reads with the addition of quantitative information (kappa = 0.88-0.96). Inclusion of quantitative information increases consistency of PET scan interpretations for early detection of cerebral amyloid-β plaque accumulation.

  2. Nuclear electric propulsion operational reliability and crew safety study: NEP systems/modeling report

    NASA Technical Reports Server (NTRS)

    Karns, James

    1993-01-01

    The objective of this study was to establish the initial quantitative reliability bounds for nuclear electric propulsion systems in a manned Mars mission required to ensure crew safety and mission success. Finding the reliability bounds involves balancing top-down (mission driven) requirements and bottom-up (technology driven) capabilities. In seeking this balance we hope to accomplish the following: (1) provide design insights into the achievability of the baseline design in terms of reliability requirements, given the existing technology base; (2) suggest alternative design approaches which might enhance reliability and crew safety; and (3) indicate what technology areas require significant research and development to achieve the reliability objectives.

  3. Measuring competence in endoscopic sinus surgery.

    PubMed

    Syme-Grant, J; White, P S; McAleer, J P G

    2008-02-01

    Competence based education is currently being introduced into higher surgical training in the UK. Valid and reliable performance assessment tools are essential to ensure competencies are achieved. No such tools have yet been reported in the UK literature. We sought to develop and pilot test an Endoscopic Sinus Surgery Competence Assessment Tool (ESSCAT). The ESSCAT was designed for in-theatre assessment of higher surgical trainees in the UK. The ESSCAT rating matrix was developed through task analysis of ESS procedures. All otolaryngology consultants and specialist registrars in Scotland were given the opportunity to contribute to its refinement. Two cycles of in-theatre testing were used to ensure utility and gather quantitative data on validity and reliability. Videos of trainees performing surgery were used in establishing inter-rater reliability. National consultation, the consensus derived minimum standard of performance, Cronbach's alpha = 0.89 and demonstration of trainee learning (p = 0.027) during the in vivo application of the ESSCAT suggest a high level of validity. Inter-rater reliability was moderate for competence decisions (Cohen's Kappa = 0.5) and good for total scores (Intra-Class Correlation Co-efficient = 0.63). Intra-rater reliability was good for both competence decisions (Kappa = 0.67) and total scores (Kendall's Tau-b = 0.73). The ESSCAT generates a valid and reliable assessment of trainees' in-theatre performance of endoscopic sinus surgery. In conjunction with ongoing evaluation of the instrument we recommend the use of the ESSCAT in higher specialist training in otolaryngology in the UK.

  4. Multiplexed Analysis of Serum Breast and Ovarian Cancer Markers by Means of Suspension Bead-quantum Dot Microarrays

    NASA Astrophysics Data System (ADS)

    Brazhnik, Kristina; Sokolova, Zinaida; Baryshnikova, Maria; Bilan, Regina; Nabiev, Igor; Sukhanova, Alyona

    Multiplexed analysis of cancer markers is crucial for early tumor diagnosis and screening. We have designed lab-on-a-bead microarray for quantitative detection of three breast cancer markers in human serum. Quantum dots were used as bead-bound fluorescent tags for identifying each marker by means of flow cytometry. Antigen-specific beads reliably detected CA 15-3, CEA, and CA 125 in serum samples, providing clear discrimination between the samples with respect to the antigen levels. The novel microarray is advantageous over the routine single-analyte ones due to the simultaneous detection of various markers. Therefore the developed microarray is a promising tool for serum tumor marker profiling.

  5. Selenium contents in tobacco and main stream cigarette smoke determined using neutron activation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorak-Pokrajac, M.; Dermelj, M.; Slejkovec, Z.

    In the domain of the essential trace elements, the role of selenium is extremely important. As one of the volatile elements it can be partly absorbed through the pulmonary system during smoking and transported to different organs of the body. Thus a knowledge of its concentration levels in various sorts of tobacco and in the smoke of commercial cigarettes, as well as in the same type of cigarettes from plants treated with selenium, is of interest for various research fields. The purpose of this contribution is to present reliable quantitative data on selenium contents in tobacco, soil, and main streammore » cigarette smoke, obtained by destructive neutron activation analysis.« less

  6. Nanostructured Drugs Embedded into a Polymeric Matrix: Vinpocetine/PVP Hybrids Investigated by Debye Function Analysis.

    PubMed

    Hasa, Dritan; Giacobbe, Carlotta; Perissutti, Beatrice; Voinovich, Dario; Grassi, Mario; Cervellino, Antonio; Masciocchi, Norberto; Guagliardi, Antonietta

    2016-09-06

    Microcrystalline vinpocetine, coground with cross-linked polyvinylpyrrolidone, affords hybrids containing nanosized drug nanocrystals, the size and size distributions of which depend on milling times and drug-to-polymer weight ratios. Using an innovative approach to microstructural characterization, we analyzed wide-angle X-ray total scattering data by the Debye function analysis and demonstrated the possibility to characterize pharmaceutical solid dispersions obtaining a reliable quantitative view of the physicochemical status of the drug dispersed in an amorphous carrier. The microstructural properties derived therefrom have been successfully employed in reconciling the enigmatic difference in behavior between in vitro and in vivo solubility tests performed on nanosized vinpocetine embedded in a polymeric matrix.

  7. 3D-QSAR study and design of 4-hydroxyamino α-pyranone carboxamide analogues as potential anti-HCV agents

    NASA Astrophysics Data System (ADS)

    Li, Wenlian; Xiao, Faqi; Zhou, Mingming; Jiang, Xuejin; Liu, Jun; Si, Hongzong; Xie, Meng; Ma, Xiuting; Duan, Yunbo; Zhai, Honglin

    2016-09-01

    The three dimensional-quantitative structure activity relationship (3D-QSAR) study was performed on a series of 4-hydroxyamino α-pyranone carboxamide analogues using comparative molecular similarity indices analysis (COMSIA). The purpose of the present study was to develop a satisfactory model providing a reliable prediction based on 4-hydroxyamino α-pyranone carboxamide analogues as anti-HCV (hepatitis C virus) inhibitors. The statistical results and the results of validation of this optimum COMSIA model were satisfactory. Furthermore, analysis of the contour maps helped to provide guidelines for finding structural requirement. Therefore, the satisfactory results from this study may provide useful guidelines for drug development of anti-HCV inhibitors.

  8. Paper Capillary Enables Effective Sampling for Microfluidic Paper Analytical Devices.

    PubMed

    Shangguan, Jin-Wen; Liu, Yu; Wang, Sha; Hou, Yun-Xuan; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan

    2018-06-06

    Paper capillary is introduced to enable effective sampling on microfluidic paper analytical devices. By coupling mac-roscale capillary force of paper capillary and microscale capillary forces of native paper, fluid transport can be flexibly tailored with proper design. Subsequently, a hybrid-fluid-mode paper capillary device was proposed, which enables fast and reliable sampling in an arrayed form, with less surface adsorption and bias for different components. The resulting device thus well supports high throughput, quantitative, and repeatable assays all by hands operation. With all these merits, multiplex analysis of ions, proteins, and microbe have all been realized on this platform, which has paved the way to level-up analysis on μPADs.

  9. A study on reliability of power customer in distribution network

    NASA Astrophysics Data System (ADS)

    Liu, Liyuan; Ouyang, Sen; Chen, Danling; Ma, Shaohua; Wang, Xin

    2017-05-01

    The existing power supply reliability index system is oriented to power system without considering actual electricity availability in customer side. In addition, it is unable to reflect outage or customer’s equipment shutdown caused by instantaneous interruption and power quality problem. This paper thus makes a systematic study on reliability of power customer. By comparing with power supply reliability, reliability of power customer is defined and extracted its evaluation requirements. An indexes system, consisting of seven customer indexes and two contrast indexes, are designed to describe reliability of power customer from continuity and availability. In order to comprehensively and quantitatively evaluate reliability of power customer in distribution networks, reliability evaluation method is proposed based on improved entropy method and the punishment weighting principle. Practical application has proved that reliability index system and evaluation method for power customer is reasonable and effective.

  10. Development and validation of technique for in-vivo 3D analysis of cranial bone graft survival

    NASA Astrophysics Data System (ADS)

    Bernstein, Mark P.; Caldwell, Curtis B.; Antonyshyn, Oleh M.; Ma, Karen; Cooper, Perry W.; Ehrlich, Lisa E.

    1997-05-01

    Bone autografts are routinely employed in the reconstruction of facial deformities resulting from trauma, tumor ablation or congenital malformations. The combined use of post- operative 3D CT and SPECT imaging provides a means for quantitative in vivo evaluation of bone graft volume and osteoblastic activity. The specific objectives of this study were: (1) Determine the reliability and accuracy of interactive computer-assisted analysis of bone graft volumes based on 3D CT scans; (2) Determine the error in CT/SPECT multimodality image registration; (3) Determine the error in SPECT/SPECT image registration; and (4) Determine the reliability and accuracy of CT-guided SPECT uptake measurements in cranial bone grafts. Five human cadaver heads served as anthropomorphic models for all experiments. Four cranial defects were created in each specimen with inlay and onlay split skull bone grafts and reconstructed to skull and malar recipient sites. To acquire all images, each specimen was CT scanned and coated with Technetium doped paint. For purposes of validation, skulls were landmarked with 1/16-inch ball-bearings and Indium. This study provides a new technique relating anatomy and physiology for the analysis of cranial bone graft survival.

  11. Kinematics of mechanical and adhesional micromanipulation under a scanning electron microscope

    NASA Astrophysics Data System (ADS)

    Saito, Shigeki; Miyazaki, Hideki T.; Sato, Tomomasa; Takahashi, Kunio

    2002-11-01

    In this paper, the kinematics of mechanical and adhesional micromanipulation using a needle-shaped tool under a scanning electron microscope is analyzed. A mode diagram is derived to indicate the possible micro-object behavior for the specified operational conditions. Based on the diagram, a reasonable method for pick and place operation is proposed. The keys to successful analysis are to introduce adhesional and rolling-resistance factors into the kinematic system consisting of a sphere, a needle-shaped tool, and a substrate, and to consider the time dependence of these factors due to the electron-beam (EB) irradiation. Adhesional force and the lower limit of maximum rolling resistance are evaluated quantitatively in theoretical and experimental ways. This analysis shows that it is possible to control the fracture of either the tool-sphere or substrate-sphere interface of the system selectively by the tool-loading angle and that such a selective fracture of the interfaces enables reliable pick or place operation even under EB irradiation. Although the conventional micromanipulation was not repeatable because the technique was based on an empirically effective method, this analysis should provide us with a guideline to reliable micromanipulation.

  12. Semi-quantitative proteomics of mammalian cells upon short-term exposure to non-ionizing electromagnetic fields.

    PubMed

    Kuzniar, Arnold; Laffeber, Charlie; Eppink, Berina; Bezstarosti, Karel; Dekkers, Dick; Woelders, Henri; Zwamborn, A Peter M; Demmers, Jeroen; Lebbink, Joyce H G; Kanaar, Roland

    2017-01-01

    The potential effects of non-ionizing electromagnetic fields (EMFs), such as those emitted by power-lines (in extremely low frequency range), mobile cellular systems and wireless networking devices (in radio frequency range) on human health have been intensively researched and debated. However, how exposure to these EMFs may lead to biological changes underlying possible health effects is still unclear. To reveal EMF-induced molecular changes, unbiased experiments (without a priori focusing on specific biological processes) with sensitive readouts are required. We present the first proteome-wide semi-quantitative mass spectrometry analysis of human fibroblasts, osteosarcomas and mouse embryonic stem cells exposed to three types of non-ionizing EMFs (ELF 50 Hz, UMTS 2.1 GHz and WiFi 5.8 GHz). We performed controlled in vitro EMF exposures of metabolically labeled mammalian cells followed by reliable statistical analyses of differential protein- and pathway-level regulations using an array of established bioinformatics methods. Our results indicate that less than 1% of the quantitated human or mouse proteome responds to the EMFs by small changes in protein abundance. Further network-based analysis of the differentially regulated proteins did not detect significantly perturbed cellular processes or pathways in human and mouse cells in response to ELF, UMTS or WiFi exposure. In conclusion, our extensive bioinformatics analyses of semi-quantitative mass spectrometry data do not support the notion that the short-time exposures to non-ionizing EMFs have a consistent biologically significant bearing on mammalian cells in culture.

  13. Semi-quantitative proteomics of mammalian cells upon short-term exposure to non-ionizing electromagnetic fields

    PubMed Central

    Laffeber, Charlie; Eppink, Berina; Bezstarosti, Karel; Dekkers, Dick; Woelders, Henri; Zwamborn, A. Peter M.; Demmers, Jeroen; Lebbink, Joyce H. G.; Kanaar, Roland

    2017-01-01

    The potential effects of non-ionizing electromagnetic fields (EMFs), such as those emitted by power-lines (in extremely low frequency range), mobile cellular systems and wireless networking devices (in radio frequency range) on human health have been intensively researched and debated. However, how exposure to these EMFs may lead to biological changes underlying possible health effects is still unclear. To reveal EMF-induced molecular changes, unbiased experiments (without a priori focusing on specific biological processes) with sensitive readouts are required. We present the first proteome-wide semi-quantitative mass spectrometry analysis of human fibroblasts, osteosarcomas and mouse embryonic stem cells exposed to three types of non-ionizing EMFs (ELF 50 Hz, UMTS 2.1 GHz and WiFi 5.8 GHz). We performed controlled in vitro EMF exposures of metabolically labeled mammalian cells followed by reliable statistical analyses of differential protein- and pathway-level regulations using an array of established bioinformatics methods. Our results indicate that less than 1% of the quantitated human or mouse proteome responds to the EMFs by small changes in protein abundance. Further network-based analysis of the differentially regulated proteins did not detect significantly perturbed cellular processes or pathways in human and mouse cells in response to ELF, UMTS or WiFi exposure. In conclusion, our extensive bioinformatics analyses of semi-quantitative mass spectrometry data do not support the notion that the short-time exposures to non-ionizing EMFs have a consistent biologically significant bearing on mammalian cells in culture. PMID:28234898

  14. Quantitative analysis for peripheral vascularity assessment based on clinical photoacoustic and ultrasound images

    NASA Astrophysics Data System (ADS)

    Murakoshi, Dai; Hirota, Kazuhiro; Ishii, Hiroyasu; Hashimoto, Atsushi; Ebata, Tetsurou; Irisawa, Kaku; Wada, Takatsugu; Hayakawa, Toshiro; Itoh, Kenji; Ishihara, Miya

    2018-02-01

    Photoacoustic (PA) imaging technology is expected to be applied to clinical assessment for peripheral vascularity. We started a clinical evaluation with the prototype PA imaging system we recently developed. Prototype PA imaging system was composed with in-house Q-switched Alexandrite laser system which emits short-pulsed laser with 750 nm wavelength, handheld ultrasound transducer where illumination optics were integrated and signal processing for PA image reconstruction implemented in the clinical ultrasound (US) system. For the purpose of quantitative assessment of PA images, an image analyzing function has been developed and applied to clinical PA images. In this analyzing function, vascularity derived from PA signal intensity ranged for prescribed threshold was defined as a numerical index of vessel fulfillment and calculated for the prescribed region of interest (ROI). Skin surface was automatically detected by utilizing B-mode image acquired simultaneously with PA image. Skinsurface position is utilized to place the ROI objectively while avoiding unwanted signals such as artifacts which were imposed due to melanin pigment in the epidermal layer which absorbs laser emission and generates strong PA signals. Multiple images were available to support the scanned image set for 3D viewing. PA images for several fingers of patients with systemic sclerosis (SSc) were quantitatively assessed. Since the artifact region is trimmed off in PA images, the visibility of vessels with rather low PA signal intensity on the 3D projection image was enhanced and the reliability of the quantitative analysis was improved.

  15. Fault trees for decision making in systems analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Howard E.

    1975-10-09

    The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less

  16. Quantitative FE-EPMA measurement of formation and inhibition of carbon contamination on Fe for trace carbon analysis.

    PubMed

    Tanaka, Yuji; Yamashita, Takako; Nagoshi, Masayasu

    2017-04-01

    Hydrocarbon contamination introduced during point, line and map analyses in a field emission electron probe microanalysis (FE-EPMA) was investigated to enable reliable quantitative analysis of trace amounts of carbon in steels. The increment of contamination on pure iron in point analysis is proportional to the number of iterations of beam irradiation, but not to the accumulated irradiation time. A combination of a longer dwell time and single measurement with a liquid nitrogen (LN2) trap as an anti-contamination device (ACD) is sufficient for a quantitative point analysis. However, in line and map analyses, contamination increases with irradiation time in addition to the number of iterations, even though the LN2 trap and a plasma cleaner are used as ACDs. Thus, a shorter dwell time and single measurement are preferred for line and map analyses, although it is difficult to eliminate the influence of contamination. While ring-like contamination around the irradiation point grows during electron-beam irradiation, contamination at the irradiation point increases during blanking time after irradiation. This can explain the increment of contamination in iterative point analysis as well as in line and map analyses. Among the ACDs, which are tested in this study, specimen heating at 373 K has a significant contamination inhibition effect. This technique makes it possible to obtain line and map analysis data with minimum influence of contamination. The above-mentioned FE-EPMA data are presented and discussed in terms of the contamination-formation mechanisms and the preferable experimental conditions for the quantification of trace carbon in steels. © The Author 2016. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Human figure drawings in the evaluation of severe adolescent suicidal behavior.

    PubMed

    Zalsman, G; Netanel, R; Fischel, T; Freudenstein, O; Landau, E; Orbach, I; Weizman, A; Pfeffer, C R; Apter, A

    2000-08-01

    To evaluate the reliability of using certain indicators derived from human figure drawings to distinguish between suicidal and nonsuicidal adolescents. Ninety consecutive admissions to an adolescent inpatient unit were assessed. Thirty-nine patients were admitted because of suicidal behavior and 51 for other reasons. All subjects were given the Human Figure Drawing (HFD) test. HFD was evaluated according to the method of Pfeffer and Richman, and the degree of suicidal behavior was rated by the Child Suicide Potential Scale. The internal reliability was satisfactory. HFD indicators correlated significantly with quantitative measures of suicidal behavior; of these indicators specifically, overall impression of the evaluator enabled the prediction of suicidal behavior and the distinction between suicidal and nonsuicidal inpatients (p < .001). A group of graphic indicators derived from a discriminant analysis formed a function, which was able to identify 84.6% of the suicidal and 76.6% of the nonsuicidal adolescents correctly. Many of the items had a regressive quality. The HFD is an example of a simple projective test that may have empirical reliability. It may be useful for the assessment of severe suicidal behavior in adolescents.

  18. A novel approach on accelerated ageing towards reliability optimization of high concentration photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Tsanakas, John A.; Jaffre, Damien; Sicre, Mathieu; Elouamari, Rachid; Vossier, Alexis; de Salins, Jean-Edouard; Bechou, Laurent; Levrier, Bruno; Perona, Arnaud; Dollet, Alain

    2014-09-01

    This paper presents a preliminary study upon a novel approach proposed for highly accelerated ageing and reliability optimization of high concentrating photovoltaic (HCPV) cells and assemblies. The intended approach aims to overcome several limitations of some current accelerated ageing tests (AAT) adopted up today, proposing the use of an alternative experimental set-up for performing faster and more realistic thermal cycles, under real sun, without the involvement of environmental chamber. The study also includes specific characterization techniques, before and after each AAT sequence, which respectively provide the initial and final diagnosis on the condition of the tested sample. The acquired data from these diagnostic/characterization methods are then used as indices to determine both quantitatively and qualitatively the severity of degradation and, thus, the ageing level for each tested HCPV assembly or cell sample. Ultimate goal of such "initial diagnosis - AAT - final diagnosis" sequences is to provide the basis for a future work on the reliability analysis of the main degradation mechanisms and confident prediction of failure propagation in HCPV cells, by means of acceleration factor (AF) and mean-time-to-failure (MTTF) estimations.

  19. Reliable Quantitative Mineral Abundances of the Martian Surface using THEMIS

    NASA Astrophysics Data System (ADS)

    Smith, R. J.; Huang, J.; Ryan, A. J.; Christensen, P. R.

    2013-12-01

    The following presents a proof of concept that given quality data, Thermal Emission Imaging System (THEMIS) data can be used to derive reliable quantitative mineral abundances of the Martian surface using a limited mineral library. The THEMIS instrument aboard the Mars Odyssey spacecraft is a multispectral thermal infrared imager with a spatial resolution of 100 m/pixel. The relatively high spatial resolution along with global coverage makes THEMIS datasets powerful tools for comprehensive fine scale petrologic analyses. However, the spectral resolution of THEMIS is limited to 8 surface sensitive bands between 6.8 and 14.0 μm with an average bandwidth of ~ 1 μm, which complicates atmosphere-surface separation and spectral analysis. This study utilizes the atmospheric correction methods of both Bandfield et al. [2004] and Ryan et al. [2013] joined with the iterative linear deconvolution technique pioneered by Huang et al. [in review] in order to derive fine-scale quantitative mineral abundances of the Martian surface. In general, it can be assumed that surface emissivity combines in a linear fashion in the thermal infrared (TIR) wavelengths such that the emitted energy is proportional to the areal percentage of the minerals present. TIR spectra are unmixed using a set of linear equations involving an endmember library of lab measured mineral spectra. The number of endmembers allowed in a spectral library are restricted to a quantity of n-1 (where n = the number of spectral bands of an instrument), preserving one band for blackbody. Spectral analysis of THEMIS data is thus allowed only seven endmembers. This study attempts to prove that this limitation does not prohibit the derivation of meaningful spectral analyses from THEMIS data. Our study selects THEMIS stamps from a region of Mars that is well characterized in the TIR by the higher spectral resolution, lower spatial resolution Thermal Emission Spectrometer (TES) instrument (143 bands at 10 cm-1 sampling and 3x5 km pixel). Multiple atmospheric corrections are performed for one image using the methods of Bandfield et al. [2004] and Ryan et al. [2013]. 7x7 pixel areas were selected, averaged, and compared using each atmospherically corrected image to ensure consistency. Corrections that provided reliable data were then used for spectral analyses. Linear deconvolution is performed using an iterative spectral analysis method [Huang et al. in review] that takes an endmember spectral library, and creates mineral combinations based on prescribed mineral group selections. The script then performs a spectral mixture analysis on each surface spectrum using all possible mineral combinations, and reports the best modeled fit to the measured spectrum. Here we present initial results from Syrtis Planum where multiple atmospherically corrected THEMIS images were deconvolved to produce similar spectral analysis results, within the detection limit of the instrument. THEMIS mineral abundances are comparable to TES-derived abundances. References: Bandfield, JL et al. [2004], JGR 109, E10008 Huang, J et al., JGR, in review Ryan, AJ et al. [2013], AGU Fall Meeting

  20. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  1. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews.

    PubMed

    Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique

    2009-04-01

    A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This scoring system may also be used to appraise the methodological quality of qualitative, quantitative and mixed methods components of mixed methods research.

  2. Quantitative Accelerated Life Testing of MEMS Accelerometers

    PubMed Central

    Bâzu, Marius; Gălăţeanu, Lucian; Ilian, Virgil Emil; Loicq, Jerome; Habraken, Serge; Collette, Jean-Paul

    2007-01-01

    Quantitative Accelerated Life Testing (QALT) is a solution for assessing the reliability of Micro Electro Mechanical Systems (MEMS). A procedure for QALT is shown in this paper and an attempt to assess the reliability level for a batch of MEMS accelerometers is reported. The testing plan is application-driven and contains combined tests: thermal (high temperature) and mechanical stress. Two variants of mechanical stress are used: vibration (at a fixed frequency) and tilting. Original equipment for testing at tilting and high temperature is used. Tilting is appropriate as application-driven stress, because the tilt movement is a natural environment for devices used for automotive and aerospace applications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The test results demonstrated the excellent reliability of the studied devices, the failure rate in the “worst case” being smaller than 10-7h-1. PMID:28903265

  3. Disaster Metrics: Evaluation of de Boer's Disaster Severity Scale (DSS) Applied to Earthquakes.

    PubMed

    Bayram, Jamil D; Zuabi, Shawki; McCord, Caitlin M; Sherak, Raphael A G; Hsu, Edberdt B; Kelen, Gabor D

    2015-02-01

    Quantitative measurement of the medical severity following multiple-casualty events (MCEs) is an important goal in disaster medicine. In 1990, de Boer proposed a 13-point, 7-parameter scale called the Disaster Severity Scale (DSS). Parameters include cause, duration, radius, number of casualties, nature of injuries, rescue time, and effect on surrounding community. Hypothesis This study aimed to examine the reliability and dimensionality (number of salient themes) of de Boer's DSS scale through its application to 144 discrete earthquake events. A search for earthquake events was conducted via National Oceanic and Atmospheric Administration (NOAA) and US Geological Survey (USGS) databases. Two experts in the field of disaster medicine independently reviewed and assigned scores for parameters that had no data readily available (nature of injuries, rescue time, and effect on surrounding community), and differences were reconciled via consensus. Principle Component Analysis was performed using SPSS Statistics for Windows Version 22.0 (IBM Corp; Armonk, New York USA) to evaluate the reliability and dimensionality of the DSS. A total of 144 individual earthquakes from 2003 through 2013 were identified and scored. Of 13 points possible, the mean score was 6.04, the mode = 5, minimum = 4, maximum = 11, and standard deviation = 2.23. Three parameters in the DSS had zero variance (ie, the parameter received the same score in all 144 earthquakes). Because of the zero contribution to variance, these three parameters (cause, duration, and radius) were removed to run the statistical analysis. Cronbach's alpha score, a coefficient of internal consistency, for the remaining four parameters was found to be robust at 0.89. Principle Component Analysis showed uni-dimensional characteristics with only one component having an eigenvalue greater than one at 3.17. The 4-parameter DSS, however, suffered from restriction of scoring range on both parameter and scale levels. Jan de Boer's DSS in its 7-parameter format fails to hold statistically in a dataset of 144 earthquakes subjected to analysis. A modified 4-parameter scale was found to quantitatively assess medical severity more directly, but remains flawed due to range restriction on both individual parameter and scale levels. Further research is needed in the field of disaster metrics to develop a scale that is reliable in its complete set of parameters, capable of better fine discrimination, and uni-dimensional in measurement of the medical severity of MCEs.

  4. Factor Analytic Validation of the Ford, Wolvin, and Chung Listening Competence Scale

    ERIC Educational Resources Information Center

    Mickelson, William T.; Welch, S. A.

    2012-01-01

    This research begins to independently and quantitatively validate the Ford, Wolvin, and Chung (2000) Listening Competency Scale. Reliability and Confirmatory Factor analyses were conducted on two independent samples. The reliability estimates were found to be below those reported by Ford, Wolvin, and Chung (2000) and below acceptable levels for…

  5. Reliability of quantitative EEG (qEEG) measures and LORETA current source density at 30 days.

    PubMed

    Cannon, Rex L; Baldwin, Debora R; Shaw, Tiffany L; Diloreto, Dominic J; Phillips, Sherman M; Scruggs, Annie M; Riehl, Timothy C

    2012-06-14

    There is a growing interest for using quantitative EEG and LORETA current source density in clinical and research settings. Importantly, if these indices are to be employed in clinical settings then the reliability of these measures is of great concern. Neuroguide (Applied Neurosciences) is sophisticated software developed for the analyses of power, and connectivity measures of the EEG as well as LORETA current source density. To date there are relatively few data evaluating topographical EEG reliability contrasts for all 19 channels and no studies have evaluated reliability for LORETA calculations. We obtained 4 min eyes-closed and eyes-opened EEG recordings at 30-day intervals. The EEG was analyzed in Neuroguide and FFT power, coherence and phase was computed for traditional frequency bands (delta, theta, alpha and beta) and LORETA current source density was calculated in 1 Hz increments and summed for total power in eight regions of interest (ROI). In order to obtain a robust measure of reliability we utilized a random effects model with an absolute agreement definition. The results show very good reproducibility for total absolute power and coherence. Phase shows lower reliability coefficients. LORETA current source density shows very good reliability with an average 0.81 for ECB and 0.82 for EOB. Similarly, the eight regions of interest show good to very good agreement across time. Implications for future directions and use of qEEG and LORETA in clinical populations are discussed. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Test-retest reliability of quantitative sensory testing for mechanical somatosensory and pain modulation assessment of masticatory structures.

    PubMed

    Costa, Y M; Morita-Neto, O; de Araújo-Júnior, E N S; Sampaio, F A; Conti, P C R; Bonjardim, L R

    2017-03-01

    Assessing the reliability of medical measurements is a crucial step towards the elaboration of an applicable clinical instrument. There are few studies that evaluate the reliability of somatosensory assessment and pain modulation of masticatory structures. This study estimated the test-retest reliability, that is over time, of the mechanical somatosensory assessment of anterior temporalis, masseter and temporomandibular joint (TMJ) and the conditioned pain modulation (CPM) using the anterior temporalis as the test site. Twenty healthy women were evaluated in two sessions (1 week apart) by the same examiner. Mechanical detection threshold (MDT), mechanical pain threshold (MPT), wind-up ratio (WUR) and pressure pain threshold (PPT) were assessed on the skin overlying the anterior temporalis, masseter and TMJ of the dominant side. CPM was tested by comparing PPT before and during the hand immersion in a hot water bath. anova and intra-class correlation coefficients (ICCs) were applied to the data (α = 5%). The overall ICCs showed acceptable values for the test-retest reliability of mechanical somatosensory assessment of masticatory structures. The ICC values of 75% of all quantitative sensory measurements were considered fair to excellent (fair = 8·4%, good = 33·3% and excellent = 33·3%). However, the CPM paradigm presented poor reliability (ICC = 0·25). The mechanical somatosensory assessment of the masticatory structures, but not the proposed CPM protocol, can be considered sufficiently reliable over time to evaluate the trigeminal sensory function. © 2016 John Wiley & Sons Ltd.

  7. Wearable inertial sensors in swimming motion analysis: a systematic review.

    PubMed

    de Magalhaes, Fabricio Anicio; Vannozzi, Giuseppe; Gatta, Giorgio; Fantozzi, Silvia

    2015-01-01

    The use of contemporary technology is widely recognised as a key tool for enhancing competitive performance in swimming. Video analysis is traditionally used by coaches to acquire reliable biomechanical data about swimming performance; however, this approach requires a huge computational effort, thus introducing a delay in providing quantitative information. Inertial and magnetic sensors, including accelerometers, gyroscopes and magnetometers, have been recently introduced to assess the biomechanics of swimming performance. Research in this field has attracted a great deal of interest in the last decade due to the gradual improvement of the performance of sensors and the decreasing cost of miniaturised wearable devices. With the aim of describing the state of the art of current developments in this area, a systematic review of the existing methods was performed using the following databases: PubMed, ISI Web of Knowledge, IEEE Xplore, Google Scholar, Scopus and Science Direct. Twenty-seven articles published in indexed journals and conference proceedings, focusing on the biomechanical analysis of swimming by means of inertial sensors were reviewed. The articles were categorised according to sensor's specification, anatomical sites where the sensors were attached, experimental design and applications for the analysis of swimming performance. Results indicate that inertial sensors are reliable tools for swimming biomechanical analyses.

  8. The impact of Lean bundles on hospital performance: does size matter?

    PubMed

    Al-Hyari, Khalil; Abu Hammour, Sewar; Abu Zaid, Mohammad Khair Saleem; Haffar, Mohamed

    2016-10-10

    Purpose The purpose of this paper is to study the effect of the implementation of Lean bundles on hospital performance in private hospitals in Jordan and evaluate how much the size of organization can affect the relationship between Lean bundles implementation and hospital performance. Design/methodology/approach The research is considered as quantitative method (descriptive and hypothesis testing). Three statistical techniques were adopted to analyse the data. Structural equation modeling techniques and multi-group analysis were used to examine the research's hypothesis, and to perform the required statistical analysis of the data from the survey. Reliability analysis and confirmatory factor analysis were used to test the construct validity, reliability and measurement loadings that were performed. Findings Lean bundles have been identified as an effective approach that can dramatically improve the organizational performance of private hospitals in Jordan. Main Lean bundles - just in time, human resource management, and total quality management are applicable to large, small and medium hospitals without significant differences in advantages that depend on size. Originality/value According to the researchers' best knowledge, this is the first research that studies the impact of Lean bundles implementation in healthcare sector in Jordan. This research also makes a significant contribution for decision makers in healthcare to increase their awareness of Lean bundles.

  9. Top-down and bottom-up definitions of human failure events in human reliability analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald Laurids

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approachesmore » should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.« less

  10. Inertial Sensor-Based Motion Analysis of Lower Limbs for Rehabilitation Treatments

    PubMed Central

    Sun, Tongyang; Duan, Lihong; Wang, Yulong

    2017-01-01

    The hemiplegic rehabilitation state diagnosing performed by therapists can be biased due to their subjective experience, which may deteriorate the rehabilitation effect. In order to improve this situation, a quantitative evaluation is proposed. Though many motion analysis systems are available, they are too complicated for practical application by therapists. In this paper, a method for detecting the motion of human lower limbs including all degrees of freedom (DOFs) via the inertial sensors is proposed, which permits analyzing the patient's motion ability. This method is applicable to arbitrary walking directions and tracks of persons under study, and its results are unbiased, as compared to therapist qualitative estimations. Using the simplified mathematical model of a human body, the rotation angles for each lower limb joint are calculated from the input signals acquired by the inertial sensors. Finally, the rotation angle versus joint displacement curves are constructed, and the estimated values of joint motion angle and motion ability are obtained. The experimental verification of the proposed motion detection and analysis method was performed, which proved that it can efficiently detect the differences between motion behaviors of disabled and healthy persons and provide a reliable quantitative evaluation of the rehabilitation state. PMID:29065575

  11. Quantitative high-performance liquid chromatography of nucleosides in biological materials.

    PubMed

    Gehrke, C W; Kuo, K C; Davis, G E; Suits, R D; Waalkes, T P; Borek, E

    1978-03-21

    A rigorous, comprehensive, and reliable reversed-phase high-performance liquid chromatographic (HPLC) method has been developed for the analysis of ribonucleosides in urine (psi, m1A, m1I, m2G, A, m2(2)G). An initial isolation of ribonucleosides with an affinity gel containing an immobilized phenylboronic acid was used to improve selectivity and sensitivity. Response for all nucleosides was linear from 0.1 to 50 nmoles injected and good quantitation was obtained for 25 microliter or less of sample placed on the HPLC column. Excellent precision of analysis for urinary nucleosides was achieved on matrix dependent and independent samples, and the high resolution of the reversed-phase column allowed the complete separation of 9 nucleosides from other unidentified UV absorbing components at the 1-ng level. Supporting experimental data are presented on precision, recovery, chromatographic methods, minimum detection limit, retention time, relative molar response, sample clean-up, stability of nucleosides, boronate gel capacity, and application to analysis of urine from patients with leukemia and breast cancer. This method is now being used routinely for the determination of the concentration and ratios of nucleosides in urine from patients with different types of cancer and in chemotherapy response studies.

  12. Scanning transmission ion microscopy mass measurements for quantitative trace element analysis within biological samples and validation using atomic force microscopy thickness measurements

    NASA Astrophysics Data System (ADS)

    Devès, Guillaume; Cohen-Bouhacina, Touria; Ortega, Richard

    2004-10-01

    We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm).

  13. Quantitation by Portable Gas Chromatography: Mass Spectrometry of VOCs Associated with Vapor Intrusion

    PubMed Central

    Fair, Justin D.; Bailey, William F.; Felty, Robert A.; Gifford, Amy E.; Shultes, Benjamin; Volles, Leslie H.

    2010-01-01

    Development of a robust reliable technique that permits for the rapid quantitation of volatile organic chemicals is an important first step to remediation associated with vapor intrusion. This paper describes the development of an analytical method that allows for the rapid and precise identification and quantitation of halogenated and nonhalogenated contaminants commonly found within the ppbv level at sites where vapor intrusion is a concern. PMID:20885969

  14. Planning Robot-Control Parameters With Qualitative Reasoning

    NASA Technical Reports Server (NTRS)

    Peters, Stephen F.

    1993-01-01

    Qualitative-reasoning planning algorithm helps to determine quantitative parameters controlling motion of robot. Algorithm regarded as performing search in multidimensional space of control parameters from starting point to goal region in which desired result of robotic manipulation achieved. Makes use of directed graph representing qualitative physical equations describing task, and interacts, at each sampling period, with history of quantitative control parameters and sensory data, to narrow search for reliable values of quantitative control parameters.

  15. [Development of an evaluation instrument for service quality in nursing homes].

    PubMed

    Lee, Jia; Ji, Eun Sun

    2011-08-01

    The purposes of this study were to identify the factors influencing service quality in nursing homes, and to develop an evaluation instrument for service quality. A three-phase process was employed for the study. 1) The important factors to evaluate the service quality in nursing homes were identified through a literature review, panel discussion and focus group interview, 2) the evaluation instrument was developed, and 3) validity and reliability of the study instrument were tested by factor analysis, Pearson correlation coefficient, Cronbach's α and Cohen's Kappa. Factor analysis showed that the factors influencing service quality in nursing homes were healthcare, diet/assistance, therapy, environment and staff. To improve objectivity of the instrument, quantitative as well as qualitative evaluation approaches were adopted. The study instrument was developed with 30 items and showed acceptable construct validity. The criterion-related validity was a Pearson correlation coefficient of .85 in 151 care facilities. The internal consistency was Cronbach's α=.95. The instrument has acceptable validity and a high degree of reliability. Staff in nursing homes can continuously improve and manage their services using the results of the evaluation instrument.

  16. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  17. Development of the Sri Lankan early teenagers' violence inventory: an instrument to measure peer violence in schools.

    PubMed

    Wijeratne, Monika; Seneviratne, Rohini; Gunawardena, Nalika; Østbye, Truls; Lynch, Catherine; Sandøy, Ingvild Fossgard

    2014-01-01

    This study was designed to develop an inventory to measure peer violence among early teens (13-15 years of age) in schools in Sri Lanka. Development of SLETVI was carried out in two phases. In phase I, development of an operational definition for peer violence, identification, and finalizing violent acts for inventory was done by a combination of qualitative methods: a comprehensive literature review, focus group discussions among 13-15-year-old adolescents, their teachers and parents, and consultative meetings with experts in the field. Inventory was then pretested. In phase II, elaboration of SLETVI was carried out by administering it to a sample of 1700 adolescents (13-15 years old). Exploratory factor analysis using principal component analysis was performed separately for experiences of victimization and perpetration. Test-retest reliability of SLETVI was assessed. SLETVI included 37 items in three factors: "less severe violence," "severe physical," and "severe relational" violence. Combined use of qualitative and quantitative methods enabled development of a culturally valid and reliable operational inventory to assess early teenagers' peer violence in Sri Lankan and other South Asian schools.

  18. Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).

    PubMed

    Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan

    2015-01-01

    Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.

  19. Leading for the long haul: a mixed-method evaluation of the Sustainment Leadership Scale (SLS).

    PubMed

    Ehrhart, Mark G; Torres, Elisa M; Green, Amy E; Trott, Elise M; Willging, Cathleen E; Moullin, Joanna C; Aarons, Gregory A

    2018-01-19

    Despite our progress in understanding the organizational context for implementation and specifically the role of leadership in implementation, its role in sustainment has received little attention. This paper took a mixed-method approach to examine leadership during the sustainment phase of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Utilizing the Implementation Leadership Scale as a foundation, we sought to develop a short, practical measure of sustainment leadership that can be used for both applied and research purposes. Data for this study were collected as a part of a larger mixed-method study of evidence-based intervention, SafeCare®, sustainment. Quantitative data were collected from 157 providers using web-based surveys. Confirmatory factor analysis was used to examine the factor structure of the Sustainment Leadership Scale (SLS). Qualitative data were collected from 95 providers who participated in one of 15 focus groups. A framework approach guided qualitative data analysis. Mixed-method integration was also utilized to examine convergence of quantitative and qualitative findings. Confirmatory factor analysis supported the a priori higher order factor structure of the SLS with subscales indicating a single higher order sustainment leadership factor. The SLS demonstrated excellent internal consistency reliability. Qualitative analyses offered support for the dimensions of sustainment leadership captured by the quantitative measure, in addition to uncovering a fifth possible factor, available leadership. This study found qualitative and quantitative support for the pragmatic SLS measure. The SLS can be used for assessing leadership of first-level leaders to understand how staff perceive leadership during sustainment and to suggest areas where leaders could direct more attention in order to increase the likelihood that EBIs are institutionalized into the normal functioning of the organization.

  20. Assessment of antibody library diversity through next generation sequencing and technical error compensation

    PubMed Central

    Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino

    2017-01-01

    Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error. PMID:28505201

  1. Characterization of reference genes for qPCR analysis in various tissues of the Fujian oyster Crassostrea angulata

    NASA Astrophysics Data System (ADS)

    Pu, Fei; Yang, Bingye; Ke, Caihuan

    2015-07-01

    Accurate quantification of transcripts using quantitative real-time polymerase chain reaction (qPCR) depends on the identification of reliable reference genes for normalization. This study aimed to identify and validate seven reference genes, including actin-2 ( ACT-2), elongation factor 1 alpha ( EF-1α), elongation factor 1 beta ( EF-1β), glyceraldehyde-3-phosphate dehydrogenase ( GAPDH), ubiquitin ( UBQ), β-tubulin ( β-TUB), and 18S ribosomal RNA, from Crassostrea angulata, a valuable marine bivalve cultured worldwide. Transcript levels of the candidate reference genes were examined using qPCR analysis and showed differential expression patterns in the mantle, gill, adductor muscle, labial palp, visceral mass, hemolymph and gonad tissues. Quantitative data were analyzed using the geNorm software to assess the expression stability of the candidate reference genes, revealing that β-TUB and UBQ were the most stable genes. The commonly used GAPDH and 18S rRNA showed low stability, making them unsuitable candidates in this system. The expression pattern of the G protein β-subunit gene ( Gβ) across tissue types was also examined and normalized to the expression of each or both of UBQ and β-TUB as internal controls. This revealed consistent trends with all three normalization approaches, thus validating the reliability of UBQ and β-TUB as optimal internal controls. The study provides the first validated reference genes for accurate data normalization in transcript profiling in Crassostrea angulata, which will be indispensable for further functional genomics studies in this economically valuable marine bivalve.

  2. Assessment of antibody library diversity through next generation sequencing and technical error compensation.

    PubMed

    Fantini, Marco; Pandolfini, Luca; Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Terrigno, Marco; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino

    2017-01-01

    Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error.

  3. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  4. [A comparative analysis of the brain activity during the traditional and intensive forms of learning foreign languages].

    PubMed

    Bykova, L G; Bazylev, V N

    1994-01-01

    By means of dichotic test the comparative research of the brain activity in dynamics in 84 adult students was conducted during their traditional (36 persons) and intensive (48 persons) learning of foreign languages. By different methods of learning the reliable distinction of the hemisphere's asymmetry was not detected. By both methods in the reliable majority of students the activation of the hemisphere opposite to the one dominating initially was observed. The correlation between the maximum quantitative shift of the right ear coefficient and the level of success in colloquial practice by the same initial level of language start and initial comparable size of memory was revealed. The authors discuss the possibility of the individual map composition for every student using the results of dichotic tests in dynamics for the help in the profession of a teacher.

  5. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    PubMed

    Gilhodes, Jean-Claude; Julé, Yvon; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes automated analysis as a novel end-point measure of BLM-induced lung fibrosis in mice, which will be very valuable for future preclinical drug explorations.

  6. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis

    PubMed Central

    Gilhodes, Jean-Claude; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes automated analysis as a novel end-point measure of BLM-induced lung fibrosis in mice, which will be very valuable for future preclinical drug explorations. PMID:28107543

  7. Impaired limb position sense after stroke: a quantitative test for clinical use.

    PubMed

    Carey, L M; Oke, L E; Matyas, T A

    1996-12-01

    A quantitative measure of wrist position sense was developed to advance clinical measurement of proprioceptive limb sensibility after stroke. Test-retest reliability, normative standards, and ability to discriminate impaired and unimpaired performance were investigated. Retest reliability was assessed over three sessions, and a matched-pairs study compared stroke and unimpaired subjects. Both wrists were tested, in counterbalanced order. Patients were tested in hospital-based rehabilitation units. Reliability was investigated on a consecutive sample of 35 adult stroke patients with a range of proprioceptive discrimination abilities and no evidence of neglect. A consecutive sample of 50 stroke patients and convenience sample of 50 healthy volunteers, matched for age, sex, and hand dominance, were tested in the normative-discriminative study. Age and sex were representative of the adult stroke population. The test required matching of imposed wrist positions using a pointer aligned with the axis of movement and a protractor scale. The test was reliable (r = .88 and .92) and observed changes of 8 degrees can be interpreted, with 95% confidence, as genuine. Scores of healthy volunteers ranged from 3.1 degrees to 10.9 degrees average error. The criterion of impairment was conservatively defined as 11 degrees (+/-4.8 degrees) average error. Impaired and unimpaired performance were well differentiated. Clinicians can confidently and quantitatively sample one aspect of proprioceptive sensibility in stroke patients using the wrist position sense test. Development of tests on other joints using the present approach is supported by our findings.

  8. Validation of HPLC and UV spectrophotometric methods for the determination of meropenem in pharmaceutical dosage form.

    PubMed

    Mendez, Andreas S L; Steppe, Martin; Schapoval, Elfrides E S

    2003-12-04

    A high-performance liquid chromatographic method and a UV spectrophotometric method for the quantitative determination of meropenem, a highly active carbapenem antibiotic, in powder for injection were developed in present work. The parameters linearity, precision, accuracy, specificity, robustness, limit of detection and limit of quantitation were studied according to International Conference on Harmonization guidelines. Chromatography was carried out by reversed-phase technique on an RP-18 column with a mobile phase composed of 30 mM monobasic phosphate buffer and acetonitrile (90:10; v/v), adjusted to pH 3.0 with orthophosphoric acid. The UV spectrophotometric method was performed at 298 nm. The samples were prepared in water and the stability of meropenem in aqueous solution at 4 and 25 degrees C was studied. The results were satisfactory with good stability after 24 h at 4 degrees C. Statistical analysis by Student's t-test showed no significant difference between the results obtained by the two methods. The proposed methods are highly sensitive, precise and accurate and can be used for the reliable quantitation of meropenem in pharmaceutical dosage form.

  9. Quantitative characterization of surface topography using spectral analysis

    NASA Astrophysics Data System (ADS)

    Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars

    2017-03-01

    Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.

  10. Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.

    PubMed

    He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan

    2009-07-01

    Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.

  11. A preamplification approach to GMO detection in processed foods.

    PubMed

    Del Gaudio, S; Cirillo, A; Di Bernardo, G; Galderisi, U; Cipollaro, M

    2010-03-01

    DNA is widely used as a target for GMO analysis because of its stability and high detectability. Real-time PCR is the method routinely used in most analytical laboratories due to its quantitative performance and great sensitivity. Accurate DNA detection and quantification is dependent on the specificity and sensitivity of the amplification protocol as well as on the quality and quantity of the DNA used in the PCR reaction. In order to enhance the sensitivity of real-time PCR and consequently expand the number of analyzable target genes, we applied a preamplification technique to processed foods where DNA can be present in low amounts and/or in degraded forms thereby affecting the reliability of qualitative and quantitative results. The preamplification procedure utilizes a pool of primers targeting genes of interest and is followed by real-time PCR reactions specific for each gene. An improvement of Ct values was found comparing preamplified vs. non-preamplified DNA. The strategy reported in the present study will be also applicable to other fields requiring quantitative DNA testing by real-time PCR.

  12. [A novel quantitative approach to study dynamic anaerobic process at micro scale].

    PubMed

    Zhang, Zhong-Liang; Wu, Jing; Jiang, Jian-Kai; Jiang, Jie; Li, Huai-Zhi

    2012-11-01

    Anaerobic digestion is attracting more and more interests because of its advantages such as low cost and recovery of clean energy etc. In order to overcome the drawbacks of the existed methods to study the dynamic anaerobic process, a novel microscopical quantitative approach at the granule level was developed combining both the microdevice and the quantitative image analysis techniques. This experiment displayed the process and characteristics of the gas production at static state for the first time and the results indicated that the method was of satisfactory repeatability. The gas production process at static state could be divided into three stages including rapid linear increasing stage, decelerated increasing stage and slow linear increasing stage. The rapid linear increasing stage was long and the biogas rate was high under high initial organic loading rate. The results showed that it was feasible to make the anaerobic process to be carried out in the microdevice; furthermore this novel method was reliable and could clearly display the dynamic process of the anaerobic reaction at the micro scale. The results are helpful to understand the anaerobic process.

  13. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  14. Isobaric Tags for Relative and Absolute Quantification (iTRAQ)-Based Untargeted Quantitative Proteomic Approach To Identify Change of the Plasma Proteins by Salbutamol Abuse in Beef Cattle.

    PubMed

    Zhang, Kai; Tang, Chaohua; Liang, Xiaowei; Zhao, Qingyu; Zhang, Junmin

    2018-01-10

    Salbutamol, a selective β 2 -agonist, endangers the safety of animal products as a result of illegal use in food animals. In this study, an iTRAQ-based untargeted quantitative proteomic approach was applied to screen potential protein biomarkers in plasma of cattle before and after treatment with salbutamol for 21 days. A total of 62 plasma proteins were significantly affected by salbutamol treatment, which can be used as potential biomarkers to screen for the illegal use of salbutamol in beef cattle. Enzyme-linked immunosorbent assay measurements of five selected proteins demonstrated the reliability of iTRAQ-based proteomics in screening of candidate biomarkers among the plasma proteins. The plasma samples collected before and after salbutamol treatment were well-separated by principal component analysis (PCA) using the differentially expressed proteins. These results suggested that an iTRAQ-based untargeted quantitative proteomic strategy combined with PCA pattern recognition methods can discriminate differences in plasma protein profiles collected before and after salbutamol treatment.

  15. Quality evaluation of Shenmaidihuang Pills based on the chromatographic fingerprints and simultaneous determination of seven bioactive constituents.

    PubMed

    Liu, Sifei; Zhang, Guangrui; Qiu, Ying; Wang, Xiaobo; Guo, Lihan; Zhao, Yanxin; Tong, Meng; Wei, Lan; Sun, Lixin

    2016-12-01

    In this study, we aimed to establish a comprehensive and practical quality evaluation system for Shenmaidihuang pills. A simple and reliable high-performance liquid chromatography coupled with photodiode array detection method was developed both for fingerprint analysis and quantitative determination. In fingerprint analysis, relative retention time and relative peak area were used to identify the common peaks in 18 samples for investigation. Twenty one peaks were selected as the common peaks to evaluate the similarities of 18 Shenmaidihuang pills samples with different manufacture dates. Furthermore, similarity analysis was applied to evaluate the similarity of samples. Hierarchical cluster analysis and principal component analysis were also performed to evaluate the variation of Shenmaidihuang pills. In quantitative analysis, linear regressions, injection precisions, recovery, repeatability and sample stability were all tested and good results were obtained to simultaneously determine the seven identified compounds, namely, 5-hydroxymethylfurfural, morroniside, loganin, paeonol, paeoniflorin, psoralen, isopsoralen in Shenmaidihuang pills. The contents of some analytes in different batches of samples indicated significant difference, especially for 5-hydroxymethylfurfural. So, it was concluded that the chromatographic fingerprint method obtained by high-performance liquid chromatography coupled with photodiode array detection associated with multiple compounds determination is a powerful and meaningful tool to comprehensively conduct the quality control of Shenmaidihuang pills. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Development of life story experience (LSE) scales for migrant dentists in Australia: a sequential qualitative-quantitative study.

    PubMed

    Balasubramanian, M; Spencer, A J; Short, S D; Watkins, K; Chrisopoulos, S; Brennan, D S

    2016-09-01

    The integration of qualitative and quantitative approaches introduces new avenues to bridge strengths, and address weaknesses of both methods. To develop measure(s) for migrant dentist experiences in Australia through a mixed methods approach. The sequential qualitative-quantitative design involved first the harvesting of data items from qualitative study, followed by a national survey of migrant dentists in Australia. Statements representing unique experiences in migrant dentists' life stories were deployed the survey questionnaire, using a five-point Likert scale. Factor analysis was used to examine component factors. Eighty-two statements from 51 participants were harvested from the qualitative analysis. A total of 1,022 of 1,977 migrant dentists (response rate 54.5%) returned completed questionnaires. Factor analysis supported an initial eight-factor solution; further scale development and reliability analysis led to five scales with a final list of 38 life story experience (LSE) items. Three scales were based on home country events: health system and general lifestyle concerns (LSE1; 10 items), society and culture (LSE4; 4 items) and career development (LSE5; 4 items). Two scales included migrant experiences in Australia: appreciation towards Australian way of life (LSE2; 13 items) and settlement concerns (LSE3; 7 items). The five life story experience scales provided necessary conceptual clarity and empirical grounding to explore migrant dentist experiences in Australia. Being based on original migrant dentist narrations, these scales have the potential to offer in-depth insights for policy makers and support future research on dentist migration. Copyright© 2016 Dennis Barber Ltd

  17. Detection, monitoring, and quantitative analysis of wildfires with the BIRD satellite

    NASA Astrophysics Data System (ADS)

    Oertel, Dieter A.; Briess, Klaus; Lorenz, Eckehard; Skrbek, Wolfgang; Zhukov, Boris

    2004-02-01

    Increasing concern about environment and interest to avoid losses led to growing demands on space borne fire detection, monitoring and quantitative parameter estimation of wildfires. The global change research community intends to quantify the amount of gaseous and particulate matter emitted from vegetation fires, peat fires and coal seam fires. The DLR Institute of Space Sensor Technology and Planetary Exploration (Berlin-Adlershof) developed a small satellite called BIRD (Bi-spectral Infrared Detection) which carries a sensor package specially designed for fire detection. BIRD was launched as a piggy-back satellite on October 22, 2001 with ISRO"s Polar Satellite Launch Vehicle (PSLV). It is circling the Earth on a polar and sun-synchronous orbit at an altitude of 572 km and it is providing unique data for detailed analysis of high temperature events on Earth surface. The BIRD sensor package is dedicated for high resolution and reliable fire recognition. Active fire analysis is possible in the sub-pixel domain. The leading channel for fire detection and monitoring is the MIR channel at 3.8 μm. The rejection of false alarms is based on procedures using MIR/NIR (Middle Infra Red/Near Infra Red) and MIR/TIR (Middle Infra Red/Thermal Infra Red) radiance ratio thresholds. Unique results of BIRD wildfire detection and analysis over fire prone regions in Australia and Asia will be presented. BIRD successfully demonstrates innovative fire recognition technology for small satellites which permit to retrieve quantitative characteristics of active burning wildfires, such as the equivalent fire temperature, fire area, radiative energy release, fire front length and fire front strength.

  18. The Test of Masticating and Swallowing Solids (TOMASS): Reliability, Validity and International Normative Data

    ERIC Educational Resources Information Center

    Huckabee, Maggie-Lee; McIntosh, Theresa; Fuller, Laura; Curry, Morgan; Thomas, Paige; Walshe, Margaret; McCague, Ellen; Battel, Irene; Nogueira, Dalia; Frank, Ulrike; van den Engel-Hoek, Lenie; Sella-Weiss, Oshrat

    2018-01-01

    Background: Clinical swallowing assessment is largely limited to qualitative assessment of behavioural observations. There are limited quantitative data that can be compared with a healthy population for identification of impairment. The Test of Masticating and Swallowing Solids (TOMASS) was developed as a quantitative assessment of solid bolus…

  19. A Methodological Self-Study of Quantitizing: Negotiating Meaning and Revealing Multiplicity

    ERIC Educational Resources Information Center

    Seltzer-Kelly, Deborah; Westwood, Sean J.; Pena-Guzman, David M.

    2012-01-01

    This inquiry developed during the process of "quantitizing" qualitative data the authors had gathered for a mixed methods curriculum efficacy study. Rather than providing the intended rigor to their data coding process, their use of an intercoder reliability metric prompted their investigation of the multiplicity and messiness that, as they…

  20. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    ERIC Educational Resources Information Center

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

Top