Sample records for provide quantitative tests

  1. 40 CFR 796.2750 - Sediment and soil adsorption isotherm.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... are highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...

  2. 40 CFR 796.2750 - Sediment and soil adsorption isotherm.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...

  3. 40 CFR 796.2750 - Sediment and soil adsorption isotherm.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... highly reproducible. The test provides excellent quantitative data readily amenable to statistical... combination of methods suitable for the identification and quantitative detection of the parent test chemical... quantitative analysis of the parent chemical. (3) Amount of parent test chemical applied, the amount recovered...

  4. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  5. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  6. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  7. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  8. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  9. Generalized likelihood ratios for quantitative diagnostic test scores.

    PubMed

    Tandberg, D; Deely, J J; O'Malley, A J

    1997-11-01

    The reduction of quantitative diagnostic test scores to the dichotomous case is a wasteful and unnecessary simplification in the era of high-speed computing. Physicians could make better use of the information embedded in quantitative test results if modern generalized curve estimation techniques were applied to the likelihood functions of Bayes' theorem. Hand calculations could be completely avoided and computed graphical summaries provided instead. Graphs showing posttest probability of disease as a function of pretest probability with confidence intervals (POD plots) would enhance acceptance of these techniques if they were immediately available at the computer terminal when test results were retrieved. Such constructs would also provide immediate feedback to physicians when a valueless test had been ordered.

  10. Cost and Efficacy Assessment of an Alternative Medication Compliance Urine Drug Testing Strategy.

    PubMed

    Doyle, Kelly; Strathmann, Frederick G

    2017-02-01

    This study investigates the frequency at which quantitative results provide additional clinical benefit compared to qualitative results alone. A comparison between alternative urine drug screens and conventional screens including the assessment of cost-to-payer differences, accuracy of prescription compliance or polypharmacy/substance abuse was also included. In a reference laboratory evaluation of urine specimens from across the United States, 213 urine specimens with provided prescription medication information (302 prescriptions) were analyzed by two testing algorithms: 1) conventional immunoassay screen with subsequent reflexive testing of positive results by quantitative mass spectrometry; and 2) a combined immunoassay/qualitative mass-spectrometry screen that substantially reduced the need for subsequent testing. The qualitative screen was superior to immunoassay with reflex to mass spectrometry in confirming compliance per prescription (226/302 vs 205/302), and identifying non-prescription abuse (97 vs 71). Pharmaceutical impurities and inconsistent drug metabolite patterns were detected in only 3.8% of specimens, suggesting that quantitative results have limited benefit. The percentage difference between the conventional testing algorithm and the alternative screen was projected to be 55%, and a 2-year evaluation of test utilization as a measure of test order volume follows an exponential trend for alternative screen test orders over conventional immunoassay screens that require subsequent confirmation testing. Alternative, qualitative urine drug screens provide a less expensive, faster, and more comprehensive evaluation of patient medication compliance and drug abuse. The vast majority of results were interpretable with qualitative results alone indicating a reduced need to automatically reflex to quantitation or provide quantitation for the majority of patients. This strategy highlights a successful approach using an alternative strategy for both the laboratory and physician to align clinical needs while being mindful of costs.

  11. Sweat testing to evaluate autonomic function

    PubMed Central

    Illigens, Ben M.W.; Gibbons, Christopher H.

    2011-01-01

    Sudomotor dysfunction is one of the earliest detectable neurophysiologic abnormalities in distal small fiber neuropathy. Traditional neurophysiologic measurements of sudomotor function include thermoregulatory sweat testing (TST), quantitative sudomotor axon reflex testing (QSART), silicone impressions, the sympathetic skin response (SSR), and the recent addition of quantitative direct and indirect axon reflex testing (QDIRT). These testing techniques, when used in combination, can detect and localized pre- and postganglionic lesions, can provide early diagnosis of sudomotor dysfunction and can monitor disease progression or disease recovery. In this article, we review the common tests available for assessment of sudomotor function, detail the testing methodology, review the limitations and provide examples of test results. PMID:18989618

  12. The other half of the story: effect size analysis in quantitative research.

    PubMed

    Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane

    2013-01-01

    Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.

  13. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  14. Evaluation of a rapid quantitative determination method of PSA concentration with gold immunochromatographic strips.

    PubMed

    Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia

    2015-11-03

    Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.

  15. Providing Quantitative Information and a Nudge to Undergo Stool Testing in a Colorectal Cancer Screening Decision Aid: A Randomized Clinical Trial.

    PubMed

    Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M

    2017-08-01

    Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.

  16. Establish an Agent-Simulant Technology Relationship (ASTR)

    DTIC Science & Technology

    2017-04-14

    for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT

  17. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  18. The Quantitative-MFG Test: A Linear Mixed Effect Model to Detect Maternal-Offspring Gene Interactions.

    PubMed

    Clark, Michelle M; Blangero, John; Dyer, Thomas D; Sobel, Eric M; Sinsheimer, Janet S

    2016-01-01

    Maternal-offspring gene interactions, aka maternal-fetal genotype (MFG) incompatibilities, are neglected in complex diseases and quantitative trait studies. They are implicated in birth to adult onset diseases but there are limited ways to investigate their influence on quantitative traits. We present the quantitative-MFG (QMFG) test, a linear mixed model where maternal and offspring genotypes are fixed effects and residual correlations between family members are random effects. The QMFG handles families of any size, common or general scenarios of MFG incompatibility, and additional covariates. We develop likelihood ratio tests (LRTs) and rapid score tests and show they provide correct inference. In addition, the LRT's alternative model provides unbiased parameter estimates. We show that testing the association of SNPs by fitting a standard model, which only considers the offspring genotypes, has very low power or can lead to incorrect conclusions. We also show that offspring genetic effects are missed if the MFG modeling assumptions are too restrictive. With genome-wide association study data from the San Antonio Family Heart Study, we demonstrate that the QMFG score test is an effective and rapid screening tool. The QMFG test therefore has important potential to identify pathways of complex diseases for which the genetic etiology remains to be discovered. © 2015 John Wiley & Sons Ltd/University College London.

  19. 21 CFR 809.10 - Labeling for in vitro diagnostic products.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... procedure, e.g., qualitative or quantitative. (3) Summary and explanation of the test. Include a short... provides other than quantitative results, provide an adequate description of expected results. (10... are met: (i) For a product in the laboratory research phase of development, and not represented as an...

  20. 21 CFR 809.10 - Labeling for in vitro diagnostic products.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... procedure, e.g., qualitative or quantitative. (3) Summary and explanation of the test. Include a short... provides other than quantitative results, provide an adequate description of expected results. (10... are met: (i) For a product in the laboratory research phase of development, and not represented as an...

  1. 21 CFR 809.10 - Labeling for in vitro diagnostic products.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... procedure, e.g., qualitative or quantitative. (3) Summary and explanation of the test. Include a short... provides other than quantitative results, provide an adequate description of expected results. (10... are met: (i) For a product in the laboratory research phase of development, and not represented as an...

  2. 21 CFR 809.10 - Labeling for in vitro diagnostic products.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... procedure, e.g., qualitative or quantitative. (3) Summary and explanation of the test. Include a short... provides other than quantitative results, provide an adequate description of expected results. (10... are met: (i) For a product in the laboratory research phase of development, and not represented as an...

  3. Prototype ultrasonic instrument for quantitative testing

    NASA Technical Reports Server (NTRS)

    Lynnworth, L. C.; Dubois, J. L.; Kranz, P. R.

    1972-01-01

    A prototype ultrasonic instrument has been designed and developed for quantitative testing. The complete delivered instrument consists of a pulser/receiver which plugs into a standard oscilloscope, an rf power amplifier, a standard decade oscillator, and a set of broadband transducers for typical use at 1, 2, 5 and 10 MHz. The system provides for its own calibration, and on the oscilloscope, presents a quantitative (digital) indication of time base and sensitivity scale factors and some measurement data.

  4. Metrologies for quantitative nanomechanical testing and quality control in semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Pratt, Jon R.; Kramar, John A.; Newell, David B.; Smith, Douglas T.

    2005-05-01

    If nanomechanical testing is to evolve into a tool for process and quality control in semiconductor fabrication, great advances in throughput, repeatability, and accuracy of the associated instruments and measurements will be required. A recent grant awarded by the NIST Advanced Technology Program seeks to address the throughput issue by developing a high-speed AFM-based platform for quantitative nanomechanical measurements. The following paper speaks to the issue of quantitative accuracy by presenting an overview of various standards and techniques under development at NIST and other national metrology institutes (NMIs) that can provide a metrological basis for nanomechanical testing. The infrastructure we describe places firm emphasis on traceability to the International System of Units, paving the way for truly quantitative, rather than qualitative, physical property testing.

  5. Bence-Jones protein - quantitative

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003597.htm Quantitative Bence-Jones protein test To use the sharing ... Todd Gersten, MD, Hematology/Oncology, Florida Cancer Specialists & Research Institute, Wellington, FL. Review provided by VeriMed Healthcare ...

  6. Characterization of breast lesion using T1-perfusion magnetic resonance imaging: Qualitative vs. quantitative analysis.

    PubMed

    Thakran, S; Gupta, P K; Kabra, V; Saha, I; Jain, P; Gupta, R K; Singh, A

    2018-06-14

    The objective of this study was to quantify the hemodynamic parameters using first pass analysis of T 1 -perfusion magnetic resonance imaging (MRI) data of human breast and to compare these parameters with the existing tracer kinetic parameters, semi-quantitative and qualitative T 1 -perfusion analysis in terms of lesion characterization. MRI of the breast was performed in 50 women (mean age, 44±11 [SD] years; range: 26-75) years with a total of 15 benign and 35 malignant breast lesions. After pre-processing, T 1 -perfusion MRI data was analyzed using qualitative approach by two radiologists (visual inspection of the kinetic curve into types I, II or III), semi-quantitative (characterization of kinetic curve types using empirical parameters), generalized-tracer-kinetic-model (tracer kinetic parameters) and first pass analysis (hemodynamic-parameters). Chi-squared test, t-test, one-way analysis-of-variance (ANOVA) using Bonferroni post-hoc test and receiver-operating-characteristic (ROC) curve were used for statistical analysis. All quantitative parameters except leakage volume (Ve), qualitative (type-I and III) and semi-quantitative curves (type-I and III) provided significant differences (P<0.05) between benign and malignant lesions. Kinetic parameters, particularly volume transfer coefficient (K trans ) provided a significant difference (P<0.05) between all grades except grade-II vs III. The hemodynamic parameter (relative-leakage-corrected-breast-blood-volume [rBBVcorr) provided a statistically significant difference (P<0.05) between all grades. It also provided highest sensitivity and specificity among all parameters in differentiation between different grades of malignant breast lesions. Quantitative parameters, particularly rBBVcorr and K trans provided similar sensitivity and specificity in differentiating benign from malignant breast lesions for this cohort. Moreover, rBBVcorr provided better differentiation between different grades of malignant breast lesions among all the parameters. Copyright © 2018. Published by Elsevier Masson SAS.

  7. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  8. Highlights from High Energy Neutrino Experiments at CERN

    NASA Astrophysics Data System (ADS)

    Schlatter, W.-D.

    2015-07-01

    Experiments with high energy neutrino beams at CERN provided early quantitative tests of the Standard Model. This article describes results from studies of the nucleon quark structure and of the weak current, together with the precise measurement of the weak mixing angle. These results have established a new quality for tests of the electroweak model. In addition, the measurements of the nucleon structure functions in deep inelastic neutrino scattering allowed first quantitative tests of QCD.

  9. Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.

    PubMed

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-02-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.

  10. Testing the Protestant Ethic Thesis with Quantitative Historical Data: A Research Note

    ERIC Educational Resources Information Center

    Sanderson, Stephen K.; Abrutyn, Seth B.; Proctor, Kristopher R.

    2011-01-01

    We provide a test of the thesis that Protestantism influenced the development of modern capitalism by using quantitative data from 1500 through 1870. Results show that during this period the percentage of a country's population that is Protestant is unrelated to both its level of per capita GDP and the average rate of its annual growth in per…

  11. Selecting the most appropriate inferential statistical test for your quantitative research study.

    PubMed

    Bettany-Saltikov, Josette; Whittaker, Victoria Jane

    2014-06-01

    To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.

  12. Qualification Testing Versus Quantitative Reliability Testing of PV - Gaining Confidence in a Rapidly Changing Technology: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L

    Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less

  13. A Multicenter Trial of the Proficiency of Smart Quantitative Sensation Tests

    PubMed Central

    Dyck, Peter J.; Argyros, Barbara; Russell, James W.; Gahnstrom, Linde E.; Nalepa, Susan; Albers, James W.; Lodermeier, Karen A.; Zafft, Andrew J.; Dyck, P. James B.; Klein, Christopher J.; Litchy, William J.; Davies, Jenny L.; Carter, Rickey E.; Melton, L. Joseph

    2014-01-01

    Introduction We assessed proficiency (accuracy and intra- and inter-test reproducibility) of smart quantitative sensation tests (smart QSTs) in subjects without and with diabetic polyneuropathy (DSPN). Methods Technologists from 3 medical centers using different but identical QSTs assessed independently 6 modalities of sensation of foot (or leg) twice in patients without (n = 6) and with (n = 6) DSPN using smart computer assisted QSTs. Results Low rates of test abnormalities were observed in health and high rates in DSPN. Very high intra-class correlations were obtained between continuous measures of QSTs and neuropathy signs, symptoms, or nerve conductions (NCs). No significant intra- or inter-test differences were observed. Discussion These results provide proof of concept that smart QSTs provide accurate assessment of sensation loss without intra- or inter-test differences useful for multicenter trials. Smart technology makes possible efficient testing of body surface area sensation loss in symmetric length-dependent sensorimotor polyneuropathies. PMID:23929701

  14. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    PubMed

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  15. A test for selection employing quantitative trait locus and mutation accumulation data.

    PubMed

    Rice, Daniel P; Townsend, Jeffrey P

    2012-04-01

    Evolutionary biologists attribute much of the phenotypic diversity observed in nature to the action of natural selection. However, for many phenotypic traits, especially quantitative phenotypic traits, it has been challenging to test for the historical action of selection. An important challenge for biologists studying quantitative traits, therefore, is to distinguish between traits that have evolved under the influence of strong selection and those that have evolved neutrally. Most existing tests for selection employ molecular data, but selection also leaves a mark on the genetic architecture underlying a trait. In particular, the distribution of quantitative trait locus (QTL) effect sizes and the distribution of mutational effects together provide information regarding the history of selection. Despite the increasing availability of QTL and mutation accumulation data, such data have not yet been effectively exploited for this purpose. We present a model of the evolution of QTL and employ it to formulate a test for historical selection. To provide a baseline for neutral evolution of the trait, we estimate the distribution of mutational effects from mutation accumulation experiments. We then apply a maximum-likelihood-based method of inference to estimate the range of selection strengths under which such a distribution of mutations could generate the observed QTL. Our test thus represents the first integration of population genetic theory and QTL data to measure the historical influence of selection.

  16. Condenser: a statistical aggregation tool for multi-sample quantitative proteomic data from Matrix Science Mascot Distiller™.

    PubMed

    Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan

    2014-05-30

    We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Predictors of provider- initiated HIV testing and counseling refusal by outpatient department clients in Wolaita zone, Southern Ethiopia: a case control study.

    PubMed

    Facha, Wolde; Kassahun, Wondewosen; Workicho, Abdulhalik

    2016-08-12

    Despite different strategies designed to rapidly identify HIV infected individuals, majority of HIV-infected people are unaware of their sero-status in developing countries. The objective of this study was to assess predictors of provider-initiated HIV testing and counseling (PITC) refusal by outpatient department (OPD) clients in Wolaita zone, Southern Ethiopia. Facility based unmatched case control study was conducted on outpatient department clients in randomly selected seven health facilities in Wolaita zone, Southern Ethiopia in February 2012. A total of 291 participants (97 cases and 194 controls) were included in our study. Cases were patients who refused HIV test while controls were patients who tested for HIV after provider-initiated HIV testing and counseling (PITC) recommendation by outpatient department (OPD) clinicians. We used both quantitative and qualitative methods of data collection. Pretested interviewer administered questionnaires were used to collect quantitative data by trained nurses, and in-depth interview with 14 OPD clinicians was conducted by principal investigator to supplement quantitative findings. Bivariate and multivariate analyses were done to identify independent predictors of provider-initiated HIV testing and counseling refusal by OPD clients. Study participants who had stigmatizing attitude [AOR = 6.09, (95 % CI: 1.70, 21.76)], who had perceived risk for HIV infection [AOR = 5.23, (95 % CI: 2.22, 12.32)], who did not perceive the benefits of provider-initiated HIV testing and counseling [AOR = 4.64, (95 % CI: 1.79, 12.01)], who did not get minimum recommended pretest information from their providers [AOR = 2.98, (95 % CI: 1.06, 8.35)], who ever not heard of provider-initiated HIV testing and counseling service [AOR = 2.41, (95 % CI: 1.14, 5.09)], and who were from urban area [AOR = 2.40, (95 % CI = 1.26, 4.57)] were more likely to refuse provider-initiated HIV testing and counseling service than their counterparts. Knowledge on HIV/AIDS, attitude towards people living with HIV/AIDS and perceived risk for HIV infection by clients were the major barriers for provider-initiated HIV testing and counseling acceptance. Health professionals working at outpatient department should give due attention to overcome these barriers so as to enhance HIV testing acceptance by their clients.

  18. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    PubMed

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  19. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  20. Google glass based immunochromatographic diagnostic test analysis

    NASA Astrophysics Data System (ADS)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  1. QUANTITATIVE GENETIC ACTIVITY GRAPHICAL PROFILES FOR USE IN CHEMICAL EVALUATION

    EPA Science Inventory

    A graphic approach termed a Genetic Activity Profile (GAP) has been developed to display a matrix of data on the genetic and related effects of selected chemical agents. he profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each...

  2. Are quantitative cultures useful in the diagnosis of hospital-acquired pneumonia?

    PubMed

    San Pedro, G

    2001-02-01

    Noninvasive and invasive tests have been developed and studied for their utility in diagnosing and guiding the treatment of hospital-acquired pneumonia, a condition with an inherently high mortality. Early empiric antibiotic treatment has been shown to reduce mortality, so delaying this treatment until test results are available is not justifiable. Furthermore, tailoring therapy based on results of either noninvasive or invasive tests has not been clearly shown to affect morbidity and mortality. This may be related to quantitative limitations of these tests or possibly to a high false-negative rate in patients who receive early antibiotic treatment and may therefore have suppressed bacterial counts. Results of these tests, however, do influence treatment. It is therefore hoped that they may ultimately provide a rational basis for making therapeutic decisions. In the future, outcomes research should be a part of large-scale clinical trials, and noninvasive and invasive tests should be incorporated into the design in an attempt to provide a better understanding of the value of such tests.

  3. Test/QA Plan for Verification of Ozone Indicator Cards

    EPA Science Inventory

    This verification test will address ozone indicator cards (OICs) that provide short-term semi-quantitative measures of ozone concentration in ambient air. Testing will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Tec...

  4. Overview of T.E.S.T. (Toxicity Estimation Software Tool)

    EPA Science Inventory

    This talk provides an overview of T.E.S.T. (Toxicity Estimation Software Tool). T.E.S.T. predicts toxicity values and physical properties using a variety of different QSAR (quantitative structure activity relationship) approaches including hierarchical clustering, group contribut...

  5. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  6. Immunochromatographic diagnostic test analysis using Google Glass.

    PubMed

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2014-03-25

    We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health.

  7. Immunochromatographic Diagnostic Test Analysis Using Google Glass

    PubMed Central

    2014-01-01

    We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health. PMID:24571349

  8. Semi-Quantitative Scoring of an Immunochromatographic Test for Circulating Filarial Antigen

    PubMed Central

    Chesnais, Cédric B.; Missamou, François; Pion, Sébastien D. S.; Bopda, Jean; Louya, Frédéric; Majewski, Andrew C.; Weil, Gary J.; Boussinesq, Michel

    2013-01-01

    The value of a semi-quantitative scoring of the filarial antigen test (Binax Now Filariasis card test, ICT) results was evaluated during a field survey in the Republic of Congo. One hundred and thirty-four (134) of 774 tests (17.3%) were clearly positive and were scored 1, 2, or 3; and 11 (1.4%) had questionable results. Wuchereria bancrofti microfilariae (mf) were detected in 41 of those 133 individuals with an ICT test score ≥ 1 who also had a night blood smear; none of the 11 individuals with questionable ICT results harbored night mf. Cuzick's test showed a significant trend for higher microfilarial densities in groups with higher ICT scores (P < 0.001). The ICT scores were also significantly correlated with blood mf counts. Because filarial antigen levels provide an indication of adult worm infection intensity, our results suggest that semi-quantitative reading of the ICT may be useful for grading the intensity of filarial infections in individuals and populations. PMID:24019435

  9. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    PubMed

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  10. Tree Testing of Hierarchical Menu Structures for Health Applications

    PubMed Central

    Le, Thai; Chaudhuri, Shomir; Chung, Jane; Thompson, Hilaire J; Demiris, George

    2014-01-01

    To address the need for greater evidence-based evaluation of Health Information Technology (HIT) systems we introduce a method of usability testing termed tree testing. In a tree test, participants are presented with an abstract hierarchical tree of the system taxonomy and asked to navigate through the tree in completing representative tasks. We apply tree testing to a commercially available health application, demonstrating a use case and providing a comparison with more traditional in-person usability testing methods. Online tree tests (N=54) and in-person usability tests (N=15) were conducted from August to September 2013. Tree testing provided a method to quantitatively evaluate the information structure of a system using various navigational metrics including completion time, task accuracy, and path length. The results of the analyses compared favorably to the results seen from the traditional usability test. Tree testing provides a flexible, evidence-based approach for researchers to evaluate the information structure of HITs. In addition, remote tree testing provides a quick, flexible, and high volume method of acquiring feedback in a structured format that allows for quantitative comparisons. With the diverse nature and often large quantities of health information available, addressing issues of terminology and concept classifications during the early development process of a health information system will improve navigation through the system and save future resources. Tree testing is a usability method that can be used to quickly and easily assess information hierarchy of health information systems. PMID:24582924

  11. Multicenter trial of the proficiency of smart quantitative sensation tests.

    PubMed

    Dyck, Peter J; Argyros, Barbara; Russell, James W; Gahnstrom, Linde E; Nalepa, Susan; Albers, James W; Lodermeier, Karen A; Zafft, Andrew J; Dyck, P James B; Klein, Christopher J; Litchy, William J; Davies, Jenny L; Carter, Rickey E; Melton, L Joseph

    2014-05-01

    We assessed proficiency (accuracy and intra- and intertest reproducibility) of smart quantitative sensation tests (smart QSTs) in subjects without and with diabetic sensorimotor polyneuropathy (DSPN). Technologists from 3 medical centers using different but identical QSTs independently assessed 6 modalities of sensation of the foot (or leg) twice in patients without (n = 6) and with (n = 6) DSPN using smart computer assisted QSTs. Low rates of test abnormalities were observed in health and high rates in DSPN. Very high intraclass correlations were obtained between continuous measures of QSTs and neuropathy signs, symptoms, or nerve conductions (NCs). No significant intra- or intertest differences were observed. These results provide proof of concept that smart QSTs provide accurate assessment of sensation loss without intra- or intertest differences useful for multicenter trials. Smart technology makes possible efficient testing of body surface area sensation loss in symmetric length-dependent sensorimotor polyneuropathies. Copyright © 2013 Wiley Periodicals, Inc.

  12. Measures of fish behavior as indicators of sublethal toxicosis during standard toxicity tests

    USGS Publications Warehouse

    Little, E.E.; DeLonay, A.J.

    1996-01-01

    Behavioral functions essential for growth and survival can be dramatically altered by sublethal exposure to toxicants. Measures of these behavioral responses are effective in detecting adverse effects of sublethal contaminant exposure. Behavioral responses of fishes can be qualitatively and quantitatively evaluated during routine toxicity tests. At selected intervals of exposure, qualitative evaluations are accomplished through direct observations, whereas video recordings are used for quantitative evaluations. Standardized procedures for behavioral evaluation are readily applicable to different fish species and provide rapid, sensitive, and ecologically relevant assessments of sublethal exposure. The methods are readily applied to standardized test protocols.

  13. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.

  14. Colorimetric analysis of saliva–alcohol test strips by smartphone-based instruments using machine-learning algorithms

    USDA-ARS?s Scientific Manuscript database

    Strip lateral flow assays, similar to a home pregnancy test, are used widely in food safety applications to provide rapid and accurate tests for the presence of specific foodborne pathogens or other contaminants. Though these tests are very rapid, they are not very sensitive, are not quantitative, a...

  15. Characteristics of quantitative nursing research from 1990 to 2010.

    PubMed

    Yarcheski, Adela; Mahon, Noreen E

    2013-12-01

    To assess author credentials of quantitative research in nursing, the composition of the research teams, and the disciplinary focus of the theories tested. Nursing Research, Western Journal of Nursing Research, and Journal of Advanced Nursing were selected for this descriptive study; 1990, 1995, 2000, 2005, and 2010 were included. The final sample consisted of 484 quantitative research articles. From 1990 to 2010, there was an increase in first authors holding doctoral degrees, research from other countries, and funding. Solo authorship decreased; multi-authorship and multidisciplinary teams increased. Theories tested were mostly from psychology; the testing of nursing theory was modest. Multidisciplinary research far outdistanced interdisciplinary research. Quantitative nursing research can be characterized as multidisciplinary (distinct theories from different disciplines) rather than discipline-specific to nursing. Interdisciplinary (theories synthesized from different disciplines) research has been conducted minimally. This study provides information about the growth of the scientific knowledge base of nursing, which has implications for practice. © 2013 Sigma Theta Tau International.

  16. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  17. Accelerated life assessment of coating on the radar structure components in coastal environment.

    PubMed

    Liu, Zhe; Ming, ZhiMao

    2016-07-04

    This paper aimed to build an accelerated life test scheme and carry out quantitative analysis between accelerated life test in the laboratory and actual service for the coating composed of epoxy primer and polyurethane paint on structure components of some kind of radar served in the coastal environment of South China Sea. The accelerated life test scheme was built based on the service environment and failure analysis of the coating. The quantitative analysis between accelerated life test and actual service was conducted by comparing the gloss loss, discoloration, chalking, blistering, cracking and electrochemical impedance spectroscopy of the coating. The main factors leading to the coating failure were ultraviolet radiation, temperature, moisture, salt fog and loads, the accelerated life test included ultraviolet radiation, damp heat, thermal shock, fatigue and salt spray. The quantitative relationship was that one cycle of the accelerated life test was equal to actual service for one year. It was established that one cycle of the accelerated life test was equal to actual service for one year. It provided a precise way to predict actual service life of newly developed coatings for the manufacturer.

  18. Quantitative Estimate of the Relation Between Rolling Resistance on Fuel Consumption of Class 8 Tractor Trailers Using Both New and Retreaded Tires (SAE Paper 2014-01-2425)

    EPA Science Inventory

    Road tests of class 8 tractor trailers were conducted by the US Environmental Protection Agency on new and retreaded tires of varying rolling resistance in order to provide estimates of the quantitative relationship between rolling resistance and fuel consumption.

  19. Dynamic feature analysis of vector-based images for neuropsychological testing

    NASA Astrophysics Data System (ADS)

    Smith, Stephen L.; Cervantes, Basilio R.

    1998-07-01

    The dynamic properties of human motor activities, such as those observed in the course of drawing simple geometric shapes, are considerably more complex and often more informative than the goal to be achieved; in this case a static line drawing. This paper demonstrates how these dynamic properties may be used to provide a means of assessing a patient's visuo-spatial ability -- an important component of neuropsychological testing. The work described here provides a quantitative assessment of visuo-spatial ability, whilst preserving the conventional test environment. Results will be presented for a clinical population of long-term haemodialysis patients and test population comprises three groups of children (1) 7-8 years, (2) 9-10 years and (3) 11-12 years, all of which have no known neurological dysfunction. Ten new dynamic measurements extracted from patient responses in conjunction with one static feature deduced from earlier work describe a patient's visuo-spatial ability in a quantitative manner with sensitivity not previously attainable. The dynamic feature measurements in isolation provide a unique means of tracking a patient's approach to motor activities and could prove useful in monitoring a child' visuo-motor development.

  20. Pleiotropy Analysis of Quantitative Traits at Gene Level by Multivariate Functional Linear Models

    PubMed Central

    Wang, Yifan; Liu, Aiyi; Mills, James L.; Boehnke, Michael; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao; Wu, Colin O.; Fan, Ruzong

    2015-01-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks’s Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. PMID:25809955

  1. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    PubMed

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  2. Multivariate Qst–Fst Comparisons: A Neutrality Test for the Evolution of the G Matrix in Structured Populations

    PubMed Central

    Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme

    2008-01-01

    Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845

  3. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... under the Clean Air Act. (vii) Ocean dumping permits under the Marine Protection Research and... is to be provided as specified in § 122.26). When “quantitative data” for a pollutant are required... Director may allow the applicant to test only one outfall and report that quantitative data as applying to...

  4. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... under the Clean Air Act. (vii) Ocean dumping permits under the Marine Protection Research and... is to be provided as specified in § 122.26). When “quantitative data” for a pollutant are required... Director may allow the applicant to test only one outfall and report that quantitative data as applying to...

  5. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... under the Clean Air Act. (vii) Ocean dumping permits under the Marine Protection Research and... is to be provided as specified in § 122.26). When “quantitative data” for a pollutant are required... Director may allow the applicant to test only one outfall and report that quantitative data as applying to...

  6. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... under the Clean Air Act. (vii) Ocean dumping permits under the Marine Protection Research and... is to be provided as specified in § 122.26). When “quantitative data” for a pollutant are required... Director may allow the applicant to test only one outfall and report that quantitative data as applying to...

  7. Quantitative sensory testing of neuropathic pain patients: potential mechanistic and therapeutic implications.

    PubMed

    Pfau, Doreen B; Geber, Christian; Birklein, Frank; Treede, Rolf-Detlef

    2012-06-01

    Quantitative sensory testing (QST) is a widely accepted tool to investigate somatosensory changes in pain patients. Many different protocols have been developed in clinical pain research within recent years. In this review, we provide an overview of QST and tested neuroanatomical pathways, including peripheral and central structures. Based on research studies using animal and human surrogate models of neuropathic pain, possible underlying mechanisms of chronic pain are discussed. Clinically, QST may be useful for 1) the identification of subgroups of patients with different underlying pain mechanisms; 2) prediction of therapeutic outcomes; and 3) quantification of therapeutic interventions in pain therapy. Combined with sensory mapping, QST may provide useful information on the site of neural damage and on mechanisms of positive and negative somatosensory abnormalities. The use of QST in individual patients for diagnostic purposes leading to individualized therapy is an interesting concept, but needs further validation.

  8. Fatigue crack identification method based on strain amplitude changing

    NASA Astrophysics Data System (ADS)

    Guo, Tiancai; Gao, Jun; Wang, Yonghong; Xu, Youliang

    2017-09-01

    Aiming at the difficulties in identifying the location and time of crack initiation in the castings of helicopter transmission system during fatigue tests, by introducing the classification diagnostic criteria of similar failure mode to find out the similarity of fatigue crack initiation among castings, an engineering method and quantitative criterion for detecting fatigue cracks based on strain amplitude changing is proposed. This method is applied on the fatigue test of a gearbox housing, whose results indicates: during the fatigue test, the system alarms when SC strain meter reaches the quantitative criterion. The afterwards check shows that a fatigue crack less than 5mm is found at the corresponding location of SC strain meter. The test result proves that the method can provide accurate test data for strength life analysis.

  9. Quantitative investigation of ligament strains during physical tests for sacroiliac joint pain using finite element analysis.

    PubMed

    Kim, Yoon Hyuk; Yao, Zhidong; Kim, Kyungsoo; Park, Won Man

    2014-06-01

    It may be assumed that the stability is affected when some ligaments are injured or loosened, and this joint instability causes sacroiliac joint pain. Several physical examinations have been used to diagnose sacroiliac pain and to isolate the source of the pain. However, more quantitative and objective information may be necessary to identify unstable or injured ligaments during these tests due to the lack of understanding of the quantitative relationship between the physical tests and the biomechanical parameters that may be related to pains in the sacroiliac joint and the surrounding ligaments. In this study, a three-dimensional finite element model of the sacroiliac joint was developed and the biomechanical conditions for six typical physical tests such as the compression test, distraction test, sacral apex pressure test, thigh thrust test, Patrick's test, and Gaenslen's test were modelled. The sacroiliac joint contact pressure and ligament strain were investigated for each test. The values of contact pressure and the combination of most highly strained ligaments differed markedly among the tests. Therefore, these findings in combination with the physical tests would be helpful to identify the pain source and to understand the pain mechanism. Moreover, the technology provided in this study might be a useful tool to evaluate the physical tests, to improve the present test protocols, or to develop a new physical test protocol. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  11. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  12. Normalized Temperature Contrast Processing in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  13. Quantitative metal magnetic memory reliability modeling for welded joints

    NASA Astrophysics Data System (ADS)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  14. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  15. Optimization of dual-energy xenon-computed tomography for quantitative assessment of regional pulmonary ventilation.

    PubMed

    Fuld, Matthew K; Halaweish, Ahmed F; Newell, John D; Krauss, Bernhard; Hoffman, Eric A

    2013-09-01

    Dual-energy x-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study, we sought to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon/oxygen gas mixtures (0%, 20%, 25%, 33%, 50%, 66%, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved 3-material decomposition calibration parameters. In addition, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Attenuation curves for xenon were obtained from the syringe test-objects and were used to develop improved 3-material decomposition parameters (Hounsfield unit enhancement per percentage xenon: within the chest phantom, 2.25 at 80 kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; in open air, 2.5 at 80 kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally nondependent portion of the airway tree test-object, while not affecting the quantitation of xenon in the 3-material decomposition DECT. The mixture 40% Xe/40% He/20% O2 provided good signal-to-noise ratio (SNR), greater than the Rose criterion (SNR > 5), while avoiding gravitational effects of similar concentrations of xenon in a 60% O2 mixture. Compared with 100/140 Sn kVp, 80/140 Sn kVp (Sn = tin filtered) provided improved SNR in a swine with an equivalent thoracic transverse density to a human subject with a body mass index of 33 kg/m. Airways were brighter in the 80/140 Sn kVp scan (80/140 Sn, 31.6%; 100/140 Sn, 25.1%) with considerably lower noise (80/140 Sn, coefficient of variation of 0.140; 100/140 Sn, coefficient of variation of 0.216). To provide a truly quantitative measure of regional lung function with xenon-DECT, the basic protocols and parameter calibrations need to be better understood and quantified. It is critically important to understand the fundamentals of new techniques to allow for proper implementation and interpretation of their results before widespread usage. With the use of an in-house derived xenon calibration curve for 3-material decomposition rather than the scanner supplied calibration and a xenon/helium/oxygen mixture, we demonstrate highly accurate quantitation of xenon gas volumes and avoid gravitational effects on gas distribution. This study provides a foundation for other researchers to use and test these methods with the goal of clinical translation.

  16. Optimization of Dual-Energy Xenon-CT for Quantitative Assessment of Regional Pulmonary Ventilation

    PubMed Central

    Fuld, Matthew K.; Halaweish, Ahmed; Newell, John D.; Krauss, Bernhard; Hoffman, Eric A.

    2013-01-01

    Objective Dual-energy X-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study we seek to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. Materials and Methods The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon-oxygen gas mixtures (0, 20, 25, 33, 50, 66, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved three-material decomposition calibration parameters. Additionally, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine in order to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Results Attenuation curves for xenon were obtained from the syringe test objects and were used to develop improved three-material decomposition parameters (HU enhancement per percent xenon: Within the chest phantom: 2.25 at 80kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; In open air: 2.5 at 80kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally non-dependent portion of the airway tree test-object, while not affecting quantitation of xenon in the three-material decomposition DECT. 40%Xe/40%He/20%O2 provided good signal-to-noise, greater than the Rose Criterion (SNR > 5), while avoiding gravitational effects of similar concentrations of xenon in a 60%O2 mixture. 80/140-kVp (tin-filtered) provided improved SNR compared with 100/140-kVp in a swine with an equivalent thoracic transverse density to a human subject with body mass index of 33. Airways were brighter in the 80/140 kVp scan (80/140Sn, 31.6%; 100/140Sn, 25.1%) with considerably lower noise (80/140Sn, CV of 0.140; 100/140Sn, CV of 0.216). Conclusion In order to provide a truly quantitative measure of regional lung function with xenon-DECT, the basic protocols and parameter calibrations needed to be better understood and quantified. It is critically important to understand the fundamentals of new techniques in order to allow for proper implementation and interpretation of their results prior to wide spread usage. With the use of an in house derived xenon calibration curve for three-material decomposition rather than the scanner supplied calibration and a xenon/helium/oxygen mixture we demonstrate highly accurate quantitation of xenon gas volumes and avoid gravitational effects on gas distribution. This study provides a foundation for other researchers to use and test these methods with the goal of clinical translation. PMID:23571834

  17. A Test Method for Monitoring Modulus Changes during Durability Tests on Building Joint Sealants

    Treesearch

    Christopher C. White; Donald L. Hunston; Kar Tean Tan; Gregory T. Schueneman

    2012-01-01

    The durability of building joint sealants is generally assessed using a descriptive methodology involving visual inspection of exposed specimens for defects. It is widely known that this methodology has inherent limitations, including that the results are qualitative. A new test method is proposed that provides more fundamental and quantitative information about...

  18. Integrated System Test of the Advanced Instructional System (AIS). Final Report.

    ERIC Educational Resources Information Center

    Lintz, Larry M.; And Others

    The integrated system test for the Advanced Instructional System (AIS) was designed to provide quantitative information regarding training time reductions resulting from certain computer managed instruction features. The reliabilities of these features and of support systems were also investigated. Basic computer managed instruction reduced…

  19. Identifying Metabolically Active Chemicals Using a Consensus Quantitative Structure Activity Relationship Model for Estrogen Receptor Binding

    EPA Science Inventory

    Traditional toxicity testing provides insight into the mechanisms underlying toxicological responses but requires a high investment in a large number of resources. The new paradigm of testing approaches involves rapid screening studies able to evaluate thousands of chemicals acro...

  20. Minority Performance on the Naglieri Nonverbal Ability Test, Second Edition, versus the Cognitive Abilities Test, Form 6: One Gifted Program's Experience

    ERIC Educational Resources Information Center

    Giessman, Jacob A.; Gambrell, James L.; Stebbins, Molly S.

    2013-01-01

    The Naglieri Nonverbal Ability Test, Second Edition (NNAT2), is used widely to screen students for possible inclusion in talent development programs. The NNAT2 claims to provide a more culturally neutral evaluation of general ability than tests such as Form 6 of the Cognitive Abilities Test (CogAT6), which has Verbal and Quantitative batteries in…

  1. General Model Study of Scour at Proposed Pier Extensions - Santa Ana River at BNSF Bridge, Corona, California

    DTIC Science & Technology

    2017-11-01

    model of the bridge piers, other related structures, and the adjacent channel. Data from the model provided a qualitative and quantitative evaluation of...minus post-test lidar survey . ......................... 42 Figure 38. Test 1 (30,000 cfs existing conditions) pre- minus post-test lidar survey ...43 Figure 39. Test 7 (15,000 cfs original proposed conditions) pre- minus post-test lidar survey

  2. 42 CFR 84.303 - General testing conditions and requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Closed... quantitative evaluations and human subjects on a treadmill to provide qualitative evaluations. Information on...

  3. 42 CFR 84.303 - General testing conditions and requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Closed... quantitative evaluations and human subjects on a treadmill to provide qualitative evaluations. Information on...

  4. 42 CFR 84.303 - General testing conditions and requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Closed... quantitative evaluations and human subjects on a treadmill to provide qualitative evaluations. Information on...

  5. Accelerated Stress-Corrosion Testing

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Test procedures for accelerated stress-corrosion testing of high-strength aluminum alloys faster and provide more quantitative information than traditional pass/fail tests. Method uses data from tests on specimen sets exposed to corrosive environment at several levels of applied static tensile stress for selected exposure times then subsequently tensile tested to failure. Method potentially applicable to other degrading phenomena (such as fatigue, corrosion fatigue, fretting, wear, and creep) that promote development and growth of cracklike flaws within material.

  6. Bilingual health literacy assessment using the Talking Touchscreen/la Pantalla Parlanchina: Development and pilot testing.

    PubMed

    Yost, Kathleen J; Webster, Kimberly; Baker, David W; Choi, Seung W; Bode, Rita K; Hahn, Elizabeth A

    2009-06-01

    Current health literacy measures are too long, imprecise, or have questionable equivalence of English and Spanish versions. The purpose of this paper is to describe the development and pilot testing of a new bilingual computer-based health literacy assessment tool. We analyzed literacy data from three large studies. Using a working definition of health literacy, we developed new prose, document and quantitative items in English and Spanish. Items were pilot tested on 97 English- and 134 Spanish-speaking participants to assess item difficulty. Items covered topics relevant to primary care patients and providers. English- and Spanish-speaking participants understood the tasks involved in answering each type of question. The English Talking Touchscreen was easy to use and the English and Spanish items provided good coverage of the difficulty continuum. Qualitative and quantitative results provided useful information on computer acceptability and initial item difficulty. After the items have been administered on the Talking Touchscreen (la Pantalla Parlanchina) to 600 English-speaking (and 600 Spanish-speaking) primary care patients, we will develop a computer adaptive test. This health literacy tool will enable clinicians and researchers to more precisely determine the level at which low health literacy adversely affects health and healthcare utilization.

  7. [Clinical exercise testing and the Fick equation: strategic thinking for optimizing diagnosis].

    PubMed

    Perrault, H; Richard, R

    2012-04-01

    This article examines the expected exercise-induced changes in the components of the oxygen transport system as described by the Fick equation with a view to enable a critical analysis of a standard incremental exercise test to identify normal and abnormal patterns of responses and generate hypotheses as to potential physiological and/or pathophysiological causes. The text reviews basic physiological principals and provides useful reminders of standard equations that serve to integrate circulatory, respiratory and skeletal muscle functions. More specifically, the article provides a conceptual and quantitative framework linking the exercise-induced increase in whole body oxygen uptake to central circulatory and peripheral circulatory factors with the view to establish the normalcy of response. Thus, the article reviews the exercise response to cardiac output determinants and provides qualitative and quantitative perspective bases for making assumptions on the peripheral circulatory factors and oxygen use. Finally, the article demonstrates the usefulness of exercise testing as an effective integrative physiological approach to develop clinical reasoning or verify pathophysiological outcomes. Copyright © 2012 SPLF. Published by Elsevier Masson SAS. All rights reserved.

  8. Next Generation Programmable Bio-Nano-Chip System for On-Site Detection in Oral Fluids.

    PubMed

    Christodoulides, Nicolaos; De La Garza, Richard; Simmons, Glennon W; McRae, Michael P; Wong, Jorge; Newton, Thomas F; Kosten, Thomas R; Haque, Ahmed; McDevitt, John T

    2015-11-23

    Current on-site drug of abuse detection methods involve invasive sampling of blood and urine specimens, or collection of oral fluid, followed by qualitative screening tests using immunochromatographic cartridges. Test confirmation and quantitative assessment of a presumptive positive are then provided by remote laboratories, an inefficient and costly process decoupled from the initial sampling. Recently, a new noninvasive oral fluid sampling approach that is integrated with the chip-based Programmable Bio-Nano-Chip (p-BNC) platform has been developed for the rapid (~ 10 minutes), sensitive detection (~ ng/ml) and quantitation of 12 drugs of abuse. Furthermore, the system can provide the time-course of select drug and metabolite profiles in oral fluids. For cocaine, we observed three slope components were correlated with cocaine-induced impairment using this chip-based p-BNC detection modality. Thus, this p-BNC has significant potential for roadside drug testing by law enforcement officers. Initial work reported on chip-based drug detection was completed using 'macro' or "chip in the lab" prototypes, that included metal encased "flow cells", external peristaltic pumps and a bench-top analyzer system instrumentation. We now describe the next generation miniaturized analyzer instrumentation along with customized disposables and sampling devices. These tools will offer real-time oral fluid drug monitoring capabilities, to be used for roadside drug testing as well as testing in clinical settings as a non-invasive, quantitative, accurate and sensitive tool to verify patient adherence to treatment.

  9. Analysis of mathematical literacy ability based on self-efficacy in model eliciting activities using metaphorical thinking approach

    NASA Astrophysics Data System (ADS)

    Setiani, C.; Waluya, S. B.; Wardono

    2018-03-01

    The purposes of this research are: (1) to identify learning quality in Model Eliciting Activities (MEAs) using a Metaphorical Thinking (MT) approach regarding qualitative and quantitative; (2) to analyze mathematical literacy of students based on Self-Efficacy (SE). This research is mixed method concurrent embedded design with qualitative research as the primary method. The quantitative research used quasi-experimental with non-equivalent control group design. The population is VIII grade students of SMP Negeri 3 Semarang Indonesia. Quantitative data is examined by conducting completeness mean test, standard completeness test, mean differentiation test and proportional differentiation test. Qualitative data is analyzed descriptively. The result of this research shows that MEAs learning using MT approach accomplishes good criteria both quantitatively and qualitatively. Students with low self-efficacy can identify problems, but they are lack ability to arrange problem-solving strategy on mathematical literacy questions. Students with medium self-efficacy can identify information provided in issues, but they find difficulties to use math symbols in making a representation. Students with high self-efficacy are excellent to represent problems into mathematical models as well as figures by using appropriate symbols and tools, so they can arrange strategy easily to solve mathematical literacy questions.

  10. Quantitative Field Testing Rotylenchulus reniformis DNA from Metagenomic Samples Isolated Directly from Soil

    PubMed Central

    Showmaker, Kurt; Lawrence, Gary W.; Lu, Shien; Balbalian, Clarissa; Klink, Vincent P.

    2011-01-01

    A quantitative PCR procedure targeting the β-tubulin gene determined the number of Rotylenchulus reniformis Linford & Oliveira 1940 in metagenomic DNA samples isolated from soil. Of note, this outcome was in the presence of other soil-dwelling plant parasitic nematodes including its sister genus Helicotylenchus Steiner, 1945. The methodology provides a framework for molecular diagnostics of nematodes from metagenomic DNA isolated directly from soil. PMID:22194958

  11. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. An Automated System for Chromosome Analysis

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Melnyk, J. H.

    1976-01-01

    The design, construction, and testing of a complete system to produce karyotypes and chromosome measurement data from human blood samples, and to provide a basis for statistical analysis of quantitative chromosome measurement data are described.

  13. Investigation of Kevlar fabric-based materials for use with inflatable structures

    NASA Technical Reports Server (NTRS)

    Niccum, R. J.; Munson, J. B.; Rueter, L. L.

    1977-01-01

    Design, manufacture and testing of laminated and coated composite materials incorporating a structural matrix of Kevlar are reported. The practicality of using Kevlar in aerostat materials is demonstrated, and data are provided on practical weaves, lamination and coating particulars, rigidity, strength, weight, elastic coefficients, abrasion resistance, crease effects, peel strength, blocking tendencies, helium permeability, and fabrication techniques. Properties of the Kevlar-based materials are compared with conventional Dacron-reinforced counterparts. A comprehensive test and qualification program is discussed, and considerable quantitative biaxial tensile and shear test data are provided.

  14. Louis Guttman's Contributions to Classical Test Theory

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald

    2005-01-01

    This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…

  15. A Method to Measure and Estimate Normalized Contrast in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  16. Inferring pterosaur diets through quantitative 3D textural analysis of tooth microwear in extant analogues

    NASA Astrophysics Data System (ADS)

    Bestwick, Jordan; Unwin, David; Butler, Richard; Henderson, Don; Purnell, Mark

    2017-04-01

    Pterosaurs (Pterosauria) were a successful group of Mesozoic flying reptiles. For 150 million years they were integral components of terrestrial and coastal ecosystems, yet their feeding ecology remains poorly constrained. Postulated pterosaur diets include insectivory, piscivory and/or carnivory, but many dietary hypotheses are speculative and/or based on little evidence, highlighting the need for alternative approaches to provide robust data. One method involves quantitative analysis of the micron-scale 3D textures of worn pterosaur tooth surfaces - dental microwear texture analysis. Microwear is produced as scratches and chips generated by food items create characteristic tooth surface textures. Microwear analysis has never been applied to pterosaurs, but we might expect microwear textures to differ between pterosaurs with different diets. An important step in investigating pterosaur microwear is to examine microwear from extant organisms with known diets to provide a comparative data set. This has been achieved through analysis of non-occlusal microwear textures in extant bats, crocodilians and monitor lizards, clades within which species exhibit insectivorous, piscivorous and carnivorous diets. The results - the first test of the hypothesis that non-occlusal microwear textures in these extant clades vary with diet - provide the context for the first robust quantitative tests of pterosaur diets.

  17. Leachate Testing of Hamlet City Lake, North Carolina, Sediment

    DTIC Science & Technology

    1992-11-01

    release; distribution is unlimited. 13. ABSTRACT (Maximum 200 words) Sediment leaching studies of Hamlet City Lake, Hamlet, NC, were conducted in...laboratories at the U.S. Army Engineer Waterways Experiment Station. The pur- pose of these studies was to provide quantitative information on the...conditions similar to landfarming. The study involved three elements: batch leach tests, column leach tests, and simulations using the Hydrologic

  18. Practicing Accounting Profession Criterial Skills in the Classroom: A Study of Collaborative Testing and the Impact on Final Exam Scores

    ERIC Educational Resources Information Center

    VanderLaan, Ski R.

    2010-01-01

    This mixed methods study (Creswell, 2008) was designed to test the influence of collaborative testing on learning using a quasi-experimental approach. This study used a modified embedded mixed method design in which the qualitative and quantitative data, associated with the secondary questions, provided a supportive role in a study based primarily…

  19. Direct agglutination test for serologic diagnosis of Neospora caninum infection.

    PubMed

    Romand, S; Thulliez, P; Dubey, J P

    1998-01-01

    A direct agglutination test was evaluated for the detection and quantitation of IgG antibodies to Neospora caninum in both experimental and natural infections in various animal species. As compared with results obtained by the indirect fluorescent antibody test, the direct agglutination test appeared reliable for the serologic diagnosis of neosporosis in a variety of animal species. The direct agglutination test should provide easily available and inexpensive tools for serologic testing for antibodies to N. caninum in many host species.

  20. Simultaneous Quantitative Detection of Helicobacter Pylori Based on a Rapid and Sensitive Testing Platform using Quantum Dots-Labeled Immunochromatiographic Test Strips

    NASA Astrophysics Data System (ADS)

    Zheng, Yu; Wang, Kan; Zhang, Jingjing; Qin, Weijian; Yan, Xinyu; Shen, Guangxia; Gao, Guo; Pan, Fei; Cui, Daxiang

    2016-02-01

    Quantum dots-labeled urea-enzyme antibody-based rapid immunochromatographic test strips have been developed as quantitative fluorescence point-of-care tests (POCTs) to detect helicobacter pylori. Presented in this study is a new test strip reader designed to run on tablet personal computers (PCs), which is portable for outdoor detection even without an alternating current (AC) power supply. A Wi-Fi module was integrated into the reader to improve its portability. Patient information was loaded by a barcode scanner, and an application designed to run on tablet PCs was developed to handle the acquired images. A vision algorithm called Kmeans was used for picture processing. Different concentrations of various human blood samples were tested to evaluate the stability and accuracy of the fabricated device. Results demonstrate that the reader can provide an easy, rapid, simultaneous, quantitative detection for helicobacter pylori. The proposed test strip reader has a lighter weight than existing detection readers, and it can run for long durations without an AC power supply, thus verifying that it possesses advantages for outdoor detection. Given its fast detection speed and high accuracy, the proposed reader combined with quantum dots-labeled test strips is suitable for POCTs and owns great potential in applications such as screening patients with infection of helicobacter pylori, etc. in near future.

  1. Confocal Microscopy and Flow Cytometry System Performance: Assessment of QA Parameters that affect data Quanitification

    EPA Science Inventory

    Flow and image cytometers can provide useful quantitative fluorescence data. We have devised QA tests to be used on both a flow cytometer and a confocal microscope to assure that the data is accurate, reproducible and precise. Flow Cytometry: We have provided two simple perform...

  2. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials

    PubMed Central

    Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.

    2015-01-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347

  3. On-chip quantitative detection of pathogen genes by autonomous microfluidic PCR platform.

    PubMed

    Tachibana, Hiroaki; Saito, Masato; Shibuya, Shogo; Tsuji, Koji; Miyagawa, Nobuyuki; Yamanaka, Keiichiro; Tamiya, Eiichi

    2015-12-15

    Polymerase chain reaction (PCR)-based genetic testing has become a routine part of clinical diagnoses and food testing. In these fields, rapid, easy-to-use, and cost-efficient PCR chips are expected to be appeared for providing such testing on-site. In this study, a new autonomous disposable plastic microfluidic PCR chip was created, and was utilized for quantitative detection of pathogenic microorganisms. To control the capillary flow of the following solution in the PCR microchannel, a driving microchannel was newly designed behind the PCR microchannel. This allowed the effective PCR by simply dropping the PCR solution onto the inlet without any external pumps. In order to achieve disposability, injection-molded cyclo-olefin polymer (COP) of a cost-competitive plastic was used for the PCR chip. We discovered that coating the microchannel walls with non-ionic surfactant produced a suitable hydrophilic surface for driving the capillary flow through the 1250-mm long microchannel. As a result, quantitative real-time PCR with the lowest initial concentration of human, Escherichia coli (E. coli), and pathogenic E. coli O157 genomic DNA of 4, 0.0019, 0.031 pg/μl, respectively, was successfully achieved in less than 18 min. Our results indicate that the platform presented in this study provided a rapid, easy-to-use, and low-cost real-time PCR system that could be potentially used for on-site gene testing. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. A quantitative dynamic systems model of health-related quality of life among older adults

    PubMed Central

    Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela

    2015-01-01

    Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722

  5. Design of an impact evaluation using a mixed methods model--an explanatory assessment of the effects of results-based financing mechanisms on maternal healthcare services in Malawi.

    PubMed

    Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela

    2014-04-22

    In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.

  6. Quantitative methods used in Australian health promotion research: a review of publications from 1992-2002.

    PubMed

    Smith, Ben J; Zehle, Katharina; Bauman, Adrian E; Chau, Josephine; Hawkshaw, Barbara; Frost, Steven; Thomas, Margaret

    2006-04-01

    This study examined the use of quantitative methods in Australian health promotion research in order to identify methodological trends and priorities for strengthening the evidence base for health promotion. Australian health promotion articles were identified by hand searching publications from 1992-2002 in six journals: Health Promotion Journal of Australia, Australian and New Zealand journal of Public Health, Health Promotion International, Health Education Research, Health Education and Behavior and the American Journal of Health Promotion. The study designs and statistical methods used in articles presenting quantitative research were recorded. 591 (57.7%) of the 1,025 articles used quantitative methods. Cross-sectional designs were used in the majority (54.3%) of studies with pre- and post-test (14.6%) and post-test only (9.5%) the next most common designs. Bivariate statistical methods were used in 45.9% of papers, multivariate methods in 27.1% and simple numbers and proportions in 25.4%. Few studies used higher-level statistical techniques. While most studies used quantitative methods, the majority were descriptive in nature. The study designs and statistical methods used provided limited scope for demonstrating intervention effects or understanding the determinants of change.

  7. Investigation of PACE™ software and VeriFax's Impairoscope device for quantitatively measuring the effects of stress

    NASA Astrophysics Data System (ADS)

    Morgenthaler, George W.; Nuñez, German R.; Botello, Aaron M.; Soto, Jose; Shrairman, Ruth; Landau, Alexander

    1998-01-01

    Many reaction time experiments have been conducted over the years to observe human responses. However, most of the experiments that were performed did not have quantitatively accurate instruments for measuring change in reaction time under stress. There is a great need for quantitative instruments to measure neuromuscular reaction responses under stressful conditions such as distraction, disorientation, disease, alcohol, drugs, etc. The two instruments used in the experiments reported in this paper are such devices. Their accuracy, portability, ease of use, and biometric character are what makes them very special. PACE™ is a software model used to measure reaction time. VeriFax's Impairoscope measures the deterioration of neuromuscular responses. During the 1997 Summer Semester, various reaction time experiments were conducted on University of Colorado faculty, staff, and students using the PACE™ system. The tests included both two-eye and one-eye unstressed trials and trials with various stresses such as fatigue, distractions in which subjects were asked to perform simple arithmetic during the PACE™ tests, and stress due to rotating-chair dizziness. Various VeriFax Impairoscope tests, both stressed and unstressed, were conducted to determine the Impairoscope's ability to quantitatively measure this impairment. In the 1997 Fall Semester, a Phase II effort was undertaken to increase test sample sizes in order to provide statistical precision and stability. More sophisticated statistical methods remain to be applied to better interpret the data.

  8. Development of a novel nanoscratch technique for quantitative measurement of ice adhesion strength

    NASA Astrophysics Data System (ADS)

    Loho, T.; Dickinson, M.

    2018-04-01

    The mechanism for the way that ice adheres to surfaces is still not well understood. Currently there is no standard method to quantitatively measure how ice adheres to surfaces which makes ice surface studies difficult to compare. A novel quantitative lateral force adhesion measurement at the micro-nano scale for ice was created which shears micro-nano sized ice droplets (less than 3 μm in diameter and 100nm in height) using a nanoindenter. By using small ice droplets, the variables associated with bulk ice measurements were minimised which increased data repeatability compared to bulk testing. The technique provided post- testing surface scans to confirm that the ice had been removed and that measurements were of ice adhesion strength. Results show that the ice adhesion strength of a material is greatly affected by the nano-scale surface roughness of the material with rougher surfaces having higher ice adhesion strength.

  9. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  10. Applications of FT-IR spectrophotometry in cancer diagnostics.

    PubMed

    Bunaciu, Andrei A; Hoang, Vu Dang; Aboul-Enein, Hassan Y

    2015-01-01

    This review provides a brief background to the application of infrared spectroscopy, including Fourier transform-infrared spectroscopy, in biological fluids. It is not meant to be complete or exhaustive but to provide the reader with sufficient background for selected applications in cancer diagnostics. Fourier transform-infrared spectroscopy (FT-IR) is a fast and nondestructive analytical method. The infrared spectrum of a mixture serves as the basis to quantitate its constituents, and a number of common clinical chemistry tests have proven to be feasible using this approach. This review focuses on biomedical FT-IR applications, published in the period 2009-2013, used for early detection of cancer through qualitative and quantitative analysis.

  11. An automated system for pulmonary function testing

    NASA Technical Reports Server (NTRS)

    Mauldin, D. G.

    1974-01-01

    An experiment to quantitate pulmonary function was accepted for the space shuttle concept verification test. The single breath maneuver and the nitrogen washout are combined to reduce the test time. Parameters are defined from the forced vital capacity maneuvers. A spirometer measures the breath volume and a magnetic section mass spectrometer provides definition of gas composition. Mass spectrometer and spirometer data are analyzed by a PDP-81 digital computer.

  12. Respiratory protection against Mycobacterium tuberculosis: quantitative fit test outcomes for five type N95 filtering-facepiece respirators.

    PubMed

    Lee, Kiyoung; Slavcev, Andrea; Nicas, Mark

    2004-01-01

    In preparing to fit test a large workforce, a respirator program manager needs to initially choose respirators that will fit the greatest proportion of employees and achieve the best fits. This article discusses our strategy in selecting respirators from an initial array of seven NIOSH-certified Type N95 filtering-facepiece devices for a respiratory protection program against Mycobacterium tuberculosis (M. tb) aerosol. The seven respirators were screened based on manufacturer-provided fit test data, comfort, and cost. From these 7 devices, 5 were chosen for quantitative fit testing on 40 subjects who were a convenience sample from a cohort of approximately 30,000 workers scheduled to undergo fit testing. Across the five brands, medium/regular-size respirators fit from 8% to 95% of the subjects; providing another size of the same brand improved the pass rates slightly. Gender was not found to significantly affect fit test pass rates for any respirator brand. Among test panel members, an Aearo Corporation respirator (TC 84A-2630) and a 3M Company respirator (TC 84A-0006) provided the highest overall pass rates of 98% and 90%, respectively. We selected these two brands for fit testing in the larger worker cohort. To date, these two respirators have provided overall pass rates of 98% (1793/1830) and 88% (50/57), respectively, which are similar to the test panel results. Among 1850 individuals who have been fit tested, 1843 (99.6%) have been successfully fitted with one or the other brand. In a separate analysis, we used the test panel pass rates to estimate the reduction in M. tb infection risk afforded by the medium/regular-size of five filtering-facepiece respirators. We posed a low-exposure versus a high-exposure scenario for health care workers and assumed that respirators could be assigned without conducting fit testing, as proposed by many hospital infection control practitioners. Among those who would pass versus fail the fit test, we assumed an average respirator penetration (primarily due to faceseal leakage) of .04 and 0.3, respectively. The respirator with the highest overall pass rate (95%) reduced M. tb infection risk by 95%, while the respirator with the lowest pass rate (8%) reduced M. tb infection risk by only 70%. To promote the marketing of respirators that will successfully fit the highest proportion of wearers, and to increase protection for workers who might use respirators without the benefit of being fit tested, we recommend that fit testing be part of the NIOSH certification process for negative-pressure air-purifying respirators with tightly fitting facepieces. At a minimum, we recommend that respirator manufacturers generate and provide pass rate data to assist in selecting candidate respirators. In any event, program managers can initially select candidate respirators by comparing quantitative fit tests for a representative sample of their employee population.

  13. Standardized pivot shift test improves measurement accuracy.

    PubMed

    Hoshino, Yuichi; Araujo, Paulo; Ahlden, Mattias; Moore, Charity G; Kuroda, Ryosuke; Zaffagnini, Stefano; Karlsson, Jon; Fu, Freddie H; Musahl, Volker

    2012-04-01

    The variability of the pivot shift test techniques greatly interferes with achieving a quantitative and generally comparable measurement. The purpose of this study was to compare the variation of the quantitative pivot shift measurements with different surgeons' preferred techniques to a standardized technique. The hypothesis was that standardizing the pivot shift test would improve consistency in the quantitative evaluation when compared with surgeon-specific techniques. A whole lower body cadaveric specimen was prepared to have a low-grade pivot shift on one side and high-grade pivot shift on the other side. Twelve expert surgeons performed the pivot shift test using (1) their preferred technique and (2) a standardized technique. Electromagnetic tracking was utilized to measure anterior tibial translation and acceleration of the reduction during the pivot shift test. The variation of the measurement was compared between the surgeons' preferred technique and the standardized technique. The anterior tibial translation during pivot shift test was similar between using surgeons' preferred technique (left 24.0 ± 4.3 mm; right 15.5 ± 3.8 mm) and using standardized technique (left 25.1 ± 3.2 mm; right 15.6 ± 4.0 mm; n.s.). However, the variation in acceleration was significantly smaller with the standardized technique (left 3.0 ± 1.3 mm/s(2); right 2.5 ± 0.7 mm/s(2)) compared with the surgeons' preferred technique (left 4.3 ± 3.3 mm/s(2); right 3.4 ± 2.3 mm/s(2); both P < 0.01). Standardizing the pivot shift test maneuver provides a more consistent quantitative evaluation and may be helpful in designing future multicenter clinical outcome trials. Diagnostic study, Level I.

  14. Quantitative Detection and Biological Propagation of Scrapie Seeding Activity In Vitro Facilitate Use of Prions as Model Pathogens for Disinfection

    PubMed Central

    Pritzkow, Sandra; Wagenführ, Katja; Daus, Martin L.; Boerner, Susann; Lemmer, Karin; Thomzig, Achim; Mielke, Martin; Beekes, Michael

    2011-01-01

    Prions are pathogens with an unusually high tolerance to inactivation and constitute a complex challenge to the re-processing of surgical instruments. On the other hand, however, they provide an informative paradigm which has been exploited successfully for the development of novel broad-range disinfectants simultaneously active also against bacteria, viruses and fungi. Here we report on the development of a methodological platform that further facilitates the use of scrapie prions as model pathogens for disinfection. We used specifically adapted serial protein misfolding cyclic amplification (PMCA) for the quantitative detection, on steel wires providing model carriers for decontamination, of 263K scrapie seeding activity converting normal protease-sensitive into abnormal protease-resistant prion protein. Reference steel wires carrying defined amounts of scrapie infectivity were used for assay calibration, while scrapie-contaminated test steel wires were subjected to fifteen different procedures for disinfection that yielded scrapie titre reductions of ≤101- to ≥105.5-fold. As confirmed by titration in hamsters the residual scrapie infectivity on test wires could be reliably deduced for all examined disinfection procedures, from our quantitative seeding activity assay. Furthermore, we found that scrapie seeding activity present in 263K hamster brain homogenate or multiplied by PMCA of scrapie-contaminated steel wires both triggered accumulation of protease-resistant prion protein and was further propagated in a novel cell assay for 263K scrapie prions, i.e., cerebral glial cell cultures from hamsters. The findings from our PMCA- and glial cell culture assays revealed scrapie seeding activity as a biochemically and biologically replicative principle in vitro, with the former being quantitatively linked to prion infectivity detected on steel wires in vivo. When combined, our in vitro assays provide an alternative to titrations of biological scrapie infectivity in animals that substantially facilitates the use of prions as potentially highly indicative test agents in the search for novel broad-range disinfectants. PMID:21647368

  15. Rationalising the 'irrational': a think aloud study of discrete choice experiment responses.

    PubMed

    Ryan, Mandy; Watson, Verity; Entwistle, Vikki

    2009-03-01

    Stated preference methods assume respondents' preferences are consistent with utility theory, but many empirical studies report evidence of preferences that violate utility theory. This evidence is often derived from quantitative tests that occur naturally within, or are added to, stated preference tasks. In this study, we use qualitative methods to explore three axioms of utility theory: completeness, monotonicity, and continuity. We take a novel approach, adopting a 'think aloud' technique to identify violations of the axioms of utility theory and to consider how well the quantitative tests incorporated within a discrete choice experiment are able to detect these. Results indicate that quantitative tests classify respondents as being 'irrational' when qualitative statements would indicate they are 'rational'. In particular, 'non-monotonic' responses can often be explained by respondents inferring additional information beyond what is presented in the task, and individuals who appear to adopt non-compensatory decision-making strategies do so because they rate particular attributes very highly (they are not attempting to simplify the task). The results also provide evidence of 'cost-based responses': respondents assumed tests with higher costs would be of higher quality. The value of including in-depth qualitative validation techniques in the development of stated preference tasks is shown.

  16. Correlation between quantitative whole-body muscle magnetic resonance imaging and clinical muscle weakness in Pompe disease.

    PubMed

    Horvath, Jeffrey J; Austin, Stephanie L; Case, Laura E; Greene, Karla B; Jones, Harrison N; Soher, Brian J; Kishnani, Priya S; Bashir, Mustafa R

    2015-05-01

    Previous examination of whole-body muscle involvement in Pompe disease has been limited to physical examination and/or qualitative magnetic resonance imaging (MRI). In this study we assess the feasibility of quantitative proton-density fat-fraction (PDFF) whole-body MRI in late-onset Pompe disease (LOPD) and compare the results with manual muscle testing. Seven LOPD patients and 11 disease-free controls underwent whole-body PDFF MRI. Quantitative MR muscle group assessments were compared with physical testing of muscle groups. The 95% upper limits of confidence intervals for muscle groups were 4.9-12.6% in controls and 6.8-76.4% in LOPD patients. LOPD patients showed severe and consistent tongue and axial muscle group involvement, with less marked involvement of peripheral musculature. MRI was more sensitive than physical examination for detection of abnormality in multiple muscle groups. This integrated, quantitative approach to muscle assessment provides more detailed data than physical examination and may have clinical utility for monitoring disease progression and treatment response. © 2014 Wiley Periodicals, Inc.

  17. Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.

    1978-01-01

    Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.

  18. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  19. Automated Chemical Warfare Respirator Quantitative Fit Test Instrument

    DTIC Science & Technology

    1985-04-01

    i requisite to assessment of the level of protection provided by the respirator. Quantitative measurement of the variability of fit of the face- plec ...ACQUISITION SYSTEM CORN OIL FILlE RESERVOIR -~ HEATER CONCENTRATIOA ZAONVU~C RB EAHG1FECO PHOTOMETER- STA RT CIL RESERVOIR BOTTGM TEMPERATURE SWITCH...30 .........-.-.-... % L4* . 3.3.4 HP 3497A Control and Data Acquisition Unit lI, "%a.infraoc" i.; a box with five slots for plug-in modules pll u; a

  20. The FY 1980 Department of Defense Program for Research, Development, and Acquisition

    DTIC Science & Technology

    1979-02-01

    materiel. Up to a point, superior performance is an offset to this quantitative disadvantage. Lanchester’s theory of warfare derived simplified relations...intermediate ranges. Underground Test. The next scheduled underground test ( UGT ), MINERS IRON, in FY 1980, will provide engineering and design data on...methods of discriminating between UGTs and earthquakes, and address U.S. capabilities to monitor both the existing Threshold Test Ban Treaty and the

  1. Supporting the Integration of HIV Testing Into Primary Care Settings

    PubMed Central

    Bradley-Springer, Lucy; Kang Dufour, Mi-Suk; Koester, Kimberly A.; Beane, Stephanie; Warren, Nancy; Beal, Jeffrey; Frank, Linda Rose

    2012-01-01

    Objectives. We examined the efforts of the US network of AIDS Education and Training Centers (AETCs) to increase HIV testing capacity across a variety of clinical settings. Methods. We used quantitative process data from 8 regional AETCs for July 1, 2008, to June 30, 2009, and qualitative program descriptions to demonstrate how AETC education helped providers integrate HIV testing into routine clinical care with the goals of early diagnosis and treatment. Results. Compared with other AETC training, HIV testing training was longer and used a broader variety of strategies to educate more providers per training. During education, providers were able to understand their primary care responsibility to address public health concerns through HIV testing. Conclusions. AETC efforts illustrate how integration of the principles of primary care and public health can be promoted through professional training. PMID:22515867

  2. Mobile Phone Sensing of Cocaine in a Lateral Flow Assay Combined with a Biomimetic Material.

    PubMed

    Guler, Emine; Yilmaz Sengel, Tulay; Gumus, Z Pinar; Arslan, Mustafa; Coskunol, Hakan; Timur, Suna; Yagci, Yusuf

    2017-09-19

    Lateral flow assays (LFAs) are an ideal choice for drug abuse testing favored by their practicability, portability, and rapidity. LFA based on-site rapid screening devices provide positive/negative judgment in a short response time. The conventionally applied competitive assay format used for small molecule analysis such as abused drugs restricts the quantitation ability of LFA strips. We report herein, for the first time, a new strategy using the noncompetitive assay format via a biomimetic material, namely, poly(p-phenylene) β-cyclodextrin poly(ethylene glycol) (PPP-CD-g-PEG) combined with gold nanoparticle (AuNP) conjugates as the labeling agent to recognize the target cocaine molecule in the test zone. The intensities of the visualized red color in the test line indicate that the cocaine concentrations were analyzed via a smartphone application. Significantly, a combination of this platform with a smartphone application provides quantitative data on the cocaine amount, making it a very inventive and attractive approach especially for on-site applications at critical points such as traffic stops and the workplace.

  3. Structural fatigue test results for large wind turbine blade sections

    NASA Technical Reports Server (NTRS)

    Faddoul, J. R.; Sullivan, T. L.

    1982-01-01

    In order to provide quantitative information on the operating life capabilities of wind turbine rotor blade concepts for root-end load transfer, a series of cantilever beam fatigue tests was conducted. Fatigue tests were conducted on a laminated wood blade with bonded steel studs, a low cost steel spar (utility pole) with a welded flange, a utility pole with additional root-end thickness provided by a swaged collar, fiberglass spars with both bonded and nonbonded fittings, and, finally, an aluminum blade with a bolted steel fitting (Lockheed Mod-0 blade). Photographs, data, and conclusions for each of these tests are presented. In addition, the aluminum blade test results are compared to field failure information; these results provide evidence that the cantilever beam type of fatigue test is a satisfactory method for obtaining qualitative data on blade life expectancy and for identifying structurally underdesigned areas (hot spots).

  4. Quantitative Assessment of Neurite Outgrowth in PC12 Cells

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity. In order to identify potential developmental neurotoxicants, assessment of critical neurodevelopmental processes such as neuronal differenti...

  5. Chronic Obstructive Pulmonary Disease: Lobe-based Visual Assessment of Volumetric CT by Using Standard Images—Comparison with Quantitative CT and Pulmonary Function Test in the COPDGene Study

    PubMed Central

    Kim, Song Soo; Lee, Ho Yun; Nevrekar, Dipti V.; Forssen, Anna V.; Crapo, James D.; Schroeder, Joyce D.; Lynch, David A.

    2013-01-01

    Purpose: To provide a new detailed visual assessment scheme of computed tomography (CT) for chronic obstructive pulmonary disease (COPD) by using standard reference images and to compare this visual assessment method with quantitative CT and several physiologic parameters. Materials and Methods: This research was approved by the institutional review board of each institution. CT images of 200 participants in the COPDGene study were evaluated. Four thoracic radiologists performed independent, lobar analysis of volumetric CT images for type (centrilobular, panlobular, and mixed) and extent (on a six-point scale) of emphysema, the presence of bronchiectasis, airway wall thickening, and tracheal abnormalities. Standard images for each finding, generated by two radiologists, were used for reference. The extent of emphysema, airway wall thickening, and luminal area were quantified at the lobar level by using commercial software. Spearman rank test and simple and multiple regression analyses were performed to compare the results of visual assessment with physiologic and quantitative parameters. Results: The type of emphysema, determined by four readers, showed good agreement (κ = 0.63). The extent of the emphysema in each lobe showed good agreement (mean weighted κ = 0.70) and correlated with findings at quantitative CT (r = 0.75), forced expiratory volume in 1 second (FEV1) (r = −0.68), FEV1/forced vital capacity (FVC) ratio (r = −0.74) (P < .001). Agreement for airway wall thickening was fair (mean κ = 0.41), and the number of lobes with thickened bronchial walls correlated with FEV1 (r = −0.60) and FEV1/FVC ratio (r = −0.60) (P < .001). Conclusion: Visual assessment of emphysema and airways disease in individuals with COPD can provide reproducible, physiologically substantial information that may complement that provided by quantitative CT assessment. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12120385/-/DC1 PMID:23220894

  6. Development of combinatorial chemistry methods for coatings: high-throughput adhesion evaluation and scale-up of combinatorial leads.

    PubMed

    Potyrailo, Radislav A; Chisholm, Bret J; Morris, William G; Cawse, James N; Flanagan, William P; Hassib, Lamyaa; Molaison, Chris A; Ezbiansky, Karin; Medford, George; Reitz, Hariklia

    2003-01-01

    Coupling of combinatorial chemistry methods with high-throughput (HT) performance testing and measurements of resulting properties has provided a powerful set of tools for the 10-fold accelerated discovery of new high-performance coating materials for automotive applications. Our approach replaces labor-intensive steps with automated systems for evaluation of adhesion of 8 x 6 arrays of coating elements that are discretely deposited on a single 9 x 12 cm plastic substrate. Performance of coatings is evaluated with respect to their resistance to adhesion loss, because this parameter is one of the primary considerations in end-use automotive applications. Our HT adhesion evaluation provides previously unavailable capabilities of high speed and reproducibility of testing by using a robotic automation, an expanded range of types of tested coatings by using the coating tagging strategy, and an improved quantitation by using high signal-to-noise automatic imaging. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several coatings leads. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and adhesion loss testing. These validation results have confirmed the superb performance of combinatorially developed coatings over conventional coatings on the traditional scale.

  7. Quantitative template for subtyping primary progressive aphasia.

    PubMed

    Mesulam, Marsel; Wieneke, Christina; Rogalski, Emily; Cobia, Derin; Thompson, Cynthia; Weintraub, Sandra

    2009-12-01

    The syndrome of primary progressive aphasia (PPA) is diagnosed when a gradual failure of word usage or comprehension emerges as the principal feature of a neurodegenerative disease. To provide a quantitative algorithm for classifying PPA into agrammatic (PPA-G), semantic (PPA-S), and logopenic (PPA-L) variants, each of which is known to have a different probability of association with Alzheimer disease vs frontotemporal lobar degeneration. Prospective study. University medical center. Sixteen consecutively enrolled patients with PPA who underwent neuropsychological testing and magnetic resonance imaging recruited nationally in the United States as part of a longitudinal study. A 2-dimensional template that reflects performance on tests of syntax (Northwestern Anagram Test) and lexical semantics (Peabody Picture Vocabulary Test-Fourth Edition) classified all 16 patients in concordance with a clinical diagnosis that had been made before the administration of quantitative tests. All 3 PPA subtypes had distinctly asymmetrical atrophy of the left perisylvian language network. Each subtype also had distinctive peak atrophy sites: PPA-G in the inferior frontal gyrus (Broca area), PPA-S in the anterior temporal lobe, and PPA-L in Brodmann area 37. Once an accurate root diagnosis of PPA is made, subtyping can be quantitatively guided using a 2-dimensional template based on orthogonal tasks of grammatical competence and word comprehension. Although the choice of tasks and the precise cutoff levels may need to be adjusted to fit linguistic and educational backgrounds, these 16 patients demonstrate the feasibility of using a simple algorithm for clinicoanatomical classification in PPA. Prospective studies will show whether this subtyping can improve clinical prediction of the underlying neuropathologic condition.

  8. QUANTITATIVE TEMPLATE FOR SUBTYPING PRIMARY PROGRESSIVE APHASIA

    PubMed Central

    Mesulam, Marsel; Wieneke, Christina; Rogalski, Emily; Cobia, Derin; Thompson, Cynthia; Weintraub, Sandra

    2009-01-01

    Objective To provide a quantitative algorithm for classifying primary progressive aphasia (PPA) into agrammatic (PPA-G), semantic (PPA-S) and logopenic (PPA-L) variants, each of which is known to have a different probability of association with Alzheimer’s disease (AD) versus frontotemporal lobar degeneration (FTLD). Design Prospectively and consecutively enrolled 16 PPA patients tested with neuropsychological instruments and magnetic resonance imaging (MRI). Setting University medical center. Participants PPA patients recruited nationally in the USA as part of a longitudinal study. Results A two-dimensional template, reflecting performance on tests of syntax (Northwestern Anagram Test) and lexical semantics (Peabody Picture Vocabulary Test), classified all 16 patients in concordance with a clinical diagnosis that had been made prior to the administration of the quantitative tests. All three subtypes had distinctly asymmetrical atrophy of the left perisylvian language network. Each subtype also had distinctive peak atrophy sites. Only PPA-G had peak atrophy in the IFG (Broca’s area), only PPA-S had peak atrophy in the anterior temporal lobe, and only PPA-L had peak atrophy in area 37. Conclusions Once an accurate root diagnosis of PPA is made, subtyping can be quantitatively guided using a two-dimensional template based on orthogonal tasks of grammatical competence and word comprehension. Although the choice of tasks and precise cut-off levels may evolve in time, this set of 16 patients demonstrates the feasibility of using a simple algorithm for clinico-anatomical classification in PPA. Prospective studies will show whether this suptyping can improve the clinical prediction of underlying neuropathology. PMID:20008661

  9. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    PubMed

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.

  10. NecroQuant: quantitative assessment of radiological necrosis

    NASA Astrophysics Data System (ADS)

    Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay

    2017-11-01

    Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.

  11. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  12. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  13. Detection of gas leakage

    DOEpatents

    Thornberg, Steven M; Brown, Jason

    2015-02-17

    A method of detecting leaks and measuring volumes as well as a device, the Power-free Pump Module (PPM), provides a self-contained leak test and volume measurement apparatus that requires no external sources of electrical power during leak testing or volume measurement. The PPM is a portable, pneumatically-controlled instrument capable of generating a vacuum, calibrating volumes, and performing quantitative leak tests on a closed test system or device, all without the use of alternating current (AC) power. Capabilities include the ability is to provide a modest vacuum (less than 10 Torr) using a venturi pump, perform a pressure rise leak test, measure the gas's absolute pressure, and perform volume measurements. All operations are performed through a simple rotary control valve which controls pneumatically-operated manifold valves.

  14. Trends in hypothesis testing and related variables in nursing research: a retrospective exploratory study.

    PubMed

    Lash, Ayhan Aytekin; Plonczynski, Donna J; Sehdev, Amikar

    2011-01-01

    To compare the inclusion and the influences of selected variables on hypothesis testing during the 1980s and 1990s. In spite of the emphasis on conducting inquiry consistent with the tenets of logical positivism, there have been no studies investigating the frequency and patterns of hypothesis testing in nursing research The sample was obtained from the journal Nursing Research which was the research journal with the highest circulation during the study period under study. All quantitative studies published during the two decades including briefs and historical studies were included in the analyses A retrospective design was used to select the sample. Five years from the 1980s and 1990s each were randomly selected from the journal, Nursing Research. Of the 582 studies, 517 met inclusion criteria. Findings suggest that there has been a decline in the use of hypothesis testing in the last decades of the 20th century. Further research is needed to identify the factors that influence the conduction of research with hypothesis testing. Hypothesis testing in nursing research showed a steady decline from the 1980s to 1990s. Research purposes of explanation, and prediction/ control increased the likelihood of hypothesis testing. Hypothesis testing strengthens the quality of the quantitative studies, increases the generality of findings and provides dependable knowledge. This is particularly true for quantitative studies that aim to explore, explain and predict/control phenomena and/or test theories. The findings also have implications for doctoral programmes, research preparation of nurse-investigators, and theory testing.

  15. Convergent and sequential synthesis designs: implications for conducting and reporting systematic reviews of qualitative and quantitative evidence.

    PubMed

    Hong, Quan Nha; Pluye, Pierre; Bujold, Mathieu; Wassef, Maggy

    2017-03-23

    Systematic reviews of qualitative and quantitative evidence can provide a rich understanding of complex phenomena. This type of review is increasingly popular, has been used to provide a landscape of existing knowledge, and addresses the types of questions not usually covered in reviews relying solely on either quantitative or qualitative evidence. Although several typologies of synthesis designs have been developed, none have been tested on a large sample of reviews. The aim of this review of reviews was to identify and develop a typology of synthesis designs and methods that have been used and to propose strategies for synthesizing qualitative and quantitative evidence. A review of systematic reviews combining qualitative and quantitative evidence was performed. Six databases were searched from inception to December 2014. Reviews were included if they were systematic reviews combining qualitative and quantitative evidence. The included reviews were analyzed according to three concepts of synthesis processes: (a) synthesis methods, (b) sequence of data synthesis, and (c) integration of data and synthesis results. A total of 459 reviews were included. The analysis of this literature highlighted a lack of transparency in reporting how evidence was synthesized and a lack of consistency in the terminology used. Two main types of synthesis designs were identified: convergent and sequential synthesis designs. Within the convergent synthesis design, three subtypes were found: (a) data-based convergent synthesis design, where qualitative and quantitative evidence is analyzed together using the same synthesis method, (b) results-based convergent synthesis design, where qualitative and quantitative evidence is analyzed separately using different synthesis methods and results of both syntheses are integrated during a final synthesis, and (c) parallel-results convergent synthesis design consisting of independent syntheses of qualitative and quantitative evidence and an interpretation of the results in the discussion. Performing systematic reviews of qualitative and quantitative evidence is challenging because of the multiple synthesis options. The findings provide guidance on how to combine qualitative and quantitative evidence. Also, recommendations are made to improve the conducting and reporting of this type of review.

  16. [Nutritional self-care promotion in community-dwelling older people: a protocol of mixed method research].

    PubMed

    Raffaele, Barbara; Matarese, Maria; Piredda, Michela; De Marinis, Maria Grazia

    2016-01-01

    To describe a research protocol designed to promote nutritional self-care in older people. The aims of the research are: a) to evaluate the effectiveness of a nutritional education intervention in changing knowledge, attitudes, and behaviors; b) to describe the nutritional self-care ability and activities; c) to identify the promoting factors and barriers that influence the changes in nutritional knowledge, behaviors and attitudes in home-dwelling older people. Sequential explanatory mixed method design. The study will enroll 50 people aged 65 years and over. In the first quantitative phase, a pre-test and post-test design will be used to deliver a nutritional intervention aimed to change knowledge, behaviors and attitudes toward nutrition. Using the quantitative study results, the qualitative study phase will be conducted by interviews in sub-groups of older people. In a third phase, the quantitative and qualitative study results will be integrated. Quantitative data will be analyzed using descriptive and inferential statistics and qualitative data will be analyzed through content analysis. The study will provide new knowledge on nutritional self-care in home-dwelling older adults and the factors promoting nutritional self-care. Nutritional self-care promotion is of pivotal importance for the nursing care provided to home-dwelling older people. Educational programs aimed at the maintenance of proper nutrition in the older adults may reduce malnutrition and the related diseases. Nutrition educational programs should be based on knowledge derived from research to tailor individualized nutritional interventions and to realize effective educational programs.

  17. Generalizability and Validity of a Mathematics Performance Assessment.

    ERIC Educational Resources Information Center

    Lane, Suzanne; And Others

    1996-01-01

    Evidence from test results of 3,604 sixth and seventh graders is provided for the generalizability and validity of the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) Cognitive Assessment Instrument, which is designed to measure program outcomes and growth in mathematics. (SLD)

  18. Further Reflections

    ERIC Educational Resources Information Center

    McGaw, Barry

    2008-01-01

    In their reactions to my paper, the four authors provide comments that are illuminating and helpful for continuing discussions of the nature and utility of quantitative, comparative, international studies of educational achievement. In this response, I comment further on the issues of test characteristics, sample design, culture and causation.

  19. 40 CFR 761.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... screening tests do not usually provide: an identity record generated by an instrument; a quantitative... accordance with subpart D of this part. Research and development (R&D) for PCB disposal means demonstrations... not been approved, development of new disposal technologies, and research on chemical transformation...

  20. Characterization of Microporous Insulation, Microsil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, R.

    Microsil microporous insulation has been characterized by Lawrence Livermore National Laboratory for possible use in structural and thermal applications in the DPP-1 design. Qualitative test results have provided mechanical behavioral characteristics for DPP-1 design studies and focused on the material behavioral response to being crushed, cyclically loaded, and subjected to vibration for a confined material with an interference fit or a radial gap. Quantitative test results have provided data to support the DPP-1 FEA model analysis and verification and were used to determine mechanical property values for the material under a compression load. The test results are documented within thismore » report.« less

  1. Properties of young massive clusters obtained with different massive-star evolutionary models

    NASA Astrophysics Data System (ADS)

    Wofford, Aida; Charlot, Stéphane

    We undertake a comprehensive comparative test of seven widely-used spectral synthesis models using multi-band HST photometry of a sample of eight YMCs in two galaxies. We provide a first quantitative estimate of the accuracies and uncertainties of new models, show the good progress of models in fitting high-quality observations, and highlight the need of further comprehensive comparative tests.

  2. Comment on Hall et al. (2017), "How to Choose Between Measures of Tinnitus Loudness for Clinical Research? A Report on the Reliability and Validity of an Investigator-Administered Test and a Patient-Reported Measure Using Baseline Data Collected in a Phase IIa Drug Trial".

    PubMed

    Sabour, Siamak

    2018-03-08

    The purpose of this letter, in response to Hall, Mehta, and Fackrell (2017), is to provide important knowledge about methodology and statistical issues in assessing the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. The author uses reference textbooks and published articles regarding scientific assessment of the validity and reliability of a clinical test to discuss the statistical test and the methodological approach in assessing validity and reliability in clinical research. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess reliability and validity. The qualitative variables of sensitivity, specificity, positive predictive value, negative predictive value, false positive and false negative rates, likelihood ratio positive and likelihood ratio negative, as well as odds ratio (i.e., ratio of true to false results), are the most appropriate estimates to evaluate validity of a test compared to a gold standard. In the case of quantitative variables, depending on distribution of the variable, Pearson r or Spearman rho can be applied. Diagnostic accuracy (validity) and diagnostic precision (reliability or agreement) are two completely different methodological issues. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess validity.

  3. Office procedures for quantitative assessment of olfactory function.

    PubMed

    Doty, Richard L

    2007-01-01

    Despite the importance of the sense of smell for establishing the flavor of foods and beverages, as well as protecting against environmental dangers, this primary sensory system is commonly ignored by the rhinologist. In this article basic issues related to practical measurement of olfactory function in the clinic are described and examples of the application of the two most common paradigms for such measurement--odor identification and detection--are presented. A listing is made of the 27 olfactory tests currently used clinically, along with their strengths and weaknesses. A brief review of common nasosinus-related disorders for which quantitative olfactory testing has been performed is provided. Although many psychophysical tests are available for quantifying olfactory loss, it is apparent that a number are limited in terms of practicality, sensitivity, and reliability. In general, sensitivity and reliability are positively correlated with test length. Given the strengths of the more reliable forced-choice pyschophysical tests and the limitations of electrophysiological tests, the common distinction between "subjective" and "objective" tests is misleading and should not be used. Complete recovery of olfactory function, as measured quantitatively, rarely follows surgical or medical interventions in patients with rhinosinusitis. Given the availability of practical clinical olfactory tests, the modern rhinologist can easily quantify cranial nerve (CN) I function. The application of such tests has led to a new understanding of the effects of nasal disease on olfactory function. Except in cases of total or near-total nasal obstruction, olfactory and airway patency measures usually are unrelated, in accord with the concept that rhinosinusitis primarily influences olfactory function by apoptotic pathological changes within the olfactory neuroepithelium.

  4. Using the iPhone as a device for a rapid quantitative analysis of trinitrotoluene in soil.

    PubMed

    Choodum, Aree; Kanatharana, Proespichaya; Wongniramaikul, Worawit; Daeid, Niamh Nic

    2013-10-15

    Mobile 'smart' phones have become almost ubiquitous in society and are typically equipped with a high-resolution digital camera which can be used to produce an image very conveniently. In this study, the built-in digital camera of a smart phone (iPhone) was used to capture the results from a rapid quantitative colorimetric test for trinitrotoluene (TNT) in soil. The results were compared to those from a digital single-lens reflex (DSLR) camera. The colored product from the selective test for TNT was quantified using an innovative application of photography where the relationships between the Red Green Blue (RGB) values and the concentrations of colorimetric product were exploited. The iPhone showed itself to be capable of being used more conveniently than the DSLR while providing similar analytical results with increased sensitivity. The wide linear range and low detection limits achieved were comparable with those from spectrophotometric quantification methods. Low relative errors in the range of 0.4 to 6.3% were achieved in the analysis of control samples and 0.4-6.2% for spiked soil extracts with good precision (2.09-7.43% RSD) for the analysis over 4 days. The results demonstrate that the iPhone provides the potential to be used as an ideal novel platform for the development of a rapid on site semi quantitative field test for the analysis of explosives. © 2013 Elsevier B.V. All rights reserved.

  5. Ground Testing of a 10 K Sorption Cryocooler Flight Experiment (BETSCE)

    NASA Technical Reports Server (NTRS)

    Bard, S.; Wu, J.; Karlmann, P.; Cowgill, P.; Mirate, C.; Rodriguez, J.

    1994-01-01

    The Brilliant Eyes Ten-Kelvin Sorption Cryocooler Experiment (BETSCE) is a Space Shuttle side-wall-mounted flight experiment designed to demonstrate 10 K sorption cryocooler technology in a space environment. The BETSCE objectives are to: (1) provide a thorough end-to-end characterization and space performance validation of a complete, multistage, automated, closed-cycle hydride sorption cryocooler in the 10 to 30 K temperature range, (2) acquire the quantitative microgravity database required to provide confident engineering design, scaling, and optimization, (3) advance the enabling technologies and resolve integration issues, and (4) provide hardware qualification and safety verification heritage. BETSCE ground tests were the first-ever demonstration of a complete closed-cycle 10 K sorption cryocooler. Test results exceeded functional requirements. This paper summarizes functional and environmental ground test results, planned characterization tests, important development challenges that were overcome, and valuable lessons-learned.

  6. Quantitative analysis of comparative genomic hybridization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manoir, S. du; Bentz, M.; Joos, S.

    1995-01-01

    Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a programmore » for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.« less

  7. Quantitative systems toxicology

    PubMed Central

    Bloomingdale, Peter; Housand, Conrad; Apgar, Joshua F.; Millard, Bjorn L.; Mager, Donald E.; Burke, John M.; Shah, Dhaval K.

    2017-01-01

    The overarching goal of modern drug development is to optimize therapeutic benefits while minimizing adverse effects. However, inadequate efficacy and safety concerns remain to be the major causes of drug attrition in clinical development. For the past 80 years, toxicity testing has consisted of evaluating the adverse effects of drugs in animals to predict human health risks. The U.S. Environmental Protection Agency recognized the need to develop innovative toxicity testing strategies and asked the National Research Council to develop a long-range vision and strategy for toxicity testing in the 21st century. The vision aims to reduce the use of animals and drug development costs through the integration of computational modeling and in vitro experimental methods that evaluates the perturbation of toxicity-related pathways. Towards this vision, collaborative quantitative systems pharmacology and toxicology modeling endeavors (QSP/QST) have been initiated amongst numerous organizations worldwide. In this article, we discuss how quantitative structure-activity relationship (QSAR), network-based, and pharmacokinetic/pharmacodynamic modeling approaches can be integrated into the framework of QST models. Additionally, we review the application of QST models to predict cardiotoxicity and hepatotoxicity of drugs throughout their development. Cell and organ specific QST models are likely to become an essential component of modern toxicity testing, and provides a solid foundation towards determining individualized therapeutic windows to improve patient safety. PMID:29308440

  8. High content screening of ToxCast compounds using Vala Sciences’ complex cell culturing systems (SOT)

    EPA Science Inventory

    US EPA’s ToxCast research program evaluates bioactivity for thousands of chemicals utilizing high-throughput screening assays to inform chemical testing decisions. Vala Sciences provides high content, multiplexed assays that utilize quantitative cell-based digital image analysis....

  9. Null Hypothesis Significance Testing and "p" Values

    ERIC Educational Resources Information Center

    Travers, Jason C.; Cook, Bryan G.; Cook, Lysandra

    2017-01-01

    "p" values are commonly reported in quantitative research, but are often misunderstood and misinterpreted by research consumers. Our aim in this article is to provide special educators with guidance for appropriately interpreting "p" values, with the broader goal of improving research consumers' understanding and interpretation…

  10. Simulation of UV atomic radiation for application in exhaust plume spectrometry

    NASA Astrophysics Data System (ADS)

    Wallace, T. L.; Powers, W. T.; Cooper, A. E.

    1993-06-01

    Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.

  11. A Quantitative Method for Comparing the Brightness of Antibody-dye Reagents and Estimating Antibodies Bound per Cell.

    PubMed

    Kantor, Aaron B; Moore, Wayne A; Meehan, Stephen; Parks, David R

    2016-07-01

    We present a quantitative method for comparing the brightness of antibody-dye reagents and estimating antibodies bound per cell. The method is based on complementary binding of test and fill reagents to antibody capture microspheres. Several aliquots of antibody capture beads are stained with varying amounts of the test conjugate. The remaining binding sites on the beads are then filled with a second conjugate containing a different fluorophore. Finally, the fluorescence of the test conjugate compared to the fill conjugate is used to measure the relative brightness of the test conjugate. The fundamental assumption of the test-fill method is that if it takes X molecules of one test antibody to lower the fill signal by Y units, it will take the same X molecules of any other test antibody to give the same effect. We apply a quadratic fit to evaluate the test-fill signal relationship across different amounts of test reagent. If the fit is close to linear, we consider the test reagent to be suitable for quantitative evaluation of antibody binding. To calibrate the antibodies bound per bead, a PE conjugate with 1 PE molecule per antibody is used as a test reagent and the fluorescence scale is calibrated with Quantibrite PE beads. When the fluorescence per antibody molecule has been determined for a particular conjugate, that conjugate can be used for measurement of antibodies bound per cell. This provides comparisons of the brightness of different conjugates when conducted on an instrument whose statistical photoelectron (Spe) scales are known. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  12. Fire Suppression Properties of Very Fine Water Mist

    DTIC Science & Technology

    2005-01-01

    with the University of Heidelberg, developed an in situ oxygen sensor based on tunable diode laser absorption spectroscopy ( TDLAS ) to provide absolute... oxygen number densities in the presence of mist.3 Th e TDLAS oxygen sensor provides real-time, calibra- tion-free, quantitative oxygen ...Determination of Molecular Oxygen Concentrations in Full-Scale Fire Suppression Tests Using TDLAS ,” Proc. Combust. Inst. 29, 353-360 (2002).

  13. Patient-Centered Communication and Health Assessment with Youth

    PubMed Central

    Munro, Michelle L.; Darling-Fisher, Cynthia S.; Ronis, David L.; Villarruel, Antonia M.; Pardee, Michelle; Faleer, Hannah; Fava, Nicole M.

    2014-01-01

    Background Patient-centered communication is the hallmark of care that incorporates the perspective of patients to provide tailored care that meets their needs and desires. However, at this time there has been limited evaluation of patient-provider communication involving youth. Objectives This manuscript will report on results from secondary analysis of data obtained during a participatory research-based randomized control trial designed to test a sexual risk event history calendar intervention with youth to address the following research questions: (a) Based on the event history calendar’s (EHC) inclusion of contextual factors, does the EHC demonstrate improved communication outcomes (i.e., amount, satisfaction, mutuality, client involvement, client satisfaction, patient-provider interaction, and patient-centeredness) when compared to the Guidelines for Adolescent Preventive Services (GAPS) tool? and (b) How do patients and providers describe the characteristics of each tool in regards to patient-centered communication? Method This report will utilize a sequential explanatory mixed methods approach to evaluate communication. A split plot design with one between factor (i.e., communication structure between EHC and GAPS) and one within factor (i.e., time between pretest and posttest) was used for analyses of data collection from male and female youth (n=186) and providers (n=9). Quantitative analysis of survey data evaluated changes in communication from pre-test to post-test. Qualitative data collected from open-ended questions, audio-taped visits, and exit interviews was employed to enhance interpretation of quantitative findings. Results Patient-centered communication using assessment tools (EHC and GAPS) with youth demonstrated improved communication outcomes both quantitatively and qualitatively. Additional analyses with subgroups of males and Arab-Americans demonstrated better post-intervention scores among the EHC group in certain aspects of communication. Qualitative results revealed that the EHC demonstrated improved outcomes in the four components of patient-centered communication including: validation of the patient’s perspective; viewing the patient within context; reaching a shared understanding on needs and preferences; and helping the patient share power in the healthcare interaction. Discussion Though both tools provided a framework from which to conduct a clinical visit, the integrated time-linked assessment captured by the EHC enhanced the patient-centered communication in select groups compared to GAPS. PMID:24165214

  14. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  15. Development and evaluation of a thermochemistry concept inventory for college-level general chemistry

    NASA Astrophysics Data System (ADS)

    Wren, David A.

    The research presented in this dissertation culminated in a 10-item Thermochemistry Concept Inventory (TCI). The development of the TCI can be divided into two main phases: qualitative studies and quantitative studies. Both phases focused on the primary stakeholders of the TCI, college-level general chemistry instructors and students. Each phase was designed to collect evidence for the validity of the interpretations and uses of TCI testing data. A central use of TCI testing data is to identify student conceptual misunderstandings, which are represented as incorrect options of multiple-choice TCI items. Therefore, quantitative and qualitative studies focused heavily on collecting evidence at the item-level, where important interpretations may be made by TCI users. Qualitative studies included student interviews (N = 28) and online expert surveys (N = 30). Think-aloud student interviews (N = 12) were used to identify conceptual misunderstandings used by students. Novice response process validity interviews (N = 16) helped provide information on how students interpreted and answered TCI items and were the basis of item revisions. Practicing general chemistry instructors (N = 18), or experts, defined boundaries of thermochemistry content included on the TCI. Once TCI items were in the later stages of development, an online version of the TCI was used in expert response process validity survey (N = 12), to provide expert feedback on item content, format and consensus of the correct answer for each item. Quantitative studies included three phases: beta testing of TCI items (N = 280), pilot testing of the a 12-item TCI (N = 485), and a large data collection using a 10-item TCI ( N = 1331). In addition to traditional classical test theory analysis, Rasch model analysis was also used for evaluation of testing data at the test and item level. The TCI was administered in both formative assessment (beta and pilot testing) and summative assessment (large data collection), with items performing well in both. One item, item K, did not have acceptable psychometric properties when the TCI was used as a quiz (summative assessment), but was retained in the final version of the TCI based on the acceptable psychometric properties displayed in pilot testing (formative assessment).

  16. The use of a battery of tracking tests in the quantitative evaluation of neurological function

    NASA Technical Reports Server (NTRS)

    Repa, B. S.; Albers, J. W.; Potvin, A. R.; Tourtellotte, W. W.

    1972-01-01

    A tracking test battery has been applied in a drug trail designed to compare the efficacy of L-DOPA and amantadine to that of L-DOPA and placebo in the treatment of 28 patients with Parkinson's disease. The drug trial provided an ideal opportunity for objectively evaluating the usefulness of tracking tests in assessing changes in neurologic function. Evaluating changes in patient performance resulting from disease progression and controlled clinical trials is of great importance in establishing effective treatment programs.

  17. Benefits of Exercise for the Quality of Life of Drug-Dependent Patients.

    PubMed

    Giménez-Meseguer, Jorge; Tortosa-Martínez, Juan; de los Remedios Fernández-Valenciano, María

    2015-01-01

    This study combined quantitative and qualitative research methods to evaluate quality-of-life changes in drug-dependent patients after participation in a group-based exercise program. Quality of life (SF-36) and physical fitness (six-minute Walk Test, Timed Get Up and Go Test, and Chair Stand Test) were quantitatively determined in a group (n=37) of drug-dependent patients before and after a 12-week group exercise program (n=18) or routine care (n=19). Additionally, in-depth interviews were conducted at the end of the program with a subsample of 11 participants from the exercise group. Quantitative results showed improvements in fitness and different aspects of quality of life, such as physical function, mental health, vitality, social function, and general health perception. Qualitative results showed specific physical benefits (decreased injuries and muscle pain, decreased weight, and increased vitality with improvement in activities of daily living), psychological benefits (forgetting about everyday problems, improved mood, decreased stress and anxiety), social benefits, and a reduction in craving. The results of this study provide insight into the importance of exercise for the quality of life and recovery process of drug-dependent patients.

  18. On-line analysis capabilities developed to support the AFW wind-tunnel tests

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Hoadley, Sherwood T.; Mcgraw, Sandra M.

    1992-01-01

    A variety of on-line analysis tools were developed to support two active flexible wing (AFW) wind-tunnel tests. These tools were developed to verify control law execution, to satisfy analysis requirements of the control law designers, to provide measures of system stability in a real-time environment, and to provide project managers with a quantitative measure of controller performance. Descriptions and purposes of the developed capabilities are presented along with examples. Procedures for saving and transferring data for near real-time analysis, and descriptions of the corresponding data interface programs are also presented. The on-line analysis tools worked well before, during, and after the wind tunnel test and proved to be a vital and important part of the entire test effort.

  19. Quantitative nephelometry

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003545.htm Quantitative nephelometry test To use the sharing features on this page, please enable JavaScript. Quantitative nephelometry is a lab test to quickly and ...

  20. A versatile quantitation platform based on platinum nanoparticles incorporated volumetric bar-chart chip for highly sensitive assays.

    PubMed

    Wang, Yuzhen; Zhu, Guixian; Qi, Wenjin; Li, Ying; Song, Yujun

    2016-11-15

    Platinum nanoparticles incorporated volumetric bar-chart chip (PtNPs-V-Chip) is able to be used for point-of-care tests by providing quantitative and visualized readout without any assistance from instruments, data processing, or graphic plotting. To improve the sensitivity of PtNPs-V-Chip, hybridization chain reaction was employed in this quantitation platform for highly sensitive assays that can detect as low as 16 pM Ebola Virus DNA, 0.01ng/mL carcinoembryonic antigen (CEA), and the 10 HER2-expressing cancer cells. Based on this amplified strategy, a 100-fold decrease of detection limit was achieved for DNA by improving the number of platinum nanoparticle catalyst for the captured analyte. This quantitation platform can also distinguish single base mismatch of DNA hybridization and observe the concentration threshold of CEA. The new strategy lays the foundation for this quantitation platform to be applied in forensic analysis, biothreat detection, clinical diagnostics and drug screening. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. A comparison of the Sensititre® MYCOTB panel and the agar proportion method for the susceptibility testing of Mycobacterium tuberculosis.

    PubMed

    Abuali, M M; Katariwala, R; LaBombardi, V J

    2012-05-01

    The agar proportion method (APM) for determining Mycobacterium tuberculosis susceptibilities is a qualitative method that requires 21 days in order to produce the results. The Sensititre method allows for a quantitative assessment. Our objective was to compare the accuracy, time to results, and ease of use of the Sensititre method to the APM. 7H10 plates in the APM and 96-well microtiter dry MYCOTB panels containing 12 antibiotics at full dilution ranges in the Sensititre method were inoculated with M. tuberculosis and read for colony growth. Thirty-seven clinical isolates were tested using both methods and 26 challenge strains of blinded susceptibilities were tested using the Sensititre method only. The Sensititre method displayed 99.3% concordance with the APM. The APM provided reliable results on day 21, whereas the Sensititre method displayed consistent results by day 10. The Sensititre method provides a more rapid, quantitative, and efficient method of testing both first- and second-line drugs when compared to the gold standard. It will give clinicians a sense of the degree of susceptibility, thus, guiding the therapeutic decision-making process. Furthermore, the microwell plate format without the need for instrumentation will allow its use in resource-poor settings.

  2. Quantitative imaging test approval and biomarker qualification: interrelated but distinct activities.

    PubMed

    Buckler, Andrew J; Bresolin, Linda; Dunnick, N Reed; Sullivan, Daniel C; Aerts, Hugo J W L; Bendriem, Bernard; Bendtsen, Claus; Boellaard, Ronald; Boone, John M; Cole, Patricia E; Conklin, James J; Dorfman, Gary S; Douglas, Pamela S; Eidsaunet, Willy; Elsinger, Cathy; Frank, Richard A; Gatsonis, Constantine; Giger, Maryellen L; Gupta, Sandeep N; Gustafson, David; Hoekstra, Otto S; Jackson, Edward F; Karam, Lisa; Kelloff, Gary J; Kinahan, Paul E; McLennan, Geoffrey; Miller, Colin G; Mozley, P David; Muller, Keith E; Patt, Rick; Raunig, David; Rosen, Mark; Rupani, Haren; Schwartz, Lawrence H; Siegel, Barry A; Sorensen, A Gregory; Wahl, Richard L; Waterton, John C; Wolf, Walter; Zahlmann, Gudrun; Zimmerman, Brian

    2011-06-01

    Quantitative imaging biomarkers could speed the development of new treatments for unmet medical needs and improve routine clinical care. However, it is not clear how the various regulatory and nonregulatory (eg, reimbursement) processes (often referred to as pathways) relate, nor is it clear which data need to be collected to support these different pathways most efficiently, given the time- and cost-intensive nature of doing so. The purpose of this article is to describe current thinking regarding these pathways emerging from diverse stakeholders interested and active in the definition, validation, and qualification of quantitative imaging biomarkers and to propose processes to facilitate the development and use of quantitative imaging biomarkers. A flexible framework is described that may be adapted for each imaging application, providing mechanisms that can be used to develop, assess, and evaluate relevant biomarkers. From this framework, processes can be mapped that would be applicable to both imaging product development and to quantitative imaging biomarker development aimed at increasing the effectiveness and availability of quantitative imaging. http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.10100800/-/DC1. RSNA, 2011

  3. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  4. 42 CFR 493.923 - Syphilis serology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...

  5. 42 CFR 493.923 - Syphilis serology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...

  6. 42 CFR 493.923 - Syphilis serology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...

  7. 42 CFR 493.923 - Syphilis serology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...

  8. 42 CFR 493.923 - Syphilis serology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...

  9. Test of Achievement in Quantitative Economics for Secondary Schools: Construction and Validation Using Item Response Theory

    ERIC Educational Resources Information Center

    Eleje, Lydia I.; Esomonu, Nkechi P. M.

    2018-01-01

    A Test to measure achievement in quantitative economics among secondary school students was developed and validated in this study. The test is made up 20 multiple choice test items constructed based on quantitative economics sub-skills. Six research questions guided the study. Preliminary validation was done by two experienced teachers in…

  10. Preliminary research on eddy current bobbin quantitative test for heat exchange tube in nuclear power plant

    NASA Astrophysics Data System (ADS)

    Qi, Pan; Shao, Wenbin; Liao, Shusheng

    2016-02-01

    For quantitative defects detection research on heat transfer tube in nuclear power plants (NPP), two parts of work are carried out based on the crack as the main research objects. (1) Production optimization of calibration tube. Firstly, ASME, RSEM and homemade crack calibration tubes are applied to quantitatively analyze the defects depth on other designed crack test tubes, and then the judgment with quantitative results under crack calibration tube with more accuracy is given. Base on that, weight analysis of influence factors for crack depth quantitative test such as crack orientation, length, volume and so on can be undertaken, which will optimize manufacture technology of calibration tubes. (2) Quantitative optimization of crack depth. Neural network model with multi-calibration curve adopted to optimize natural crack test depth generated in in-service tubes shows preliminary ability to improve quantitative accuracy.

  11. Noninvasive identification of the total peripheral resistance baroreflex

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ramakrishna; Toska, Karin; Cohen, Richard J.

    2003-01-01

    We propose two identification algorithms for quantitating the total peripheral resistance (TPR) baroreflex, an important contributor to short-term arterial blood pressure (ABP) regulation. Each algorithm analyzes beat-to-beat fluctuations in ABP and cardiac output, which may both be obtained noninvasively in humans. For a theoretical evaluation, we applied both algorithms to a realistic cardiovascular model. The results contrasted with only one of the algorithms proving to be reliable. This algorithm was able to track changes in the static gains of both the arterial and cardiopulmonary TPR baroreflex. We then applied both algorithms to a preliminary set of human data and obtained contrasting results much like those obtained from the cardiovascular model, thereby making the theoretical evaluation results more meaningful. This study suggests that, with experimental testing, the reliable identification algorithm may provide a powerful, noninvasive means for quantitating the TPR baroreflex. This study also provides an example of the role that models can play in the development and initial evaluation of algorithms aimed at quantitating important physiological mechanisms.

  12. A dual memory theory of the testing effect.

    PubMed

    Rickard, Timothy C; Pan, Steven C

    2017-06-05

    A new theoretical framework for the testing effect-the finding that retrieval practice is usually more effective for learning than are other strategies-is proposed, the empirically supported tenet of which is that separate memories form as a consequence of study and test events. A simplest case quantitative model is derived from that framework for the case of cued recall. With no free parameters, that model predicts both proportion correct in the test condition and the magnitude of the testing effect across 10 experiments conducted in our laboratory, experiments that varied with respect to material type, retention interval, and performance in the restudy condition. The model also provides the first quantitative accounts of (a) the testing effect as a function of performance in the restudy condition, (b) the upper bound magnitude of the testing effect, (c) the effect of correct answer feedback, (d) the testing effect as a function of retention interval for the cases of feedback and no feedback, and (e) the effect of prior learning method on subsequent learning through testing. Candidate accounts of several other core phenomena in the literature, including test-potentiated learning, recognition versus cued recall training effects, cued versus free recall final test effects, and other select transfer effects, are also proposed. Future prospects and relations to other theories are discussed.

  13. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    PubMed

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-11

    Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  14. ACL Return to Sport Guidelines and Criteria.

    PubMed

    Davies, George J; McCarty, Eric; Provencher, Matthew; Manske, Robert C

    2017-09-01

    Because of the epidemiological incidence of anterior cruciate ligament (ACL) injuries, the high reinjury rates that occur when returning back to sports, the actual number of patients that return to the same premorbid level of competition, the high incidence of osteoarthritis at 5-10-year follow-ups, and the effects on the long-term health of the knee and the quality of life for the patient, individualizing the return to sports after ACL reconstruction (ACL-R) is critical. However, one of the challenging but unsolved dilemmas is what criteria and clinical decision making should be used to return an athlete back to sports following an ACL-R. This article describes an example of a functional testing algorithm (FTA) as one method for clinical decision making based on quantitative and qualitative testing and assessment utilized to make informed decisions to return an athlete to their sports safely and without compromised performance. The methods were a review of the best current evidence to support a FTA. In order to evaluate all the complicated domains of the clinical decision making for individualizing the return to sports after ACL-R, numerous assessments need to be performed including the biopsychosocial concepts, impairment testing, strength and power testing, functional testing, and patient-reported outcomes (PROs). The optimum criteria to use for individualizing the return to sports after ACL-R remain elusive. However, since this decision needs to be made on a regular basis with the safety and performance factors of the patient involved, this FTA provides one method of quantitatively and qualitatively making the decisions. Admittedly, there is no predictive validity of this system, but it does provide practical guidelines to facilitate the clinical decision making process for return to sports. The clinical decision to return an athlete back into competition has significant implications ranging from the safety of the athlete, to performance factors and actual litigation issues. By using a multifactorial FTA, such as the one described, provides quantitative and qualitatively criteria to make an informed decision in the best interests of the athlete.

  15. QR-STEM: Energy and Environment as a Context for Improving QR and STEM Understandings of 6-12 Grade Teachers II. The Quantitative Reasoning

    NASA Astrophysics Data System (ADS)

    Mayes, R.; Lyford, M. E.; Myers, J. D.

    2009-12-01

    The Quantitative Reasoning in STEM (QR STEM) project is a state level Mathematics and Science Partnership Project (MSP) with a focus on the mathematics and statistics that underlies the understanding of complex global scientific issues. This session is a companion session to the QR STEM: The Science presentation. The focus of this session is the quantitative reasoning aspects of the project. As students move from understandings that range from local to global in perspective on issues of energy and environment, there is a significant increase in the need for mathematical and statistical conceptual understanding. These understandings must be accessible to the students within the scientific context, requiring the special understandings that are endemic within quantitative reasoning. The QR STEM project brings together interdisciplinary teams of higher education faculty and middle/high school teachers to explore complex problems in energy and environment. The disciplines include life sciences, physics, chemistry, earth science, statistics, and mathematics. These interdisciplinary teams develop open ended performance tasks to implement in the classroom, based on scientific concepts that underpin energy and environment. Quantitative reasoning is broken down into three components: Quantitative Literacy, Quantitative Interpretation, and Quantitative Modeling. Quantitative Literacy is composed of arithmetic concepts such as proportional reasoning, numeracy, and descriptive statistics. Quantitative Interpretation includes algebraic and geometric concepts that underlie the ability to interpret a model of natural phenomena which is provided for the student. This model may be a table, graph, or equation from which the student is to make predictions or identify trends, or from which they would use statistics to explore correlations or patterns in data. Quantitative modeling is the ability to develop the model from data, including the ability to test hypothesis using statistical procedures. We use the term model very broadly, so it includes visual models such as box models, as well as best fit equation models and hypothesis testing. One of the powerful outcomes of the project is the conversation which takes place between science teachers and mathematics teachers. First they realize that though they are teaching concepts that cross their disciplines, the barrier of scientific language within their subjects restricts students from applying the concepts across subjects. Second the mathematics teachers discover the context of science as a means of providing real world situations that engage students in the utility of mathematics as a tool for solving problems. Third the science teachers discover the barrier to understanding science that is presented by poor quantitative reasoning ability. Finally the students are engaged in exploring energy and environment in a manner which exposes the importance of seeing a problem from multiple interdisciplinary perspectives. The outcome is a democratic citizen capable of making informed decisions, and perhaps a future scientist.

  16. Steady-state and transient operation of a heat-pipe radiator system

    NASA Technical Reports Server (NTRS)

    Sellers, J. P.

    1974-01-01

    Data obtained on a VCHP heat-pipe radiator system tested in a vacuum environment were studied. Analyses and interpretation of the steady-state results are presented along with an initial analysis of some of the transient data. Particular emphasis was placed on quantitative comparisons of the experimental data with computer model simulations. The results of the study provide a better understanding of the system but do not provide a complete explanation for the observed low VCHP performance and the relatively flat radiator panel temperature distribution. The results of the study also suggest hardware, software, and testing improvements.

  17. Quantitative cleaning characterization of a lithium-fluoride ion diode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menge, P.R.; Cuneo, M.E.

    An ion source cleaning testbed was created to test plasma-cleaning techniques, and to provide quantitative data on plasma-cleaning protocols prior to implementation on the SABRE accelerator. The testbed was designed to resolve issues regarding the quantity of contaminants absorbed by the anode source (LiF), and the best cleaning methodology. A test chamber was devised containing a duplicate of the SABRE diode. Radio-frequency (RF) power was fed to the anode, which was isolated from ground and thus served as the plasma discharge electrode. RF plasma discharges in 1--3 mtorr of Ar with 10% O{sub 2} were found to provide the bestmore » cleaning of the LiF surface. X-ray photoelectron spectroscopy (XPS) showed that the LiF could accrue dozens of monolayers of carbon just by sitting in a 2 {times} 10{sup {minus}5} vacuum for 24 h. Tests of various discharge cleaning protocols indicated that 15 min of an Ar/O{sub 2} discharge was sufficient to reduce this initial 13--45 monolayers of carbon impurities to 2--4 monolayers. Rapid recontamination of the LiF was also observed. Up to ten monolayers of carbon returned in 2 min after termination of the plasma discharge and subsequent pumping back to the 10{sup {minus}5} torr range. Heating of the LiF also was found to provide anode cleaning. Application of heating combined with plasma cleaning provided the highest cleaning rates.« less

  18. Measuring landscape esthetics: the scenic beauty estimation method

    Treesearch

    Terry C. Daniel; Ron S. Boster

    1976-01-01

    The Scenic Beauty Estimation Method (SBE) provides quantitative measures of esthetic preferences for alternative wildland management systems. Extensive experimentation and testing with user, interest, and professional groups validated the method. SBE shows promise as an efficient and objective means for assessing the scenic beauty of public forests and wildlands, and...

  19. Tri-Squared Mean Cross Comparative Analysis: An Advanced Post Hoc Qualitative and Quantitative Metric for a More In-Depth Examination of the Initial Research Outcomes of the Tri-Square Test

    ERIC Educational Resources Information Center

    Osler, James Edward

    2013-01-01

    This monograph provides an epistemological rational for the design of an advanced novel analysis metric. The metric is designed to analyze the outcomes of the Tri-Squared Test. This methodology is referred to as: "Tri-Squared Mean Cross Comparative Analysis" (given the acronym TSMCCA). Tri-Squared Mean Cross Comparative Analysis involves…

  20. TH-A-207B-02: QIBA Ultrasound Elasticity Imaging System Biomarker Qualification and User Testing of Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garra, B.

    Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less

  1. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  2. Technique for quantitative RT-PCR analysis directly from single muscle fibers.

    PubMed

    Wacker, Michael J; Tehel, Michelle M; Gallagher, Philip M

    2008-07-01

    The use of single-cell quantitative RT-PCR has greatly aided the study of gene expression in fields such as muscle physiology. For this study, we hypothesized that single muscle fibers from a biopsy can be placed directly into the reverse transcription buffer and that gene expression data can be obtained without having to first extract the RNA. To test this hypothesis, biopsies were taken from the vastus lateralis of five male subjects. Single muscle fibers were isolated and underwent RNA isolation (technique 1) or placed directly into reverse transcription buffer (technique 2). After cDNA conversion, individual fiber cDNA was pooled and quantitative PCR was performed using primer-probes for beta(2)-microglobulin, glyceraldehyde-3-phosphate dehydrogenase, insulin-like growth factor I receptor, and glucose transporter subtype 4. The no RNA extraction method provided similar quantitative PCR data as that of the RNA extraction method. A third technique was also tested in which we used one-quarter of an individual fiber's cDNA for PCR (not pooled) and the average coefficient of variation between fibers was <8% (cycle threshold value) for all genes studied. The no RNA extraction technique was tested on isolated muscle fibers using a gene known to increase after exercise (pyruvate dehydrogenase kinase 4). We observed a 13.9-fold change in expression after resistance exercise, which is consistent with what has been previously observed. These results demonstrate a successful method for gene expression analysis directly from single muscle fibers.

  3. Quantitative Protein Topography Analysis and High-Resolution Structure Prediction Using Hydroxyl Radical Labeling and Tandem-Ion Mass Spectrometry (MS)*

    PubMed Central

    Kaur, Parminder; Kiselar, Janna; Yang, Sichun; Chance, Mark R.

    2015-01-01

    Hydroxyl radical footprinting based MS for protein structure assessment has the goal of understanding ligand induced conformational changes and macromolecular interactions, for example, protein tertiary and quaternary structure, but the structural resolution provided by typical peptide-level quantification is limiting. In this work, we present experimental strategies using tandem-MS fragmentation to increase the spatial resolution of the technique to the single residue level to provide a high precision tool for molecular biophysics research. Overall, in this study we demonstrated an eightfold increase in structural resolution compared with peptide level assessments. In addition, to provide a quantitative analysis of residue based solvent accessibility and protein topography as a basis for high-resolution structure prediction; we illustrate strategies of data transformation using the relative reactivity of side chains as a normalization strategy and predict side-chain surface area from the footprinting data. We tested the methods by examination of Ca+2-calmodulin showing highly significant correlations between surface area and side-chain contact predictions for individual side chains and the crystal structure. Tandem ion based hydroxyl radical footprinting-MS provides quantitative high-resolution protein topology information in solution that can fill existing gaps in structure determination for large proteins and macromolecular complexes. PMID:25687570

  4. A life prediction methodology for encapsulated solar cells

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    This paper presents an approach to the development of a life prediction methodology for encapsulated solar cells which are intended to operate for twenty years or more in a terrestrial environment. Such a methodology, or solar cell life prediction model, requires the development of quantitative intermediate relationships between local environmental stress parameters and the basic chemical mechanisms of encapsulant aging leading to solar cell failures. The use of accelerated/abbreviated testing to develop these intermediate relationships and in revealing failure modes is discussed. Current field and demonstration tests of solar cell arrays and the present laboratory tests to qualify solar module designs provide very little data applicable to predicting the long-term performance of encapsulated solar cells. An approach to enhancing the value of such field tests to provide data for life prediction is described.

  5. A novel quantified bitterness evaluation model for traditional Chinese herbs based on an animal ethology principle.

    PubMed

    Han, Xue; Jiang, Hong; Han, Li; Xiong, Xi; He, Yanan; Fu, Chaomei; Xu, Runchun; Zhang, Dingkun; Lin, Junzhi; Yang, Ming

    2018-03-01

    Traditional Chinese herbs (TCH) are currently gaining attention in disease prevention and health care plans. However, their general bitter taste hinders their use. Despite the development of a variety of taste evaluation methods, it is still a major challenge to establish a quantitative detection technique that is objective, authentic and sensitive. Based on the two-bottle preference test (TBP), we proposed a novel quantitative strategy using a standardized animal test and a unified quantitative benchmark. To reduce the difference of results, the methodology of TBP was optimized. The relationship between the concentration of quinine and animal preference index (PI) was obtained. Then the PI of TCH was measured through TBP, and bitterness results were converted into a unified numerical system using the relationship of concentration and PI. To verify the authenticity and sensitivity of quantified results, human sensory testing and electronic tongue testing were applied. The quantified results showed a good discrimination ability. For example, the bitterness of Coptidis Rhizoma was equal to 0.0579 mg/mL quinine, and Nelumbinis Folium was equal to 0.0001 mg/mL. The validation results proved that the new assessment method for TCH was objective and reliable. In conclusion, this study provides an option for the quantification of bitterness and the evaluation of taste masking effects.

  6. Methods for the field evaluation of quantitative G6PD diagnostics: a review.

    PubMed

    Ley, Benedikt; Bancone, Germana; von Seidlein, Lorenz; Thriemer, Kamala; Richards, Jack S; Domingo, Gonzalo J; Price, Ric N

    2017-09-11

    Individuals with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk of severe haemolysis following the administration of 8-aminoquinoline compounds. Primaquine is the only widely available 8-aminoquinoline for the radical cure of Plasmodium vivax. Tafenoquine is under development with the potential to simplify treatment regimens, but point-of-care (PoC) tests will be needed to provide quantitative measurement of G6PD activity prior to its administration. There is currently a lack of appropriate G6PD PoC tests, but a number of new tests are in development and are likely to enter the market in the coming years. As these are implemented, they will need to be validated in field studies. This article outlines the technical details for the field evaluation of novel quantitative G6PD diagnostics such as sample handling, reference testing and statistical analysis. Field evaluation is based on the comparison of paired samples, including one sample tested by the new assay at point of care and one sample tested by the gold-standard reference method, UV spectrophotometry in an established laboratory. Samples can be collected as capillary or venous blood; the existing literature suggests that potential differences in capillary or venous blood are unlikely to affect results substantially. The collection and storage of samples is critical to ensure preservation of enzyme activity, it is recommended that samples are stored at 4 °C and testing occurs within 4 days of collection. Test results can be visually presented as scatter plot, Bland-Altman plot, and a histogram of the G6PD activity distribution of the study population. Calculating the adjusted male median allows categorizing results according to G6PD activity to calculate standard performance indicators and to perform receiver operating characteristic (ROC) analysis.

  7. Vibroacoustic test plan evaluation: Parameter variation study

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloef, H. R.

    1976-01-01

    Statistical decision models are shown to provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology developed provides a major step toward the development of a realistic tool to quantitatively tailor test programs to specific payloads. Testing is considered at the no test, component, subassembly, or system level of assembly. Component redundancy and partial loss of flight data are considered. Most and probabilistic costs are considered, and incipient failures resulting from ground tests are treated. Optimums defining both component and assembly test levels are indicated for the modified test plans considered. modeling simplifications must be considered in interpreting the results relative to a particular payload. New parameters introduced were a no test option, flight by flight failure probabilities, and a cost to design components for higher vibration requirements. Parameters varied were the shuttle payload bay internal acoustic environment, the STS launch cost, the component retest/repair cost, and the amount of redundancy in the housekeeping section of the payload reliability model.

  8. Quantitative PCR for HTLV-1 provirus in adult T-cell leukemia/lymphoma using paraffin tumor sections.

    PubMed

    Kato, Junki; Masaki, Ayako; Fujii, Keiichiro; Takino, Hisashi; Murase, Takayuki; Yonekura, Kentaro; Utsunomiya, Atae; Ishida, Takashi; Iida, Shinsuke; Inagaki, Hiroshi

    2016-11-01

    Detection of HTLV-1 provirus using paraffin tumor sections may assist the diagnosis of adult T-cell leukemia/lymphoma (ATLL). For the detection, non-quantitative PCR assay has been reported, but its usefulness and limitations remain unclear. To our knowledge, quantitative PCR assay using paraffin tumor sections has not been reported. Using paraffin sections from ATLLs and non-ATLL T-cell lymphomas, we first performed non-quantitative PCR for HTLV-1 provirus. Next, we determined tumor ratios and carried out quantitative PCR to obtain provirus copy numbers. The results were analyzed with a simple regression model and a novel criterion, cut-off using 95 % rejection limits. Our quantitative PCR assay showed an excellent association between tumor ratios and the copy numbers (r = 0.89, P < 0.0001). The 95 % rejection limits provided a statistical basis for the range for the determination of HTLV-1 involvement. Its application suggested that results of non-quantitative PCR assay should be interpreted very carefully and that our quantitative PCR assay is useful to estimate the status of HTLV-1 involvement in the tumor cases. In conclusion, our quantitative PCR assay using paraffin tumor sections may be useful for the screening of ATLL cases, especially in HTLV-1 non-endemic areas where easy access to serological testing for HTLV-1 infection is limited. © 2016 Japanese Society of Pathology and John Wiley & Sons Australia, Ltd.

  9. EvolQG - An R package for evolutionary quantitative genetics

    PubMed Central

    Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel

    2016-01-01

    We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352

  10. In vivo measurements of proton relaxation times in human brain, liver, and skeletal muscle: a multicenter MRI study.

    PubMed

    de Certaines, J D; Henriksen, O; Spisni, A; Cortsen, M; Ring, P B

    1993-01-01

    Quantitative magnetic resonance imaging may offer unique potential for tissue characterization in vivo. In this connection texture analysis of quantitative MR images may be of special importance. Because evaluation of texture analysis needs large data material, multicenter approaches become mandatory. Within the frame of BME Concerted Action on Tissue Characterization by MRI and MRS, a pilot multicenter study was launched in order to evaluate the technical problems including comparability of relaxation time measurements carried out in the individual sites. Human brain, skeletal muscle, and liver were used as models. A total of 218 healthy volunteers were studied. Fifteen MRI scanners with field strength ranging from 0.08 T to 1.5 T were induced. Measurement accuracy was tested on the Eurospin relaxation time test object (TO5) and the obtained calibration curve was used for correction of the in vivo data. The results established that, by following a standardized procedure, comparable quantitative measurements can be obtained in vivo from a number of MR sites. The overall variation coefficient in vivo was in the same order of magnitude as ex vivo relaxometry. Thus, it is possible to carry out international multicenter studies on quantitative imaging, provided that quality control with respect to measurement accuracy and calibration of the MR equipments are performed.

  11. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vsmore » treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.« less

  12. The emotional coaching model: quantitative and qualitative research into relationships, communication and decisions in physical and sports rehabilitation

    PubMed Central

    RESPIZZI, STEFANO; COVELLI, ELISABETTA

    2015-01-01

    The emotional coaching model uses quantitative and qualitative elements to demonstrate some assumptions relevant to new methods of treatment in physical rehabilitation, considering emotional, cognitive and behavioral aspects in patients, whether or not they are sportsmen. Through quantitative tools (Tampa Kinesiophobia Scale, Emotional Interview Test, Previous Re-Injury Test, and reports on test scores) and qualitative tools (training contracts and relationships of emotional alliance or “contagion”), we investigate initial assumptions regarding: the presence of a cognitive and emotional mental state of impasse in patients at the beginning of the rehabilitation pathway; the curative value of the emotional alliance or “emotional contagion” relationship between healthcare provider and patient; the link between the patient’s pathology and type of contact with his own body and emotions; analysis of the psychosocial variables for the prediction of possible cases of re-injury for patients who have undergone or are afraid to undergo reconstruction of the anterior cruciate ligament (ACL). Although this approach is still in the experimental stage, the scores of the administered tests show the possibility of integrating quantitative and qualitative tools to investigate and develop a patient’s physical, mental and emotional resources during the course of his rehabilitation. Furthermore, it seems possible to identify many elements characterizing patients likely to undergo episodes of re-injury or to withdraw totally from sporting activity. In particular, such patients are competitive athletes, who fear or have previously undergone ACL reconstruction. The theories referred to (the transactional analysis theory, self-determination theory) and the tools used demonstrate the usefulness of continuing this research in order to build a shared coaching model treatment aimed at all patients, sportspeople or otherwise, which is not only physical but also emotional, cognitive and behavioral. PMID:26904525

  13. The Effect of Radiation on Selected Photographic Film

    NASA Technical Reports Server (NTRS)

    Slater, Richard; Kinard, John; Firsov, Ivan

    2000-01-01

    We conducted this film test to evaluate several manufacturers' photographic films for their ability to acquire imagery on the International Space Station. We selected 25 motion picture, photographic slide, and negative films from three different film manufacturers. We based this selection on the fact that their films ranked highest in other similar film tests, and on their general acceptance by the international community. This test differed from previous tests because the entire evaluation process leading up to the final selection was based on information derived after the original flight film was scanned to a digital file. Previously conducted tests were evaluated entirely based on 8 x 10s that were produced from the film either directly or through the internegative process. This new evaluation procedure provided accurate quantitative data on granularity and contrast from the digital data. This test did not try to define which film was best visually. This is too often based on personal preference. However, the test results did group the films by good, marginal, and unacceptable. We developed, and included in this report, a template containing quantitative, graphical, and visual information for each film. These templates should be sufficient for comparing the different films tested and subsequently selecting a film or films to be used for experiments and general documentation on the International Space Station.

  14. Towards SDS (Strategic Defense System) Testing and Evaluation: A collection of Relevant Topics

    DTIC Science & Technology

    1989-07-01

    the proof of the next. 89 The Piton project is the first instance of stacking.two verified components. In 1985 Warren...Accelerated? In the long term, a vast amount of work needs to be done. Below are some miscellaneous, fairly near term projects which would seem to provide...and predictions for the current project . It provides a quantitative analysis of the environment and a model of the

  15. TH-A-207B-00: Shear-Wave Imaging and a QIBA US Biomarker Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less

  16. TH-A-207B-01: Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, S.

    Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less

  17. Accuracy of commercially available c-reactive protein rapid tests in the context of undifferentiated fevers in rural Laos.

    PubMed

    Phommasone, Koukeo; Althaus, Thomas; Souvanthong, Phonesavanh; Phakhounthong, Khansoudaphone; Soyvienvong, Laxoy; Malapheth, Phatthaphone; Mayxay, Mayfong; Pavlicek, Rebecca L; Paris, Daniel H; Dance, David; Newton, Paul; Lubell, Yoel

    2016-02-04

    C-Reactive Protein (CRP) has been shown to be an accurate biomarker for discriminating bacterial from viral infections in febrile patients in Southeast Asia. Here we investigate the accuracy of existing rapid qualitative and semi-quantitative tests as compared with a quantitative reference test to assess their potential for use in remote tropical settings. Blood samples were obtained from consecutive patients recruited to a prospective fever study at three sites in rural Laos. At each site, one of three rapid qualitative or semi-quantitative tests was performed, as well as a corresponding quantitative NycoCard Reader II as a reference test. We estimate the sensitivity and specificity of the three tests against a threshold of 10 mg/L and kappa values for the agreement of the two semi-quantitative tests with the results of the reference test. All three tests showed high sensitivity, specificity and kappa values as compared with the NycoCard Reader II. With a threshold of 10 mg/L the sensitivity of the tests ranged from 87-98 % and the specificity from 91-98 %. The weighted kappa values for the semi-quantitative tests were 0.7 and 0.8. The use of CRP rapid tests could offer an inexpensive and effective approach to improve the targeting of antibiotics in remote settings where health facilities are basic and laboratories are absent. This study demonstrates that accurate CRP rapid tests are commercially available; evaluations of their clinical impact and cost-effectiveness at point of care is warranted.

  18. Overview of Classical Test Theory and Item Response Theory for Quantitative Assessment of Items in Developing Patient-Reported Outcome Measures

    PubMed Central

    Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.

    2014-01-01

    Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753

  19. Smartphone based visual and quantitative assays on upconversional paper sensor.

    PubMed

    Mei, Qingsong; Jing, Huarong; Li, You; Yisibashaer, Wuerzha; Chen, Jian; Nan Li, Bing; Zhang, Yong

    2016-01-15

    The integration of smartphone with paper sensors recently has been gain increasing attentions because of the achievement of quantitative and rapid analysis. However, smartphone based upconversional paper sensors have been restricted by the lack of effective methods to acquire luminescence signals on test paper. Herein, by the virtue of 3D printing technology, we exploited an auxiliary reusable device, which orderly assembled a 980nm mini-laser, optical filter and mini-cavity together, for digitally imaging the luminescence variations on test paper and quantitative analyzing pesticide thiram by smartphone. In detail, copper ions decorated NaYF4:Yb/Tm upconversion nanoparticles were fixed onto filter paper to form test paper, and the blue luminescence on it would be quenched after additions of thiram through luminescence resonance energy transfer mechanism. These variations could be monitored by the smartphone camera, and then the blue channel intensities of obtained colored images were calculated to quantify amounts of thiram through a self-written Android program installed on the smartphone, offering a reliable and accurate detection limit of 0.1μM for the system. This work provides an initial demonstration of integrating upconversion nanosensors with smartphone digital imaging for point-of-care analysis on a paper-based platform. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Smartphone-Based Dual-Modality Imaging System for Quantitative Detection of Color or Fluorescent Lateral Flow Immunochromatographic Strips

    NASA Astrophysics Data System (ADS)

    Hou, Yafei; Wang, Kan; Xiao, Kun; Qin, Weijian; Lu, Wenting; Tao, Wei; Cui, Daxiang

    2017-04-01

    Nowadays, lateral flow immunochromatographic assays are increasingly popular as a diagnostic tool for point-of-care (POC) test based on their simplicity, specificity, and sensitivity. Hence, quantitative detection and pluralistic popular application are urgently needed in medical examination. In this study, a smartphone-based dual-modality imaging system was developed for quantitative detection of color or fluorescent lateral flow test strips, which can be operated anywhere at any time. In this system, the white and ultra-violet (UV) light of optical device was designed, which was tunable with different strips, and the Sobel operator algorithm was used in the software, which could enhance the identification ability to recognize the test area from the background boundary information. Moreover, this technology based on extraction of the components from RGB format (red, green, and blue) of color strips or only red format of the fluorescent strips can obviously improve the high-signal intensity and sensitivity. Fifty samples were used to evaluate the accuracy of this system, and the ideal detection limit was calculated separately from detection of human chorionic gonadotropin (HCG) and carcinoembryonic antigen (CEA). The results indicated that smartphone-controlled dual-modality imaging system could provide various POC diagnoses, which becomes a potential technology for developing the next-generation of portable system in the near future.

  1. Quantitative application of the primary progressive aphasia consensus criteria.

    PubMed

    Wicklund, Meredith R; Duffy, Joseph R; Strand, Edythe A; Machulda, Mary M; Whitwell, Jennifer L; Josephs, Keith A

    2014-04-01

    To determine how well the consensus criteria could classify subjects with primary progressive aphasia (PPA) using a quantitative speech and language battery that matches the test descriptions provided by the consensus criteria. A total of 105 participants with a neurodegenerative speech and language disorder were prospectively recruited and underwent neurologic, neuropsychological, and speech and language testing and MRI in this case-control study. Twenty-one participants with apraxia of speech without aphasia served as controls. Select tests from the speech and language battery were chosen for application of consensus criteria and cutoffs were employed to determine syndromic classification. Hierarchical cluster analysis was used to examine participants who could not be classified. Of the 84 participants, 58 (69%) could be classified as agrammatic (27%), semantic (7%), or logopenic (35%) variants of PPA. The remaining 31% of participants could not be classified. Of the unclassifiable participants, 2 clusters were identified. The speech and language profile of the first cluster resembled mild logopenic PPA and the second cluster semantic PPA. Gray matter patterns of loss of these 2 clusters of unclassified participants also resembled mild logopenic and semantic variants. Quantitative application of consensus PPA criteria yields the 3 syndromic variants but leaves a large proportion unclassified. Therefore, the current consensus criteria need to be modified in order to improve sensitivity.

  2. Radiomic analysis in prediction of Human Papilloma Virus status.

    PubMed

    Yu, Kaixian; Zhang, Youyi; Yu, Yang; Huang, Chao; Liu, Rongjie; Li, Tengfei; Yang, Liuqing; Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Zhu, Hongtu

    2017-12-01

    Human Papilloma Virus (HPV) has been associated with oropharyngeal cancer prognosis. Traditionally the HPV status is tested through invasive lab test. Recently, the rapid development of statistical image analysis techniques has enabled precise quantitative analysis of medical images. The quantitative analysis of Computed Tomography (CT) provides a non-invasive way to assess HPV status for oropharynx cancer patients. We designed a statistical radiomics approach analyzing CT images to predict HPV status. Various radiomics features were extracted from CT scans, and analyzed using statistical feature selection and prediction methods. Our approach ranked the highest in the 2016 Medical Image Computing and Computer Assisted Intervention (MICCAI) grand challenge: Oropharynx Cancer (OPC) Radiomics Challenge, Human Papilloma Virus (HPV) Status Prediction. Further analysis on the most relevant radiomic features distinguishing HPV positive and negative subjects suggested that HPV positive patients usually have smaller and simpler tumors.

  3. How and Why of User Studies: RLG's RedLightGreen as a Case Study

    ERIC Educational Resources Information Center

    Proffitt, Merrilee

    2006-01-01

    This article documents a lifecycle approach to employing user-centered design, covering both qualitative and quantitative data gathering methods in support of using this approach for product design, usability testing, and market research. The author provides specific case studies of usability studies, focus groups, interviews, ethnographic…

  4. KSC01pp0799

    NASA Image and Video Library

    2001-04-12

    Workers at Astrotech, Titusville, Fla., work on the GOES-M satellite. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite is undergoing testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  5. Investigation of Kevlar fabric based materials for use with inflatable structures

    NASA Technical Reports Server (NTRS)

    Niccum, R. J.; Munson, J. B.

    1974-01-01

    Design, manufacture and testing of laminated and coated composite materials incorporating a structural matrix of Kevlar are reported in detail. The practicality of using Kevlar in aerostat materials is demonstrated and data are provided on practical weaves, lamination and coating particulars, rigidity, strength, weight, elastic coefficients, abrasion resistance, crease effects, peel strength, blocking tendencies, helium permeability, and fabrication techniques. Properties of the Kevlar based materials are compared with conventional, Dacron reinforced counterparts. A comprehensive test and qualification program is discussed and quantitative biaxial tensile and shear test data are provided. The investigation shows that single ply laminates of Kevlar and plastic films offer significant strength to weight improvements, are less permeable than two ply coated materials, but have a lower flex life.

  6. Oxygen Concentration Flammability Thresholds of Selected Aerospace Materials Considered for the Constellation Program

    NASA Technical Reports Server (NTRS)

    Hirsch, David B.; Williams, James H.; Harper, Susan A.; Beeson, Harold; Pedley, Michael D.

    2007-01-01

    Materials selection for spacecraft is based on an upward flammability test conducted in a quiescent environment in the highest expected oxygen concentration environment. The test conditions and its pass/fail test logic do not provide sufficient quantitative materials flammability information for an advanced space exploration program. A modified approach has been suggested determination of materials self-extinguishment limits. The flammability threshold information will allow NASA to identify materials with increased flammability risk from oxygen concentration and total pressure changes, minimize potential impacts, and allow for development of sound requirements for new spacecraft and extraterrestrial landers and habitats. This paper provides data on oxygen concentration self-extinguishment limits under quiescent conditions for selected materials considered for the Constellation Program.

  7. The Validity of Scores from the "GRE"® revised General Test for Forecasting Performance in Business Schools: Phase One. ETS GRE® Board Research Report. ETS GRE®-14-01. ETS Research Report. RR-14-17

    ERIC Educational Resources Information Center

    Young, John W.; Klieger, David; Bochenek, Jennifer; Li, Chen; Cline, Fred

    2014-01-01

    Scores from the "GRE"® revised General Test provide important information regarding the verbal and quantitative reasoning abilities and analytical writing skills of applicants to graduate programs. The validity and utility of these scores depend upon the degree to which the scores predict success in graduate and business school in…

  8. Aerothermal modeling program, phase 1

    NASA Technical Reports Server (NTRS)

    Sturgess, G. J.

    1983-01-01

    The physical modeling embodied in the computational fluid dynamics codes is discussed. The objectives were to identify shortcomings in the models and to provide a program plan to improve the quantitative accuracy. The physical models studied were for: turbulent mass and momentum transport, heat release, liquid fuel spray, and gaseous radiation. The approach adopted was to test the models against appropriate benchmark-quality test cases from experiments in the literature for the constituent flows that together make up the combustor real flow.

  9. Photothermal technique in cell microscopy studies

    NASA Astrophysics Data System (ADS)

    Lapotko, Dmitry; Chebot'ko, Igor; Kutchinsky, Georgy; Cherenkevitch, Sergey

    1995-01-01

    Photothermal (PT) method is applied for a cell imaging and quantitative studies. The techniques for cell monitoring, imaging and cell viability test are developed. The method and experimental set up for optical and PT-image acquisition and analysis is described. Dual- pulsed laser set up combined with phase contrast illumination of a sample provides visualization of temperature field or absorption structure of a sample with spatial resolution 0.5 micrometers . The experimental optics, hardware and software are designed using the modular principle, so the whole set up can be adjusted for various experiments: PT-response monitoring or photothermal spectroscopy studies. Sensitivity of PT-method provides the imaging of the structural elements of live (non-stained) white blood cells. The results of experiments with normal and subnormal blood cells (red blood cells, lymphocytes, neutrophyles and lymphoblasts) are reported. Obtained PT-images are different from optical analogs and deliver additional information about cell structure. The quantitative analysis of images was used for cell population comparative diagnostic. The viability test for red blood cell differentiation is described. During the study of neutrophyles in norma and sarcoidosis disease the differences in PT-images of cells were found.

  10. Supply-side dimensions and dynamics of integrating HIV testing and counselling into routine antenatal care: a facility assessment from Morogoro Region, Tanzania.

    PubMed

    An, Selena J; George, Asha S; LeFevre, Amnesty E; Mpembeni, Rose; Mosha, Idda; Mohan, Diwakar; Yang, Ann; Chebet, Joy; Lipingu, Chrisostom; Baqui, Abdullah H; Killewo, Japhet; Winch, Peter J; Kilewo, Charles

    2015-10-04

    Integration of HIV into RMNCH (reproductive, maternal, newborn and child health) services is an important process addressing the disproportionate burden of HIV among mothers and children in sub-Saharan Africa. We assess the structural inputs and processes of care that support HIV testing and counselling in routine antenatal care to understand supply-side dynamics critical to scaling up further integration of HIV into RMNCH services prior to recent changes in HIV policy in Tanzania. This study, as a part of a maternal and newborn health program evaluation in Morogoro Region, Tanzania, drew from an assessment of health centers with 18 facility checklists, 65 quantitative and 57 qualitative provider interviews, and 203 antenatal care observations. Descriptive analyses were performed with quantitative data using Stata 12.0, and qualitative data were analyzed thematically with data managed by Atlas.ti. Limitations in structural inputs, such as infrastructure, supplies, and staffing, constrain the potential for integration of HIV testing and counselling into routine antenatal care services. While assessment of infrastructure, including waiting areas, appeared adequate, long queues and small rooms made private and confidential HIV testing and counselling difficult for individual women. Unreliable stocks of HIV test kits, essential medicines, and infection prevention equipment also had implications for provider-patient relationships, with reported decreases in women's care seeking at health centers. In addition, low staffing levels were reported to increase workloads and lower motivation for health workers. Despite adequate knowledge of counselling messages, antenatal counselling sessions were brief with incomplete messages conveyed to pregnant women. In addition, coping mechanisms, such as scheduling of clinical activities on different days, limited service availability. Antenatal care is a strategic entry point for the delivery of critical tests and counselling messages and the framing of patient-provider relations, which together underpin care seeking for the remaining continuum of care. Supply-side deficiencies in structural inputs and processes of delivering HIV testing and counselling during antenatal care indicate critical shortcomings in the quality of care provided. These must be addressed if integrating HIV testing and counselling into antenatal care is to result in improved maternal and newborn health outcomes.

  11. Quantitation of a slide test (Monotest) for infectious mononucleosis

    PubMed Central

    Carter, P. Kenneth; Schoen, Irwin; Miyahira, Teru

    1970-01-01

    A slide test for infectious mononucleosis using formalinized horse erythrocytes (Monotest2) was quantitated and compared with standard differential heterophile (Davidsohn) titres performed on the same specimens. The Monotest titre parallels the standard presumptive heterophile (antisheep cell) titre in the degree of elevation, with a ratio of Monotest to heterophile titre of approximately 1 to 56. The simplicity of the quantitative slide test recommends it as a routine test for infectious mononucleosis. PMID:5530641

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turkington, T.

    This education session will cover the physics and operation principles of gamma cameras and PET scanners. The first talk will focus on PET imaging. An overview of the principles of PET imaging will be provided, including positron decay physics, and the transition from 2D to 3D imaging. More recent advances in hardware and software will be discussed, such as time-of-flight imaging, and improvements in reconstruction algorithms that provide for options such as depth-of-interaction corrections. Quantitative applications of PET will be discussed, as well as the requirements for doing accurate quantitation. Relevant performance tests will also be described. Learning Objectives: Bemore » able to describe basic physics principles of PET and operation of PET scanners. Learn about recent advances in PET scanner hardware technology. Be able to describe advances in reconstruction techniques and improvements Be able to list relevant performance tests. The second talk will focus on gamma cameras. The Nuclear Medicine subcommittee has charged a task group (TG177) to develop a report on the current state of physics testing of gamma cameras, SPECT, and SPECT/CT systems. The report makes recommendations for performance tests to be done for routine quality assurance, annual physics testing, and acceptance tests, and identifies those needed satisfy the ACR accreditation program and The Joint Commission imaging standards. The report is also intended to be used as a manual with detailed instructions on how to perform tests under widely varying conditions. Learning Objectives: At the end of the presentation members of the audience will: Be familiar with the tests recommended for routine quality assurance, annual physics testing, and acceptance tests of gamma cameras for planar imaging. Be familiar with the tests recommended for routine quality assurance, annual physics testing, and acceptance tests of SPECT systems. Be familiar with the tests of a SPECT/CT system that include the CT images for SPECT reconstructions. Become knowledgeable of items to be included in annual acceptance testing reports including CT dosimetry and PACS monitor measurements. T. Turkington, GE Healthcare.« less

  13. 77 FR 18793 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... customary compensation for their participation. For the quantitative research, the Bureau plans to contract with a consumer research firm to formulate a quantitative testing plan, recruit respondents, as well as... soliciting comments concerning the information collection efforts relating to Quantitative Testing of...

  14. Remote Imaging of Exploration Flight Test-1 (EFT-1) Entry Heating Risk Reduction

    NASA Technical Reports Server (NTRS)

    Schuster, David M.; Horvath, Thomas J.; Schwartz, Richard J.

    2016-01-01

    A Measure of Performance (MOP) identified with an Exploration Flight Test-1 (EFT-1) Multi- Purpose Crew Vehicle (MPCV) Program Flight Test Objective (FTO) (OFT1.091) specified an observation during reentry though external ground-based or airborne assets with thermal detection capabilities. The objective of this FTO was to be met with onboard Developmental Flight Instrumentation (DFI), but the MOP for external observation was intended to provide complementary quantitative data and serve as a risk reduction in the event of anomalous DFI behavior (or failure). Mr. Gavin Mendeck, the Entry, Descent, and Landing (EDL) Phase Engineer for the MPCV Program (Vehicle Integration Office/Systems & Mission Integration) requested a risk-reduction assessment from the NASA Engineering and Safety Center (NESC) to determine whether quantitative imagery could be obtained from remote aerial assets to support the external observation MOP. If so, then a viable path forward was to be determined, risks identified, and an observation pursued. If not, then the MOP for external observation was to be eliminated.

  15. From community-based pilot testing to region-wide systems change: lessons from a local quality improvement collaborative.

    PubMed

    Keyser, Donna J; Pincus, Harold Alan

    2010-01-01

    A community-based collaborative conducted a 2-year pilot study to inform efforts for improving maternal and child health care practice and policy in Allegheny County, Pennsylvania. (1) To test whether three small-scale versions of an evidence-based, systems improvement approach would be workable in local community settings and (2) to identify specific policy/infrastructure reforms for sustaining improvements. A mixed methods approach was used, including quantitative performance measurement supplemented with qualitative data about factors related to outcomes of interest, as well as key stakeholder interviews and a literature review/Internet search. Quantitative performance results varied; qualitative data revealed critical factors for the success and failure of the practices tested. Policy/infrastructure recommendations were developed to address specific practice barriers. This information was important for designing a region-wide quality improvement initiative focused on maternal depression. The processes and outcomes provide valuable insights for other communities interested in conducting similar quality improvement initiatives.

  16. Correcting power and p-value calculations for bias in diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Landman, Bennett A

    2013-07-01

    Diffusion tensor imaging (DTI) provides quantitative parametric maps sensitive to tissue microarchitecture (e.g., fractional anisotropy, FA). These maps are estimated through computational processes and subject to random distortions including variance and bias. Traditional statistical procedures commonly used for study planning (including power analyses and p-value/alpha-rate thresholds) specifically model variability, but neglect potential impacts of bias. Herein, we quantitatively investigate the impacts of bias in DTI on hypothesis test properties (power and alpha-rate) using a two-sided hypothesis testing framework. We present theoretical evaluation of bias on hypothesis test properties, evaluate the bias estimation technique SIMEX for DTI hypothesis testing using simulated data, and evaluate the impacts of bias on spatially varying power and alpha rates in an empirical study of 21 subjects. Bias is shown to inflame alpha rates, distort the power curve, and cause significant power loss even in empirical settings where the expected difference in bias between groups is zero. These adverse effects can be attenuated by properly accounting for bias in the calculation of power and p-values. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. [Development and effects of emotional intelligence program for undergraduate nursing students: mixed methods research].

    PubMed

    Lee, Oi Sun; Gu, Mee Ock

    2014-12-01

    This study was conducted to develop and test the effects of an emotional intelligence program for undergraduate nursing students. The study design was a mixed method research. Participants were 36 nursing students (intervention group: 17, control group: 19). The emotional intelligence program was provided for 4 weeks (8 sessions, 20 hours). Data were collected between August 6 and October 4, 2013. Quantitative data were analyzed using Chi-square, Fisher's exact test, t-test, repeated measure ANOVA, and paired t-test with SPSS/WIN 18.0. Qualitative data were analyzed using content analysis. Quantitative results showed that emotional intelligence, communication skills, resilience, stress coping strategy, and clinical competence were significantly better in the experimental group compared to the control group. According to the qualitative results, the nursing students experienced improvement in emotional intelligence, interpersonal relationships, and empowerment, as well as a reduction in clinical practice stress after participation in the emotional intelligence program. Study findings indicate that the emotional intelligence program for undergraduate nursing students is effective and can be recommended as an intervention for improving the clinical competence of undergraduate students in a nursing curriculum.

  18. Olfactory dysfunction and its measurement in the clinic and workplace.

    PubMed

    Doty, Richard L

    2006-04-01

    To provide an overview of practical means for quantitatively assessing the sense of smell in both the clinic and workplace. To address basic measurement issues, including those of test sensitivity, specificity, and reliability. To describe and discuss factors that influence olfactory function, including airborne toxins commonly found in industrial settings. Selective review and discussion. A number of well-validated practical threshold and suprathreshold tests are available for assessing smell function. The reliability, sensitivity, and specificity of such techniques vary, being influenced by such factors as test length and type. Numerous subject factors, including age, sex, health, medications, and exposure to environmental toxins, particularly heavy metals, influence the ability to smell. Modern advances in technology, in conjunction with better occupational medicine practices, now make it possible to reliably monitor and limit occupational exposures to hazardous chemicals and their potential adverse influences on the sense of smell. Quantitative olfactory testing is critical to establish the presence or absence of such adverse influences, as well as to (a) detect malingering, (b) establish disability compensation, and (c) monitor function over time.

  19. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    PubMed

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  20. Stereotype threat? Effects of inquiring about test takers' gender on conceptual test performance in physics

    NASA Astrophysics Data System (ADS)

    Maries, Alexandru; Singh, Chandralekha

    2015-12-01

    It has been found that activation of a stereotype, for example by indicating one's gender before a test, typically alters performance in a way consistent with the stereotype, an effect called "stereotype threat." On a standardized conceptual physics assessment, we found that asking test takers to indicate their gender right before taking the test did not deteriorate performance compared to an equivalent group who did not provide gender information. Although a statistically significant gender gap was present on the standardized test whether or not students indicated their gender, no gender gap was observed on the multiple-choice final exam students took, which included both quantitative and conceptual questions on similar topics.

  1. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    PubMed

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  2. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data*

    PubMed Central

    Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy

    2011-01-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510

  3. Hepatitis C Test

    MedlinePlus

    ... Time and International Normalized Ratio (PT/INR) PSEN1 Quantitative Immunoglobulins Red Blood Cell (RBC) Antibody Identification Red ... monitor treatment: HCV RNA tests: HCV RNA test, Quantitative (HCV viral load) detects and measures the number ...

  4. Situation analysis of prenatal diagnosis technology utilization in China: current situation, main issues, and policy implications.

    PubMed

    Chen, Yingyao; Qian, Xu; Tang, Zhiliu; Banta, H David; Hu, Fangfang; Cao, Jianwen; Huang, Jiayan; Wang, Qian; Lv, Jun; Ying, Xianghua; Chen, Jie

    2004-01-01

    The purpose of this study is to describe the situation with the distribution and utilization of prenatal diagnosis technology in China, to identify some important barriers to prenatal diagnosis use, and to suggest changes to improve the present situation. The study uses cross-sectional surveys to capture quantitative data from both providers and consumers. Qualitative information based on focus group discussions is also presented. A mail survey of the provincial Bureaus of Health (BOHs) reveals that sixteen provincial prenatal diagnosis centers and twelve city level centers were accredited by the BOHs by July of 2001. These centers were located in thirteen provinces, of thirty in all of China. Of 147 selected institutions surveyed separately, 90.5 percent offer ultrasound examination, 72.1 percent provide pathogen tests (mainly Toxoplasma, rubella virus, cytomegalovirus, and herpes simplex or TORCH), 57.1 percent do biochemical tests, 21.8 percent have genetic counseling, 13.6 percent do karyotype testing, 7.5 percent do enzymology testing, and 5.4 percent carry out molecular genetic testing. Chromosome diseases, congenital diseases, and several gene diseases are the target diseases. According to qualitative data, macromanagement for prenatal diagnosis, supplier provision of tests, and population demand are the main influences on prenatal diagnosis use. From the quantitative and qualitative analysis, it is clear that the technology of prenatal diagnosis is not diffusing well throughout China and is apparently not appropriately used. The situation of prenatal diagnosis has implications for policy-makers, including identification of priorities, regulation of prenatal diagnosis, strategic planning, development of guidelines based on health technology assessment, and consumer orientation.

  5. Three-dimensional modeling and quantitative analysis of gap junction distributions in cardiac tissue.

    PubMed

    Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W

    2011-11-01

    Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.

  6. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  7. Quantitative PCR detection of Batrachochytrium dendrobatidis DNA from sediments and water

    USGS Publications Warehouse

    Kirshtein, Julie D.; Anderson, Chauncey W.; Wood, J.S.; Longcore, Joyce E.; Voytek, Mary A.

    2007-01-01

    The fungal pathogen Batrachochytrium dendrobatidis (Bd) causes chytridiomycosis, a disease implicated in amphibian declines on 5 continents. Polymerase chain reaction (PCR) primer sets exist with which amphibians can be tested for this disease, and advances in sampling techniques allow non-invasive testing of animals. We developed filtering and PCR based quantitative methods by modifying existing PCR assays to detect Bd DNA in water and sediments, without the need for testing amphibians; we tested the methods at 4 field sites. The SYBR based assay using Boyle primers (SYBR/Boyle assay) and the Taqman based assay using Wood primers performed similarly with samples generated in the laboratory (Bd spiked filters), but the SYBR/Boyle assay detected Bd DNA in more field samples. We detected Bd DNA in water from 3 of 4 sites tested, including one pond historically negative for chytridiomycosis. Zoospore equivalents in sampled water ranged from 19 to 454 l-1 (nominal detection limit is 10 DNA copies, or about 0.06 zoospore). We did not detect DNA of Bd from sediments collected at any sites. Our filtering and amplification methods provide a new tool to investigate critical aspects of Bd in the environment. ?? Inter-Research 2007.

  8. G6PD: The Test

    MedlinePlus

    ... Time and International Normalized Ratio (PT/INR) PSEN1 Quantitative Immunoglobulins Red Blood Cell (RBC) Antibody Identification Red ... or her cells. Confirmation testing will involve a quantitative test, with which the actual amount of enzyme ...

  9. Rapid and quantitative detection of zoonotic influenza A virus infection utilizing coumarin-derived dendrimer-based fluorescent immunochromatographic strip test (FICT).

    PubMed

    Yeo, Seon-Ju; Huong, Dinh Thi; Hong, Nguyen Ngoc; Li, Chun-Ying; Choi, Kyunghan; Yu, Kyoungsik; Choi, Du-Young; Chong, Chom-Kyu; Choi, Hak Soo; Mallik, Shyam Kumar; Kim, Hak Sung; Sung, Haan Woo; Park, Hyun

    2014-01-01

    Great efforts have been made to develop robust signal-generating fluorescence materials which will help in improving the rapid diagnostic test (RDT) in terms of sensitivity and quantification. In this study, we developed coumarin-derived dendrimer-based fluorescent immunochromatographic strip test (FICT) assay with enhanced sensitivity as a quantitative diagnostic tool in typical RDT environments. The accuracy of the proposed FICT was compared with that of dot blot immunoassay techniques and conventional RDTs. Through conjugation of coumarin-derived dendrimers with latex beads, fluorescent emission covering broad output spectral ranges was obtained which provided a distinct advantage of easy discrimination of the fluorescent emission of the latex beads with a simple insertion of a long-pass optical filter away from the excitation wavelength. The newly developed FICT assay was able to detect 100 ng/10 μL of influenza A nucleoprotein (NP) antigen within 5 minutes, which corresponded to 2.5-fold higher sensitivity than that of the dot blot immunoassay or conventional RDTs. Moreover, the FICT assay was confirmed to detect at least four avian influenza A subtypes (H5N3, H7N1, H7N7, and H9N2). On applying the FICT to the clinical swab samples infected with respiratory viruses, our FICT assay was confirmed to differentiate influenza H1N1 infection from other respiratory viral diseases. These data demonstrate that the proposed FICT assay is able to detect zoonotic influenza A viruses with a high sensitivity, and it enables the quantitation of the infection intensity by providing the numerical diagnostic values; thus demonstrating enhanced detectability of influenza A viruses.

  10. Qualitative and quantitative evaluation of human dental enamel after bracket debonding: a noncontact three-dimensional optical profilometry analysis.

    PubMed

    Ferreira, Fabiano G; Nouer, Darcy F; Silva, Nelson P; Garbui, Ivana U; Correr-Sobrinho, Lourenço; Nouer, Paulo R A

    2014-09-01

    The aim of this study was to undertake a qualitative and quantitative evaluation of changes on enamel surfaces after debonding of brackets followed by finishing procedures, using a high-resolution three-dimensional optical profiler and to investigate the accuracy of the technique. The labial surfaces of 36 extracted upper central incisors were examined. Before bonding, the enamel surfaces were subjected to profilometry, recording four amplitude parameters. Brackets were then bonded using two types of light-cured orthodontic adhesive: composite resin and resin-modified glass ionomer cement. Finishing was performed by three different methods: pumice on a rubber cup, fine and ultrafine aluminum oxide discs, and microfine diamond cups followed by silicon carbide brushes. The samples were subsequently re-analyzed by profilometry. Wilcoxon signed-rank test, Kruskal-Wallis test (p < 0.05) and a posteriori Mann-Whitney U test with Bonferroni correction (p < 0.0167) revealed a significant reduction of enamel roughness when diamond cups followed by silicon carbide brushes were used to finish surfaces that had remnants of resin-modified glass ionomer adhesive and when pumice was used to finish surfaces that had traces of composite resin. Enamel loss was minimal. The 3D optical profilometry technique was able to provide accurate qualitative and quantitative assessment of changes on the enamel surface after debonding. Morphological changes in the topography of dental surfaces, especially if related to enamel loss and roughness, are of considerable clinical importance. The quantitative evaluation method used herein enables a more comprehensive understanding of the effects of orthodontic bonding on teeth.

  11. Quantitative Evaluation of Electrodes for External Urethral Sphincter Electromyography during Bladder-to-Urethral Guarding Reflex

    PubMed Central

    Steward, James E.; Clemons, Jessica D.; Zaszczurynski, Paul J.; Butler, Robert S.; Damaser, Margot S.; Jiang, Hai-Hong

    2009-01-01

    Purpose Accuracy in the recording of external urethral sphincter (EUS) electromyography (EMG) is an important goal in the quantitative evaluation of urethral function. This study aim was to quantitatively compare electrode recordings taken during tonic activity and leak point pressure (LPP) testing. Methods Several electrodes, including the surface electrode (SE), concentric electrode (CE), and wire electrode (WE), were placed on the EUS singly and simultaneously in six female Sprague-Dawley rats under urethane anesthesia. The bladder was filled via a retropubic catheter while LPP testing and EUS EMG recording were done. Quantitative baseline correction of the EUS EMG signal was performed to reduce baseline variation. Amplitude and frequency of one-second samples of the EUS EMG signal were measured before LPP (tonic activity) and during peak LPP activity. Results The SE, CE, and WE signals demonstrated tonic activity before LPP and an increase in activity during LPP, suggesting that the electrodes accurately recorded EUS activity during tonic activity and during the bladder-to-EUS guarding reflex, regardless of the size or location of detection areas. SE recordings required significantly less baseline correction than both CE and WE recordings. The activity in CE-recorded EMG was significantly higher than that of the SE and WE both in single and simultaneous recordings. Conclusions These electrodes may be suitable for testing EUS EMG activity. The SE signal had significantly less baseline variation and the CE detected local activity more sensitively than the other electrodes, which may provide insight into choosing an appropriate electrode for EUS EMG recording. PMID:19680661

  12. Label-free and amplified quantitation of proteins in complex mixtures using diffractive optics technology.

    PubMed

    Cleverley, Steve; Chen, Irene; Houle, Jean-François

    2010-01-15

    Immunoaffinity approaches remain invaluable tools for characterization and quantitation of biopolymers. Their application in separation science is often limited due to the challenges of immunoassay development. Typical end-point immunoassays require time consuming and labor-intensive approaches for optimization. Real-time label-free analysis using diffractive optics technology (dot) helps guide a very effective iterative process for rapid immunoassay development. Both label-free and amplified approaches can be used throughout feasibility testing and ultimately in the final assay, providing a robust platform for biopolymer analysis over a very broad dynamic range. We demonstrate the use of dot in rapidly developing assays for quantitating (1) human IgG in complex media, (2) a fusion protein in production media and (3) protein A contamination in purified immunoglobulin preparations. 2009 Elsevier B.V. All rights reserved.

  13. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  14. Quantitative assessment of skin, hair, and iris variation in a diverse sample of individuals and associated genetic variation.

    PubMed

    Norton, Heather L; Edwards, Melissa; Krithika, S; Johnson, Monique; Werren, Elizabeth A; Parra, Esteban J

    2016-08-01

    The main goals of this study are to 1) quantitatively measure skin, hair, and iris pigmentation in a diverse sample of individuals, 2) describe variation within and between these samples, and 3) demonstrate how quantitative measures can facilitate genotype-phenotype association tests. We quantitatively characterize skin, hair, and iris pigmentation using the Melanin (M) Index (skin) and CIELab values (hair) in 1,450 individuals who self-identify as African American, East Asian, European, Hispanic, or South Asian. We also quantify iris pigmentation in a subset of these individuals using CIELab values from high-resolution iris photographs. We compare mean skin M index and hair and iris CIELab values among populations using ANOVA and MANOVA respectively and test for genotype-phenotype associations in the European sample. All five populations are significantly different for skin (P <2 × 10(-16) ) and hair color (P <2 × 10(-16) ). Our quantitative analysis of iris and hair pigmentation reinforces the continuous, rather than discrete, nature of these traits. We confirm the association of three loci (rs16891982, rs12203592, and rs12913832) with skin pigmentation and four loci (rs12913832, rs12203592, rs12896399, and rs16891982) with hair pigmentation. Interestingly, the derived rs12203592 T allele located within the IRF4 gene is associated with lighter skin but darker hair color. The quantitative methods used here provide a fine-scale assessment of pigmentation phenotype and facilitate genotype-phenotype associations, even with relatively small sample sizes. This represents an important expansion of current investigations into pigmentation phenotype and associated genetic variation by including non-European and admixed populations. Am J Phys Anthropol 160:570-581, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  15. The efficacy of semi-quantitative urine protein-to-creatinine (P/C) ratio for the detection of significant proteinuria in urine specimens in health screening settings.

    PubMed

    Chang, Chih-Chun; Su, Ming-Jang; Ho, Jung-Li; Tsai, Yu-Hui; Tsai, Wei-Ting; Lee, Shu-Jene; Yen, Tzung-Hai; Chu, Fang-Yeh

    2016-01-01

    Urine protein detection could be underestimated using the conventional dipstick method because of variations in urine aliquots. This study aimed to assess the efficacy of the semi-quantitative urine protein-to-creatinine (P/C) ratio compared with other laboratory methods. Random urine samples were requested from patients undergoing chronic kidney disease screening. Significant proteinuria was determined by the quantitative P/C ratio of at least 150 mg protein/g creatinine. The semi-quantitative P/C ratio, dipstick protein and quantitative protein concentrations were compared and analyzed. In the 2932 urine aliquots, 156 (5.3 %) urine samples were considered as diluted and 60 (39.2 %) were found as significant proteinuria. The semi-quantitative P/C ratio testing had the best sensitivity (70.0 %) and specificity (95.9 %) as well as the lowest underestimation rate (0.37 %) when compared to other laboratory methods in the study. In the semi-quantitative P/C ratio test, 19 (12.2 %) had positive, 52 (33.3 %) had diluted, and 85 (54.5 %) had negative results. Of those with positive results, 7 (36.8 %) were positive detected by traditional dipstick urine protein test, and 9 (47.4 %) were positive detected by quantitative urine protein test. Additionally, of those with diluted results, 25 (48.1 %) had significant proteinuria, and all were assigned as no significant proteinuria by both tests. The semi-quantitative urine P/C ratio is clinically applicable based on its better sensitivity and screening ability for significant proteinuria than other laboratory methods, particularly in diluted urine samples. To establish an effective strategy for CKD prevention, urine protein screening with semi-quantitative P/C ratio could be considered.

  16. Test-Analysis Correlation for Space Shuttle External Tank Foam Impacting RCC Wing Leading Edge Component Panels

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.

    2008-01-01

    The Space Shuttle Columbia Accident Investigation Board recommended that NASA develop, validate, and maintain a modeling tool capable of predicting the damage threshold for debris impacts on the Space Shuttle Reinforced Carbon-Carbon (RCC) wing leading edge and nosecap assembly. The results presented in this paper are one part of a multi-level approach that supported the development of the predictive tool used to recertify the shuttle for flight following the Columbia Accident. The assessment of predictive capability was largely based on test analysis comparisons for simpler component structures. This paper provides comparisons of finite element simulations with test data for external tank foam debris impacts onto 6-in. square RCC flat panels. Both quantitative displacement and qualitative damage assessment correlations are provided. The comparisons show good agreement and provided the Space Shuttle Program with confidence in the predictive tool.

  17. Testing new approaches to carbonate system simulation at the reef scale: the ReefSam model first results, application to a question in reef morphology and future challenges.

    NASA Astrophysics Data System (ADS)

    Barrett, Samuel; Webster, Jody

    2016-04-01

    Numerical simulation of the stratigraphy and sedimentology of carbonate systems (carbonate forward stratigraphic modelling - CFSM) provides significant insight into the understanding of both the physical nature of these systems and the processes which control their development. It also provides the opportunity to quantitatively test conceptual models concerning stratigraphy, sedimentology or geomorphology, and allows us to extend our knowledge either spatially (e.g. between bore holes) or temporally (forwards or backwards in time). The later is especially important in determining the likely future development of carbonate systems, particularly regarding the effects of climate change. This application, by its nature, requires successful simulation of carbonate systems on short time scales and at high spatial resolutions. Previous modelling attempts have typically focused on the scales of kilometers and kilo-years or greater (the scale of entire carbonate platforms), rather than at the scale of centuries or decades, and tens to hundreds of meters (the scale of individual reefs). Previous work has identified limitations in common approaches to simulating important reef processes. We present a new CFSM, Reef Sedimentary Accretion Model (ReefSAM), which is designed to test new approaches to simulating reef-scale processes, with the aim of being able to better simulate the past and future development of coral reefs. Four major features have been tested: 1. A simulation of wave based hydrodynamic energy with multiple simultaneous directions and intensities including wave refraction, interaction, and lateral sheltering. 2. Sediment transport simulated as sediment being moved from cell to cell in an iterative fashion until complete deposition. 3. A coral growth model including consideration of local wave energy and composition of the basement substrate (as well as depth). 4. A highly quantitative model testing approach where dozens of output parameters describing the reef morphology and development are compared with observational data. Despite being a test-bed and work in progress, ReefSAM was able to simulate the Holocene development of One Tree Reef in the Southern Great Barrier Reef (Australia) and was able to improve upon previous modelling attempts in terms of both quantitative measures and qualitative outputs, such as the presence of previously un-simulated reef features. Given the success of the model in simulating the Holocene development of OTR, we used it to quantitatively explore the effect of basement substrate depth and morphology on reef maturity/lagoonal filling (as discussed by Purdy and Gischer 2005). Initial results show a number of non-linear relationships between basement substrate depth, lagoonal filling and volume of sand produced on the reef rims and deposited in the lagoon. Lastly, further testing of the model has revealed new challenges which are likely to manifest in any attempt at reef-scale simulation. Subtly different sets of energy direction and magnitude input parameters (different in each time step but with identical probability distributions across the entire model run) resulted in a wide range of quantitative model outputs. Time step length is a likely contributing factor and the results of further testing to address this challenge will be presented.

  18. Least squares QR-based decomposition provides an efficient way of computing optimal regularization parameter in photoacoustic tomography.

    PubMed

    Shaw, Calvin B; Prakash, Jaya; Pramanik, Manojit; Yalavarthy, Phaneendra K

    2013-08-01

    A computationally efficient approach that computes the optimal regularization parameter for the Tikhonov-minimization scheme is developed for photoacoustic imaging. This approach is based on the least squares-QR decomposition which is a well-known dimensionality reduction technique for a large system of equations. It is shown that the proposed framework is effective in terms of quantitative and qualitative reconstructions of initial pressure distribution enabled via finding an optimal regularization parameter. The computational efficiency and performance of the proposed method are shown using a test case of numerical blood vessel phantom, where the initial pressure is exactly known for quantitative comparison.

  19. The Triangulation Algorithmic: A Transformative Function for Designing and Deploying Effective Educational Technology Assessment Instruments

    ERIC Educational Resources Information Center

    Osler, James Edward

    2013-01-01

    This paper discusses the implementation of the Tri-Squared Test as an advanced statistical measure used to verify and validate the research outcomes of Educational Technology software. A mathematical and epistemological rational is provided for the transformative process of qualitative data into quantitative outcomes through the Tri-Squared Test…

  20. An Overview of Addiction Research Center Inventory Scales (ARCI): An Appendix and Manual of Scales.

    ERIC Educational Resources Information Center

    Haertzen, C.A.

    The Addiction Research Center Inventory is a 550 item multipurpose test measuring the broad range of physical, emotive, cognitive, and subjective effects of drugs. This manual provides technical information concerning 38 most valid scales, a quantitative method for characterizing the similarity of a profile of scores for the subject, group, or…

  1. The Inevitable Corruption of Indicators and Educators through High-Stakes Testing

    ERIC Educational Resources Information Center

    Nichols, Sharon L.; Berliner, David C.

    2005-01-01

    This research provides lengthy proof of a principle of social science known as Campbell's law: "The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor." Applying…

  2. A Spectral Method for Color Quantitation of a Protein Drug Solution.

    PubMed

    Swartz, Trevor E; Yin, Jian; Patapoff, Thomas W; Horst, Travis; Skieresz, Susan M; Leggett, Gordon; Morgan, Charles J; Rahimi, Kimia; Marhoul, Joseph; Kabakoff, Bruce

    2016-01-01

    Color is an important quality attribute for biotherapeutics. In the biotechnology industry, a visual method is most commonly utilized for color characterization of liquid drug protein solutions. The color testing method is used for both batch release and on stability testing for quality control. Using that method, an analyst visually determines the color of the sample by choosing the closest matching European Pharmacopeia reference color solution. The requirement to judge the best match makes it a subjective method. Furthermore, the visual method does not capture data on hue or chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we describe a quantitative method for color determination that greatly reduces the variability in measuring color and allows for a more precise understanding of color differences. Following color industry standards established by International Commission on Illumination, this method converts a protein solution's visible absorption spectra to L*a*b* color space. Color matching is achieved within the L*a*b* color space, a practice that is already widely used in other industries. The work performed here is to facilitate the adoption and transition for the traditional visual assessment method to a quantitative spectral method. We describe here the algorithm used such that the quantitative spectral method correlates with the currently used visual method. In addition, we provide the L*a*b* values for the European Pharmacopeia reference color solutions required for the quantitative method. We have determined these L*a*b* values by gravimetrically preparing and measuring multiple lots of the reference color solutions. We demonstrate that the visual assessment and the quantitative spectral method are comparable using both low- and high-concentration antibody solutions and solutions with varying turbidity. In the biotechnology industry, a visual assessment is the most commonly used method for color characterization, batch release, and stability testing of liquid protein drug solutions. Using this method, an analyst visually determines the color of the sample by choosing the closest match to a standard color series. This visual method can be subjective because it requires an analyst to make a judgment of the best match of color of the sample to the standard color series, and it does not capture data on hue and chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we developed a quantitative spectral method for color determination that greatly reduces the variability in measuring color and allows for a more precise understanding of color differences. The details of the spectral quantitative method are described. A comparison between the visual assessment method and spectral quantitative method is presented. This study supports the transition to a quantitative spectral method from the visual assessment method for quality testing of protein solutions. © PDA, Inc. 2016.

  3. Quality of Life and Psychological State in Chinese Breast Cancer Patients Who Received BRCA1/2 Genetic Testing

    PubMed Central

    Qiu, Jiajia; Guan, Jiaqin; Yang, Xiaochen; Wu, Jiong; Liu, Guangyu; Di, Genhong; Chen, Canming; Hou, Yifeng; Han, Qixia; Shen, Zhenzhou; Shao, Zhimin; Hu, Zhen

    2016-01-01

    Background This study aims to understand the quality of life (QOL) and psychological state (PS) of Chinese breast cancer patients who received BRCA1/2 genetic testing; to examine the psychological changes between BRCA1/2 mutation carriers and non-carriers; and to further explore the psychological experience of BRCA1/2 mutation carriers. Methods This study was combined with quantitative and qualitative designs. First, we performed a quantitative investigation using FACT-B (Chinese version) and Irritability, Depression and Anxiety scale (IDA) to assess the QOL and PS in breast cancer patients who received BRCA1/2 genetic testing. Then semi-structured in-depth qualitative interviews among 13 mutation carriers were conducted in hospital. Results Results from the quantitative study showed QOL scores were relatively high and the IDA scores were relatively low among the patients, and there was no significant difference in the QOL or IDA scores between non-carriers and carriers. Based on the qualitative analysis, four main themes emerged: (1) Finding the reason for having breast cancer; (2) Negative emotions; (3) Behavioral changes; (4) Lack of information. Conclusions The present study showed that QOL and PS are good among the breast cancer patients who received genetic testing. Genetic testing itself does not cause long psychosocial effects. BRCA1/2 mutation carriers may have certain negative emotions at the first stage they knew the testing results and may initiate behavioral and lifestyle changes. The patients with a BRCA1/2 mutation desire knowledge with regard to genetic aspects in mainland China. Professional information and advice can be provided to relieve the patients’ negative emotions when they were informed of gene defect. PMID:27428375

  4. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures.

    PubMed

    Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D

    2014-05-01

    The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  5. Calculator Use on the "GRE"® Revised General Test Quantitative Reasoning Measure. ETS GRE® Board Research Report. ETS GRE®-14-02. ETS Research Report. RR-14-25

    ERIC Educational Resources Information Center

    Attali, Yigal

    2014-01-01

    Previous research on calculator use in standardized assessments of quantitative ability focused on the effect of calculator availability on item difficulty and on whether test developers can predict these effects. With the introduction of an on-screen calculator on the Quantitative Reasoning measure of the "GRE"® revised General Test, it…

  6. Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.

    PubMed

    Obuchowski, Nancy A; Bullen, Jennifer

    2017-01-01

    Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.

  7. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...

  8. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...

  9. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...

  10. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...

  11. The breaking load method - Results and statistical modification from the ASTM interlaboratory test program

    NASA Technical Reports Server (NTRS)

    Colvin, E. L.; Emptage, M. R.

    1992-01-01

    The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.

  12. El uso de las simulaciones educativas en la ensenanza de conceptos de ciencias y su importancia desde la perspectiva de los estudiantes candidatos a maestros

    NASA Astrophysics Data System (ADS)

    Crespo Ramos, Edwin O.

    This research was aimed at establishing the differences, if any, between traditional direct teaching and constructive teaching through the use of computer simulations and their effect on pre-service teachers. It's also intended to gain feedback on the users of these simulations as providers of constructive teaching and learning experiences. The experimental framework used a quantitative method with a descriptive focus. The research was guided by two hypothesis and five inquiries. The data was obtained from a group composed of twenty-nine students from a private Metropolitan University in Puerto Rico and elementary school pre-service teachers. They were divided into two sub-groups: experimental and control. Two means were used to collect data: tests and surveys. Quantitative data was analyzed through test "t" for paired samples and the non-parametric Wilcoxon test. The results of the pre and post tests do not provide enough evidence to conclude that using the simulations as learning tools was more effective than traditional teaching. However, the quantitative results obtained were not enough to reject or dismiss the hypothesis Ho1. On the other hand, an overall positive attitude towards these simulations was obtained from the surveys. The importance of including hands-on activities in daily lesson planning was proven and well recognized among practice teachers. After participating and working with these simulations, the practice teachers expressed being convinced that they would definitely use them as teaching tools in the classroom. Due to these results, hypothesis Ho2 was rejected. Evidence also proved that practice teachers need further professional development to improve their skills in the application of these simulations in the classroom environment. The majority of these practice teachers showed concern about not being instructed on important aspects of the use of simulation as part of their college education curriculum towards becoming teachers.

  13. Papers Based Electrochemical Biosensors: From Test Strips to Paper-Based Microfluidics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bingwen; Du, Dan; Hua, Xin

    2014-05-08

    Papers based biosensors such as lateral flow test strips and paper-based microfluidic devices (or paperfluidics) are inexpensive, rapid, flexible, and easy-to-use analytical tools. An apparent trend in their detection is to interpret sensing results from qualitative assessment to quantitative determination. Electrochemical detection plays an important role in quantification. This review focuses on electrochemical (EC) detection enabled biosensors. The first part provides detailed examples in paper test strips. The second part gives an overview of paperfluidics engaging EC detections. The outlook and recommendation of future directions of EC enabled biosensors are discussed in the end.

  14. Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.

    PubMed

    Goodson-Gregg, Nathan; De Stasio, Elizabeth A

    2009-01-01

    While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.

  15. DGT Passive Sampling for Quantitative in Situ Measurements of Compounds from Household and Personal Care Products in Waters.

    PubMed

    Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C

    2017-11-21

    Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.

  16. Implementing online quantitative support modules in an intermediate-level course

    NASA Astrophysics Data System (ADS)

    Daly, J.

    2011-12-01

    While instructors typically anticipate that students in introductory geology courses enter a class with a wide range of quantitative ability, we often overlook the fact that this may also be true in upper-level courses. Some students are drawn to the subject and experience success in early courses with an emphasis on descriptive geology, then experience frustration and disappointment in mid- and upper-level courses that are more quantitative. To bolster student confidence in quantitative skills and enhance their performance in an upper-level course, I implemented several modules from The Math You Need (TMYN) online resource with a 200-level geomorphology class. Student facility with basic quantitative skills (rearranging equations, manipulating units, and graphing) was assessed with an online pre- and post-test. During the semester, modules were assigned to complement existing course activities (for example, the module on manipulating units was assigned prior to a lab on measurement of channel area and water velocity, then calculation of discharge). The implementation was designed to be a concise review of relevant skills for students with higher confidence in their quantitative abilities, and to provide a self-paced opportunity for students with less quantitative facility to build skills. This course already includes a strong emphasis on quantitative data collection, analysis, and presentation; in the past, student performance in the course has been strongly influenced by their individual quantitative ability. I anticipate that giving students the opportunity to improve mastery of fundamental quantitative skills will improve their performance on higher-stakes assignments and exams, and will enhance their sense of accomplishment in the course.

  17. hCG Test (Pregnancy Test)

    MedlinePlus

    ... Time and International Normalized Ratio (PT/INR) PSEN1 Quantitative Immunoglobulins Red Blood Cell (RBC) Antibody Identification Red ... Us Also Known As Pregnancy Test Qualitative hCG Quantitative hCG Beta hCG Total hCG Total beta hCG ...

  18. Allergy Blood Test

    MedlinePlus

    ... have an allergy. Other names: IgE allergy test, Quantitative IgE, Immunoglobulin E, Total IgE, Specific IgE What ... Thermo Fisher Scientific Inc.; c2017. ImmunoCAP – a truly quantitative allergy test [cited 2017 Feb 24]; [about 3 ...

  19. Quantitative Imaging of Young's Modulus of Soft Tissues from Ultrasound Water Jet Indentation: A Finite Element Study

    PubMed Central

    Lu, Min-Hua; Mao, Rui; Lu, Yin; Liu, Zheng; Wang, Tian-Fu; Chen, Si-Ping

    2012-01-01

    Indentation testing is a widely used approach to evaluate mechanical characteristics of soft tissues quantitatively. Young's modulus of soft tissue can be calculated from the force-deformation data with known tissue thickness and Poisson's ratio using Hayes' equation. Our group previously developed a noncontact indentation system using a water jet as a soft indenter as well as the coupling medium for the propagation of high-frequency ultrasound. The novel system has shown its ability to detect the early degeneration of articular cartilage. However, there is still lack of a quantitative method to extract the intrinsic mechanical properties of soft tissue from water jet indentation. The purpose of this study is to investigate the relationship between the loading-unloading curves and the mechanical properties of soft tissues to provide an imaging technique of tissue mechanical properties. A 3D finite element model of water jet indentation was developed with consideration of finite deformation effect. An improved Hayes' equation has been derived by introducing a new scaling factor which is dependent on Poisson's ratios v, aspect ratio a/h (the radius of the indenter/the thickness of the test tissue), and deformation ratio d/h. With this model, the Young's modulus of soft tissue can be quantitatively evaluated and imaged with the error no more than 2%. PMID:22927890

  20. Multiwavelength UV/visible spectroscopy for the quantitative investigation of platelet quality

    NASA Astrophysics Data System (ADS)

    Mattley, Yvette D.; Leparc, German F.; Potter, Robert L.; Garcia-Rubio, Luis H.

    1998-04-01

    The quality of platelets transfused is vital to the effectiveness of the transfusion. Freshly prepared, discoid platelets are the most effective treatment for preventing spontaneous hemorrhage or for stopping an abnormal bleeding event. Current methodology for the routine testing of platelet quality involves random pH testing of platelet rich plasma and visual inspection of platelet rich plasma for a swirling pattern indicative of the discoid shape of the cells. The drawback to these methods is that they do not provide a quantitative and objective assay for platelet functionality that can be used on each platelet unit prior to transfusion. As part of a larger project aimed at characterizing whole blood and blood components with multiwavelength UV/vis spectroscopy, isolated platelets and platelet in platelet rich plasma have been investigated. Models based on Mie theory have been developed which allow for the extraction of quantitative information on platelet size, number and quality from multi-wavelength UV/vis spectra. These models have been used to quantify changes in platelet rich plasma during storage. The overall goal of this work is to develop a simple, rapid quantitative assay for platelet quality that can be used prior to platelet transfusion to ensure the effectiveness of the treatment. As a result of this work, the optical properties for isolated platelets, platelet rich plasma and leukodepleted platelet rich plasma have been determined.

  1. Quantitative optical scanning tests of complex microcircuits

    NASA Technical Reports Server (NTRS)

    Erickson, J. J.

    1980-01-01

    An approach for the development of the optical scanner as a screening inspection instrument for microcircuits involves comparing the quantitative differences in photoresponse images and then correlating them with electrical parameter differences in test devices. The existing optical scanner was modified so that the photoresponse data could be recorded and subsequently digitized. A method was devised for applying digital image processing techniques to the digitized photoresponse data in order to quantitatively compare the data. Electrical tests were performed and photoresponse images were recorded before and following life test intervals on two groups of test devices. Correlations were made between differences or changes in the electrical parameters of the test devices.

  2. Runway Incursion Prevention System: Demonstration and Testing at the Dallas/Fort Worth International Airport

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.; Quach, Cuong C.; Young, Steven D.

    2007-01-01

    A Runway Incursion Prevention System (RIPS) was tested at the Dallas-Ft. Worth International Airport (DFW) in October 2000. The system integrated airborne and ground components to provide both pilots and controllers with enhanced situational awareness, supplemental guidance cues, a real-time display of traffic information, and warning of runway incursions in order to prevent runway incidents while also improving operational capability. A series of test runs was conducted using NASA s Boeing 757 research aircraft and a test van equipped to emulate an incurring aircraft. The system was also demonstrated to over 100 visitors from the aviation community. This paper gives an overview of the RIPS, DFW flight test activities, and quantitative and qualitative results of the testing.

  3. A reliability as an independent variable (RAIV) methodology for optimizing test planning for liquid rocket engines

    NASA Astrophysics Data System (ADS)

    Strunz, Richard; Herrmann, Jeffrey W.

    2011-12-01

    The hot fire test strategy for liquid rocket engines has always been a concern of space industry and agency alike because no recognized standard exists. Previous hot fire test plans focused on the verification of performance requirements but did not explicitly include reliability as a dimensioning variable. The stakeholders are, however, concerned about a hot fire test strategy that balances reliability, schedule, and affordability. A multiple criteria test planning model is presented that provides a framework to optimize the hot fire test strategy with respect to stakeholder concerns. The Staged Combustion Rocket Engine Demonstrator, a program of the European Space Agency, is used as example to provide the quantitative answer to the claim that a reduced thrust scale demonstrator is cost beneficial for a subsequent flight engine development. Scalability aspects of major subsystems are considered in the prior information definition inside the Bayesian framework. The model is also applied to assess the impact of an increase of the demonstrated reliability level on schedule and affordability.

  4. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  5. The influence of copper precipitation and plastic deformation hardening on the impact-transition temperature of rolled structural steels

    NASA Astrophysics Data System (ADS)

    Aróztegui, Juan J.; Urcola, José J.; Fuentes, Manuel

    1989-09-01

    Commercial electric arc melted low-carbon steels, provided as I beams, were characterized both microstructurally and mechanically in the as-rolled, copper precipitation, and plastically pre-deformed conditions. Inclusion size distribution, ferrite grain size, pearlite volume fraction, precipitated volume fraction of copper, and size distribution of these precipitates were deter-mined by conventional quantitative optical and electron metallographic techniques. From the tensile tests conducted at a strain rate of 10-3 s-1 and impact Charpy V-notched tests carried out, stress/strain curves, yield stress, and impact-transition temperature were obtained. The spe-cific fractographic features of the fracture surfaces also were quantitatively characterized. The increases in yield stress and transition temperature experienced upon either aging or work hard-ening were related through empirical relationships. These dependences were analyzed semi-quantitatively by combining microscopic and macroscopic fracture criteria based on measured fundamental properties (fracture stress and yield stress) and observed fractographic parameters (crack nucleation distance and nuclei size). The rationale developed from these fracture criteria allows the semiquantitative prediction of the temperature transition shifts produced upon aging and work hardening. The values obtained are of the right order of magnitude.

  6. Integrated Smartphone-App-Chip System for On-Site Parts-Per-Billion-Level Colorimetric Quantitation of Aflatoxins.

    PubMed

    Li, Xiaochun; Yang, Fan; Wong, Jessica X H; Yu, Hua-Zhong

    2017-09-05

    We demonstrate herein an integrated, smartphone-app-chip (SPAC) system for on-site quantitation of food toxins, as demonstrated with aflatoxin B1 (AFB1), at parts-per-billion (ppb) level in food products. The detection is based on an indirect competitive immunoassay fabricated on a transparent plastic chip with the assistance of a microfluidic channel plate. A 3D-printed optical accessory attached to a smartphone is adapted to align the assay chip and to provide uniform illumination for imaging, with which high-quality images of the assay chip are captured by the smartphone camera and directly processed using a custom-developed Android app. The performance of this smartphone-based detection system was tested using both spiked and moldy corn samples; consistent results with conventional enzyme-linked immunosorbent assay (ELISA) kits were obtained. The achieved detection limit (3 ± 1 ppb, equivalent to μg/kg) and dynamic response range (0.5-250 ppb) meet the requested testing standards set by authorities in China and North America. We envision that the integrated SPAC system promises to be a simple and accurate method of food toxin quantitation, bringing much benefit for rapid on-site screening.

  7. Physiological frailty index (PFI): quantitative in-life estimate of individual biological age in mice.

    PubMed

    Antoch, Marina P; Wrobel, Michelle; Kuropatwinski, Karen K; Gitlin, Ilya; Leonova, Katerina I; Toshkov, Ilia; Gleiberman, Anatoli S; Hutson, Alan D; Chernova, Olga B; Gudkov, Andrei V

    2017-03-19

    The development of healthspan-extending pharmaceuticals requires quantitative estimation of age-related progressive physiological decline. In humans, individual health status can be quantitatively assessed by means of a frailty index (FI), a parameter which reflects the scale of accumulation of age-related deficits. However, adaptation of this methodology to animal models is a challenging task since it includes multiple subjective parameters. Here we report a development of a quantitative non-invasive procedure to estimate biological age of an individual animal by creating physiological frailty index (PFI). We demonstrated the dynamics of PFI increase during chronological aging of male and female NIH Swiss mice. We also demonstrated acceleration of growth of PFI in animals placed on a high fat diet, reflecting aging acceleration by obesity and provide a tool for its quantitative assessment. Additionally, we showed that PFI could reveal anti-aging effect of mTOR inhibitor rapatar (bioavailable formulation of rapamycin) prior to registration of its effects on longevity. PFI revealed substantial sex-related differences in normal chronological aging and in the efficacy of detrimental (high fat diet) or beneficial (rapatar) aging modulatory factors. Together, these data introduce PFI as a reliable, non-invasive, quantitative tool suitable for testing potential anti-aging pharmaceuticals in pre-clinical studies.

  8. Quantitative application of the primary progressive aphasia consensus criteria

    PubMed Central

    Wicklund, Meredith R.; Duffy, Joseph R.; Strand, Edythe A.; Machulda, Mary M.; Whitwell, Jennifer L.

    2014-01-01

    Objective: To determine how well the consensus criteria could classify subjects with primary progressive aphasia (PPA) using a quantitative speech and language battery that matches the test descriptions provided by the consensus criteria. Methods: A total of 105 participants with a neurodegenerative speech and language disorder were prospectively recruited and underwent neurologic, neuropsychological, and speech and language testing and MRI in this case-control study. Twenty-one participants with apraxia of speech without aphasia served as controls. Select tests from the speech and language battery were chosen for application of consensus criteria and cutoffs were employed to determine syndromic classification. Hierarchical cluster analysis was used to examine participants who could not be classified. Results: Of the 84 participants, 58 (69%) could be classified as agrammatic (27%), semantic (7%), or logopenic (35%) variants of PPA. The remaining 31% of participants could not be classified. Of the unclassifiable participants, 2 clusters were identified. The speech and language profile of the first cluster resembled mild logopenic PPA and the second cluster semantic PPA. Gray matter patterns of loss of these 2 clusters of unclassified participants also resembled mild logopenic and semantic variants. Conclusions: Quantitative application of consensus PPA criteria yields the 3 syndromic variants but leaves a large proportion unclassified. Therefore, the current consensus criteria need to be modified in order to improve sensitivity. PMID:24598709

  9. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  10. Manned Versus Unmanned Risk and Complexity Considerations for Future Midsized X-Planes

    NASA Technical Reports Server (NTRS)

    Lechniak, Jason A.; Melton, John E.

    2017-01-01

    The objective of this work was to identify and estimate complexity and risks associated with the development and testing of new low-cost medium-scale X-plane aircraft primarily focused on air transport operations. Piloting modes that were evaluated for this task were manned, remotely piloted, and unmanned flight research programs. This analysis was conducted early in the data collection period for X-plane concept vehicles before preliminary designs were complete. Over 50 different aircraft and system topics were used to evaluate the three piloting control modes. Expert group evaluations from a diverse set of pilots, engineers, and other experts at Aeronautics Research Mission Directorate centers within the National Aeronautics and Space Administration provided qualitative reasoning on the many issues surrounding the decisions regarding piloting modes. The group evaluations were numerically rated to evaluate each topic quantitatively and were used to provide independent criteria for vehicle complexity and risk. An Edwards Air Force Base instruction document was identified that emerged as a source of the effects found in our qualitative and quantitative data. The study showed that a manned aircraft was the best choice to align with test activities for transport aircraft flight research from a low-complexity and low-risk perspective. The study concluded that a manned aircraft option would minimize the risk and complexity to improve flight-test efficiency and bound the cost of the flight-test portion of the program. Several key findings and discriminators between the three modes are discussed in detail.

  11. Synthesising quantitative and qualitative research in evidence-based patient information.

    PubMed

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-03-01

    Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review.

  12. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... possess appropriate advanced degrees and experience in test development or psychometric research, to... quantitative domains, the Secretary reviews only those subtests covering verbal and quantitative domains...

  13. Functional assessment of the ex vivo vocal folds through biomechanical testing: A review

    PubMed Central

    Dion, Gregory R.; Jeswani, Seema; Roof, Scott; Fritz, Mark; Coelho, Paulo; Sobieraj, Michael; Amin, Milan R.; Branski, Ryan C.

    2016-01-01

    The human vocal folds are complex structures made up of distinct layers that vary in cellular and extracellular composition. The mechanical properties of vocal fold tissue are fundamental to the study of both the acoustics and biomechanics of voice production. To date, quantitative methods have been applied to characterize the vocal fold tissue in both normal and pathologic conditions. This review describes, summarizes, and discusses the most commonly employed methods for vocal fold biomechanical testing. Force-elongation, torsional parallel plate rheometry, simple-shear parallel plate rheometry, linear skin rheometry, and indentation are the most frequently employed biomechanical tests for vocal fold tissues and each provide material properties data that can be used to compare native tissue verses diseased for treated tissue. Force-elongation testing is clinically useful, as it allows for functional unit testing, while rheometry provides physiologically relevant shear data, and nanoindentation permits micrometer scale testing across different areas of the vocal fold as well as whole organ testing. Thoughtful selection of the testing technique during experimental design to evaluate a hypothesis is important to optimizing biomechanical testing of vocal fold tissues. PMID:27127075

  14. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    High resolution calibrated infrared imagery of vehicles during hypervelocity atmospheric entry or sustained hypersonic cruise has the potential to provide flight data on the distribution of surface temperature and the state of the airflow over the vehicle. In the early 1980 s NASA sought to obtain high spatial resolution infrared imagery of the Shuttle during entry. Despite mission execution with a technically rigorous pre-planning capability, the single airborne optical system for this attempt was considered developmental and the scientific return was marginal. In 2005 the Space Shuttle Program again sponsored an effort to obtain imagery of the Orbiter. Imaging requirements were targeted towards Shuttle ascent; companion requirements for entry did not exist. The engineering community was allowed to define observation goals and incrementally demonstrate key elements of a quantitative spatially resolved measurement capability over a series of flights. These imaging opportunities were extremely beneficial and clearly demonstrated capability to capture infrared imagery with mature and operational assets of the US Navy and the Missile Defense Agency. While successful, the usefulness of the imagery was, from an engineering perspective, limited. These limitations were mainly associated with uncertainties regarding operational aspects of data acquisition. These uncertainties, in turn, came about because of limited pre-flight mission planning capability, a poor understanding of several factors including the infrared signature of the Shuttle, optical hardware limitations, atmospheric effects and detector response characteristics. Operational details of sensor configuration such as detector integration time and tracking system algorithms were carried out ad hoc (best practices) which led to low probability of target acquisition and detector saturation. Leveraging from the qualified success during Return-to-Flight, the NASA Engineering and Safety Center sponsored an assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.

  15. Quantitative measures of walking and strength provide insight into brain corticospinal tract pathology in multiple sclerosis.

    PubMed

    Fritz, Nora E; Keller, Jennifer; Calabresi, Peter A; Zackowski, Kathleen M

    2017-01-01

    At least 85% of individuals with multiple sclerosis report walking dysfunction as their primary complaint. Walking and strength measures are common clinical measures to mark increasing disability or improvement with rehabilitation. Previous studies have shown an association between strength or walking ability and spinal cord MRI measures, and strength measures with brainstem corticospinal tract magnetization transfer ratio. However, the relationship between walking performance and brain corticospinal tract magnetization transfer imaging measures and the contribution of clinical measurements of walking and strength to the underlying integrity of the corticospinal tract has not been explored in multiple sclerosis. The objectives of this study were explore the relationship of quantitative measures of walking and strength to whole-brain corticospinal tract-specific MRI measures and to determine the contribution of quantitative measures of function in addition to basic clinical measures (age, gender, symptom duration and Expanded Disability Status Scale) to structural imaging measures of the corticospinal tract. We hypothesized that quantitative walking and strength measures would be related to brain corticospinal tract-specific measures, and would provide insight into the heterogeneity of brain pathology. Twenty-nine individuals with relapsing-remitting multiple sclerosis (mean(SD) age 48.7 (11.5) years; symptom duration 11.9(8.7); 17 females; median[range] Expanded Disability Status Scale 4.0 [1.0-6.5]) and 29 age and gender-matched healthy controls (age 50.8(11.6) years; 20 females) participated in clinical tests of strength and walking (Timed Up and Go, Timed 25 Foot Walk, Two Minute Walk Test ) as well as 3 T imaging including diffusion tensor imaging and magnetization transfer imaging. Individuals with multiple sclerosis were weaker (p = 0.0024) and walked slower (p = 0.0013) compared to controls. Quantitative measures of walking and strength were significantly related to corticospinal tract fractional anisotropy (r > 0.26; p < 0.04) and magnetization transfer ratio (r > 0.29; p < 0.03) measures. Although the Expanded Disability Status Scale was highly correlated with walking measures, it was not significantly related to either corticospinal tract fractional anisotropy or magnetization transfer ratio (p > 0.05). Walk velocity was a significant contributor to magnetization transfer ratio (p = 0.006) and fractional anisotropy (p = 0.011) in regression modeling that included both quantitative measures of function and basic clinical information. Quantitative measures of strength and walking are associated with brain corticospinal tract pathology. The addition of these quantitative measures to basic clinical information explains more of the variance in corticospinal tract fractional anisotropy and magnetization transfer ratio than the basic clinical information alone. Outcome measurement for multiple sclerosis clinical trials has been notoriously challenging; the use of quantitative measures of strength and walking along with tract-specific imaging methods may improve our ability to monitor disease change over time, with intervention, and provide needed guidelines for developing more effective targeted rehabilitation strategies.

  16. Automated classification of cell morphology by coherence-controlled holographic microscopy

    NASA Astrophysics Data System (ADS)

    Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim

    2017-08-01

    In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity.

  17. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  18. NAIMA: target amplification strategy allowing quantitative on-chip detection of GMOs.

    PubMed

    Morisset, Dany; Dobnik, David; Hamels, Sandrine; Zel, Jana; Gruden, Kristina

    2008-10-01

    We have developed a novel multiplex quantitative DNA-based target amplification method suitable for sensitive, specific and quantitative detection on microarray. This new method named NASBA Implemented Microarray Analysis (NAIMA) was applied to GMO detection in food and feed, but its application can be extended to all fields of biology requiring simultaneous detection of low copy number DNA targets. In a first step, the use of tailed primers allows the multiplex synthesis of template DNAs in a primer extension reaction. A second step of the procedure consists of transcription-based amplification using universal primers. The cRNA product is further on directly ligated to fluorescent dyes labelled 3DNA dendrimers allowing signal amplification and hybridized without further purification on an oligonucleotide probe-based microarray for multiplex detection. Two triplex systems have been applied to test maize samples containing several transgenic lines, and NAIMA has shown to be sensitive down to two target copies and to provide quantitative data on the transgenic contents in a range of 0.1-25%. Performances of NAIMA are comparable to singleplex quantitative real-time PCR. In addition, NAIMA amplification is faster since 20 min are sufficient to achieve full amplification.

  19. Automated classification of cell morphology by coherence-controlled holographic microscopy.

    PubMed

    Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim

    2017-08-01

    In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  20. NAIMA: target amplification strategy allowing quantitative on-chip detection of GMOs

    PubMed Central

    Morisset, Dany; Dobnik, David; Hamels, Sandrine; Žel, Jana; Gruden, Kristina

    2008-01-01

    We have developed a novel multiplex quantitative DNA-based target amplification method suitable for sensitive, specific and quantitative detection on microarray. This new method named NASBA Implemented Microarray Analysis (NAIMA) was applied to GMO detection in food and feed, but its application can be extended to all fields of biology requiring simultaneous detection of low copy number DNA targets. In a first step, the use of tailed primers allows the multiplex synthesis of template DNAs in a primer extension reaction. A second step of the procedure consists of transcription-based amplification using universal primers. The cRNA product is further on directly ligated to fluorescent dyes labelled 3DNA dendrimers allowing signal amplification and hybridized without further purification on an oligonucleotide probe-based microarray for multiplex detection. Two triplex systems have been applied to test maize samples containing several transgenic lines, and NAIMA has shown to be sensitive down to two target copies and to provide quantitative data on the transgenic contents in a range of 0.1–25%. Performances of NAIMA are comparable to singleplex quantitative real-time PCR. In addition, NAIMA amplification is faster since 20 min are sufficient to achieve full amplification. PMID:18710880

  1. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    PubMed

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  2. Confirmatory and quantitative analysis of beta-lactam antibiotics in bovine kidney tissue by dispersive solid-phase extraction and liquid chromatography-tandem mass spectrometry.

    PubMed

    Fagerquist, Clifton K; Lightfield, Alan R; Lehotay, Steven J

    2005-03-01

    A simple, rapid, rugged, sensitive, and specific method for the confirmation and quantitation of 10 beta-lactam antibiotics in fortified and incurred bovine kidney tissue has been developed. The method uses a simple solvent extraction, dispersive solid-phase extraction (dispersive-SPE) cleanup, and liquid chromatography-tandem mass spectrometry (LC/MS/MS) for confirmation and quantitation. Dispersive-SPE greatly simplifies and accelerates sample cleanup and improves overall recoveries compared with conventional SPE cleanup. The beta-lactam antibiotics tested were as follows: deacetylcephapirin (an antimicrobial metabolite of cephapirin), amoxicillin, desfuroylceftiofur cysteine disulfide (DCCD, an antimicrobial metabolite of ceftiofur), ampicillin, cefazolin, penicillin G, oxacillin, cloxacillin, naficillin, and dicloxacillin. Average recoveries of fortified samples were 70% or better for all beta-lactams except DCCD, which had an average recovery of 58%. The LC/MS/MS method was able to demonstrate quantitative recoveries at established tolerance levels and provide confirmatory data for unambiguous analyte identification. The method was also tested on 30 incurred bovine kidney samples obtained from the USDA Food Safety and Inspection Service, which had previously tested the samples using the approved semiquantitative microbial assay. The results from the quantitative LC/MS/MS analysis were in general agreement with the microbial assay for 23 samples although the LC/MS/MS method was superior in that it could specifically identify which beta-lactam was present and quantitate its concentration, whereas the microbial assay could only identify the type of beta-lactam present and report a concentration with respect to the microbial inhibition of a penicillin G standard. In addition, for 6 of the 23 samples, LC/MS/MS analysis detected a penicillin and a cephalosporin beta-lactam, whereas the microbial assay detected only a penicillin beta-lactam. For samples that do not fall into the "general agreement" category, the most serious discrepancy involves two samples where the LC/MS/MS method detected a violative level of a cephalosporin beta-lactam (deacetylcephapirin) in the first sample and a possibly violative level of desfuroylceftiofur in the second, whereas the microbial assay identified the two samples as having only violative levels of a penicillin beta-lactam.

  3. Analysis of Within-Test Variability of Non-Destructive Test Methods to Evaluate Compressive Strength of Normal Vibrated and Self-Compacting Concretes

    NASA Astrophysics Data System (ADS)

    Nepomuceno, Miguel C. S.; Lopes, Sérgio M. R.

    2017-10-01

    Non-destructive tests (NDT) have been used in the last decades for the assessment of in-situ quality and integrity of concrete elements. An important step in the application of NDT methods concerns to the interpretation and validation of the test results. In general, interpretation of NDT results should involve three distinct phases leading to the development of conclusions: processing of collected data, analysis of within-test variability and quantitative evaluation of property under investigation. The analysis of within-test variability can provide valuable information, since this can be compared with that of within-test variability associated with the NDT method in use, either to provide a measure of the quality control or to detect the presence of abnormal circumstances during the in-situ application. This paper reports the analysis of the experimental results of within-test variability of NDT obtained for normal vibrated concrete and self-compacting concrete. The NDT reported includes the surface hardness test, ultrasonic pulse velocity test, penetration resistance test, pull-off test, pull-out test and maturity test. The obtained results are discussed and conclusions are presented.

  4. The Effect of Peer Review on Student Learning Outcomes in a Research Methods Course

    ERIC Educational Resources Information Center

    Crowe, Jessica A.; Silva, Tony; Ceresola, Ryan

    2015-01-01

    In this study, we test the effect of in-class student peer review on student learning outcomes using a quasiexperimental design. We provide an assessment of peer review in a quantitative research methods course, which is a traditionally difficult and technical course. Data were collected from 170 students enrolled in four sections of a…

  5. Historical fire and vegetation dynamics in dry forests of the interior Pacific Northwest, USA, and relationships to northern spotted owl (Strix occidentalis caurina) habitat conservation

    Treesearch

    Rebecca S.H. Kennedy; Michael C. Wimberly

    2009-01-01

    Regional conservation planning frequently relies on general assumptions about historical disturbance regimes to inform decisions about landscape restoration, reserve allocations, and landscape management. Spatially explicit simulations of landscape dynamics provide quantitative estimates of landscape structure and allow for the testing of alternative scenarios. We used...

  6. Commentary on factors affecting transverse vibration using an idealized theoretical equation

    Treesearch

    Joseph F. Murphy

    2000-01-01

    An idealized theoretical equation to calculate flexural stiffness using transverse vibration of a simply end-supported beam is being considered by the American Society of Testing and Materials (ASTM) Wood Committee D07 to determine lumber modulus of elasticity. This commentary provides the user a quantitative view of six factors that affect the accuracy of using the...

  7. Title I ESEA, High School; English as a Second Language: 1979-1980. OEE Evaluation Report.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Office of Educational Evaluation.

    The report is an evaluation of the 1979-80 High School Title I English as a Second Language Program. Two types of information are presented: (1) a narrative description of the program which provides qualitative data regarding the program, and (2) a statistical analysis of test results which consists of quantitative, city-wide data. By integrating…

  8. The Impact of a Curricular Innovation on Prospective EFL Teachers' Attitudes towards ICT Integration into Language Instruction

    ERIC Educational Resources Information Center

    Hismanoglu, Murat

    2012-01-01

    This paper investigates whether the new EFL teacher training curriculum provides an efficient ICT training or not through both a quantitative and a partially qualitative research methodology. One hundred twenty-four prospective EFL teachers participated in this study and the results of a series of Independent Samples T-tests highlight that the…

  9. Current capabilities and limitations of the stable isotope technologies and applied mathematical equations in determining whole body vitamin A status

    USDA-ARS?s Scientific Manuscript database

    Vitamin A (VA) stable isotope dilution methodology provides a quantitative estimate of total body VA stores and is the best method currently available for assessing VA status in adults and children. The methodology has also been used to test the efficacy of VA interventions in a number of low-incom...

  10. KSC01pp0808

    NASA Image and Video Library

    2001-04-17

    Workers at Astrotech, Titusville, Fla., begin deploying the magnetometer boom on the GOES-M satellite. The satellite is undergoing testing at Astrotech. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite is scheduled to launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  11. KSC01pp0795

    NASA Image and Video Library

    2001-04-12

    At Astrotech, Titusville, Fla., an overhead crane lifts the GOES-M (Geostationary Operational Environmental Satellite) from the transporter. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite will undergo testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  12. KSC01pp0798

    NASA Image and Video Library

    2001-04-12

    At Astrotech, Titusville, Fla., workers look over the GOES-M satellite after removal of its protective cover. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite will undergo testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  13. KSC01pp0800

    NASA Image and Video Library

    2001-04-12

    While an overhead crane lifts the GOES-M satellite at Astrotech, Titusville, Fla., workers check the underside. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite is undergoing testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  14. KSC01pp0803

    NASA Image and Video Library

    2001-04-12

    At Astrotech, Titusville, Fla., the GOES-M satellite is lifted at an angle on a workstand. The satellite is undergoing testing at Astrotech. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite is scheduled to launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  15. KSC01pp0802

    NASA Image and Video Library

    2001-04-12

    At Astrotech, Titusville, Fla., a worker (right) turns the GOES-M satellite, bringing its side into view. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite is undergoing testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  16. KSC01pp0809

    NASA Image and Video Library

    2001-04-17

    Workers at Astrotech, Titusville, Fla., begin deploying the magnetometer boom on the GOES-M satellite. The satellite is undergoing testing at Astrotech. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite is scheduled to launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  17. KSC01pp0810

    NASA Image and Video Library

    2001-04-13

    Workers at Astrotech, Titusville, Fla., deploy the magnetometer boom on the GOES-M satellite. The satellite is undergoing testing at Astrotech. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite is scheduled to launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  18. The Inevitable Corruption of Indicators and Educators through High-Stakes Testing. Executive Summary

    ERIC Educational Resources Information Center

    Nichols, Sharon L.; Berliner, David C.

    2005-01-01

    This research provides lengthy proof of a principle of social science known as Campbell's law: "The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor." Applying…

  19. Computational Psychometrics Meets Hollywood: The Complexity in Emotional Storytelling.

    PubMed

    Cipresso, Pietro; Riva, Giuseppe

    2016-01-01

    Expressions of emotions are pervasive in media, especially in movies. In this article, we focus on the emotional relationships of movie characters in narrative thought and emotional storytelling. Several studies examine emotion elicitation through movies, but there is a gap in scientific literature and in the practice to quantitatively consider emotions among the characters of a movie story, which in turn provide the basis of spectator emotion elicitation. Some might argument that the ultimate purpose of a movie is to elicit emotions in the viewers; however, we are highlighting that the path to emotional stimulation entails emotions among the characters composing a narrative and manipulating to enable the effective elicitation of viewers' emotions. Here we provided and tested an effective quantitative method for analyzing these relationships in emotional networks, which allow for a clear understanding of the effects of story changes on movie perceptions and pleasantness.

  20. Computational Psychometrics Meets Hollywood: The Complexity in Emotional Storytelling

    PubMed Central

    Cipresso, Pietro; Riva, Giuseppe

    2016-01-01

    Expressions of emotions are pervasive in media, especially in movies. In this article, we focus on the emotional relationships of movie characters in narrative thought and emotional storytelling. Several studies examine emotion elicitation through movies, but there is a gap in scientific literature and in the practice to quantitatively consider emotions among the characters of a movie story, which in turn provide the basis of spectator emotion elicitation. Some might argument that the ultimate purpose of a movie is to elicit emotions in the viewers; however, we are highlighting that the path to emotional stimulation entails emotions among the characters composing a narrative and manipulating to enable the effective elicitation of viewers' emotions. Here we provided and tested an effective quantitative method for analyzing these relationships in emotional networks, which allow for a clear understanding of the effects of story changes on movie perceptions and pleasantness. PMID:27877153

  1. Investigation of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian A.

    2005-01-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical model. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. Excellent agreement is achieved between the predicted and measured results, thereby quantitatively validating the numerical tool.

  2. Simple, Rapid, and Highly Sensitive Detection of Diphosgene and Triphosgene by Spectrophotometric Methods

    PubMed Central

    Joy, Abraham; Anim-Danso, Emmanuel; Kohn, Joachim

    2009-01-01

    Methods for the detection and estimation of diphosgene and triphosgene are described. These compounds are widely used phosgene precursors which produce an intensely colored purple pentamethine oxonol dye when reacted with 1,3-dimethylbarbituric acid (DBA) and pyridine (or a pyridine derivative). Two quantitative methods are described, based on either UV absorbance or fluorescence of the oxonol dye. Detection limits are ~ 4 µmol/L by UV and <0.4 µmol/L by fluorescence. The third method is a test strip for the simple and rapid detection and semi-quantitative estimation of diphosgene and triphosgene, using a filter paper embedded with dimethylbarbituric acid and poly(4-vinylpyridine). Addition of a test solution to the paper causes a color change from white to light blue at low concentrations and to pink at higher concentrations of triphosgene. The test strip is useful for quick on-site detection of triphosgene and diphosgene in reaction mixtures. The test strip is easy to perform and provides clear signal readouts indicative of the presence of phosgene precursors. The utility of this method was demonstrated by the qualitative determination of residual triphosgene during the production of poly(Bisphenol A carbonate). PMID:19782219

  3. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  4. Distributions of Mutational Effects and the Estimation of Directional Selection in Divergent Lineages of Arabidopsis thaliana.

    PubMed

    Park, Briton; Rutter, Matthew T; Fenster, Charles B; Symonds, V Vaughan; Ungerer, Mark C; Townsend, Jeffrey P

    2017-08-01

    Mutations are crucial to evolution, providing the ultimate source of variation on which natural selection acts. Due to their key role, the distribution of mutational effects on quantitative traits is a key component to any inference regarding historical selection on phenotypic traits. In this paper, we expand on a previously developed test for selection that could be conducted assuming a Gaussian mutation effect distribution by developing approaches to also incorporate any of a family of heavy-tailed Laplace distributions of mutational effects. We apply the test to detect directional natural selection on five traits along the divergence of Columbia and Landsberg lineages of Arabidopsis thaliana , constituting the first test for natural selection in any organism using quantitative trait locus and mutation accumulation data to quantify the intensity of directional selection on a phenotypic trait. We demonstrate that the results of the test for selection can depend on the mutation effect distribution specified. Using the distributions exhibiting the best fit to mutation accumulation data, we infer that natural directional selection caused divergence in the rosette diameter and trichome density traits of the Columbia and Landsberg lineages. Copyright © 2017 by the Genetics Society of America.

  5. The perennial problem of variability in adenosine triphosphate (ATP) tests for hygiene monitoring within healthcare settings.

    PubMed

    Whiteley, Greg S; Derry, Chris; Glasbey, Trevor; Fahey, Paul

    2015-06-01

    To investigate the reliability of commercial ATP bioluminometers and to document precision and variability measurements using known and quantitated standard materials. Four commercially branded ATP bioluminometers and their consumables were subjected to a series of controlled studies with quantitated materials in multiple repetitions of dilution series. The individual dilutions were applied directly to ATP swabs. To assess precision and reproducibility, each dilution step was tested in triplicate or quadruplicate and the RLU reading from each test point was recorded. Results across the multiple dilution series were normalized using the coefficient of variation. The results for pure ATP and bacterial ATP from suspensions of Staphylococcus epidermidis and Pseudomonas aeruginosa are presented graphically. The data indicate that precision and reproducibility are poor across all brands tested. Standard deviation was as high as 50% of the mean for all brands, and in the field users are not provided any indication of this level of imprecision. The variability of commercial ATP bioluminometers and their consumables is unacceptably high with the current technical configuration. The advantage of speed of response is undermined by instrument imprecision expressed in the numerical scale of relative light units (RLU).

  6. Behavioral momentum and resurgence: Effects of time in extinction and repeated resurgence tests

    PubMed Central

    Shahan, Timothy A.

    2014-01-01

    Resurgence is an increase in a previously extinguished operant response that occurs if an alternative reinforcement introduced during extinction is removed. Shahan and Sweeney (2011) developed a quantitative model of resurgence based on behavioral momentum theory that captures existing data well and predicts that resurgence should decrease as time in extinction and exposure to the alternative reinforcement increases. Two experiments tested this prediction. The data from Experiment 1 suggested that without a return to baseline, resurgence decreases with increased exposure to alternative reinforcement and to extinction of the target response. Experiment 2 tested the predictions of the model across two conditions, one with constant alternative reinforcement for five sessions, and the other with alternative reinforcement removed three times. In both conditions, the alternative reinforcement was removed for the final test session. Experiment 2 again demonstrated a decrease in relapse across repeated resurgence tests. Furthermore, comparably little resurgence was observed at the same time point in extinction in the final test, despite dissimilar previous exposures to alternative reinforcement removal. The quantitative model provided a good description of the observed data in both experiments. More broadly, these data suggest that increased exposure to extinction may be a successful strategy to reduce resurgence. The relationship between these data and existing tests of the effect of time in extinction on resurgence is discussed. PMID:23982985

  7. MO-AB-206-02: Testing Gamma Cameras Based On TG177 WG Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halama, J.

    2016-06-15

    This education session will cover the physics and operation principles of gamma cameras and PET scanners. The first talk will focus on PET imaging. An overview of the principles of PET imaging will be provided, including positron decay physics, and the transition from 2D to 3D imaging. More recent advances in hardware and software will be discussed, such as time-of-flight imaging, and improvements in reconstruction algorithms that provide for options such as depth-of-interaction corrections. Quantitative applications of PET will be discussed, as well as the requirements for doing accurate quantitation. Relevant performance tests will also be described. Learning Objectives: Bemore » able to describe basic physics principles of PET and operation of PET scanners. Learn about recent advances in PET scanner hardware technology. Be able to describe advances in reconstruction techniques and improvements Be able to list relevant performance tests. The second talk will focus on gamma cameras. The Nuclear Medicine subcommittee has charged a task group (TG177) to develop a report on the current state of physics testing of gamma cameras, SPECT, and SPECT/CT systems. The report makes recommendations for performance tests to be done for routine quality assurance, annual physics testing, and acceptance tests, and identifies those needed satisfy the ACR accreditation program and The Joint Commission imaging standards. The report is also intended to be used as a manual with detailed instructions on how to perform tests under widely varying conditions. Learning Objectives: At the end of the presentation members of the audience will: Be familiar with the tests recommended for routine quality assurance, annual physics testing, and acceptance tests of gamma cameras for planar imaging. Be familiar with the tests recommended for routine quality assurance, annual physics testing, and acceptance tests of SPECT systems. Be familiar with the tests of a SPECT/CT system that include the CT images for SPECT reconstructions. Become knowledgeable of items to be included in annual acceptance testing reports including CT dosimetry and PACS monitor measurements. T. Turkington, GE Healthcare.« less

  8. MO-AB-206-00: Nuclear Medicine Physics and Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This education session will cover the physics and operation principles of gamma cameras and PET scanners. The first talk will focus on PET imaging. An overview of the principles of PET imaging will be provided, including positron decay physics, and the transition from 2D to 3D imaging. More recent advances in hardware and software will be discussed, such as time-of-flight imaging, and improvements in reconstruction algorithms that provide for options such as depth-of-interaction corrections. Quantitative applications of PET will be discussed, as well as the requirements for doing accurate quantitation. Relevant performance tests will also be described. Learning Objectives: Bemore » able to describe basic physics principles of PET and operation of PET scanners. Learn about recent advances in PET scanner hardware technology. Be able to describe advances in reconstruction techniques and improvements Be able to list relevant performance tests. The second talk will focus on gamma cameras. The Nuclear Medicine subcommittee has charged a task group (TG177) to develop a report on the current state of physics testing of gamma cameras, SPECT, and SPECT/CT systems. The report makes recommendations for performance tests to be done for routine quality assurance, annual physics testing, and acceptance tests, and identifies those needed satisfy the ACR accreditation program and The Joint Commission imaging standards. The report is also intended to be used as a manual with detailed instructions on how to perform tests under widely varying conditions. Learning Objectives: At the end of the presentation members of the audience will: Be familiar with the tests recommended for routine quality assurance, annual physics testing, and acceptance tests of gamma cameras for planar imaging. Be familiar with the tests recommended for routine quality assurance, annual physics testing, and acceptance tests of SPECT systems. Be familiar with the tests of a SPECT/CT system that include the CT images for SPECT reconstructions. Become knowledgeable of items to be included in annual acceptance testing reports including CT dosimetry and PACS monitor measurements. T. Turkington, GE Healthcare.« less

  9. Test Plan: WIPP bin-scale CH TRU waste tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molecke, M.A.

    1990-08-01

    This WIPP Bin-Scale CH TRU Waste Test program described herein will provide relevant composition and kinetic rate data on gas generation and consumption resulting from TRU waste degradation, as impacted by synergistic interactions due to multiple degradation modes, waste form preparation, long-term repository environmental effects, engineered barrier materials, and, possibly, engineered modifications to be developed. Similar data on waste-brine leachate compositions and potentially hazardous volatile organic compounds released by the wastes will also be provided. The quantitative data output from these tests and associated technical expertise are required by the WIPP Performance Assessment (PA) program studies, and for the scientificmore » benefit of the overall WIPP project. This Test Plan describes the necessary scientific and technical aspects, justifications, and rational for successfully initiating and conducting the WIPP Bin-Scale CH TRU Waste Test program. This Test Plan is the controlling scientific design definition and overall requirements document for this WIPP in situ test, as defined by Sandia National Laboratories (SNL), scientific advisor to the US Department of Energy, WIPP Project Office (DOE/WPO). 55 refs., 16 figs., 19 tabs.« less

  10. The predictive power of Japanese candlestick charting in Chinese stock market

    NASA Astrophysics Data System (ADS)

    Chen, Shi; Bao, Si; Zhou, Yu

    2016-09-01

    This paper studies the predictive power of 4 popular pairs of two-day bullish and bearish Japanese candlestick patterns in Chinese stock market. Based on Morris' study, we give the quantitative details of definition of long candlestick, which is important in two-day candlestick pattern recognition but ignored by several previous researches, and we further give the quantitative definitions of these four pairs of two-day candlestick patterns. To test the predictive power of candlestick patterns on short-term price movement, we propose the definition of daily average return to alleviate the impact of correlation among stocks' overlap-time returns in statistical tests. To show the robustness of our result, two methods of trend definition are used for both the medium-market-value and large-market-value sample sets. We use Step-SPA test to correct for data snooping bias. Statistical results show that the predictive power differs from pattern to pattern, three of the eight patterns provide both short-term and relatively long-term prediction, another one pair only provide significant forecasting power within very short-term period, while the rest three patterns present contradictory results for different market value groups. For all the four pairs, the predictive power drops as predicting time increases, and forecasting power is stronger for stocks with medium market value than those with large market value.

  11. Comparative Performance of Reagents and Platforms for Quantitation of Cytomegalovirus DNA by Digital PCR

    PubMed Central

    Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.

    2016-01-01

    A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685

  12. Think Pair Share with Formative Assessment for Junior High School Student

    NASA Astrophysics Data System (ADS)

    Pradana, O. R. Y.; Sujadi, I.; Pramudya, I.

    2017-09-01

    Geometry is a science related to abstract thinking ability so that not many students are able to understand this material well. In this case, the learning model plays a crucial role in improving student achievement. This means that a less precise learning model will cause difficulties for students. Therefore, this study provides a quantitative explanation of the Think Pair Share learning model combined with the formative assessment. This study aims to test the Think Pair Share with the formative assessment on junior high school students. This research uses a quantitative approach of Pretest-Posttest in control group and experiment group. ANOVA test and Scheffe test used to analyse the effectiveness this learning. Findings in this study are student achievement on the material geometry with Think Pair Share using formative assessment has increased significantly. This happens probably because this learning makes students become more active during learning. Hope in the future, Think Pair Share with formative assessment be a useful learning for teachers and this learning applied by the teacher around the world especially on the material geometry.

  13. Ultrastructural effects of pharmaceuticals (carbamazepine, clofibric acid, metoprolol, diclofenac) in rainbow trout (Oncorhynchus mykiss) and common carp (Cyprinus carpio).

    PubMed

    Triebskorn, R; Casper, H; Scheil, V; Schwaiger, J

    2007-02-01

    In order to assess potential effects of human pharmaceuticals in aquatic wildlife, laboratory experiments were conducted with carbamazepine, clofibric acid, metoprolol, and diclofenac using fish as test organisms. For each substance, at least one environmentally relevant concentration was tested. In liver, kidney, and gills of trout and carp exposed to carbamazepine, clofibric acid, and metoprolol, ultrastructural effects were qualitatively described and semi-quantitatively assessed. The obtained assessment values were compared with previously published data for diclofenac-induced effects in rainbow trout tissues. Quantitative analyses of protein accumulated in kidneys of diclofenac-exposed trout corroborated previously published data which indicated that diclofenac induced a severe glomerulonephritis resulting in a hyaline droplet degeneration of proximal kidney tubules. The investigations provided information on the general health status of the pharmaceutical-exposed fish, and allowed a differential diagnosis of harmful effects caused by these human pharmaceuticals in non-target species. For the different cytological effects observed, lowest observed effect concentration (LOECs) for at least three of the test substances (diclofenac, carbamazepine, metoprolol) were in the range of environmentally relevant concentrations (1 microg/L).

  14. Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.

    PubMed

    Counsell, Alyssa; Harlow, Lisa L

    2017-05-01

    With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.

  15. Semi-quantitative MALDI-TOF for antimicrobial susceptibility testing in Staphylococcus aureus.

    PubMed

    Maxson, Tucker; Taylor-Howell, Cheryl L; Minogue, Timothy D

    2017-01-01

    Antibiotic resistant bacterial infections are a significant problem in the healthcare setting, in many cases requiring the rapid administration of appropriate and effective antibiotic therapy. Diagnostic assays capable of quickly and accurately determining the pathogen resistance profile are therefore crucial to initiate or modify care. Matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS) is a standard method for species identification in many clinical microbiology laboratories and is well positioned to be applied towards antimicrobial susceptibility testing. One recently reported approach utilizes semi-quantitative MALDI-TOF MS for growth rate analysis to provide a resistance profile independent of resistance mechanism. This method was previously successfully applied to Gram-negative pathogens and mycobacteria; here, we evaluated this method with the Gram-positive pathogen Staphylococcus aureus. Specifically, we used 35 strains of S. aureus and four antibiotics to optimize and test the assay, resulting in an overall accuracy rate of 95%. Application of the optimized assay also successfully determined susceptibility from mock blood cultures, allowing both species identification and resistance determination for all four antibiotics within 3 hours of blood culture positivity.

  16. WE-AB-BRB-10: Filmless QA of CyberKnife MLC-Collimated and Iris-Collimated Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gersh, J; Spectrum Medical Physics, LLC, Greenville, SC

    Purpose: Current methods of CK field shape QA is based on the use of radiochromic film. Though accurate results can be attained, these methods are prone to error, time consuming, and expensive. The techniques described herein perform similar QA using the FOIL Detector (Field, Output, and Image Localization). A key feature of this in-house QA solution, and central to this study, is an aSi flat-panel detector which provides the user with the means to perform accurate, immediate, and quantitative field analysis. Methods: The FOIL detector is automatically aligned in the CK beam using fiducial markers implanted within the detector case.more » Once the system is aligned, a treatment plan is delivered which irradiates the flat-panel imager using the field being tested. The current study tests each of the clinically-used fields shaped using the Iris variable-aperture collimation system using a plan which takes 6 minutes to deliver. The user is immediately provided with field diameter and beam profile, as well as a comparison to baseline values. Additionally, the detector is used to acquire and analyze leaf positions of the InCise multi-leaf collimation system. Results: Using a 6-minute plan consisting of 11 beams of 25MU-per-beam, the FOIL detector provided the user with a quantitative analysis of all clinically-used field shapes. The FOIL detector was also able to clearly resolve field edge junctions in a picket fence test, including slight over-travel of individual leaves as well as inter-leaf leakage. Conclusion: The FOIL system provided comparable field diameter and profile data when compared to methods using film; providing results much faster and with 5% of the MU used for film. When used with the MLC system, the FOIL detector provided the means for immediate quantification of the performance of the system through analysis of leaf positions in a picket fence test field. Author is the President/Owner of Spectrum Medical Physics, LLC, a company which maintains contracts with Siemens Healthcare and Standard Imaging, Inc.« less

  17. Quantitative experiments to explain the change of seasons

    NASA Astrophysics Data System (ADS)

    Testa, Italo; Busarello, Gianni; Puddu, Emanuella; Leccia, Silvio; Merluzzi, Paola; Colantonio, Arturo; Moretti, Maria Ida; Galano, Silvia; Zappia, Alessandro

    2015-03-01

    The science education literature shows that students have difficulty understanding what causes the seasons. Incorrect explanations are often due to a lack of knowledge about the physical mechanisms underlying this phenomenon. To address this, we present a module in which the students engage in quantitative measurements with a photovoltaic panel to explain changes to the sunray flow on Earth’s surface over the year. The activities also provide examples of energy transfers between the incoming radiation and the environment to introduce basic features of Earth’s climate. The module was evaluated with 45 secondary school students (aged 17-18) and a pre-/post-test research design. Analysis of students’ learning outcomes supports the effectiveness of the proposed activities.

  18. Comparative analysis of quantitative methodologies for Vibrionaceae biofilms.

    PubMed

    Chavez-Dozal, Alba A; Nourabadi, Neda; Erken, Martina; McDougald, Diane; Nishiguchi, Michele K

    2016-11-01

    Multiple symbiotic and free-living Vibrio spp. grow as a form of microbial community known as a biofilm. In the laboratory, methods to quantify Vibrio biofilm mass include crystal violet staining, direct colony-forming unit (CFU) counting, dry biofilm cell mass measurement, and observation of development of wrinkled colonies. Another approach for bacterial biofilms also involves the use of tetrazolium (XTT) assays (used widely in studies of fungi) that are an appropriate measure of metabolic activity and vitality of cells within the biofilm matrix. This study systematically tested five techniques, among which the XTT assay and wrinkled colony measurement provided the most reproducible, accurate, and efficient methods for the quantitative estimation of Vibrionaceae biofilms.

  19. Simultaneous Measurements of Temperature and Major Species Concentration in a Hydrocarbon-Fueled Dual Mode Scramjet Using WIDECARS

    NASA Astrophysics Data System (ADS)

    Gallo, Emanuela Carolina Angela

    Width increased dual-pump enhanced coherent anti-Stokes Raman spectroscopy (WIDECARS) measurements were conducted in a McKenna air-ethylene premixed burner, at nominal equivalence ratio range between 0.55 and 2.50 to provide quantitative measurements of six major combustion species (C2H 4, N2, O2, H2, CO, CO2) concentration and temperature simultaneously. The purpose of this test was to investigate the uncertainties in the experimental and spectral modeling methods in preparation for an subsequent scramjet C2H4/air combustion test at the University of Virginia-Aerospace Research Laboratory. A broadband Pyrromethene (PM) PM597 and PM650 dye laser mixture and optical cavity were studied and optimized to excite the Raman shift of all the target species. Two hundred single shot recorded spectra were processed, theoretically fitted and then compared to computational models, to verify where chemical equilibrium or adiabatic condition occurred, providing experimental flame location and formation, species concentrations, temperature, and heat losses inputs to computational kinetic models. The Stark effect, temperature, and concentration errors are discussed. Subsequently, WIDECARS measurements of a premixed air-ethylene flame were successfully acquired in a direct connect small-scale dual-mode scramjet combustor, at University of Virginia Supersonic Combustion Facility (UVaSCF). A nominal Mach 5 flight condition was simulated (stagnation pressure p0 = 300 kPa, temperature T0 = 1200 K, equivalence ratio range ER = 0.3 -- 0.4). The purpose of this test was to provide quantitative measurements of the six major combustion species concentration and temperature. Point-wise measurements were taken by mapping four two-dimensional orthogonal planes (before, within, and two planes after the cavity flame holder) with respect to the combustor freestream direction. Two hundred single shot recorded spectra were processed and theoretically fitted. Mean flow and standard deviation are provided for each investigated case. Within the flame limits tested, WIDECARS data were analyzed and compared with CFD simulations and OH-PLIF measurements.

  20. Comparison of intradermal dilutional testing, skin prick testing, and modified quantitative testing for common allergens.

    PubMed

    Peltier, Jacques; Ryan, Matthew W

    2007-08-01

    To compare and correlate wheal size using the Multi-Test II applicator with the endpoint obtained by intradermal dilutional testing (IDT) for 5 common allergens. To examine the safety of modified quantitative testing (MQT) for determining immunotherapy starting doses. Prospective comparative clinical study. A total of 134 subjects were simultaneously skin tested for immediate hypersensitivity using the Multi-Test II device and IDT. There was a 77% concordance between results from IDT and results from MQT. When there was a difference, MQT predicted a safer endpoint for starting immunotherapy in all but 2 cases. Wheal size by SPT is predictive of endpoint by IDT. MQT is nearly as effective as formal IDT in determining endpoint. Modified quantitative testing appears to be a safe alternative to IDT for determining starting doses for immunotherapy.

  1. Detection of gas leakage

    DOEpatents

    Thornberg, Steven [Peralta, NM; Brown, Jason [Albuquerque, NM

    2012-06-19

    A method of detecting leaks and measuring volumes as well as an apparatus, the Power-free Pump Module (PPM), that is a self-contained leak test and volume measurement apparatus that requires no external sources of electrical power during leak testing or volume measurement, where the invention is a portable, pneumatically-controlled instrument capable of generating a vacuum, calibrating volumes, and performing quantitative leak tests on a closed test system or device, all without the use of alternating current (AC) power. Capabilities include the ability is to provide a modest vacuum (less than 10 Torr), perform a pressure rise leak test, measure the gas's absolute pressure, and perform volume measurements. All operations are performed through a simple rotary control valve which controls pneumatically-operated manifold valves.

  2. 42 CFR 493.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... throughout the reportable range for patient test results. Challenge means, for quantitative tests, an... requirement. Target value for quantitative tests means either the mean of all participant responses after... scientific protocol, a comparative method or a method group (“peer” group) may be used. If the method group...

  3. 42 CFR 493.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... throughout the reportable range for patient test results. Challenge means, for quantitative tests, an... requirement. Target value for quantitative tests means either the mean of all participant responses after... scientific protocol, a comparative method or a method group (“peer” group) may be used. If the method group...

  4. 42 CFR 493.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... throughout the reportable range for patient test results. Challenge means, for quantitative tests, an... requirement. Target value for quantitative tests means either the mean of all participant responses after... scientific protocol, a comparative method or a method group (“peer” group) may be used. If the method group...

  5. 42 CFR 493.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... throughout the reportable range for patient test results. Challenge means, for quantitative tests, an... requirement. Target value for quantitative tests means either the mean of all participant responses after... scientific protocol, a comparative method or a method group (“peer” group) may be used. If the method group...

  6. 42 CFR 493.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... throughout the reportable range for patient test results. Challenge means, for quantitative tests, an... requirement. Target value for quantitative tests means either the mean of all participant responses after... scientific protocol, a comparative method or a method group (“peer” group) may be used. If the method group...

  7. Applications of multivariate modeling to neuroimaging group analysis: A comprehensive alternative to univariate general linear model

    PubMed Central

    Chen, Gang; Adleman, Nancy E.; Saad, Ziad S.; Leibenluft, Ellen; Cox, RobertW.

    2014-01-01

    All neuroimaging packages can handle group analysis with t-tests or general linear modeling (GLM). However, they are quite hamstrung when there are multiple within-subject factors or when quantitative covariates are involved in the presence of a within-subject factor. In addition, sphericity is typically assumed for the variance–covariance structure when there are more than two levels in a within-subject factor. To overcome such limitations in the traditional AN(C)OVA and GLM, we adopt a multivariate modeling (MVM) approach to analyzing neuroimaging data at the group level with the following advantages: a) there is no limit on the number of factors as long as sample sizes are deemed appropriate; b) quantitative covariates can be analyzed together with within- subject factors; c) when a within-subject factor is involved, three testing methodologies are provided: traditional univariate testing (UVT)with sphericity assumption (UVT-UC) and with correction when the assumption is violated (UVT-SC), and within-subject multivariate testing (MVT-WS); d) to correct for sphericity violation at the voxel level, we propose a hybrid testing (HT) approach that achieves equal or higher power via combining traditional sphericity correction methods (Greenhouse–Geisser and Huynh–Feldt) with MVT-WS. PMID:24954281

  8. Conducting field studies for testing pesticide leaching models

    USGS Publications Warehouse

    Smith, Charles N.; Parrish, Rudolph S.; Brown, David S.

    1990-01-01

    A variety of predictive models are being applied to evaluate the transport and transformation of pesticides in the environment. These include well known models such as the Pesticide Root Zone Model (PRZM), the Risk of Unsaturated-Saturated Transport and Transformation Interactions for Chemical Concentrations Model (RUSTIC) and the Groundwater Loading Effects of Agricultural Management Systems Model (GLEAMS). The potentially large impacts of using these models as tools for developing pesticide management strategies and regulatory decisions necessitates development of sound model validation protocols. This paper offers guidance on many of the theoretical and practical problems encountered in the design and implementation of field-scale model validation studies. Recommendations are provided for site selection and characterization, test compound selection, data needs, measurement techniques, statistical design considerations and sampling techniques. A strategy is provided for quantitatively testing models using field measurements.

  9. On the stability and instantaneous velocity of grasped frictionless objects

    NASA Technical Reports Server (NTRS)

    Trinkle, Jeffrey C.

    1992-01-01

    A quantitative test for form closure valid for any number of contact points is formulated as a linear program, the optimal objective value of which provides a measure of how far a grasp is from losing form closure. Another contribution of the study is the formulation of a linear program whose solution yields the same information as the classical approach. The benefit of the formulation is that explicit testing of all possible combinations of contact interactions can be avoided by the algorithm used to solve the linear program.

  10. Scratching as a Fracture Process: From Butter to Steel

    NASA Astrophysics Data System (ADS)

    Akono, A.-T.; Reis, P. M.; Ulm, F.-J.

    2011-05-01

    We present results of a hybrid experimental and theoretical investigation of the fracture scaling in scratch tests and show that scratching is a fracture dominated process. Validated for paraffin wax, cement paste, Jurassic limestone and steel, we derive a model that provides a quantitative means to relate quantities measured in scratch tests to fracture properties of materials at multiple scales. The scalability of scratching for different probes and depths opens new venues towards miniaturization of our technique, to extract fracture properties of materials at even smaller length scales.

  11. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    PubMed

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.

  12. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  13. Cytomegalovirus replication kinetics in solid organ transplant recipients managed by preemptive therapy.

    PubMed

    Atabani, S F; Smith, C; Atkinson, C; Aldridge, R W; Rodriguez-Perálvarez, M; Rolando, N; Harber, M; Jones, G; O'Riordan, A; Burroughs, A K; Thorburn, D; O'Beirne, J; Milne, R S B; Emery, V C; Griffiths, P D

    2012-09-01

    After allotransplantation, cytomegalovirus (CMV) may be transmitted from the donor organ, giving rise to primary infection in a CMV negative recipient or reinfection in one who is CMV positive. In addition, latent CMV may reactivate in a CMV positive recipient. In this study, serial blood samples from 689 kidney or liver transplant recipients were tested for CMV DNA by quantitative PCR. CMV was managed using preemptive antiviral therapy and no patient received antiviral prophylaxis. Dynamic and quantitative measures of viremia and treatment were assessed. Median peak viral load, duration of viremia and duration of treatment were highest during primary infection, followed by reinfection then reactivation. In patients who experienced a second episode of viremia, the viral replication rate was significantly slower than in the first episode. Our data provide a clear demonstration of the immune control of CMV in immunosuppressed patients and emphasize the effectiveness of the preemptive approach for prevention of CMV syndrome and end organ disease. Overall, our findings provide quantitative biomarkers which can be used in pharmacodynamic assessments of the ability of novel CMV vaccines or antiviral drugs to reduce or even interrupt such transmission. © Copyright 2012 The American Society of Transplantation and the American Society of Transplant Surgeons.

  14. A quantitative evaluation of spurious results in the infrared spectroscopic measurement of CO2 isotope ratios

    NASA Astrophysics Data System (ADS)

    Mansfield, C. D.; Rutt, H. N.

    2002-02-01

    The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.

  15. 42 CFR 493.941 - Hematology (including routine hematology and coagulation).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... of a laboratory's responses for qualitative and quantitative hematology tests or analytes, the...) of this section. (2) For quantitative hematology tests or analytes, the program must determine the...

  16. 42 CFR 493.941 - Hematology (including routine hematology and coagulation).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... of a laboratory's responses for qualitative and quantitative hematology tests or analytes, the...) of this section. (2) For quantitative hematology tests or analytes, the program must determine the...

  17. 42 CFR 493.941 - Hematology (including routine hematology and coagulation).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... of a laboratory's responses for qualitative and quantitative hematology tests or analytes, the...) of this section. (2) For quantitative hematology tests or analytes, the program must determine the...

  18. 42 CFR 493.941 - Hematology (including routine hematology and coagulation).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... of a laboratory's responses for qualitative and quantitative hematology tests or analytes, the...) of this section. (2) For quantitative hematology tests or analytes, the program must determine the...

  19. 42 CFR 493.941 - Hematology (including routine hematology and coagulation).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... of a laboratory's responses for qualitative and quantitative hematology tests or analytes, the...) of this section. (2) For quantitative hematology tests or analytes, the program must determine the...

  20. Rapid, Standardized Method for Determination of Mycobacterium tuberculosis Drug Susceptibility by Use of Mycolic Acid Analysis▿

    PubMed Central

    Parrish, Nicole; Osterhout, Gerard; Dionne, Kim; Sweeney, Amy; Kwiatkowski, Nicole; Carroll, Karen; Jost, Kenneth C.; Dick, James

    2007-01-01

    Multidrug-resistant (MDR) Mycobacterium tuberculosis and extrensively drug-resistant (XDR) M. tuberculosis are emerging public health threats whose threats are compounded by the fact that current techniques for testing the susceptibility of M. tuberculosis require several days to weeks to complete. We investigated the use of high-performance liquid chromatography (HPLC)-based quantitation of mycolic acids as a means of rapidly determining drug resistance and susceptibility in M. tuberculosis. Standard susceptibility testing and determination of the MICs of drug-susceptible (n = 26) and drug-resistant M. tuberculosis strains, including MDR M. tuberculosis strains (n = 34), were performed by using the Bactec radiometric growth system as the reference method. The HPLC-based susceptibilities of the current first-line drugs, isoniazid (INH), rifampin (RIF), ethambutol (EMB), and pyrazinamide (PZA), were determined. The vials were incubated for 72 h, and aliquots were removed for HPLC analysis by using the Sherlock mycobacterial identification system. HPLC quantitation of total mycolic acid peaks (TMAPs) was performed with treated and untreated cultures. At 72 h, the levels of agreement of the HPLC method with the reference method were 99.5% for INH, EMB, and PZA and 98.7% for RIF. The inter- and intra-assay reproducibilities varied by drug, with an average precision of 13.4%. In summary, quantitation of TMAPs is a rapid, sensitive, and accurate method for antibiotic susceptibility testing of all first-line drugs currently used against M. tuberculosis and offers the potential of providing susceptibility testing results within hours, rather than days or weeks, for clinical M. tuberculosis isolates. PMID:17913928

  1. Genetics Home Reference: prostate cancer

    MedlinePlus

    ... prostate cancer Genetic Testing Registry: Prostate cancer aggressiveness quantitative trait locus on chromosome 19 Genetic Testing Registry: ... OMIM (25 links) PROSTATE CANCER PROSTATE CANCER AGGRESSIVENESS QUANTITATIVE TRAIT LOCUS ON CHROMOSOME 19 PROSTATE CANCER ANTIGEN ...

  2. Barriers to workplace HIV testing in South Africa: a systematic review of the literature.

    PubMed

    Weihs, Martin; Meyer-Weitz, Anna

    2016-01-01

    Low workplace HIV testing uptake makes effective management of HIV and AIDS difficult for South African organisations. Identifying barriers to workplace HIV testing is therefore crucial to inform urgently needed interventions aimed at increasing workplace HIV testing. This study reviewed literature on workplace HIV testing barriers in South Africa. Pubmed, ScienceDirect, PsycInfo and SA Publications were systematically researched. Studies needed to include measures to assess perceived or real barriers to participate in HIV Counselling and Testing (HCT) at the workplace or discuss perceived or real barriers of HIV testing at the workplace based on collected data, provide qualitative or quantitative evidence related to the research topic and needed to refer to workplaces in South Africa. Barriers were defined as any factor on economic, social, personal, environmental or organisational level preventing employees from participating in workplace HIV testing. Four peer-reviewed studies were included, two with quantitative and two with qualitative study designs. The overarching barriers across the studies were fear of compromised confidentiality, being stigmatised or discriminated in the event of testing HIV positive or being observed participating in HIV testing, and a low personal risk perception. Furthermore, it appeared that an awareness of an HIV-positive status hindered HIV testing at the workplace. Further research evidence of South African workplace barriers to HIV testing will enhance related interventions. This systematic review only found very little and contextualised evidence about workplace HCT barriers in South Africa, making it difficult to generalise, and not really sufficient to inform new interventions aimed at increasing workplace HCT uptake.

  3. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    PubMed

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities.

  4. Barriers and facilitators to HIV and sexually transmitted infections testing for gay, bisexual, and other transgender men who have sex with men.

    PubMed

    Scheim, Ayden I; Travers, Robb

    2017-08-01

    Transgender men who have sex with men (trans MSM) may be at elevated risk for HIV and other sexually transmitted infections (STI), and therefore require access to HIV and STI testing services. However, trans people often face stigma, discrimination, and gaps in provider competence when attempting to access health care and may therefore postpone, avoid, or be refused care. In this context, quantitative data have indicated low access to, and uptake of, HIV testing among trans MSM. The present manuscript aimed to identify trans MSM's perspectives on barriers and facilitators to HIV and STI testing. As part of a community-based research project investigating HIV risk and resilience among trans MSM, 40 trans MSM aged 18 and above and living in Ontario, Canada participated in one-on-one qualitative interviews in 2013. Participants described a number of barriers to HIV and other STI testing. These included both trans-specific and general difficulties in accessing sexual health services, lack of trans health knowledge among testing providers, limited clinical capacity to meet STI testing needs, and a perceived gap between trans-inclusive policies and their implementation in practice. Two major facilitators were identified: access to trusted and flexible testing providers, and integration of testing with ongoing monitoring for hormone therapy. Based on these findings, we provide recommendations for enhancing access to HIV and STI testing for this key population.

  5. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach.

    PubMed

    Aromataris, Edoardo; Fernandez, Ritin; Godfrey, Christina M; Holly, Cheryl; Khalil, Hanan; Tungpunkom, Patraporn

    2015-09-01

    With the increase in the number of systematic reviews available, a logical next step to provide decision makers in healthcare with the evidence they require has been the conduct of reviews of existing systematic reviews. Syntheses of existing systematic reviews are referred to by many different names, one of which is an umbrella review. An umbrella review allows the findings of reviews relevant to a review question to be compared and contrasted. An umbrella review's most characteristic feature is that this type of evidence synthesis only considers for inclusion the highest level of evidence, namely other systematic reviews and meta-analyses. A methodology working group was formed by the Joanna Briggs Institute to develop methodological guidance for the conduct of an umbrella review, including diverse types of evidence, both quantitative and qualitative. The aim of this study is to describe the development and guidance for the conduct of an umbrella review. Discussion and testing of the elements of methods for the conduct of an umbrella review were held over a 6-month period by members of a methodology working group. The working group comprised six participants who corresponded via teleconference, e-mail and face-to-face meeting during this development period. In October 2013, the methodology was presented in a workshop at the Joanna Briggs Institute Convention. Workshop participants, review authors and methodologists provided further testing, critique and feedback on the proposed methodology. This study describes the methodology and methods developed for the conduct of an umbrella review that includes published systematic reviews and meta-analyses as the analytical unit of the review. Details are provided regarding the essential elements of an umbrella review, including presentation of the review question in a Population, Intervention, Comparator, Outcome format, nuances of the inclusion criteria and search strategy. A critical appraisal tool with 10 questions to help assess risk of bias in systematic reviews and meta-analyses was also developed and tested. Relevant details to extract from included reviews and how to best present the findings of both quantitative and qualitative systematic reviews in a reader friendly format are provided. Umbrella reviews provide a ready means for decision makers in healthcare to gain a clear understanding of a broad topic area. The umbrella review methodology described here is the first to consider reviews that report other than quantitative evidence derived from randomized controlled trials. The methodology includes an easy to use and informative summary of evidence table to readily provide decision makers with the available, highest level of evidence relevant to the question posed.

  6. An autoanalyzer test for the quantitation of platelet-associated IgG

    NASA Technical Reports Server (NTRS)

    Levitan, Nathan; Teno, Richard A.; Szymanski, Irma O.

    1986-01-01

    A new quantitative antiglobulin consumption (QAC) test for the measurement of platelet-associated IgG is described. In this test washed platelets are incubated with anti-IgG at a final dilution of 1:2 million. The unneutralized fraction of anti-IgG remaining in solution is then measured with an Autoanalyzer and soluble IgG is used for calibration. The dose-response curves depicting the percent neutralization of anti-IgG by platelets and by soluble IgG were compared in detail and found to be nearly identical, indicating that platelet-associated IgG can be accurately quantitated by this method. The mean IgG values were 2287 molecules/platelet for normal adults and 38,112 molecules/platelet for ITP patients. The Autoanalyzer QAC test is a sensitive and reproducible assay for the quantitation of platelet-associated IgG.

  7. Application of programmable bio-nano-chip system for the quantitative detection of drugs of abuse in oral fluids.

    PubMed

    Christodoulides, Nicolaos; De La Garza, Richard; Simmons, Glennon W; McRae, Michael P; Wong, Jorge; Newton, Thomas F; Smith, Regina; Mahoney, James J; Hohenstein, Justin; Gomez, Sobeyda; Floriano, Pierre N; Talavera, Humberto; Sloan, Daniel J; Moody, David E; Andrenyak, David M; Kosten, Thomas R; Haque, Ahmed; McDevitt, John T

    2015-08-01

    There is currently a gap in on-site drug of abuse monitoring. Current detection methods involve invasive sampling of blood and urine specimens, or collection of oral fluid, followed by qualitative screening tests using immunochromatographic cartridges. While remote laboratories then may provide confirmation and quantitative assessment of a presumptive positive, this instrumentation is expensive and decoupled from the initial sampling making the current drug-screening program inefficient and costly. The authors applied a noninvasive oral fluid sampling approach integrated with the in-development chip-based Programmable bio-nano-chip (p-BNC) platform for the detection of drugs of abuse. The p-BNC assay methodology was applied for the detection of tetrahydrocannabinol, morphine, amphetamine, methamphetamine, cocaine, methadone and benzodiazepines, initially using spiked buffered samples and, ultimately, using oral fluid specimen collected from consented volunteers. Rapid (∼10min), sensitive detection (∼ng/mL) and quantitation of 12 drugs of abuse was demonstrated on the p-BNC platform. Furthermore, the system provided visibility to time-course of select drug and metabolite profiles in oral fluids; for the drug cocaine, three regions of slope were observed that, when combined with concentration measurements from this and prior impairment studies, information about cocaine-induced impairment may be revealed. This chip-based p-BNC detection modality has significant potential to be used in the future by law enforcement officers for roadside drug testing and to serve a variety of other settings, including outpatient and inpatient drug rehabilitation centers, emergency rooms, prisons, schools, and in the workplace. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Application of Programmable Bio-Nano-Chip System for the Quantitative Detection of Drugs of Abuse in Oral Fluids*

    PubMed Central

    Christodoulides, Nicolaos; De La Garza, Richard; Simmons, Glennon W.; McRae, Michael P.; Wong, Jorge; Newton, Thomas F.; Smith, Regina; Mahoney, James J.; Hohenstein, Justin; Gomez, Sobeyda; Floriano, Pierre N.; Talavera, Humberto; Sloan, Daniel J.; Moody, David E.; Andrenyak, David M.; Kosten, Thomas R.; Haque, Ahmed; McDevitt, John T.

    2015-01-01

    Objective There is currently a gap in on-site drug of abuse monitoring. Current detection methods involve invasive sampling of blood and urine specimens, or collection of oral fluid, followed by qualitative screening tests using immunochromatographic cartridges. While remote laboratories then may provide confirmation and quantitative assessment of a presumptive positive, this instrumentation is expensive and decoupled from the initial sampling making the current drug-screening program inefficient and costly. The authors applied a noninvasive oral fluid sampling approach integrated with the in-development chip-based Programmable Bio-Nano-Chip (p-BNC) platform for the detection of drugs of abuse. Method The p-BNC assay methodology was applied for the detection of tetrahydrocannabinol, morphine, amphetamine, methamphetamine, cocaine, methadone and benzodiazepines, initially using spiked buffered samples and, ultimately, using oral fluid specimen collected from consented volunteers. Results Rapid (~10 minutes), sensitive detection (~ng/ml) and quantitation of 12 drugs of abuse was demonstrated on the p-BNC platform. Furthermore, the system provided visibility to time-course of select drug and metabolite profiles in oral fluids; for the drug cocaine, three regions of slope were observed that, when combined with concentration measurements from this and prior impairment studies, information about cocaine-induced impairment may be revealed. Conclusions This chip-based p-BNC detection modality has significant potential to be used in the future by law enforcement officers for roadside drug testing and to serve a variety of other settings, including outpatient and inpatient drug rehabilitation centers, emergency rooms, prisons, schools, and in the workplace. PMID:26048639

  9. Ion Counting from Explicit-Solvent Simulations and 3D-RISM

    PubMed Central

    Giambaşu, George M.; Luchko, Tyler; Herschlag, Daniel; York, Darrin M.; Case, David A.

    2014-01-01

    The ionic atmosphere around nucleic acids remains only partially understood at atomic-level detail. Ion counting (IC) experiments provide a quantitative measure of the ionic atmosphere around nucleic acids and, as such, are a natural route for testing quantitative theoretical approaches. In this article, we replicate IC experiments involving duplex DNA in NaCl(aq) using molecular dynamics (MD) simulation, the three-dimensional reference interaction site model (3D-RISM), and nonlinear Poisson-Boltzmann (NLPB) calculations and test against recent buffer-equilibration atomic emission spectroscopy measurements. Further, we outline the statistical mechanical basis for interpreting IC experiments and clarify the use of specific concentration scales. Near physiological concentrations, MD simulation and 3D-RISM estimates are close to experimental results, but at higher concentrations (>0.7 M), both methods underestimate the number of condensed cations and overestimate the number of excluded anions. The effect of DNA charge on ion and water atmosphere extends 20–25 Å from its surface, yielding layered density profiles. Overall, ion distributions from 3D-RISMs are relatively close to those from corresponding MD simulations, but with less Na+ binding in grooves and tighter binding to phosphates. NLPB calculations, on the other hand, systematically underestimate the number of condensed cations at almost all concentrations and yield nearly structureless ion distributions that are qualitatively distinct from those generated by both MD simulation and 3D-RISM. These results suggest that MD simulation and 3D-RISM may be further developed to provide quantitative insight into the characterization of the ion atmosphere around nucleic acids and their effect on structure and stability. PMID:24559991

  10. Assessment of the clinical relevance of quantitative sensory testing with Von Frey monofilaments in patients with allodynia and neuropathic pain. A pilot study.

    PubMed

    Keizer, D; van Wijhe, M; Post, W J; Uges, D R A; Wierda, J M K H

    2007-08-01

    Allodynia is a common and disabling symptom in many patients with neuropathic pain. Whereas quantification of pain mostly depends on subjective pain reports, allodynia can also be measured objectively with quantitative sensory testing. In this pilot study, we investigated the clinical relevance of quantitative sensory testing with Von Frey monofilaments in patients with allodynia as a consequence of a neuropathic pain syndrome, by means of correlating subjective pain scores with pain thresholds obtained with quantitative sensory testing. During a 4-week trial, we administered a cannabis extract to 17 patients with allodynia. We quantified the severity of the allodynia with Von Frey monofilaments before, during and after the patients finished the trial. We also asked the patients to rate their pain on a numeric rating scale at these three moments. We found that most of the effect of the cannabis occurred in the last 2 weeks of the trial. In this phase, we observed that the pain thresholds, as measured with Von Frey monofilaments, were inversely correlated with a decrease of the perceived pain intensity. These preliminary findings indicate clinical relevance of quantitative sensory testing with Von Frey monofilaments in the quantification of allodynia in patients with neuropathic pain, although confirmation of our data is still required in further studies to position this method of quantitative sensory testing as a valuable tool, for example, in the evaluation of therapeutic interventions for neuropathic pain.

  11. Skype Synchronous Interaction Effectiveness in a Quantitative Management Science Course

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2012-01-01

    An experiment compared asynchronous versus synchronous instruction in an online quantitative course. Mann-Whitney U-tests, correlation, analysis of variance, t tests, and multivariate analysis of covariance (MANCOVA) were utilized to test the hypothesis that more high-quality online experiential learning interactions would increase grade.…

  12. Accelerated Colorimetric Micro-assay for Screening Mold Inhibitors

    Treesearch

    Carol A. Clausen; Vina W. Yang

    2014-01-01

    Rapid quantitative laboratory test methods are needed to screen potential antifungal agents. Existing laboratory test methods are relatively time consuming, may require specialized test equipment and rely on subjective visual ratings. A quantitative, colorimetric micro-assay has been developed that uses XTT tetrazolium salt to metabolically assess mold spore...

  13. The Appropriateness of Consolidation in Illinois: A Study of the Impact of Poverty, District Type, and Size on Expenditures and Achievement

    ERIC Educational Resources Information Center

    Dunlap, James A.

    2013-01-01

    This study examined whether or not enrollment, poverty rate, and district type could be used to predict cost and achievement, as measured on the Illinois Standards Achievement Test and Prairie State Achievement Exam, at the building and district levels within the state of Illinois. This study provides quantitative data that will aid educational…

  14. Contributions to the study of inductive transducers for measuring the amplitude of vibrations in solid media

    NASA Technical Reports Server (NTRS)

    Dragan, O.; Galan, N.; Sirbu, A.; Ghita, C.

    1974-01-01

    The design and construction of inductive transducers for measuring the vibrations in metal bars at ultrasonic frequencies are discussed. Illustrations of the inductive transducers are provided. The quantitative relations that are useful in designing the transducers are analyzed. Mathematical models are developed to substantiate the theoretical considerations. Results obtained with laboratory equipment in testing specified metal samples are included.

  15. Gender in Motion: Developmental Changes in Students' Conceptualizations of Gender through Participation in a First-Year Seminar Course

    ERIC Educational Resources Information Center

    Walters, Andrew S.; Sylaska, Kateryna M.

    2012-01-01

    Students enrolled in a first-year seminar course focused on gender provided attitudinal and experiential responses at two points during the course: during the first week of class and during the last week of class. A qualitative-quantitative method using concurrent triangulation was used to investigate pre- and post-test responses to core concepts…

  16. Relative-Error-Covariance Algorithms

    NASA Technical Reports Server (NTRS)

    Bierman, Gerald J.; Wolff, Peter J.

    1991-01-01

    Two algorithms compute error covariance of difference between optimal estimates, based on data acquired during overlapping or disjoint intervals, of state of discrete linear system. Provides quantitative measure of mutual consistency or inconsistency of estimates of states. Relative-error-covariance concept applied, to determine degree of correlation between trajectories calculated from two overlapping sets of measurements and construct real-time test of consistency of state estimates based upon recently acquired data.

  17. Testing Our Assumptions: The Role of First Course Grade and Course Level in Mathematics and English

    ERIC Educational Resources Information Center

    Callahan, Janet; Belcheir, Marcia

    2017-01-01

    Methods that provide an early indicator of factors that affect student persistence are important to colleges and universities. This quantitative research focused on the role of level of entry mathematics and English and also on grades earned in those classes, as they relate to persistence after 1 year. The research showed that by far, the variable…

  18. Software for Automated Image-to-Image Co-registration

    NASA Technical Reports Server (NTRS)

    Benkelman, Cody A.; Hughes, Heidi

    2007-01-01

    The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.

  19. NASA/BLM Applications Pilot Test (APT), phase 2. Volume 3: Technology transfer

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Techniques used and materials presented at a planning session and two workshops held to provide hands-on training in the integration of quantitatively based remote sensing data are described as well as methods used to enhance understanding of approaches to inventories that integrate multiple data sources given various resource information objectives. Significant results from each of the technology transfer sessions are examined.

  20. KSC01pp0794

    NASA Image and Video Library

    2001-04-12

    KENNEDY SPACE CENTER, FLA. -- After arrival at Astrotech, Titusville, Fla., the GOES-M (Geostationary Operational Environmental Satellite) is attached to an overhead crane. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite will undergo testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  1. KSC01pp0801

    NASA Image and Video Library

    2001-04-12

    With the GOES-M satellite tilted on a workstand at Astrotech, Titusville, Fla, workers check out a part of the underside. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite is undergoing testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  2. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  3. DISQOVER the Landcover - R based tools for quantitative vegetation reconstruction

    NASA Astrophysics Data System (ADS)

    Theuerkauf, Martin; Couwenberg, John; Kuparinen, Anna; Liebscher, Volkmar

    2016-04-01

    Quantitative methods have gained increasing attention in the field of vegetation reconstruction over the past decade. The DISQOVER package implements key tools in the R programming environment for statistical computing. This implementation has three main goals: 1) Provide a user-friendly, transparent, and open implementation of the methods 2) Provide full flexibility in all parameters (including the underlying pollen dispersal model) 3) Provide a sandbox for testing the sensitivity of the methods. We illustrate the possibilities of the package with tests of the REVEALS model and of the extended downscaling approach (EDA). REVEALS (Sugita 2007) is designed to translate pollen data from large lakes into regional vegetation composition. We applied REVEALSinR on pollen data from Lake Tiefer See (NE-Germany) and validated the results with historic landcover data. The results clearly show that REVEALS is sensitive to the underlying pollen dispersal model; REVEALS performs best when applied with the state of the art Lagrangian stochastic dispersal model. REVEALS applications with the conventional Gauss model can produce realistic results, but only if unrealistic pollen productivity estimates are used. The EDA (Theuerkauf et al. 2014) employs pollen data from many sites across a landscape to explore whether species distributions in the past were related to know stable patterns in the landscape, e.g. the distribution of soil types. The approach had so far only been implemented in simple settings with few taxa. Tests with EDAinR show that it produces sharp results in complex settings with many taxa as well. The DISQOVER package is open source software, available from disqover.uni-greifswald.de. This website can be used as a platform to discuss and improve quantitative methods in vegetation reconstruction. To introduce the tool we plan a short course in autumn of this year. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution (ICLEA; www.iclea.de) of the Helmholtz Association (Grant Number VH-VI-415) and is supported by Helmholtz infrastructure of the Terrestrial Environmental Observatory (TERENO) North-eastern Germany.

  4. Teleradiology Via The Naval Remote Medical Diagnosis System (RMDS)

    NASA Astrophysics Data System (ADS)

    Rasmussen, Will; Stevens, Ilya; Gerber, F. H.; Kuhlman, Jayne A.

    1982-01-01

    Testing was conducted to obtain qualitative and quantitative (statistical) data on radiology performance using the Remote Medical Diagnosis System (RMDS) Advanced Development Models (ADMs)1. Based upon data collected during testing with professional radiologists, this analysis addresses the clinical utility of radiographic images transferred through six possible RMDS transmission modes. These radiographs were also viewed under closed-circuit television (CCTV) and lightbox conditions to provide a basis for comparison. The analysis indicates that the RMDS ADM terminals (with a system video resolution of 525 x 256 x 6) would provide satisfactory radiographic images for radiology consultations in emergency cases with gross pathological disorders. However, in cases involving more subtle findings, a system video resolution of 525 x 512 x 8 would be preferable.

  5. Progress in Quantitative Viral Load Testing: Variability and Impact of the WHO Quantitative International Standards

    PubMed Central

    Sun, Y.; Tang, L.; Procop, G. W.; Hillyard, D. R.; Young, S. A.; Caliendo, A. M.

    2016-01-01

    ABSTRACT It has been hoped that the recent availability of WHO quantitative standards would improve interlaboratory agreement for viral load testing; however, insufficient data are available to evaluate whether this has been the case. Results from 554 laboratories participating in proficiency testing surveys for quantitative PCR assays of cytomegalovirus (CMV), Epstein-Barr virus (EBV), BK virus (BKV), adenovirus (ADV), and human herpesvirus 6 (HHV6) were evaluated to determine overall result variability and then were stratified by assay manufacturer. The impact of calibration to international units/ml (CMV and EBV) on variability was also determined. Viral loads showed a high degree of interlaboratory variability for all tested viruses, with interquartile ranges as high as 1.46 log10 copies/ml and the overall range for a given sample up to 5.66 log10 copies/ml. Some improvement in result variability was seen when international units were adopted. This was particularly the case for EBV viral load results. Variability in viral load results remains a challenge across all viruses tested here; introduction of international quantitative standards may help reduce variability and does so more or less markedly for certain viruses. PMID:27852673

  6. Timed function tests, motor function measure, and quantitative thigh muscle MRI in ambulant children with Duchenne muscular dystrophy: A cross-sectional analysis.

    PubMed

    Schmidt, Simone; Hafner, Patricia; Klein, Andrea; Rubino-Nacht, Daniela; Gocheva, Vanya; Schroeder, Jonas; Naduvilekoot Devasia, Arjith; Zuesli, Stephanie; Bernert, Guenther; Laugel, Vincent; Bloetzer, Clemens; Steinlin, Maja; Capone, Andrea; Gloor, Monika; Tobler, Patrick; Haas, Tanja; Bieri, Oliver; Zumbrunn, Thomas; Fischer, Dirk; Bonati, Ulrike

    2018-01-01

    The development of new therapeutic agents for the treatment of Duchenne muscular dystrophy has put a focus on defining outcome measures most sensitive to capture treatment effects. This cross-sectional analysis investigates the relation between validated clinical assessments such as the 6-minute walk test, motor function measure and quantitative muscle MRI of thigh muscles in ambulant Duchenne muscular dystrophy patients, aged 6.5 to 10.8 years (mean 8.2, SD 1.1). Quantitative muscle MRI included the mean fat fraction using a 2-point Dixon technique, and transverse relaxation time (T2) measurements. All clinical assessments were highly significantly inter-correlated with p < 0.001. The strongest correlation with the motor function measure and its D1-subscore was shown by the 6-minute walk test. Clinical assessments showed no correlation with age. Importantly, quantitative muscle MRI values significantly correlated with all clinical assessments with the extensors showing the strongest correlation. In contrast to the clinical assessments, quantitative muscle MRI values were highly significantly correlated with age. In conclusion, the motor function measure and timed function tests measure disease severity in a highly comparable fashion and all tests correlated with quantitative muscle MRI values quantifying fatty muscle degeneration. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Exhaust-System Leak Test : Quantitative Procedure

    DOT National Transportation Integrated Search

    1974-01-01

    A quantitative, periodic motor vehicle safety-inspection test for determining the leakage rate of engine exhaust from an automotive exhaust system was investigated. Two technical approaches were evaluated, and the better one was selected for developm...

  8. Quantitative imaging biomarker ontology (QIBO) for knowledge representation of biomedical imaging biomarkers.

    PubMed

    Buckler, Andrew J; Liu, Tiffany Ting; Savig, Erica; Suzek, Baris E; Ouellette, M; Danagoulian, J; Wernsing, G; Rubin, Daniel L; Paik, David

    2013-08-01

    A widening array of novel imaging biomarkers is being developed using ever more powerful clinical and preclinical imaging modalities. These biomarkers have demonstrated effectiveness in quantifying biological processes as they occur in vivo and in the early prediction of therapeutic outcomes. However, quantitative imaging biomarker data and knowledge are not standardized, representing a critical barrier to accumulating medical knowledge based on quantitative imaging data. We use an ontology to represent, integrate, and harmonize heterogeneous knowledge across the domain of imaging biomarkers. This advances the goal of developing applications to (1) improve precision and recall of storage and retrieval of quantitative imaging-related data using standardized terminology; (2) streamline the discovery and development of novel imaging biomarkers by normalizing knowledge across heterogeneous resources; (3) effectively annotate imaging experiments thus aiding comprehension, re-use, and reproducibility; and (4) provide validation frameworks through rigorous specification as a basis for testable hypotheses and compliance tests. We have developed the Quantitative Imaging Biomarker Ontology (QIBO), which currently consists of 488 terms spanning the following upper classes: experimental subject, biological intervention, imaging agent, imaging instrument, image post-processing algorithm, biological target, indicated biology, and biomarker application. We have demonstrated that QIBO can be used to annotate imaging experiments with standardized terms in the ontology and to generate hypotheses for novel imaging biomarker-disease associations. Our results established the utility of QIBO in enabling integrated analysis of quantitative imaging data.

  9. Testing a Preliminary Live with Love Conceptual Framework for cancer couple dyads: A mixed-methods study.

    PubMed

    Li, Qiuping; Xu, Yinghua; Zhou, Huiya; Loke, Alice Yuen

    2015-12-01

    The purpose of this study was to test the previous proposed Preliminary Live with Love Conceptual Framework (P-LLCF) that focuses on spousal caregiver-patient couples in their journey of coping with cancer as dyads. A mixed-methods study that included qualitative and quantitative approaches was conducted. Methods of concept and theory analysis, and structural equation modeling (SEM) were applied in testing the P-LLCF. In the qualitative approach in testing the concepts included in the P-LLCF, a comparison was made between the P-LLCF with a preliminary conceptual framework derived from focus group interviews among Chinese couples' coping with cancer. The comparison showed that the concepts identified in the P-LLCF are relevant to the phenomenon under scrutiny, and attributes of the concepts are consistent with those identified among Chinese cancer couple dyads. In the quantitative study, 117 cancer couples were recruited. The findings showed that inter-relationships exist among the components included in the P-LLCF: event situation, dyadic mediators, dyadic appraisal, dyadic coping, and dyadic outcomes. In that the event situation will impact the dyadic outcomes directly or indirectly through Dyadic Mediators. The dyadic mediators, dyadic appraisal, and dyadic coping are interrelated and work together to benefit the dyadic outcomes. This study provides evidence that supports the interlinked components and the relationship included in the P-LLCF. The findings of this study are important in that they provide healthcare professionals with guidance and directions according to the P-LLCF on how to plan supportive programs for couples coping with cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Integrating toxicogenomics into human health risk assessment: lessons learned from the benzo[a]pyrene case study.

    PubMed

    Chepelev, Nikolai L; Moffat, Ivy D; Labib, Sarah; Bourdon-Lacombe, Julie; Kuo, Byron; Buick, Julie K; Lemieux, France; Malik, Amal I; Halappanavar, Sabina; Williams, Andrew; Yauk, Carole L

    2015-01-01

    The use of short-term toxicogenomic tests to predict cancer (or other health effects) offers considerable advantages relative to traditional toxicity testing methods. The advantages include increased throughput, increased mechanistic data, and significantly reduced costs. However, precisely how toxicogenomics data can be used to support human health risk assessment (RA) is unclear. In a companion paper ( Moffat et al. 2014 ), we present a case study evaluating the utility of toxicogenomics in the RA of benzo[a]pyrene (BaP), a known human carcinogen. The case study is meant as a proof-of-principle exercise using a well-established mode of action (MOA) that impacts multiple tissues, which should provide a best case example. We found that toxicogenomics provided rich mechanistic data applicable to hazard identification, dose-response analysis, and quantitative RA of BaP. Based on this work, here we share some useful lessons for both research and RA, and outline our perspective on how toxicogenomics can benefit RA in the short- and long-term. Specifically, we focus on (1) obtaining biologically relevant data that are readily suitable for establishing an MOA for toxicants, (2) examining the human relevance of an MOA from animal testing, and (3) proposing appropriate quantitative values for RA. We describe our envisioned strategy on how toxicogenomics can become a tool in RA, especially when anchored to other short-term toxicity tests (apical endpoints) to increase confidence in the proposed MOA, and emphasize the need for additional studies on other MOAs to define the best practices in the application of toxicogenomics in RA.

  11. Justify Your Answer: The Role of Written Think Aloud in Script Concordance Testing.

    PubMed

    Power, Alyssa; Lemay, Jean-Francois; Cooke, Suzette

    2017-01-01

    Construct: Clinical reasoning assessment is a growing area of interest in the medical education literature. Script concordance testing (SCT) evaluates clinical reasoning in conditions of uncertainty and has emerged as an innovative tool in the domain of clinical reasoning assessment. SCT quantifies the degree of concordance between a learner and an experienced clinician and attempts to capture the breadth of responses of expert clinicians, acknowledging the significant yet acceptable variation in practice under situations of uncertainty. SCT has been shown to be a valid and reliable clinical reasoning assessment tool. However, as SCT provides only quantitative information, it may not provide a complete assessment of clinical reasoning. Think aloud (TA) is a qualitative research tool used in clinical reasoning assessment in which learners verbalize their thought process around an assigned task. This study explores the use of TA, in the form of written reflection, in SCT to assess resident clinical reasoning, hypothesizing that the information obtained from the written TA would enrich the quantitative data obtained through SCT. Ninety-one pediatric postgraduate trainees and 21 pediatricians from 4 Canadian training centers completed an online test consisting of 24 SCT cases immediately followed by retrospective written TA. Six of 24 cases were selected to gather TA data. These cases were chosen to allow all phases of clinical decision making (diagnosis, investigation, and treatment) to be represented in the TA data. Inductive thematic analysis was employed when systematically reviewing TA responses. Three main benefits of adding written TA to SCT were identified: (a) uncovering instances of incorrect clinical reasoning despite a correct SCT response, (b) revealing sound clinical reasoning in the context of a suboptimal SCT response, and (c) detecting question misinterpretation. Written TA can optimize SCT by demonstrating when correct examinee responses are based on guessing or uncertainty rather than robust clinical rationale. TA can also enhance SCT by allowing examinees to provide justification for responses that otherwise would have been considered incorrect and by identifying questions that are frequently misinterpreted to avoid including them in future examinations. TA also has significant value in differentiating between acceptable variations in expert clinician responses and deviance associated with faulty rationale or question misinterpretation; this could improve SCT reliability. A written TA protocol appears to be a valuable tool to assess trainees' clinical reasoning and can strengthen the quantitative assessment provided by SCT.

  12. Personal Narratives of Genetic Testing: Expectations, Emotions, and Impact on Self and Family.

    PubMed

    Anderson, Emily E; Wasson, Katherine

    2015-01-01

    The stories in this volume shed light on the potential of narrative inquiry to fill gaps in knowledge, particularly given the mixed results of quantitative research on patient views of and experiences with genetic and genomic testing. Published studies investigate predictors of testing (particularly risk perceptions and worry); psychological and behavioral responses to testing; and potential impact on the health care system (e.g., when patients bring DTC genetic test results to their primary care provider). Interestingly, these themes did not dominate the narratives published in this issue. Rather, these narratives included consistent themes of expectations and looking for answers; complex emotions; areas of contradiction and conflict; and family impact. More narrative research on patient experiences with genetic testing may fill gaps in knowledge regarding how patients define the benefits of testing, changes in psychological and emotional reactions to test results over time, and the impact of testing on families.

  13. Decoupled form and function in disparate herbivorous dinosaur clades

    NASA Astrophysics Data System (ADS)

    Lautenschlager, Stephan; Brassey, Charlotte A.; Button, David J.; Barrett, Paul M.

    2016-05-01

    Convergent evolution, the acquisition of morphologically similar traits in unrelated taxa due to similar functional demands or environmental factors, is a common phenomenon in the animal kingdom. Consequently, the occurrence of similar form is used routinely to address fundamental questions in morphofunctional research and to infer function in fossils. However, such qualitative assessments can be misleading and it is essential to test form/function relationships quantitatively. The parallel occurrence of a suite of morphologically convergent craniodental characteristics in three herbivorous, phylogenetically disparate dinosaur clades (Sauropodomorpha, Ornithischia, Theropoda) provides an ideal test case. A combination of computational biomechanical models (Finite Element Analysis, Multibody Dynamics Analysis) demonstrate that despite a high degree of morphological similarity between representative taxa (Plateosaurus engelhardti, Stegosaurus stenops, Erlikosaurus andrewsi) from these clades, their biomechanical behaviours are notably different and difficult to predict on the basis of form alone. These functional differences likely reflect dietary specialisations, demonstrating the value of quantitative biomechanical approaches when evaluating form/function relationships in extinct taxa.

  14. Decoupled form and function in disparate herbivorous dinosaur clades.

    PubMed

    Lautenschlager, Stephan; Brassey, Charlotte A; Button, David J; Barrett, Paul M

    2016-05-20

    Convergent evolution, the acquisition of morphologically similar traits in unrelated taxa due to similar functional demands or environmental factors, is a common phenomenon in the animal kingdom. Consequently, the occurrence of similar form is used routinely to address fundamental questions in morphofunctional research and to infer function in fossils. However, such qualitative assessments can be misleading and it is essential to test form/function relationships quantitatively. The parallel occurrence of a suite of morphologically convergent craniodental characteristics in three herbivorous, phylogenetically disparate dinosaur clades (Sauropodomorpha, Ornithischia, Theropoda) provides an ideal test case. A combination of computational biomechanical models (Finite Element Analysis, Multibody Dynamics Analysis) demonstrate that despite a high degree of morphological similarity between representative taxa (Plateosaurus engelhardti, Stegosaurus stenops, Erlikosaurus andrewsi) from these clades, their biomechanical behaviours are notably different and difficult to predict on the basis of form alone. These functional differences likely reflect dietary specialisations, demonstrating the value of quantitative biomechanical approaches when evaluating form/function relationships in extinct taxa.

  15. Species identification of Cannabis sativa using real-time quantitative PCR (qPCR).

    PubMed

    Johnson, Christopher E; Premasuthan, Amritha; Satkoski Trask, Jessica; Kanthaswamy, Sree

    2013-03-01

    Most narcotics-related cases in the United States involve Cannabis sativa. Material is typically identified based on the cystolithic hairs on the leaves and with chemical tests to identify of the presence of cannabinoids. Suspect seeds are germinated into a viable plant so that morphological and chemical tests can be conducted. Seed germination, however, causes undue analytical delays. DNA analyses that involve the chloroplast and nuclear genomes have been developed for identification of C. sativa materials, but they require several nanograms of template DNA. Using the trnL 3' exon-trnF intragenic spacer regions within the C. sativa chloroplast, we have developed a real-time quantitative PCR assay that is capable of identifying picogram amounts of chloroplast DNA for species determination of suspected C. sativa material. This assay provides forensic science laboratories with a quick and reliable method to identify an unknown sample as C. sativa. © 2013 American Academy of Forensic Sciences.

  16. How to Measure Physical Motion and the Impact of Individualized Feedback in the Field of Rehabilitation of Geriatric Trauma Patients.

    PubMed

    Altenbuchner, Amelie; Haug, Sonja; Kretschmer, Rainer; Weber, Karsten

    2018-01-01

    This preparatory study accelerates an implementation of individualized monitoring and feedback of physical motion using conventional motion trackers in the rehabilitation process of geriatric trauma patients. Regaining mobility is accompanied with improved quality of life in persons of very advanced age recovering from fragility fractures. Quantitative survey of regaining physical mobility provides recommendations for action on how to use motion trackers effectively in a clinical geriatric setting. Method mix of quantitative and qualitative interdisciplinary and mutual complementary research approaches (sociology, health research, philosophy/ethics, medical informatics, nursing science, gerontology and physical therapy). While validating motion tracker use in geriatric traumatology preliminary data are used to develop a target group oriented motion feedback. In addition measurement accuracy of a questionnaire about quality of life of multimorbid geriatric patients (FLQM) is tested. Implementing a new technology in a complex clinical setting needs to be based on a strong theoretical background but will not succeed without careful field testing.

  17. High speed digital holographic interferometry for hypersonic flow visualization

    NASA Astrophysics Data System (ADS)

    Hegde, G. M.; Jagdeesh, G.; Reddy, K. P. J.

    2013-06-01

    Optical imaging techniques have played a major role in understanding the flow dynamics of varieties of fluid flows, particularly in the study of hypersonic flows. Schlieren and shadowgraph techniques have been the flow diagnostic tools for the investigation of compressible flows since more than a century. However these techniques provide only the qualitative information about the flow field. Other optical techniques such as holographic interferometry and laser induced fluorescence (LIF) have been used extensively for extracting quantitative information about the high speed flows. In this paper we present the application of digital holographic interferometry (DHI) technique integrated with short duration hypersonic shock tunnel facility having 1 ms test time, for quantitative flow visualization. Dynamics of the flow fields in hypersonic/supersonic speeds around different test models is visualized with DHI using a high-speed digital camera (0.2 million fps). These visualization results are compared with schlieren visualization and CFD simulation results. Fringe analysis is carried out to estimate the density of the flow field.

  18. Surface temperature/heat transfer measurement using a quantitative phosphor thermography system

    NASA Technical Reports Server (NTRS)

    Buck, G. M.

    1991-01-01

    A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.

  19. QSAR models for predicting octanol/water and organic carbon/water partition coefficients of polychlorinated biphenyls.

    PubMed

    Yu, S; Gao, S; Gan, Y; Zhang, Y; Ruan, X; Wang, Y; Yang, L; Shi, J

    2016-04-01

    Quantitative structure-property relationship modelling can be a valuable alternative method to replace or reduce experimental testing. In particular, some endpoints such as octanol-water (KOW) and organic carbon-water (KOC) partition coefficients of polychlorinated biphenyls (PCBs) are easier to predict and various models have been already developed. In this paper, two different methods, which are multiple linear regression based on the descriptors generated using Dragon software and hologram quantitative structure-activity relationships, were employed to predict suspended particulate matter (SPM) derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of 209 PCBs. The predictive ability of the derived models was validated using a test set. The performances of all these models were compared with EPI Suite™ software. The results indicated that the proposed models were robust and satisfactory, and could provide feasible and promising tools for the rapid assessment of the SPM derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of PCBs.

  20. [Development and Effects of a Cognitive-behavioral Therapy Based Program in Reducing Internalized Stigma in Patients with Schizophrenia].

    PubMed

    Kim, Mi Young; Jun, Seong Sook

    2016-06-01

    This study was done to develop a internalized stigma reducing program based on cognitive-behavioral therapy and appropriate for patients with schizophrenia and to evaluate its effectiveness. The study design was a mixed method research. Qualitative study, 13 patients with schizophrenia who had experience in overcoming stigma were purposively chosen for interviews and data were analyzed using Giorgi method. Quantitative study, 64 patients with schizophrenia (experimental group=32, control group=32) were recruited. The cognitive-behavioral therapy-based program for reducing internalized stigma in patients with schizophrenia was provided for 8 weeks (12 sessions). Data were collected from June. 20, 2013 to Feb. 14, 2014. Quantitative data were analyzed using χ²-test, t-test, repeated measures ANOVA with the SPSS program. Qualitative results, from the experience of coping with stigma in patients with schizophrenia seventeen themes and five themes-clusters were drawn up. Quantitative results showed that internalized stigma, self-esteem, mental health recovery and quality of life were significantly better in the experimental group compared to the control group. Study findings indicate that this program for reducing internalized stigma in patients with schizophrenia is effective and can be recommended as a rehabilitation program intervention to help patients with schizophrenia to cope with internalized stigma.

  1. Improving quantitative skills in introductory geoscience courses at a four-year public institution using online math modules

    NASA Astrophysics Data System (ADS)

    Gordon, E. S.

    2011-12-01

    Fitchburg State University has a diverse student population comprised largely of students traditionally underrepresented in higher education, including first-generation, low-income, and/or students with disabilities. Approximately half of our incoming students require developmental math coursework, but often enroll in science classes prior to completing those courses. Since our introductory geoscience courses (Oceanography, Meteorology, Geology, Earth Systems Science) do not have prerequisites, many students who take them lack basic math skills, but are taking these courses alongside science majors. In order to provide supplemental math instruction without sacrificing time for content, "The Math You Need, When You Need It (TMYN), a set of online math tutorials placed in a geoscience context, will be implemented in three of our introductory courses (Oceanography, Meteorology, and Earth Systems Science) during Fall, 2011. Students will complete 5-6 modules asynchronously, the topics of which include graphing skills, calculating rates, unit conversions, and rearranging equations. Assessment of quantitative skills will be tracked with students' pre- and post-test results, as well as individual module quiz scores. In addition, student assessment results from Oceanography will be compared to student data from Academic Year 2010-11, during which quantitative skills were evaluated with pre- and post-test questions, but students did not receive online supplemental instruction.

  2. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs.

  3. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  4. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNAmore » populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.« less

  5. Analytical control test plan and microbiological methods for the water recovery test

    NASA Technical Reports Server (NTRS)

    Traweek, M. S. (Editor); Tatara, J. D. (Editor)

    1994-01-01

    Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.

  6. qFlow Cytometry-Based Receptoromic Screening: A High-Throughput Quantification Approach Informing Biomarker Selection and Nanosensor Development.

    PubMed

    Chen, Si; Weddell, Jared; Gupta, Pavan; Conard, Grace; Parkin, James; Imoukhuede, Princess I

    2017-01-01

    Nanosensor-based detection of biomarkers can improve medical diagnosis; however, a critical factor in nanosensor development is deciding which biomarker to target, as most diseases present several biomarkers. Biomarker-targeting decisions can be informed via an understanding of biomarker expression. Currently, immunohistochemistry (IHC) is the accepted standard for profiling biomarker expression. While IHC provides a relative mapping of biomarker expression, it does not provide cell-by-cell readouts of biomarker expression or absolute biomarker quantification. Flow cytometry overcomes both these IHC challenges by offering biomarker expression on a cell-by-cell basis, and when combined with calibration standards, providing quantitation of biomarker concentrations: this is known as qFlow cytometry. Here, we outline the key components for applying qFlow cytometry to detect biomarkers within the angiogenic vascular endothelial growth factor receptor family. The key aspects of the qFlow cytometry methodology include: antibody specificity testing, immunofluorescent cell labeling, saturation analysis, fluorescent microsphere calibration, and quantitative analysis of both ensemble and cell-by-cell data. Together, these methods enable high-throughput quantification of biomarker expression.

  7. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  8. Quantification of dopamine transporters in the mouse brain using ultra-high resolution single-photon emission tomography.

    PubMed

    Acton, Paul D; Choi, Seok-Rye; Plössl, Karl; Kung, Hank F

    2002-05-01

    Functional imaging of small animals, such as mice and rats, using ultra-high resolution positron emission tomography (PET) and single-photon emission tomography (SPET), is becoming a valuable tool for studying animal models of human disease. While several studies have shown the utility of PET imaging in small animals, few have used SPET in real research applications. In this study we aimed to demonstrate the feasibility of using ultra-high resolution SPET in quantitative studies of dopamine transporters (DAT) in the mouse brain. Four healthy ICR male mice were injected with (mean+/-SD) 704+/-154 MBq [(99m)Tc]TRODAT-1, and scanned using an ultra-high resolution SPET system equipped with pinhole collimators (spatial resolution 0.83 mm at 3 cm radius of rotation). Each mouse had two studies, to provide an indication of test-retest reliability. Reference tissue kinetic modeling analysis of the time-activity data in the striatum and cerebellum was used to quantitate the availability of DAT. A simple equilibrium ratio of striatum to cerebellum provided another measure of DAT binding. The SPET imaging results were compared against ex vivo biodistribution data from the striatum and cerebellum. The mean distribution volume ratio (DVR) from the reference tissue kinetic model was 2.17+/-0.34, with a test-retest reliability of 2.63%+/-1.67%. The ratio technique gave similar results (DVR=2.03+/-0.38, test-retest reliability=6.64%+/-3.86%), and the ex vivo analysis gave DVR=2.32+/-0.20. Correlations between the kinetic model and the ratio technique ( R(2)=0.86, P<0.001) and the ex vivo data ( R(2)=0.92, P=0.04) were both excellent. This study demonstrated clearly that ultra-high resolution SPET of small animals is capable of accurate, repeatable, and quantitative measures of DAT binding, and should open up the possibility of further studies of cerebral binding sites in mice using pinhole SPET.

  9. Strength of SiCf-SiCm composite tube under uniaxial and multiaxial loading

    NASA Astrophysics Data System (ADS)

    Shapovalov, Kirill; Jacobsen, George M.; Alva, Luis; Truesdale, Nathaniel; Deck, Christian P.; Huang, Xinyu

    2018-03-01

    The authors report mechanical strength of nuclear grade silicon carbide fiber reinforced silicon carbide matrix composite (SiCf-SiCm) tubing under several different stress states. The composite tubing was fabricated via a Chemical Vapor Infiltration (CVI) process, and is being evaluated for accident tolerant nuclear fuel cladding. Several experimental techniques were applied including uniaxial tension, elastomer insert burst test, open and closed end hydraulic bladder burst test, and torsion test. These tests provided critical stress and strain values at proportional limit and at ultimate failure points. Full field strain measurements using digital image correlation (DIC) were obtained in order to acquire quantitative information on localized deformation during application of stress. Based on the test results, a failure map was constructed for the SiCf-SiCm composites.

  10. Detection test of wireless network signal strength and GPS positioning signal in underground pipeline

    NASA Astrophysics Data System (ADS)

    Li, Li; Zhang, Yunwei; Chen, Ling

    2018-03-01

    In order to solve the problem of selecting positioning technology for inspection robot in underground pipeline environment, the wireless network signal strength and GPS positioning signal testing are carried out in the actual underground pipeline environment. Firstly, the strength variation of the 3G wireless network signal and Wi-Fi wireless signal provided by China Telecom and China Unicom ground base stations are tested, and the attenuation law of these wireless signals along the pipeline is analyzed quantitatively and described. Then, the receiving data of the GPS satellite signal in the pipeline are tested, and the attenuation of GPS satellite signal under underground pipeline is analyzed. The testing results may be reference for other related research which need to consider positioning in pipeline.

  11. Hybrid statistical testing for nuclear material accounting data and/or process monitoring data in nuclear safeguards

    DOE PAGES

    Burr, Tom; Hamada, Michael S.; Ticknor, Larry; ...

    2015-01-01

    The aim of nuclear safeguards is to ensure that special nuclear material is used for peaceful purposes. Historically, nuclear material accounting (NMA) has provided the quantitative basis for monitoring for nuclear material loss or diversion, and process monitoring (PM) data is collected by the operator to monitor the process. PM data typically support NMA in various ways, often by providing a basis to estimate some of the in-process nuclear material inventory. We develop options for combining PM residuals and NMA residuals (residual = measurement - prediction), using a hybrid of period-driven and data-driven hypothesis testing. The modified statistical tests canmore » be used on time series of NMA residuals (the NMA residual is the familiar material balance), or on a combination of PM and NMA residuals. The PM residuals can be generated on a fixed time schedule or as events occur.« less

  12. Development of a non-contextual model for determining the autonomy level of intelligent unmanned systems

    NASA Astrophysics Data System (ADS)

    Durst, Phillip J.; Gray, Wendell; Trentini, Michael

    2013-05-01

    A simple, quantitative measure for encapsulating the autonomous capabilities of unmanned systems (UMS) has yet to be established. Current models for measuring a UMS's autonomy level require extensive, operational level testing, and provide a means for assessing the autonomy level for a specific mission/task and operational environment. A more elegant technique for quantifying autonomy using component level testing of the robot platform alone, outside of mission and environment contexts, is desirable. Using a high level framework for UMS architectures, such a model for determining a level of autonomy has been developed. The model uses a combination of developmental and component level testing for each aspect of the UMS architecture to define a non-contextual autonomous potential (NCAP). The NCAP provides an autonomy level, ranging from fully non- autonomous to fully autonomous, in the form of a single numeric parameter describing the UMS's performance capabilities when operating at that level of autonomy.

  13. Control of Flexible Structures (COFS) Flight Experiment Background and Description

    NASA Technical Reports Server (NTRS)

    Hanks, B. R.

    1985-01-01

    A fundamental problem in designing and delivering large space structures to orbit is to provide sufficient structural stiffness and static configuration precision to meet performance requirements. These requirements are directly related to control requirements and the degree of control system sophistication available to supplement the as-built structure. Background and rationale are presented for a research study in structures, structural dynamics, and controls using a relatively large, flexible beam as a focus. This experiment would address fundamental problems applicable to large, flexible space structures in general and would involve a combination of ground tests, flight behavior prediction, and instrumented orbital tests. Intended to be multidisciplinary but basic within each discipline, the experiment should provide improved understanding and confidence in making design trades between structural conservatism and control system sophistication for meeting static shape and dynamic response/stability requirements. Quantitative results should be obtained for use in improving the validity of ground tests for verifying flight performance analyses.

  14. The predictive value of quantitative fibronectin testing in combination with cervical length measurement in symptomatic women.

    PubMed

    Bruijn, Merel M C; Kamphuis, Esme I; Hoesli, Irene M; Martinez de Tejada, Begoña; Loccufier, Anne R; Kühnert, Maritta; Helmer, Hanns; Franz, Marie; Porath, Martina M; Oudijk, Martijn A; Jacquemyn, Yves; Schulzke, Sven M; Vetter, Grit; Hoste, Griet; Vis, Jolande Y; Kok, Marjolein; Mol, Ben W J; van Baaren, Gert-Jan

    2016-12-01

    The combination of the qualitative fetal fibronectin test and cervical length measurement has a high negative predictive value for preterm birth within 7 days; however, positive prediction is poor. A new bedside quantitative fetal fibronectin test showed potential additional value over the conventional qualitative test, but there is limited evidence on the combination with cervical length measurement. The purpose of this study was to compare quantitative fetal fibronectin and qualitative fetal fibronectin testing in the prediction of spontaneous preterm birth within 7 days in symptomatic women who undergo cervical length measurement. We performed a European multicenter cohort study in 10 perinatal centers in 5 countries. Women between 24 and 34 weeks of gestation with signs of active labor and intact membranes underwent quantitative fibronectin testing and cervical length measurement. We assessed the risk of preterm birth within 7 days in predefined strata based on fibronectin concentration and cervical length. Of 455 women who were included in the study, 48 women (11%) delivered within 7 days. A combination of cervical length and qualitative fibronectin resulted in the identification of 246 women who were at low risk: 164 women with a cervix between 15 and 30 mm and a negative fibronectin test (<50 ng/mL; preterm birth rate, 2%) and 82 women with a cervix at >30 mm (preterm birth rate, 2%). Use of quantitative fibronectin alone resulted in a predicted risk of preterm birth within 7 days that ranged from 2% in the group with the lowest fibronectin level (<10 ng/mL) to 38% in the group with the highest fibronectin level (>500 ng/mL), with similar accuracy as that of the combination of cervical length and qualitative fibronectin. Combining cervical length and quantitative fibronectin resulted in the identification of an additional 19 women at low risk (preterm birth rate, 5%), using a threshold of 10 ng/mL in women with a cervix at <15 mm, and 6 women at high risk (preterm birth rate, 33%) using a threshold of >500 ng/mL in women with a cervix at >30 mm. In women with threatened preterm birth, quantitative fibronectin testing alone performs equal to the combination of cervical length and qualitative fibronectin. Possibly, the combination of quantitative fibronectin testing and cervical length increases this predictive capacity. Cost-effectiveness analysis and the availability of these tests in a local setting should determine the final choice. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. A Novel Method for Quantifying Helmeted Field of View of a Spacesuit - And What It Means for Constellation

    NASA Technical Reports Server (NTRS)

    McFarland, Shane M.

    2010-01-01

    Field of view has always been a design feature paramount to helmet design, and in particular spacesuit design, where the helmet must provide an adequate field of view for a large range of activities, environments, and body positions. Historically, suited field of view has been evaluated either qualitatively in parallel with design or quantitatively using various test methods and protocols. As such, oftentimes legacy suit field of view information is either ambiguous for lack of supporting data or contradictory to other field of view tests performed with different subjects and test methods. This paper serves to document a new field of view testing method that is more reliable and repeatable than its predecessors. It borrows heavily from standard ophthalmologic field of vision tests such as the Goldmann kinetic perimetry test, but is designed specifically for evaluating field of view of a spacesuit helmet. In this test, four suits utilizing three different helmet designs were tested for field of view. Not only do these tests provide more reliable field of view data for legacy and prototype helmet designs, they also provide insight into how helmet design impacts field of view and what this means for the Constellation Project spacesuit helmet, which must meet stringent field of view requirements that are more generous to the crewmember than legacy designs.

  16. Quantitative Tester And Reconditioner For Hand And Arm

    NASA Technical Reports Server (NTRS)

    Engle, Gary; Bond, Malcolm; Naumann, Theodore

    1993-01-01

    Apparatus measures torques, forces, and motions of hand, wrist, forearm, elbow, and shoulder and aids in reconditioning muscles involved. Used to determine strengths and endurances of muscles, ranges of motion of joints, and reaction times. Provides quantitative data used to assess extent to which disuse, disease, or injury causes deterioration of muscles and of motor-coordination skills. Same apparatus serves as exercise machine to restore muscle performance by imposing electronically controlled, gradually increasing loads on muscles. Suitable for training and evaluating astronauts, field testing for workers' compensation claims, and physical therapy in hospitals. With aid of various attachments, system adapted to measure such special motions as pinching, rotation of wrist, and supination and pronation of the forearm. Attachments are gloves, wristlets, and sleeves.

  17. Nonequilibrium fluctuations in metaphase spindles: polarized light microscopy, image registration, and correlation functions

    NASA Astrophysics Data System (ADS)

    Brugués, Jan; Needleman, Daniel J.

    2010-02-01

    Metaphase spindles are highly dynamic, nonequilibrium, steady-state structures. We study the internal fluctuations of spindles by computing spatio-temporal correlation functions of movies obtained from quantitative polarized light microscopy. These correlation functions are only physically meaningful if corrections are made for the net motion of the spindle. We describe our image registration algorithm in detail and we explore its robustness. Finally, we discuss the expression used for the estimation of the correlation function in terms of the nematic order of the microtubules which make up the spindle. Ultimately, studying the form of these correlation functions will provide a quantitative test of the validity of coarse-grained models of spindle structure inspired from liquid crystal physics.

  18. Reduced-cost Chlamydia trachomatis-specific multiplex real-time PCR diagnostic assay evaluated for ocular swabs and use by trachoma research programmes.

    PubMed

    Butcher, Robert; Houghton, Jo; Derrick, Tamsyn; Ramadhani, Athumani; Herrera, Beatriz; Last, Anna R; Massae, Patrick A; Burton, Matthew J; Holland, Martin J; Roberts, Chrissy H

    2017-08-01

    Trachoma, caused by the intracellular bacterium Chlamydia trachomatis (Ct), is the leading infectious cause of preventable blindness. Many commercial platforms are available that provide highly sensitive and specific detection of Ct DNA. However, the majority of these commercial platforms are inaccessible for population-level surveys in resource-limited settings typical to trachoma control programmes. We developed two low-cost quantitative PCR (qPCR) tests for Ct using readily available reagents on standard real-time thermocyclers. Each multiplex qPCR test targets one genomic and one plasmid Ct target in addition to an endogenous positive control for Homo sapiens DNA. The quantitative performance of the qPCR assays in clinical samples was determined by comparison to a previously evaluated droplet digital PCR (ddPCR) test. The diagnostic performance of the qPCR assays were evaluated against a commercial assay (artus C. trachomatis Plus RG PCR, Qiagen) using molecular diagnostics quality control standards and clinical samples. We examined the yield of Ct DNA prepared from five different DNA extraction kits and a cold chain-free dry-sample preservation method using swabs spiked with fixed concentrations of human and Ct DNA. The qPCR assay was highly reproducible (Ct plasmid and genomic targets mean total coefficients of variance 41.5% and 48.3%, respectively). The assay detected 8/8 core specimens upon testing of a quality control panel and performed well in comparison to commercially marketed comparator test (sensitivity and specificity>90%). Optimal extraction and sample preservation methods for research applications were identified. We describe a pipeline from collection to diagnosis providing the most efficient sample preservation and extraction with significant per test cost savings over a commercial qPCR diagnostic assay. The assay and its evaluation should allow control programs wishing to conduct independent research within the context of trachoma control, access to an affordable test with defined performance characteristics. Copyright © 2017. Published by Elsevier B.V.

  19. Influence of aging on thermal and vibratory thresholds of quantitative sensory testing.

    PubMed

    Lin, Yea-Huey; Hsieh, Song-Chou; Chao, Chi-Chao; Chang, Yang-Chyuan; Hsieh, Sung-Tsang

    2005-09-01

    Quantitative sensory testing has become a common approach to evaluate thermal and vibratory thresholds in various types of neuropathies. To understand the effect of aging on sensory perception, we measured warm, cold, and vibratory thresholds by performing quantitative sensory testing on a population of 484 normal subjects (175 males and 309 females), aged 48.61 +/- 14.10 (range 20-86) years. Sensory thresholds of the hand and foot were measured with two algorithms: the method of limits (Limits) and the method of level (Level). Thresholds measured by Limits are reaction-time-dependent, while those measured by Level are independent of reaction time. In addition, we explored (1) the correlations of thresholds between these two algorithms, (2) the effect of age on differences in thresholds between algorithms, and (3) differences in sensory thresholds between the two test sites. Age was consistently and significantly correlated with sensory thresholds of all tested modalities measured by both algorithms on multivariate regression analysis compared with other factors, including gender, body height, body weight, and body mass index. When thresholds were plotted against age, slopes differed between sensory thresholds of the hand and those of the foot: for the foot, slopes were steeper compared with those for the hand for each sensory modality. Sensory thresholds of both test sites measured by Level were highly correlated with those measured by Limits, and thresholds measured by Limits were higher than those measured by Level. Differences in sensory thresholds between the two algorithms were also correlated with age: thresholds of the foot were higher than those of the hand for each sensory modality. This difference in thresholds (measured with both Level and Limits) between the hand and foot was also correlated with age. These findings suggest that age is the most significant factor in determining sensory thresholds compared with the other factors of gender and anthropometric parameters, and this provides a foundation for investigating the neurobiologic significance of aging on the processing of sensory stimuli.

  20. Non-animal approaches for toxicokinetics in risk evaluations of food chemicals.

    PubMed

    Punt, Ans; Peijnenburg, Ad A C M; Hoogenboom, Ron L A P; Bouwmeester, Hans

    2017-01-01

    The objective of the present work was to review the availability and predictive value of non-animal toxicokinetic approaches and to evaluate their current use in European risk evaluations of food contaminants, additives and food contact materials, as well as pesticides and medicines. Results revealed little use of quantitative animal or human kinetic data in risk evaluations of food chemicals, compared with pesticides and medicines. Risk evaluations of medicines provided sufficient in vivo kinetic data from different species to evaluate the predictive value of animal kinetic data for humans. These data showed a relatively poor correlation between the in vivo bioavailability in rats and dogs versus that in humans. In contrast, in vitro (human) kinetic data have been demonstrated to provide adequate predictions of the fate of compounds in humans, using appropriate in vitro-in vivo scalers and by integration of in vitro kinetic data with in silico kinetic modelling. Even though in vitro kinetic data were found to be occasionally included within risk evaluations of food chemicals, particularly results from Caco-2 absorption experiments and in vitro data on gut-microbial conversions, only minor use of in vitro methods for metabolism and quantitative in vitro-in vivo extrapolation methods was identified. Yet, such quantitative predictions are essential in the development of alternatives to animal testing as well as to increase human relevance of toxicological risk evaluations. Future research should aim at further improving and validating quantitative alternative methods for kinetics, thereby increasing regulatory acceptance of non-animal kinetic data.

  1. 78 FR 52166 - Quantitative Messaging Research

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-22

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast-track OMB approval... message testing research (for which CFTC received fast-track OMB approval) and is necessary to identify...

  2. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  3. Endpoint titration and immunotherapy.

    PubMed

    King, H C

    1985-11-01

    Inhalant allergy, or "atopy" as it is now termed, is the best understood form of allergy today. In some circles, it is the only recognized form of allergy. While an overall picture of its effects on the body and a reasonable approach to its treatment now exist, many problems remain to be solved and much improvement in its treatment will probably occur within the next several years. Many new approaches to treatment of aeroallergens are now available; however, all are compared with the skin test, which is and has been the baseline for testing and treatment. Endpoint titration provides a quantitative means for undertaking treatment of aeroallergen sensitivity. In no other way does it differ from the forms of skin testing that have been widely used for generations. The practitioners of endpoint titration feel that this difference is highly significant in simplifying, validating, and shortening the necessary period of therapy. While the concept of endpoint titration is not difficult, it is by definition a quantitative form of testing and requires a degree of expertise in performing it correctly. While a good understanding of the method may be gained from the literature, adequate hands-on experience should be obtained by any physician prior to instituting the technique as a treatment modality. Once mastered, it becomes a reliable baseline for all forms of inhalant allergy care.

  4. Bench-top validation testing of selected immunological and molecular Renibacterium salmoninarum diagnostic assays by comparison with quantitative bacteriological culture

    USGS Publications Warehouse

    Elliott, D.G.; Applegate, L.J.; Murray, A.L.; Purcell, M.K.; McKibben, C.L.

    2013-01-01

    No gold standard assay exhibiting error-free classification of results has been identified for detection of Renibacterium salmoninarum, the causative agent of salmonid bacterial kidney disease. Validation of diagnostic assays for R. salmoninarum has been hindered by its unique characteristics and biology, and difficulties in locating suitable populations of reference test animals. Infection status of fish in test populations is often unknown, and it is commonly assumed that the assay yielding the most positive results has the highest diagnostic accuracy, without consideration of misclassification of results. In this research, quantification of R. salmoninarum in samples by bacteriological culture provided a standardized measure of viable bacteria to evaluate analytical performance characteristics (sensitivity, specificity and repeatability) of non-culture assays in three matrices (phosphate-buffered saline, ovarian fluid and kidney tissue). Non-culture assays included polyclonal enzyme-linked immunosorbent assay (ELISA), direct smear fluorescent antibody technique (FAT), membrane-filtration FAT, nested polymerase chain reaction (nested PCR) and three real-time quantitative PCR assays. Injection challenge of specific pathogen-free Chinook salmon, Oncorhynchus tshawytscha (Walbaum), with R. salmoninarum was used to estimate diagnostic sensitivity and specificity. Results did not identify a single assay demonstrating the highest analytical and diagnostic performance characteristics, but revealed strengths and weaknesses of each test.

  5. Assessing the cost of implementing the 2011 Society of Obstetricians and Gynecologists of Canada and Canadian College of Medical Genetics practice guidelines on the detection of fetal aneuploidies.

    PubMed

    Lilley, Margaret; Hume, Stacey; Karpoff, Nina; Maire, Georges; Taylor, Sherry; Tomaszewski, Robert; Yoshimoto, Maisa; Christian, Susan

    2017-09-01

    The Society of Obstetricians and Gynecologists of Canada and the Canadian College of Medical Genetics published guidelines, in 2011, recommending replacement of karyotype with quantitative fluorescent polymerase chain reaction when prenatal testing is performed because of an increased risk of a common aneuploidy. This study's objective is to perform a cost analysis following the implementation of quantitative fluorescent polymerase chain reaction as a stand-alone test. A total of 658 samples were received between 1 April 2014 and 31 August 2015: 576 amniocentesis samples and 82 chorionic villi sampling. A chromosome abnormality was identified in 14% (93/658) of the prenatal samples tested. The implementation of the 2011 Society of Obstetricians and Gynecologists of Canada and the Canadian College of Medical Genetics guidelines in Edmonton and Northern Alberta resulted in a cost savings of $46 295.80. The replacement of karyotype with chromosomal microarray for some indications would be associated with additional costs. The implementation of new test methods may provide cost savings or added costs. Cost analysis is important to consider during the implementation of new guidelines or technologies. © 2017 John Wiley & Sons, Ltd. © 2017 John Wiley & Sons, Ltd.

  6. Dual Nozzle Aerodynamic and Cooling Analysis Study.

    DTIC Science & Technology

    1981-02-27

    program and to the aerodynamic model computer program. This pro - cedure was used to define two secondary nozzle contours for the baseline con - figuration...both the dual-throat and dual-expander con - cepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow...preliminary heat transfer analysis of both con - cepts, and (5) engineering analysis of data from the NASA/MSFC hot-fire testing of a dual-throat

  7. KSC01pp0796

    NASA Image and Video Library

    2001-04-12

    At Astrotech, Titusville, Fla., the GOES-M (Geostationary Operational Environmental Satellite) satellite is tilted on a workstand so that workers can remove part of the protective cover. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite will undergo testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  8. Exploration of Use of Copulas in Analysing the Relationship between Precipitation and Meteorological Drought in Beijing, China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, Linlin; Wang, Hongrui; Wang, Cheng

    Drought risk analysis is essential for regional water resource management. In this study, the probabilistic relationship between precipitation and meteorological drought in Beijing, China, was calculated under three different precipitation conditions (precipitation equal to, greater than, or less than a threshold) based on copulas. The Standardized Precipitation Evapotranspiration Index (SPEI) was calculated based on monthly total precipitation and monthly mean temperature data. The trends and variations in the SPEI were analysed using Hilbert-Huang Transform (HHT) and Mann-Kendall (MK) trend tests with a running approach. The results of the HHT and MK test indicated a significant decreasing trend in the SPEI.more » The copula-based conditional probability indicated that the probability of meteorological drought decreased as monthly precipitation increased and that 10 mm can be regarded as the threshold for triggering extreme drought. From a quantitative perspective, when R ≤ mm, the probabilities of moderate drought, severe drought, and extreme drought were 22.1%, 18%, and 13.6%, respectively. This conditional probability distribution not only revealed the occurrence of meteorological drought in Beijing but also provided a quantitative way to analyse the probability of drought under different precipitation conditions. Furthermore, the results provide a useful reference for future drought prediction.« less

  9. Something to talk about: Topics of conversation between romantic partners during military deployments.

    PubMed

    Carter, Sarah P; Osborne, Laura J; Renshaw, Keith D; Allen, Elizabeth S; Loew, Benjamin A; Markman, Howard J; Stanley, Scott M

    2018-02-01

    Long-distance communication has been frequently identified as essential to military couples trying to maintain their relationship during a deployment. Little quantitative research, however, has assessed the types of topics discussed during such communication and how those topics relate to overall relationship satisfaction. The current study draws on a sample of 56 Army couples who provided data through online surveys while the service member was actively deployed. These couples provided information on current marital satisfaction, topics discussed during deployment (problem talk, friendship talk, love talk), and how they communicated via synchronous media (e.g., phone calls, video calls) and letters during deployment. Nonparametric Friedman tests followed by paired t tests revealed that synchronous communication was primarily utilized for friendship talk, whereas letters included friendship talk and love talk in similar amounts. Both synchronous communication and letters included less problem talk than other topics. In mixed-level modeling, only topics of communication for synchronous media (not for letters) were related to relationship satisfaction. Love talk via synchronous media was related to higher relationship satisfaction, whereas problem talk via synchronous media was related to less relationship satisfaction. The current study offers the first quantitative assessment of topics within deployment communication media and associations with relationship satisfaction. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Exploration of Use of Copulas in Analysing the Relationship between Precipitation and Meteorological Drought in Beijing, China

    DOE PAGES

    Fan, Linlin; Wang, Hongrui; Wang, Cheng; ...

    2017-05-16

    Drought risk analysis is essential for regional water resource management. In this study, the probabilistic relationship between precipitation and meteorological drought in Beijing, China, was calculated under three different precipitation conditions (precipitation equal to, greater than, or less than a threshold) based on copulas. The Standardized Precipitation Evapotranspiration Index (SPEI) was calculated based on monthly total precipitation and monthly mean temperature data. The trends and variations in the SPEI were analysed using Hilbert-Huang Transform (HHT) and Mann-Kendall (MK) trend tests with a running approach. The results of the HHT and MK test indicated a significant decreasing trend in the SPEI.more » The copula-based conditional probability indicated that the probability of meteorological drought decreased as monthly precipitation increased and that 10 mm can be regarded as the threshold for triggering extreme drought. From a quantitative perspective, when R ≤ mm, the probabilities of moderate drought, severe drought, and extreme drought were 22.1%, 18%, and 13.6%, respectively. This conditional probability distribution not only revealed the occurrence of meteorological drought in Beijing but also provided a quantitative way to analyse the probability of drought under different precipitation conditions. Furthermore, the results provide a useful reference for future drought prediction.« less

  11. Portable, one-step, and rapid GMR biosensor platform with smartphone interface.

    PubMed

    Choi, Joohong; Gani, Adi Wijaya; Bechstein, Daniel J B; Lee, Jung-Rok; Utz, Paul J; Wang, Shan X

    2016-11-15

    Quantitative immunoassay tests in clinical laboratories require trained technicians, take hours to complete with multiple steps, and the instruments used are generally immobile-patient samples have to be sent in to the labs for analysis. This prevents quantitative immunoassay tests to be performed outside laboratory settings. A portable, quantitative immunoassay device will be valuable in rural and resource-limited areas, where access to healthcare is scarce or far away. We have invented Eigen Diagnosis Platform (EDP), a portable quantitative immunoassay platform based on Giant Magnetoresistance (GMR) biosensor technology. The platform does not require a trained technician to operate, and only requires one-step user involvement. It displays quantitative results in less than 15min after sample insertion, and each test costs less than US$4. The GMR biosensor employed in EDP is capable of detecting multiple biomarkers in one test, enabling a wide array of immune diagnostics to be performed simultaneously. In this paper, we describe the design of EDP, and demonstrate its capability. Multiplexed assay of human immunoglobulin G and M (IgG and IgM) antibodies with EDP achieves sensitivities down to 0.07 and 0.33 nanomolar, respectively. The platform will allow lab testing to be performed in remote areas, and open up applications of immunoassay testing in other non-clinical settings, such as home, school, and office. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. In vivo study of the effectiveness of quantitative percussion diagnostics as an indicator of the level of the structural pathology of teeth.

    PubMed

    Sheets, Cherilyn G; Wu, Jean C; Rashad, Samer; Phelan, Michael; Earthman, James C

    2016-08-01

    Conventional dental diagnostic aids based upon imagery and patient symptoms are at best only partially effective for the detection of fine structural defects such as cracks in teeth. The purpose of this clinical study was to determine whether quantitative percussion diagnostics (QPD) provided knowledge of the structural instability of teeth before restorative work begins. QPD is a mechanics-based methodology that tests the structural integrity of teeth noninvasively. Eight human participants with 60 sites needing restoration were enrolled in an institutional review board-approved clinical study. Comprehensive examinations were performed in each human participant, including QPD testing. Each site was disassembled and microscopically video documented, and the results were recorded on a defect assessment sheet. Each restored site was then tested using QPD. The normal fit error (NFE), which corresponds to the localized defect severity, was correlated with any pretreatment structural pathology. QPD agreed with clinical disassembly in 55 of 60 comparisons (92% agreement). Moreover, the method achieved 98% specificity and 100% sensitivity for detecting structural pathologies found later upon clinical disassembly. Overall, the NFE was found to be highly predictive of advanced structural pathology. The data from the present in vivo study support the hypothesis that QPD can provide the clinician with advance knowledge of the structural instability of teeth before restorative work begins. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  13. Development and validation of InnoQuant™, a sensitive human DNA quantitation and degradation assessment method for forensic samples using high copy number mobile elements Alu and SVA.

    PubMed

    Pineda, Gina M; Montgomery, Anne H; Thompson, Robyn; Indest, Brooke; Carroll, Marion; Sinha, Sudhir K

    2014-11-01

    There is a constant need in forensic casework laboratories for an improved way to increase the first-pass success rate of forensic samples. The recent advances in mini STR analysis, SNP, and Alu marker systems have now made it possible to analyze highly compromised samples, yet few tools are available that can simultaneously provide an assessment of quantity, inhibition, and degradation in a sample prior to genotyping. Currently there are several different approaches used for fluorescence-based quantification assays which provide a measure of quantity and inhibition. However, a system which can also assess the extent of degradation in a forensic sample will be a useful tool for DNA analysts. Possessing this information prior to genotyping will allow an analyst to more informatively make downstream decisions for the successful typing of a forensic sample without unnecessarily consuming DNA extract. Real-time PCR provides a reliable method for determining the amount and quality of amplifiable DNA in a biological sample. Alu are Short Interspersed Elements (SINE), approximately 300bp insertions which are distributed throughout the human genome in large copy number. The use of an internal primer to amplify a segment of an Alu element allows for human specificity as well as high sensitivity when compared to a single copy target. The advantage of an Alu system is the presence of a large number (>1000) of fixed insertions in every human genome, which minimizes the individual specific variation possible when using a multi-copy target quantification system. This study utilizes two independent retrotransposon genomic targets to obtain quantification of an 80bp "short" DNA fragment and a 207bp "long" DNA fragment in a degraded DNA sample in the multiplex system InnoQuant™. The ratio of the two quantitation values provides a "Degradation Index", or a qualitative measure of a sample's extent of degradation. The Degradation Index was found to be predictive of the observed loss of STR markers and alleles as degradation increases. Use of a synthetic target as an internal positive control (IPC) provides an additional assessment for the presence of PCR inhibitors in the test sample. In conclusion, a DNA based qualitative/quantitative/inhibition assessment system that accurately predicts the status of a biological sample, will be a valuable tool for deciding which DNA test kit to utilize and how much target DNA to use, when processing compromised forensic samples for DNA testing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Quantitative comparison of the pivot shift test results before and after anterior cruciate ligament reconstruction by using the three-dimensional electromagnetic measurement system.

    PubMed

    Nagai, Kanto; Hoshino, Yuichi; Nishizawa, Yuichiro; Araki, Daisuke; Matsushita, Takehiko; Matsumoto, Tomoyuki; Takayama, Koji; Nagamune, Kouki; Kurosaka, Masahiro; Kuroda, Ryosuke

    2015-10-01

    Tibial acceleration during the pivot shift test is a potential quantitative parameter to evaluate rotational laxity of anterior cruciate ligament (ACL) insufficiency. However, clinical application of this measurement has not been fully examined. This study aimed to measure and compare tibial acceleration before and after ACL reconstruction (ACLR) in ACL-injured patients. We hypothesized tibial acceleration would be reduced by ACLR and tibial acceleration would be consistent in the same knee at different time points. Seventy ACL-injured patients who underwent ACLR were enrolled. Tibial acceleration during the pivot shift test was measured using an electromagnetic measurement system before ALCR and at the second-look arthroscopy 1 year post-operatively. Tibial acceleration was compared to clinical grading and between ACL-injured/ACL-reconstructed and contralateral knees. Pre-operative tibial acceleration was increased stepwise with the increase in clinical grading (P < 0.01). Tibial acceleration in ACL-injured knee (1.9 ± 1.2 m/s(2)) was larger than that in the contralateral knee (0.8 ± 0.3 m/s(2), P < 0.01), and reduced to 0.9 ± 0.3 m/s(2) post-operatively (P < 0.01). There was no difference between ACL-reconstructed and contralateral knee (n.s.). Tibial acceleration in contralateral knees was consistent pre- and post-operatively (n.s.). Tibial acceleration measurement demonstrated increased rotational laxity in ACL-injured knees and its reduction by ALCR. Additionally, consistent measurements were obtained in ACL-intact knees at different time points. Therefore, tibial acceleration during the pivot shift test could provide quantitative evaluation of rotational stability before and after ACL reconstruction. III.

  15. QDIRT: Quantitative Direct and Indirect Testing of Sudomotor Function

    PubMed Central

    Gibbons, Christopher H.; Illigens, Ben MW; Centi, Justin; Freeman, Roy

    2011-01-01

    Objective To develop a novel assessment of sudomotor function. Background Post-ganglionic sudomotor function is currently evaluated using quantitative sudomotor axon reflex testing (QSART) or silicone impressions. We hypothesize that high-resolution digital photography has advanced sufficiently to allow quantitative direct and indirect testing of sudomotor function (QDIRT) with spatial and temporal resolution comparable to these techniques. Methods Sweating in 10 humans was stimulated on both forearms by iontophoresis of 10% acetylcholine. Silicone impressions were made and topical indicator dyes were digitally photographed every 15 seconds for 7 minutes after iontophoresis. Sweat droplets were quantified by size, location and percent surface area. Each test was repeated 8 times in each subject on alternating arms over 2 months. Another 10 subjects had silicone impressions, QDIRT and QSART performed on the dorsum of the right foot. Results The percent area of sweat photographically imaged correlated with silicone impressions at 5 minutes on the forearm (r = 0.92, p<0.01) and dorsal foot (r=0.85, p<0.01). The number of sweat droplets assessed with QDIRT correlated with the silicone impression although the droplet number was lower (162±28 vs. 341±56, p<0.01; r =0.83, p<0.01). QDIRT and QSART sudomotor assessments measured at the dorsum of the foot correlated (sweat response (r=0.63, p<0.05) and sweat onset latency (r=0.52, p<0.05). Conclusions QDIRT measured both the direct and indirect sudomotor response with spatial resolution similar to silicone impressions, and with temporal resolution that is similar to QSART. QDIRT provides a novel tool for the evaluation of post-ganglionic sudomotor function. PMID:18541883

  16. Three pedagogical approaches to introductory physics labs and their effects on student learning outcomes

    NASA Astrophysics Data System (ADS)

    Chambers, Timothy

    This dissertation presents the results of an experiment that measured the learning outcomes associated with three different pedagogical approaches to introductory physics labs. These three pedagogical approaches presented students with the same apparatus and covered the same physics content, but used different lab manuals to guide students through distinct cognitive processes in conducting their laboratory investigations. We administered post-tests containing multiple-choice conceptual questions and free-response quantitative problems one week after students completed these laboratory investigations. In addition, we collected data from the laboratory practical exam taken by students at the end of the semester. Using these data sets, we compared the learning outcomes for the three curricula in three dimensions of ability: conceptual understanding, quantitative problem-solving skill, and laboratory skills. Our three pedagogical approaches are as follows. Guided labs lead students through their investigations via a combination of Socratic-style questioning and direct instruction, while students record their data and answers to written questions in the manual during the experiment. Traditional labs provide detailed written instructions, which students follow to complete the lab objectives. Open labs provide students with a set of apparatus and a question to be answered, and leave students to devise and execute an experiment to answer the question. In general, we find that students performing Guided labs perform better on some conceptual assessment items, and that students performing Open labs perform significantly better on experimental tasks. Combining a classical test theory analysis of post-test results with in-lab classroom observations allows us to identify individual components of the laboratory manuals and investigations that are likely to have influenced the observed differences in learning outcomes associated with the different pedagogical approaches. Due to the novel nature of this research and the large number of item-level results we produced, we recommend additional research to determine the reproducibility of our results. Analyzing the data with item response theory yields additional information about the performance of our students on both conceptual questions and quantitative problems. We find that performing lab activities on a topic does lead to better-than-expected performance on some conceptual questions regardless of pedagogical approach, but that this acquired conceptual understanding is strongly context-dependent. The results also suggest that a single "Newtonian reasoning ability" is inadequate to explain student response patterns to items from the Force Concept Inventory. We develop a framework for applying polytomous item response theory to the analysis of quantitative free-response problems and for analyzing how features of student solutions are influenced by problem-solving ability. Patterns in how students at different abilities approach our post-test problems are revealed, and we find hints as to how features of a free-response problem influence its item parameters. The item-response theory framework we develop provides a foundation for future development of quantitative free-response research instruments. Chapter 1 of the dissertation presents a brief history of physics education research and motivates the present study. Chapter 2 describes our experimental methodology and discusses the treatments applied to students and the instruments used to measure their learning. Chapter 3 provides an introduction to the statistical and analytical methods used in our data analysis. Chapter 4 presents the full data set, analyzed using both classical test theory and item response theory. Chapter 5 contains a discussion of the implications of our results and a data-driven analysis of our experimental methods. Chapter 6 describes the importance of this work to the field and discusses the relevance of our research to curriculum development and to future work in physics education research.

  17. Concentric Coplanar Capacitive Sensor System with Quantitative Model

    NASA Technical Reports Server (NTRS)

    Chen, Tianming (Inventor); Bowler, Nicola (Inventor)

    2014-01-01

    A concentric coplanar capacitive sensor includes a charged central disc forming a first electrode, an outer annular ring coplanar with and outer to the charged central disc, the outer annular ring forming a second electrode, and a gap between the charged central disc and the outer annular ring. The first electrode and the second electrode may be attached to an insulative film. A method provides for determining transcapacitance between the first electrode and the second electrode and using the transcapacitance in a model that accounts for a dielectric test piece to determine inversely the properties of the dielectric test piece.

  18. Electrical Characterization of 4H-SiC JFET Wafer: DC Parameter Variations for Extreme Temperature IC Design

    NASA Technical Reports Server (NTRS)

    Neudeck, Philip G.; Chen, Liangyu; Spry, David J.; Beheim, Glenn M.; Chang, Carl W.

    2014-01-01

    This work reports DC electrical characterization of a 76 mm diameter 4H-SiC JFET test wafer fabricated as part of NASA's on-going efforts to realize medium-scale ICs with prolonged and stable circuit operation at temperatures as high as 500 degC. In particular, these measurements provide quantitative parameter ranges for use in JFET IC design and simulation. Larger than expected parameter variations were observed both as a function of position across the wafer as well as a function of ambient testing temperature from 23 degC to 500 degC.

  19. Evaluation of Aution Max AX-4030 and 9UB Uriflet, 10PA Aution Sticks urine dipsticks in the automated urine test strip analysis.

    PubMed

    Rota, Cristina; Biondi, Marco; Trenti, Tommaso

    2011-09-26

    Aution Max AX-4030, a test strip analyzer recently introduced to the market, represents an upgrade of the Aution Max AX-4280 widely employed for urinalysis. This new instrument model can allocate two different test strips at the same time. In the present study the two instruments have been compared together with the usage of Uriflet 9UB and the recently produced Aution Sticks 10PA urine strips, the latter presenting an additional test area for the measurement of urinary creatinine. Imprecision and correlation between instruments and strips have been evaluated for chemical-physical parameters. Accuracy was evaluated for protein, glucose and creatinine by comparing the semi-quantitative results to those obtained by quantitative methods. The well-known interference effect of high ascorbic acid levels on urine glucose test strip determination was evaluated, ascorbic acid influence was also evaluated on protein and creatinine determination. The two instruments have demonstrated comparable performances: precision and correlation between instruments and strips, evaluated for chemical-physical parameters, were always good. Furthermore, accuracy was always very good: results of protein and glucose semi-quantitative measurements resulted to be highly correlated with those obtained by quantitative methods. Moreover, the semi-quantitative measurements of creatinine, employing Aution Sticks 10PA urine strips, were highly comparable with quantitative results. 10PA urine strips are eligible for urine creatinine determination with the possibility of correcting urinalysis results for urinary creatinine concentration, whenever necessary and calculating the protein creatinine ratio. Further studies should be carried out to evaluate effectiveness and appropriateness of the usage of creatinine semi-quantitative analysis.

  20. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: QA TESTS, QUANTITATION AND SPECTROSCOPY

    EPA Science Inventory

    Confocal Microscopy System Performance: QA tests, Quantitation and Spectroscopy.

    Robert M. Zucker 1 and Jeremy M. Lerner 2,
    1Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research Development, U.S. Environmen...

  1. 34 CFR 668.146 - Criteria for approving tests.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... approved under this subpart, a test shall— (1) Assess secondary school level basic verbal and quantitative... verbal and quantitative skills with sufficient numbers of questions to— (i) Adequately represent each... the American Educational Research Association, the American Psychological Association, and the...

  2. Diagnostic value of "dysphagia limit" for neurogenic dysphagia: 17 years of experience in 1278 adults.

    PubMed

    Aydogdu, Ibrahim; Kiylioglu, Nefati; Tarlaci, Sultan; Tanriverdi, Zeynep; Alpaydin, Sezin; Acarer, Ahmet; Baysal, Leyla; Arpaci, Esra; Yuceyar, Nur; Secil, Yaprak; Ozdemirkiran, Tolga; Ertekin, Cumhur

    2015-03-01

    Neurogenic dysphagia (ND) is a prevalent condition that accounts for significant mortality and morbidity worldwide. Screening and follow-up are critical for early diagnosis and management which can mitigate its complications and be cost-saving. The aims of this study are to provide a comprehensive investigation of the dysphagia limit (DL) in a large diverse cohort and to provide a longitudinal assessment of dysphagia in a subset of subjects. We developed a quantitative and noninvasive method for objective assessment of dysphagia by using laryngeal sensor and submental electromyography. DL is the volume at which second or more swallows become necessary to swallow the whole amount of bolus. This study represents 17 years experience with the DL approach in assessing ND in a cohort of 1278 adult subjects consisting of 292 healthy controls, 784 patients with dysphagia, and 202 patients without dysphagia. A total of 192 of all patients were also reevaluated longitudinally over a period of 1-19 months. DL has 92% sensitivity, 91% specificity, 94% positive predictive value, and 88% negative predictive value with an accuracy of 0.92. Patients with ALS, stroke, and movement disorders have the highest sensitivity (85-97%) and positive predictive value (90-99%). The clinical severity of dysphagia has significant negative correlation with DL (r=-0.67, p<0.0001). We propose the DL as a reliable, quick, noninvasive, quantitative test to detect and follow both clinical and subclinical dysphagia and it can be performed in an EMG laboratory. Our study provides specific quantitative features of DL test that can be readily utilized by the neurologic community and nominates DL as an objective and robust method to evaluate dysphagia in a wide range of neurologic conditions. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Quantitative determination of dimethicone in commercial tablets and capsules by Fourier transform infrared spectroscopy and antifoaming activity test.

    PubMed

    Torrado, G; García-Arieta, A; de los Ríos, F; Menéndez, J C; Torrado, S

    1999-03-01

    Fourier transform infrared (FTIR) spectroscopy and antifoaming activity test have been employed for the quantitative analysis of dimethicone. Linearity, accuracy and precision are presented for both methods. These methods have been also used to compare different dimethicone-containing proprietary medicines. FTIR spectroscopy has shown to be adequate for quantitation of dimethicone in commercial tablets and capsules in order to comply with USP requirements. The antifoaming activity test is able to detect incompatibilities between dimethicone and other constituents. The presence of certain enzymes in some medicinal products increases the defoaming properties of these formulations.

  4. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  5. Quantitative ESD Guidelines for Charged Spacecraft Derived from the Physics of Discharges

    NASA Technical Reports Server (NTRS)

    Frederickson, A. R.

    1992-01-01

    Quantitative guidelines are proposed for Electrostatic Discharge (ESD) pulse shape on charged spacecraft. The guidelines are based on existing ground test data, and on a physical description of the pulsed discharge process. The guidelines are designed to predict pulse shape for surface charging and internal charging on a wide variety of spacecraft structures. The pulses depend on the area of the sample, its capacitance to ground, and the strength of the electric field in the vacuum adjacent to the charged surface. By knowing the pulse shape, current vs. time, one can determine if nearby circuits are threatened by the pulse. The quantitative guidelines might be used to estimate the level of threat to an existing spacecraft, or to redesign a spacecraft to reduce its pulses to a known safe level. The experiments which provide the data and the physics that allow one to interpret the data will be discussed, culminating in examples of how to predict pulse shape/size. This method has been used, but not confirmed, on several spacecraft.

  6. Establishing the credibility of qualitative research findings: the plot thickens.

    PubMed

    Cutcliffe, J R; McKenna, H P

    1999-08-01

    Qualitative research is increasingly recognized and valued and its unique place in nursing research is highlighted by many. Despite this, some nurse researchers continue to raise epistemological issues about the problems of objectivity and the validity of qualitative research findings. This paper explores the issues relating to the representativeness or credibility of qualitative research findings. It therefore critiques the existing distinct philosophical and methodological positions concerning the trustworthiness of qualitative research findings, which are described as follows: quantitative studies should be judged using the same criteria and terminology as quantitative studies; it is impossible, in a meaningful way, for any criteria to be used to judge qualitative studies; qualitative studies should be judged using criteria that are developed for and fit the qualitative paradigm; and the credibility of qualitative research findings could be established by testing out the emerging theory by means of conducting a deductive quantitative study. The authors conclude by providing some guidelines for establishing the credibility of qualitative research findings.

  7. High-coverage quantitative proteomics using amine-specific isotopic labeling.

    PubMed

    Melanson, Jeremy E; Avery, Steven L; Pinto, Devanand M

    2006-08-01

    Peptide dimethylation with isotopically coded formaldehydes was evaluated as a potential alternative to techniques such as the iTRAQ method for comparative proteomics. The isotopic labeling strategy and custom-designed protein quantitation software were tested using protein standards and then applied to measure proteins levels associated with Alzheimer's disease (AD). The method provided high accuracy (10% error), precision (14% RSD) and coverage (70%) when applied to the analysis of a standard solution of BSA by LC-MS/MS. The technique was then applied to measure protein abundance levels in brain tissue afflicted with AD relative to normal brain tissue. 2-D LC-MS analysis identified 548 unique proteins (p<0.05). Of these, 349 were quantified with two or more peptides that met the statistical criteria used in this study. Several classes of proteins exhibited significant changes in abundance. For example, elevated levels of antioxidant proteins and decreased levels of mitochondrial electron transport proteins were observed. The results demonstrate the utility of the labeling method for high-throughput quantitative analysis.

  8. Ratiometric spectral imaging for fast tumor detection and chemotherapy monitoring in vivo

    PubMed Central

    Hwang, Jae Youn; Gross, Zeev; Gray, Harry B.; Medina-Kauwe, Lali K.; Farkas, Daniel L.

    2011-01-01

    We report a novel in vivo spectral imaging approach to cancer detection and chemotherapy assessment. We describe and characterize a ratiometric spectral imaging and analysis method and evaluate its performance for tumor detection and delineation by quantitatively monitoring the specific accumulation of targeted gallium corrole (HerGa) into HER2-positive (HER2 +) breast tumors. HerGa temporal accumulation in nude mice bearing HER2 + breast tumors was monitored comparatively by a. this new ratiometric imaging and analysis method; b. established (reflectance and fluorescence) spectral imaging; c. more commonly used fluorescence intensity imaging. We also tested the feasibility of HerGa imaging in vivo using the ratiometric spectral imaging method for tumor detection and delineation. Our results show that the new method not only provides better quantitative information than typical spectral imaging, but also better specificity than standard fluorescence intensity imaging, thus allowing enhanced in vivo outlining of tumors and dynamic, quantitative monitoring of targeted chemotherapy agent accumulation into them. PMID:21721808

  9. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  10. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  11. Semi Quantitative MALDI TOF for Antimicrobial Susceptibility Testing in Staphylococcus aureus

    DTIC Science & Technology

    2017-08-31

    Semi- quantitative MALDI-TOF for antimicrobial susceptibility testing in Staphylococcus 1 aureus 2 3 4 Tucker Maxson,a Cheryl L. Taylor-Howell,a...Timothy D. Minoguea# 5 6 Diagnostic Systems Division, United States Army Medical Research Institute of Infectious 7 Disease, Fort Detrick, MD...USAa 8 9 Running Title: Quantitative MALDI for AST in S. aureus 10 #Address correspondence to Timothy D. Minogue, timothy.d.minogue.civ@mail.mil

  12. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries

    PubMed Central

    Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-01-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698

  13. Procedures to evaluate the efficiency of protective clothing worn by operators applying pesticide.

    PubMed

    Espanhol-Soares, Melina; Nociti, Leticia A S; Machado-Neto, Joaquim Gonçalves

    2013-10-01

    The evaluation of the efficiency of whole-body protective clothing against pesticides has already been carried out through field tests and procedures defined by international standards, but there is a need to determine the useful life of these garments to ensure worker safety. The aim of this article is to compare the procedures for evaluating efficiency of two whole-body protective garments, both new and previously used by applicators of herbicides, using a laboratory test with a mannequin and in the field with the operator. The evaluation of the efficiency of protective clothing used both quantitative and qualitative methodologies, leading to a proposal for classification according to efficiency, and determination of the useful life of protective clothing for use against pesticides, based on a quantitative assessment. The procedures used were in accordance with the standards of the modified American Society for Testing and Materials (ASTM) F 1359:2007 and International Organization for Standardization 17491-4. The protocol used in the field was World Health Organization Vector Biology and Control (VBC)/82.1. Clothing tested was personal water repellent and pesticide protective. Two varieties of fabric were tested: Beige (100% cotton) and Camouflaged (31% polyester and 69% cotton). The efficiency in exposure control of the personal protective clothing was measured before use and after 5, 10, 20, and 30 uses and washes under field conditions. Personal protective clothing was worn by workers in the field during the application of the herbicide glyphosate on weed species in mature sugar cane plantations using a knapsack sprayer. The modified ASTM 1359:2007 procedure was chosen as the most appropriate due to its greater repeatability (lower coefficient of variation). This procedure provides quantitative evaluation needed to determine the efficiency and useful life of individual protective clothing, not just at specific points of failure, but according to dermal protection as a whole. The qualitative assessment, which is suitable for verification of garment design and stitching flaws, does not aid in determining useful life, but does complement the quantitative evaluation. The proposed classification is appropriate and accurate for determining the useful life of personal protective clothing against pesticide materials relative to number of uses and washes after each use. For example, the Beige garment had a useful life of 30 uses and washes, while the Camouflaged garment had a useful life of 5 uses and washes. The quantitative evaluation aids in determining the efficiency and useful life of individual protective clothing according to dermal protection as a whole, not just at specific points of failure.

  14. Synthesising quantitative and qualitative research in evidence‐based patient information

    PubMed Central

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-01-01

    Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review. PMID:17325406

  15. A QUANTITATIVE TEST OF THE NO-HAIR THEOREM WITH Sgr A* USING STARS, PULSARS, AND THE EVENT HORIZON TELESCOPE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Psaltis, Dimitrios; Wex, Norbert; Kramer, Michael

    The black hole in the center of the Milky Way, Sgr A*, has the largest mass-to-distance ratio among all known black holes in the universe. This property makes Sgr A* the optimal target for testing the gravitational no-hair theorem. In the near future, major developments in instrumentation will provide the tools for high-precision studies of its spacetime via observations of relativistic effects in stellar orbits, in the timing of pulsars, and in horizon-scale images of its accretion flow. We explore here the prospect of measuring the properties of the black hole spacetime using all of these three types of observations.more » We show that the correlated uncertainties in the measurements of the black hole spin and quadrupole moment using the orbits of stars and pulsars are nearly orthogonal to those obtained from measuring the shape and size of the shadow the black hole casts on the surrounding emission. Combining these three types of observations will therefore allow us to assess and quantify systematic biases and uncertainties in each measurement and lead to a highly accurate, quantitative test of the gravitational no-hair theorem.« less

  16. Overcoming pain thresholds with multilevel models-an example using quantitative sensory testing (QST) data.

    PubMed

    Hirschfeld, Gerrit; Blankenburg, Markus R; Süß, Moritz; Zernikow, Boris

    2015-01-01

    The assessment of somatosensory function is a cornerstone of research and clinical practice in neurology. Recent initiatives have developed novel protocols for quantitative sensory testing (QST). Application of these methods led to intriguing findings, such as the presence lower pain-thresholds in healthy children compared to healthy adolescents. In this article, we (re-) introduce the basic concepts of signal detection theory (SDT) as a method to investigate such differences in somatosensory function in detail. SDT describes participants' responses according to two parameters, sensitivity and response-bias. Sensitivity refers to individuals' ability to discriminate between painful and non-painful stimulations. Response-bias refers to individuals' criterion for giving a "painful" response. We describe how multilevel models can be used to estimate these parameters and to overcome central critiques of these methods. To provide an example we apply these methods to data from the mechanical pain sensitivity test of the QST protocol. The results show that adolescents are more sensitive to mechanical pain and contradict the idea that younger children simply use more lenient criteria to report pain. Overall, we hope that the wider use of multilevel modeling to describe somatosensory functioning may advance neurology research and practice.

  17. Quantitative multi-modal NDT data analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundantmore » information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.« less

  18. I Vivo Quantitative Ultrasound Imaging and Scatter Assessments.

    NASA Astrophysics Data System (ADS)

    Lu, Zheng Feng

    There is evidence that "instrument independent" measurements of ultrasonic scattering properties would provide useful diagnostic information that is not available with conventional ultrasound imaging. This dissertation is a continuing effort to test the above hypothesis and to incorporate quantitative ultrasound methods into clinical examinations for early detection of diffuse liver disease. A well-established reference phantom method was employed to construct quantitative ultrasound images of tissue in vivo. The method was verified by extensive phantom tests. A new method was developed to measure the effective attenuation coefficient of the body wall. The method relates the slope of the difference between the echo signal power spectrum from a uniform region distal to the body wall and the echo signal power spectrum from a reference phantom to the body wall attenuation. The accuracy obtained from phantom tests suggests further studies with animal experiments. Clinically, thirty-five healthy subjects and sixteen patients with diffuse liver disease were studied by these quantitative ultrasound methods. The average attenuation coefficient in normals agreed with previous investigators' results; in vivo backscatter coefficients agreed with the results from normals measured by O'Donnell. Strong discriminating power (p < 0.001) was found for both attenuation and backscatter coefficients between fatty livers and normals; a significant difference (p < 0.01) was observed in the backscatter coefficient but not in the attenuation coefficient between cirrhotic livers and normals. An in vivo animal model of steroid hepatopathy was used to investigate the system sensitivity in detecting early changes in canine liver resulting from corticosteroid administration. The average attenuation coefficient slope increased from 0.7 dB/cm/MHz in controls to 0.82 dB/cm/MHz (at 6 MHz) in treated animals on day 14 into the treatment, and the backscatter coefficient was 26times 10^{ -4}cm^{-1}sr^{-1} in controls compared with 74times 10^{-4}cm^{-1}sr^ {-1} (at 6 MHz) in treated animals. A simplified quantitative approach using video image signals was developed. Results derived both from the r.f. signal analysis and from the video signal analysis are sensitive to the changes in the liver in this animal model.

  19. The attentional drift-diffusion model extends to simple purchasing decisions.

    PubMed

    Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio

    2012-01-01

    How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions.

  20. Semi-automated discrimination of retinal pigmented epithelial cells in two-photon fluorescence images of mouse retinas.

    PubMed

    Alexander, Nathan S; Palczewska, Grazyna; Palczewski, Krzysztof

    2015-08-01

    Automated image segmentation is a critical step toward achieving a quantitative evaluation of disease states with imaging techniques. Two-photon fluorescence microscopy (TPM) has been employed to visualize the retinal pigmented epithelium (RPE) and provide images indicating the health of the retina. However, segmentation of RPE cells within TPM images is difficult due to small differences in fluorescence intensity between cell borders and cell bodies. Here we present a semi-automated method for segmenting RPE cells that relies upon multiple weak features that differentiate cell borders from the remaining image. These features were scored by a search optimization procedure that built up the cell border in segments around a nucleus of interest. With six images used as a test, our method correctly identified cell borders for 69% of nuclei on average. Performance was strongly dependent upon increasing retinosome content in the RPE. TPM image analysis has the potential of providing improved early quantitative assessments of diseases affecting the RPE.

  1. Field Assessment of Energy Audit Tools for Retrofit Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, J.; Bohac, D.; Nelson, C.

    2013-07-01

    This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home’s energy performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Rating systems based on energy performance models, the focus of this report, can establish a home’s achievable energy efficiency potential and provide a quantitative assessment of energy savings after retrofits are completed, although their accuracy needs to be verified by actual measurement or billing data. Ratings can also showmore » homeowners where they stand compared to their neighbors, thus creating social pressure to conform to or surpass others. This project field-tested three different building performance models of varying complexity, in order to assess their value as rating systems in the context of a residential retrofit program: Home Energy Score, SIMPLE, and REM/Rate.« less

  2. The Attentional Drift-Diffusion Model Extends to Simple Purchasing Decisions

    PubMed Central

    Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio

    2012-01-01

    How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions. PMID:22707945

  3. Quantitative in vivo assessment of lung microstructure at the alveolar level with hyperpolarized 3He diffusion MRI

    NASA Astrophysics Data System (ADS)

    Yablonskiy, Dmitriy A.; Sukstanskii, Alexander L.; Leawoods, Jason C.; Gierada, David S.; Bretthorst, G. Larry; Lefrak, Stephen S.; Cooper, Joel D.; Conradi, Mark S.

    2002-03-01

    The study of lung emphysema dates back to the beginning of the 17th century. Nevertheless, a number of important questions remain unanswered because a quantitative localized characterization of emphysema requires knowledge of lung structure at the alveolar level in the intact living lung. This information is not available from traditional imaging modalities and pulmonary function tests. Herein, we report the first in vivo measurements of lung geometrical parameters at the alveolar level obtained with 3He diffusion MRI in healthy human subjects and patients with severe emphysema. We also provide the first experimental data demonstrating that 3He gas diffusivity in the acinus of human lung is highly anisotropic. A theory of anisotropic diffusion is presented. Our results clearly demonstrate substantial differences between healthy and emphysematous lung at the acinar level and may provide new insights into emphysema progression. The technique offers promise as a clinical tool for early diagnosis of emphysema.

  4. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  5. Improved assay to detect Plasmodium falciparum using an uninterrupted, semi-nested PCR and quantitative lateral flow analysis

    PubMed Central

    2013-01-01

    Background A rapid, non-invasive, and inexpensive point-of-care (POC) diagnostic for malaria followed by therapeutic intervention would improve the ability to control infection in endemic areas. Methods A semi-nested PCR amplification protocol is described for quantitative detection of Plasmodium falciparum and is compared to a traditional nested PCR. The approach uses primers that target the P. falciparum dihydrofolate reductase gene. Results This study demonstrates that it is possible to perform an uninterrupted, asymmetric, semi-nested PCR assay with reduced assay time to detect P. falciparum without compromising the sensitivity and specificity of the assay using saliva as a testing matrix. Conclusions The development of this PCR allows nucleic acid amplification without the need to transfer amplicon from the first PCR step to a second reaction tube with nested primers, thus reducing both the chance of contamination and the time for analysis to < two hours. Analysis of the PCR amplicon yield was adapted to lateral flow detection using the quantitative up-converting phosphor (UCP) reporter technology. This approach provides a basis for migration of the assay to a POC microfluidic format. In addition the assay was successfully evaluated with oral samples. Oral fluid collection provides a simple non-invasive method to collect clinical samples. PMID:23433252

  6. Quantitative Assessment of RNA-Protein Interactions with High Throughput Sequencing - RNA Affinity Profiling (HiTS-RAP)

    PubMed Central

    Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.

    2016-01-01

    Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240

  7. Quantitative analysis of visible surface defect risk in tablets during film coating using terahertz pulsed imaging.

    PubMed

    Niwa, Masahiro; Hiraishi, Yasuhiro

    2014-01-30

    Tablets are the most common form of solid oral dosage produced by pharmaceutical industries. There are several challenges to successful and consistent tablet manufacturing. One well-known quality issue is visible surface defects, which generally occur due to insufficient physical strength, causing breakage or abrasion during processing, packaging, or shipping. Techniques that allow quantitative evaluation of surface strength and the risk of surface defect would greatly aid in quality control. Here terahertz pulsed imaging (TPI) was employed to evaluate the surface properties of core tablets with visible surface defects of varying severity after film coating. Other analytical methods, such as tensile strength measurements, friability testing, and scanning electron microscopy (SEM), were used to validate TPI results. Tensile strength and friability provided no information on visible surface defect risk, whereas the TPI-derived unique parameter terahertz electric field peak strength (TEFPS) provided spatial distribution of surface density/roughness information on core tablets, which helped in estimating tablet abrasion risk prior to film coating and predicting the location of the defects. TPI also revealed the relationship between surface strength and blending condition and is a nondestructive, quantitative approach to aid formulation development and quality control that can reduce visible surface defect risk in tablets. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Quantitative real-time in vivo detection of magnetic nanoparticles by their nonlinear magnetization

    NASA Astrophysics Data System (ADS)

    Nikitin, M. P.; Torno, M.; Chen, H.; Rosengart, A.; Nikitin, P. I.

    2008-04-01

    A novel method of highly sensitive quantitative detection of magnetic nanoparticles (MP) in biological tissues and blood system has been realized and tested in real time in vivo experiments. The detection method is based on nonlinear magnetic properties of MP and the related device can record a very small relative variation of nonlinear magnetic susceptibility up to 10-8 at room temperature, providing sensitivity of several nanograms of MP in 0.1ml volume. Real-time quantitative in vivo measurements of dynamics of MP concentration in blood flow have been performed. A catheter that carried the blood flow of a rat passed through the measuring device. After an MP injection, the quantity of MP in the circulating blood was continuously recorded. The method has also been used to evaluate the MP distribution between rat's organs. Its sensitivity was compared with detection of the radioactive MP based on isotope of Fe59. The comparison of magnetic and radioactive signals in the rat's blood and organ samples demonstrated similar sensitivity for both methods. However, the proposed magnetic method is much more convenient as it is safe, less expensive, and provides real-time measurements in vivo. Moreover, the sensitivity of the method can be further improved by optimization of the device geometry.

  9. An expert system/ion trap mass spectrometry approach for life support systems monitoring

    NASA Technical Reports Server (NTRS)

    Palmer, Peter T.; Wong, Carla M.; Yost, Richard A.; Johnson, Jodie V.; Yates, Nathan A.; Story, Michael

    1992-01-01

    Efforts to develop sensor and control system technology to monitor air quality for life support have resulted in the development and preliminary testing of a concept based on expert systems and ion trap mass spectrometry (ITMS). An ITMS instrument provides the capability to identify and quantitate a large number of suspected contaminants at trace levels through the use of a variety of multidimensional experiments. An expert system provides specialized knowledge for control, analysis, and decision making. The system is intended for real-time, on-line, autonomous monitoring of air quality. The key characteristics of the system, performance data and analytical capabilities of the ITMS instrument, the design and operation of the expert system, and results from preliminary testing of the system for trace contaminant monitoring are described.

  10. Digital video timing analyzer for the evaluation of PC-based real-time simulation systems

    NASA Astrophysics Data System (ADS)

    Jones, Shawn R.; Crosby, Jay L.; Terry, John E., Jr.

    2009-05-01

    Due to the rapid acceleration in technology and the drop in costs, the use of commercial off-the-shelf (COTS) PC-based hardware and software components for digital and hardware-in-the-loop (HWIL) simulations has increased. However, the increase in PC-based components creates new challenges for HWIL test facilities such as cost-effective hardware and software selection, system configuration and integration, performance testing, and simulation verification/validation. This paper will discuss how the Digital Video Timing Analyzer (DiViTA) installed in the Aviation and Missile Research, Development and Engineering Center (AMRDEC) provides quantitative characterization data for PC-based real-time scene generation systems. An overview of the DiViTA is provided followed by details on measurement techniques, applications, and real-world examples of system benefits.

  11. Kinetics of Poliovirus Shedding following Oral Vaccination as Measured by Quantitative Reverse Transcription-PCR versus Culture

    PubMed Central

    Begum, Sharmin; Uddin, Md Jashim; Platts-Mills, James A.; Liu, Jie; Kirkpatrick, Beth D.; Chowdhury, Anwarul H.; Jamil, Khondoker M.; Haque, Rashidul; Petri, William A.; Houpt, Eric R.

    2014-01-01

    Amid polio eradication efforts, detection of oral polio vaccine (OPV) virus in stool samples can provide information about rates of mucosal immunity and allow estimation of the poliovirus reservoir. We developed a multiplex one-step quantitative reverse transcription-PCR (qRT-PCR) assay for detection of OPV Sabin strains 1, 2, and 3 directly in stool samples with an external control to normalize samples for viral quantity and compared its performance with that of viral culture. We applied the assay to samples from infants in Dhaka, Bangladesh, after the administration of trivalent OPV (tOPV) at weeks 14 and 52 of life (on days 0 [pre-OPV], +4, +11, +18, and +25 relative to vaccination). When 1,350 stool samples were tested, the sensitivity and specificity of the quantitative PCR (qPCR) assay were 89 and 91% compared with culture. A quantitative relationship between culture+/qPCR+ and culture−/qPCR+ stool samples was observed. The kinetics of shedding revealed by qPCR and culture were similar. qPCR quantitative cutoffs based on the day +11 or +18 stool samples could be used to identify the culture-positive shedders, as well as the long-duration or high-frequency shedders. Interestingly, qPCR revealed that a small minority (7%) of infants contributed the vast majority (93 to 100%) of the total estimated viral excretion across all subtypes at each time point. This qPCR assay for OPV can simply and quantitatively detect all three Sabin strains directly in stool samples to approximate shedding both qualitatively and quantitatively. PMID:25378579

  12. Evaluative procedures to detect, characterize, and assess the severity of diabetic neuropathy.

    PubMed

    Dyck, P J

    1991-01-01

    Minimal criteria for diabetic neuropathy need to be defined and universally applied. Standardized evaluative procedures need to be agreed and normal ranges determined from healthy volunteers. Types and stages of neuropathy should be established and assessments performed on representative populations of both Type 1 and Type 2 diabetic patients. Potential minimal criteria include absent ankle reflexes and vibratory sensation, and abnormalities of nerve conduction. However, the preferred criterion is the identification of more than two statistically defined abnormalities among symptoms and deficits, nerve conduction, quantitative sensory examination or quantitative autonomic examination. Various evaluative procedures are available. Symptoms should be assessed and scores can be assigned to neurological deficits. However, assessments of nerve conduction provide the most specific, objective, sensitive, and repeatable procedures, although these may be the least meaningful. Many techniques are available for quantitative sensory examination, but are poorly standardized and normal values are not available. For quantitative autonomic examination, tests are available for the adequacy of cardiovascular and peripheral vascular reflexes and increasingly for other autonomic functions. In any assessment of nerve function the conditions should be optimized and standardized, and stimuli defined. Specific instructions should be given and normal ranges established in healthy volunteers.

  13. Quantitative Collection and Enzymatic Activity of Glucose Oxidase Nanotubes Fabricated by Templated Layer-by-Layer Assembly.

    PubMed

    Zhang, Shouwei; Demoustier-Champagne, Sophie; Jonas, Alain M

    2015-08-10

    We report on the fabrication of enzyme nanotubes in nanoporous polycarbonate membranes via the layer-by-layer (LbL) alternate assembly of polyethylenimine (PEI) and glucose oxidase (GOX), followed by dissolution of the sacrificial template in CH2Cl2, collection, and final dispersion in water. An adjuvant-assisted filtration methodology is exploited to extract quantitatively the nanotubes without loss of activity and morphology. Different water-soluble CH2Cl2-insoluble adjuvants are tested for maximal enzyme activity and nanotube stability; whereas NaCl disrupts the tubes by screening electrostatic interactions, the high osmotic pressure created by fructose also contributes to loosening the nanotubular structures. These issues are solved when using neutral, high molar mass dextran. The enzymatic activity of intact free nanotubes in water is then quantitatively compared to membrane-embedded nanotubes, showing that the liberated nanotubes have a higher catalytic activity in proportion to their larger exposed surface. Our study thus discloses a robust and general methodology for the fabrication and quantitative collection of enzymatic nanotubes and shows that LbL assembly provides access to efficient enzyme carriers for use as catalytic swarming agents.

  14. [Experimental studies of using real-time fluorescence quantitative PCR and RT-PCR to detect E6 and E7 genes of human papillomavirus type 16 in cervical carcinoma cell lines].

    PubMed

    Chen, Yue-yue; Peng, Zhi-lan; Liu, Shan-ling; He, Bing; Hu, Min

    2007-06-01

    To establish a method of using real-time fluorescence quantitative PCR and RT-PCR to detect the E6 and E7 genes of human papillomavirus type 16 (HPV-16). Plasmids containing HPV-16 E6 or E7 were used to generate absolute standard curves. Three cervical carcinoma cell lines CaSki, SiHa and HeLa were tested by real-time fluorescence quantitative PCR and RT-PCR analyses for the expressions of HPV-16 E6 and E7. The correlation coefficients of standard curves were larger than 0. 99, and the PCR efficiency was more than 90%. The relative levels of HPV-16 E6 and E7 DNA and RNA were CaSki>SiHa>HeLa cell. HPV-16 E6 and E7 quantum by real-time fluorescence quantitative PCR and RT-PCR analyses may serve as a reliable and sensitive tool. This study provides the possibility of further researches on the relationship between HPV-16 E6 or E7 copy number and cervical carcinoma.

  15. Reading Out Single-Molecule Digital RNA and DNA Isothermal Amplification in Nanoliter Volumes with Unmodified Camera Phones

    PubMed Central

    2016-01-01

    Digital single-molecule technologies are expanding diagnostic capabilities, enabling the ultrasensitive quantification of targets, such as viral load in HIV and hepatitis C infections, by directly counting single molecules. Replacing fluorescent readout with a robust visual readout that can be captured by any unmodified cell phone camera will facilitate the global distribution of diagnostic tests, including in limited-resource settings where the need is greatest. This paper describes a methodology for developing a visual readout system for digital single-molecule amplification of RNA and DNA by (i) selecting colorimetric amplification-indicator dyes that are compatible with the spectral sensitivity of standard mobile phones, and (ii) identifying an optimal ratiometric image-process for a selected dye to achieve a readout that is robust to lighting conditions and camera hardware and provides unambiguous quantitative results, even for colorblind users. We also include an analysis of the limitations of this methodology, and provide a microfluidic approach that can be applied to expand dynamic range and improve reaction performance, allowing ultrasensitive, quantitative measurements at volumes as low as 5 nL. We validate this methodology using SlipChip-based digital single-molecule isothermal amplification with λDNA as a model and hepatitis C viral RNA as a clinically relevant target. The innovative combination of isothermal amplification chemistry in the presence of a judiciously chosen indicator dye and ratiometric image processing with SlipChip technology allowed the sequence-specific visual readout of single nucleic acid molecules in nanoliter volumes with an unmodified cell phone camera. When paired with devices that integrate sample preparation and nucleic acid amplification, this hardware-agnostic approach will increase the affordability and the distribution of quantitative diagnostic and environmental tests. PMID:26900709

  16. Quantitative comparison of the in situ microbial communities in different biomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, D.C.; Ringelberg, D.B.; Palmer, R.J.

    1995-12-31

    A system to define microbial communities in different biomes requires the application of non-traditional methodology. Classical microbiological methods have severe limitations for the analysis of environmental samples. Pure-culture isolation, biochemical testing, and/or enumeration by direct microscopic counting are not well suited for the estimation of total biomass or the assessment of community composition within environmental samples. Such methods provide little insight into the in situ phenotypic activity of the extant microbiota since these techniques are dependent on microbial growth and thus select against many environmental microorganisms which are non- culturable under a wide range of conditions. It has been repeatedlymore » documented in the literature that viable counts or direct counts of bacteria attached to sediment grains are difficult to quantitative and may grossly underestimate the extent of the existing community. The traditional tests provide little indication of the in situ nutritional status or for evidence of toxicity within the microbial community. A more recent development (MIDI Microbial Identification System), measure free and ester-linked fatty acids from isolated microorganisms. Bacterial isolates are identified by comparing their fatty acid profiles to the MIKI database which contains over 8000 entries. The application of the MIKI system to the analysis of environmental samples however, has significant drawbacks. The MIDI system was developed to identify clinical microorganisms and requires their isolation and culture on trypticase soy agar at 27{degrees}C. Since many isolates are unable to grow at these restrictive growth conditions, the system does not lend itself to identification of some environmental organisms. A more applicable methodology for environmental microbial analysis is based on the liquid extrication and separation of microbial lipids from environmental samples, followed by quantitative analysis using gas chromatography/« less

  17. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  18. Improved methods for capture, extraction, and quantitative assay of environmental DNA from Asian bigheaded carp (Hypophthalmichthys spp.).

    PubMed

    Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.

  19. Single Cell Assay for Analyzing Single Cell Exosome and Endocrine Secretion and Cancer Markers

    NASA Astrophysics Data System (ADS)

    Chiu, Yu-Jui

    To understand the inhomogeneity of cells in biological systems, there is a growing demand for the capability to characterize the properties of individual single cells. Since single cell studies require continuous monitoring of the cell behaviors instead of a snapshot test at a single time point, an effective single-cell assay that can support time lapsed studies in a high throughput manner is desired. Most currently available single-cell technologies cannot provide proper environments to sustain cell growth and cannot provide, for appropriate cell types, proliferation of single cells and convenient, non-invasive tests of single cell behaviors from molecular markers. In this dissertation, I present a highly versatile single-cell assay that can accommodate different cellular types, enable easy and efficient single cell loading and culturing, and be suitable for the study of effects of in-vitro environmental factors in combination with drug screening. The salient features of the assay are the non-invasive collection and surveying of single cell secretions at different time points and massively parallel translocation of single cells by user defined criteria, producing very high compatibility to the downstream process such as single cell qPCR and sequencing. Above all, the acquired information is quantitative -- for example, one of the studies is measured by the number of exosomes each single cell secretes for a given time period. Therefore, our single-cell assay provides a convenient, low-cost, and enabling tool for quantitative, time lapsed studies of single cell properties.

  20. Electronic patient portals: evidence on health outcomes, satisfaction, efficiency, and attitudes: a systematic review.

    PubMed

    Goldzweig, Caroline Lubick; Orshansky, Greg; Paige, Neil M; Towfigh, Ali Alexander; Haggstrom, David A; Miake-Lye, Isomi; Beroes, Jessica M; Shekelle, Paul G

    2013-11-19

    Patient portals tied to provider electronic health record (EHR) systems are increasingly popular. To systematically review the literature reporting the effect of patient portals on clinical care. PubMed and Web of Science searches from 1 January 1990 to 24 January 2013. Hypothesis-testing or quantitative studies of patient portals tethered to a provider EHR that addressed patient outcomes, satisfaction, adherence, efficiency, utilization, attitudes, and patient characteristics, as well as qualitative studies of barriers or facilitators, were included. Two reviewers independently extracted data and addressed discrepancies through consensus discussion. From 6508 titles, 14 randomized, controlled trials; 21 observational, hypothesis-testing studies; 5 quantitative, descriptive studies; and 6 qualitative studies were included. Evidence is mixed about the effect of portals on patient outcomes and satisfaction, although they may be more effective when used with case management. The effect of portals on utilization and efficiency is unclear, although patient race and ethnicity, education level or literacy, and degree of comorbid conditions may influence use. Limited data for most outcomes and an absence of reporting on organizational and provider context and implementation processes. Evidence that patient portals improve health outcomes, cost, or utilization is insufficient. Patient attitudes are generally positive, but more widespread use may require efforts to overcome racial, ethnic, and literacy barriers. Portals represent a new technology with benefits that are still unclear. Better understanding requires studies that include details about context, implementation factors, and cost.

  1. Quantifying Morphological Features of α-U3O8 with Image Analysis for Nuclear Forensics.

    PubMed

    Olsen, Adam M; Richards, Bryony; Schwerdt, Ian; Heffernan, Sean; Lusk, Robert; Smith, Braxton; Jurrus, Elizabeth; Ruggiero, Christy; McDonald, Luther W

    2017-03-07

    Morphological changes in U 3 O 8 based on calcination temperature have been quantified enabling a morphological feature to serve as a signature of processing history in nuclear forensics. Five separate calcination temperatures were used to synthesize α-U 3 O 8 , and each sample was characterized using powder X-ray diffraction (p-XRD) and scanning electron microscopy (SEM). The p-XRD spectra were used to evaluate the purity of the synthesized U-oxide; the morphological analysis for materials (MAMA) software was utilized to quantitatively characterize the particle shape and size as indicated by the SEM images. Analysis comparing the particle attributes, such as particle area at each of the temperatures, was completed using the Kolmogorov-Smirnov two sample test (K-S test). These results illustrate a distinct statistical difference between each calcination temperature. To provide a framework for forensic analysis of an unknown sample, the sample distributions at each temperature were compared to randomly selected distributions (100, 250, 500, and 750 particles) from each synthesized temperature to determine if they were statistically different. It was found that 750 particles were required to differentiate between all of the synthesized temperatures with a confidence interval of 99.0%. Results from this study provide the first quantitative morphological study of U-oxides, and reveals the potential strength of morphological particle analysis in nuclear forensics by providing a framework for a more rapid characterization of interdicted uranium oxide samples.

  2. Measurement and prediction of the thermomechanical response of shape memory alloy hybrid composite beams

    NASA Astrophysics Data System (ADS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2005-05-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  3. Measurement and Prediction of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2005-01-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  4. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  5. Using confidence intervals to evaluate the focus alignment of spectrograph detector arrays.

    PubMed

    Sawyer, Travis W; Hawkins, Kyle S; Damento, Michael

    2017-06-20

    High-resolution spectrographs extract detailed spectral information of a sample and are frequently used in astronomy, laser-induced breakdown spectroscopy, and Raman spectroscopy. These instruments employ dispersive elements such as prisms and diffraction gratings to spatially separate different wavelengths of light, which are then detected by a charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) detector array. Precise alignment along the optical axis (focus position) of the detector array is critical to maximize the instrumental resolution; however, traditional approaches of scanning the detector through focus lack a quantitative measure of precision, limiting the repeatability and relying on one's experience. Here we propose a method to evaluate the focus alignment of spectrograph detector arrays by establishing confidence intervals to measure the alignment precision. We show that propagation of uncertainty can be used to estimate the variance in an alignment, thus providing a quantitative and repeatable means to evaluate the precision and confidence of an alignment. We test the approach by aligning the detector array of a prototype miniature echelle spectrograph. The results indicate that the procedure effectively quantifies alignment precision, enabling one to objectively determine when an alignment has reached an acceptable level. This quantitative approach also provides a foundation for further optimization, including automated alignment. Furthermore, the procedure introduced here can be extended to other alignment techniques that rely on numerically fitting data to a model, providing a general framework for evaluating the precision of alignment methods.

  6. 76 FR 38719 - Interim Notice of Funding Availability for the Department of Transportation's National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-01

    ... emissions, (applicants are encouraged to provide quantitative information regarding expected reductions in...). Applicants are encouraged to provide quantitative information that validates the existence of substantial... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...

  7. Predictive values of semi-quantitative procalcitonin test and common biomarkers for the clinical outcomes of community-acquired pneumonia.

    PubMed

    Ugajin, Motoi; Yamaki, Kenichi; Hirasawa, Natsuko; Yagi, Takeo

    2014-04-01

    The semi-quantitative serum procalcitonin test (Brahms PCT-Q) is available conveniently in clinical practice. However, there are few data on the relationship between results for this semi-quantitative procalcitonin test and clinical outcomes of community-acquired pneumonia (CAP). We investigated the usefulness of this procalcitonin test for predicting the clinical outcomes of CAP in comparison with severity scoring systems and the blood urea nitrogen/serum albumin (B/A) ratio, which has been reported to be a simple but reliable prognostic indicator in our prior CAP study. This retrospective study included data from subjects who were hospitalized for CAP from August 2010 through October 2012 and who were administered the semi-quantitative serum procalcitonin test on admission. The demographic characteristics; laboratory biomarkers; microbiological test results; Pneumonia Severity Index scores; confusion, urea nitrogen, breathing frequency, blood pressure, ≥ 65 years of age (CURB-65) scale scores; and age, dehydration, respiratory failure, orientation disturbance, pressure (A-DROP) scale scores on hospital admission were retrieved from their medical charts. The outcomes were mortality within 28 days of hospital admission and the need for intensive care. Of the 213 subjects with CAP who were enrolled in the study, 20 died within 28 days of hospital admission, and 32 required intensive care. Mortality did not differ significantly among subjects with different semi-quantitative serum procalcitonin levels; however, subjects with serum procalcitonin levels ≥ 10.0 ng/mL were more likely to require intensive care than those with lower levels (P < .001). The elevation of semi-quantitative serum procalcitonin levels was more frequently observed in subjects with proven etiology, especially pneumococcal pneumonia. Using the receiver operating characteristic curves for mortality, the area under the curve was 0.86 for Pneumonia Severity Index class, 0.81 for B/A ratio, 0.81 for A-DROP, 0.80 for CURB-65, and 0.57 for semi-quantitative procalcitonin test. The semi-quantitative serum procalcitonin level on hospital admission was less predictive of mortality from CAP compared with the B/A ratio. However, the subjects with serum procalcitonin levels ≥ 10.0 ng/mL were more likely to require intensive care than those with lower levels.

  8. 34 CFR 668.146 - Criteria for approving tests.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... basic verbal and quantitative skills and general learned abilities; (2) Sample the major content domains of secondary school level verbal and quantitative skills with sufficient numbers of questions to— (i... Educational and Psychological Testing, prepared by a joint committee of the American Educational Research...

  9. 34 CFR 668.146 - Criteria for approving tests.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... basic verbal and quantitative skills and general learned abilities; (2) Sample the major content domains of secondary school level verbal and quantitative skills with sufficient numbers of questions to— (i... Educational and Psychological Testing, prepared by a joint committee of the American Educational Research...

  10. 34 CFR 668.146 - Criteria for approving tests.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... basic verbal and quantitative skills and general learned abilities; (2) Sample the major content domains of secondary school level verbal and quantitative skills with sufficient numbers of questions to— (i... Educational and Psychological Testing, prepared by a joint committee of the American Educational Research...

  11. 34 CFR 668.146 - Criteria for approving tests.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... basic verbal and quantitative skills and general learned abilities; (2) Sample the major content domains of secondary school level verbal and quantitative skills with sufficient numbers of questions to— (i... Educational and Psychological Testing, prepared by a joint committee of the American Educational Research...

  12. Quantitative genetic analysis of the bTB diagnostic single intradermal comparative cervical test (SICCT).

    PubMed

    Tsairidou, Smaragda; Brotherstone, Susan; Coffey, Mike; Bishop, Stephen C; Woolliams, John A

    2016-11-24

    Bovine tuberculosis (bTB) is a disease of significant economic importance and is a persistent animal health problem with implications for public health worldwide. Control of bTB in the UK has relied on diagnosis through the single intradermal comparative cervical test (SICCT). However, limitations in the sensitivity of this test hinder successful eradication and the control of bTB remains a major challenge. Genetic selection for cattle that are more resistant to bTB infection can assist in bTB control. The aim of this study was to conduct a quantitative genetic analysis of SICCT measurements collected during bTB herd testing. Genetic selection for bTB resistance will be partially informed by SICCT-based diagnosis; therefore it is important to know whether, in addition to increasing bTB resistance, this might also alter genetically the epidemiological characteristics of SICCT. Our main findings are that: (1) the SICCT test is robust at the genetic level, since its hierarchy and comparative nature provide substantial protection against random genetic changes that arise from genetic drift and from correlated responses among its components due to either natural or artificial selection; (2) the comparative nature of SICCT provides effective control for initial skin thickness and age-dependent differences; and (3) continuous variation in SICCT is only lowly heritable and has a weak correlation with SICCT positivity among healthy animals which was not significantly different from zero (P > 0.05). These emerging results demonstrate that genetic selection for bTB resistance is unlikely to change the probability of correctly identifying non-infected animals, i.e. the test's specificity, while reducing the overall number of cases. This study cannot exclude all theoretical risks from selection on resistance to bTB infection but the role of SICCT in disease control is unlikely to be rapidly undermined, with any adverse correlated responses expected to be weak and slow, which allow them to be monitored and managed.

  13. TRAC-PF1/MOD1 pretest predictions of MIST experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyack, B.E.; Steiner, J.L.; Siebe, D.A.

    Los Alamos National Laboratory is a participant in the Integral System Test (IST) program initiated in June 1983 to provide integral system test data on specific issues and phenomena relevant to post small-break loss-of-coolant accidents (SBLOCAs) in Babcock and Wilcox plant designs. The Multi-Loop Integral System Test (MIST) facility is the largest single component in the IST program. During Fiscal Year 1986, Los Alamos performed five MIST pretest analyses. The five experiments were chosen on the basis of their potential either to approach the facility limits or to challenge the predictive capability of the TRAC-PF1/MOD1 code. Three SBLOCA tests weremore » examined which included nominal test conditions, throttled auxiliary feedwater and asymmetric steam-generator cooldown, and reduced high-pressure-injection (HPI) capacity, respectively. Also analyzed were two ''feed-and-bleed'' cooling tests with reduced HPI and delayed HPI initiation. Results of the tests showed that the MIST facility limits would not be approached in the five tests considered. Early comparisons with preliminary test data indicate that the TRAC-PF1/MOD1 code is correctly calculating the dominant phenomena occurring in the MIST facility during the tests. Posttest analyses are planned to provide a quantitative assessment of the code's ability to predict MIST transients.« less

  14. Comparative evaluation of two quantitative test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface: a precollaborative study.

    PubMed

    Tomasino, Stephen F; Hamilton, Martin A

    2007-01-01

    Two quantitative carrier-based test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface, the Standard Quantitative Carrier Test Method-ASTM E 2111-00 and an adaptation of a quantitative micro-method as reported by Sagripanti and Bonifacino, were compared in this study. The methods were selected based on their desirable characteristics (e.g., well-developed protocol, previous use with spores, fully quantitative, and use of readily available equipment) for testing liquid sporicides and sterilants on a hard surface. In this paper, the Sagripanti-Bonifacino procedure is referred to as the Three Step Method (TSM). AOAC Official Method 966.04 was included in this study as a reference method. Three laboratories participated in the evaluation. Three chemical treatments were tested: (1) 3000 ppm sodium hypochlorite with pH adjusted to 7.0, (2) a hydrogen peroxide/peroxyacetic acid product, and (3) 3000 ppm sodium hypochlorite with pH unadjusted (pH of approximately 10.0). A fourth treatment, 6000 ppm sodium hypochlorite solution with pH adjusted to 7.0, was included only for Method 966.04 as a positive control (high level of efficacy). The contact time was 10 min for all chemical treatments except the 6000 ppm sodium hypochlorite treatment which was tested at 30 min. Each chemical treatment was tested 3 times using each of the methods. Only 2 of the laboratories performed the AOAC method. Method performance was assessed by the within-laboratory variance, between-laboratory variance, and total variance associated with the log reduction (LR) estimates generated by each quantitative method. The quantitative methods performed similarly, and the LR values generated by each method were not statistically different for the 3 treatments evaluated. Based on feedback from the participating laboratories, compared to the TSM, ASTM E 2111-00 was more resource demanding and required more set-up time. The logistical and resource concerns identified for ASTM E 2111-00 were largely associated with the filtration process and counting bacterial colonies on filters. Thus, the TSM was determined to be the most suitable method.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waters, M.D.; Stack, H.F.; Garrett, N.E.

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profilemore » was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.« less

  16. Shear Strength and Interfacial Toughness Characterization of Sapphire-Epoxy Interfaces for Nacre-Inspired Composites.

    PubMed

    Behr, Sebastian; Jungblut, Laura; Swain, Michael V; Schneider, Gerold A

    2016-10-12

    The common tensile lap-shear test for adhesive joints is inappropriate for brittle substrates such as glasses or ceramics where stress intensifications due to clamping and additional bending moments invalidate results. Nevertheless, bonding of glasses and ceramics is still important in display applications for electronics, in safety glass and ballistic armor, for dental braces and restoratives, or in recently developed bioinspired composites. To mechanically characterize adhesive bondings in these fields nonetheless, a novel approach based on the so-called Schwickerath test for dental sintered joints is used. This new method not only matches data from conventional analysis but also uniquely combines the accurate determination of interfacial shear strength and toughness in one simple test. The approach is verified for sapphire-epoxy joints that are of interest for bioinspired composites. For these, the procedure not only provides quantitative interfacial properties for the first time, it also exemplarily suggests annealing of sapphire at 1000 °C for 10 h for mechanically and economically effective improvements of the interfacial bond strength and toughness. With increases of strength and toughness from approximately 8 to 29 MPa and from 2.6 to 35 J/m 2 , respectively, this thermal modification drastically enhances the properties of unmodified sapphire-epoxy interfaces. At the same time, it is much more convenient than wet-chemical approaches such as silanization. Hence, besides the introduction of a new testing procedure for adhesive joints of brittle or expensive substrates, a new and facile annealing process for improvements of the adhesive properties of sapphire is suggested and quantitative data for the mechanical properties of sapphire-epoxy interfaces that are common in synthetic nacre-inspired composites are provided for the first time.

  17. Pill testing or drug checking in Australia: Acceptability of service design features.

    PubMed

    Barratt, Monica J; Bruno, Raimondo; Ezard, Nadine; Ritter, Alison

    2018-02-01

    This study aimed to determine design features of a drug-checking service that would be feasible, attractive and likely to be used by Australian festival and nightlife attendees. Web survey of 851 Australians reporting use of psychostimulants and/or hallucinogens and attendance at licensed venues past midnight and/or festivals in the past year (70% male; median age 23 years). A drug-checking service located at festivals or clubs would be used by 94%; a fixed-site service external to such events by 85%. Most (80%) were willing to wait an hour for their result. Almost all (94%) would not use a service if there was a possibility of arrest, and a majority (64%) would not use a service that did not provide individual feedback of results. Drug-checking results were only slightly more attractive if they provided comprehensive quantitative results compared with qualitative results of key ingredients. Most (93%) were willing to pay up to $5, and 68% up to $10, per test. One-third (33%) reported willingness to donate a whole dose for testing: they were more likely to be male, younger, less experienced, use drugs more frequently and attend venues/festivals less frequently. In this sample, festival- or club-based drug-checking services with low wait times and low cost appear broadly attractive under conditions of legal amnesty and individualised feedback. Quantitative analysis of ecstasy pills requiring surrender of a whole pill may appeal to a minority in Australia where pills are more expensive than elsewhere. [Barratt MJ, Bruno R, Ezard N, Ritter A. Pill testing or drug checking in Australia: Acceptability of service design features. Drug Alcohol Rev 2017;00:000-000]. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  18. The Assay Guidance Manual: Quantitative Biology and Pharmacology in Preclinical Drug Discovery.

    PubMed

    Coussens, Nathan P; Sittampalam, G Sitta; Guha, Rajarshi; Brimacombe, Kyle; Grossman, Abigail; Chung, Thomas D Y; Weidner, Jeffrey R; Riss, Terry; Trask, O Joseph; Auld, Douglas; Dahlin, Jayme L; Devanaryan, Viswanath; Foley, Timothy L; McGee, James; Kahl, Steven D; Kales, Stephen C; Arkin, Michelle; Baell, Jonathan; Bejcek, Bruce; Gal-Edd, Neely; Glicksman, Marcie; Haas, Joseph V; Iversen, Philip W; Hoeppner, Marilu; Lathrop, Stacy; Sayers, Eric; Liu, Hanguan; Trawick, Bart; McVey, Julie; Lemmon, Vance P; Li, Zhuyin; McManus, Owen; Minor, Lisa; Napper, Andrew; Wildey, Mary Jo; Pacifici, Robert; Chin, William W; Xia, Menghang; Xu, Xin; Lal-Nag, Madhu; Hall, Matthew D; Michael, Sam; Inglese, James; Simeonov, Anton; Austin, Christopher P

    2018-06-07

    The Assay Guidance Manual (AGM) is an eBook of best-practices for the design, development, and implementation of robust assays for early drug discovery. Initiated by pharmaceutical company scientists, the manual provides guidance for designing a "testing funnel" of assays to identify genuine hits using high-throughput screening (HTS) and advancing them through pre-clinical development. Combined with a workshop/tutorial component, the overall goal of the AGM is to provide a valuable resource for training translational scientists. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  19. Space-Based Gravitational-Wave Observations as Tools for Testing General Relativity

    NASA Technical Reports Server (NTRS)

    Will, Clifford M.

    2004-01-01

    We continued a project, to analyse the ways in which detection and study of gravitational waves could provide quantitative tests of general relativity, with particular emphasis on waves that would be detectable by space-based observatories, such as LISA. This work had three foci: 1) Tests of scalar-tensor theories of gravity that, could be done by analyzing gravitational waves from neutron stars inspiralling into massive black holes, as detectable by LISA; 2) Study of alternative theories of gravity in which the graviton could be massive, and of how gravitational-wave observations by space-based detectors, solar-system tests, and cosmological observations could constrain such theories; and 3) Study of gravitational-radiation back reaction of particles orbiting black holes in general relativity, with emphasis on the effects of spin.

  20. Detecting molecular forms of antithrombin by LC-MRM-MS: defining the measurands.

    PubMed

    Ruhaak, L Renee; Romijn, Fred P H T M; Smit, Nico P M; van der Laarse, Arnoud; Pieterse, Mervin M; de Maat, Moniek P M; Haas, Fred J L M; Kluft, Cornelis; Amiral, Jean; Meijer, Piet; Cobbaert, Christa M

    2018-05-01

    Antithrombin (AT) is a critical regulator of coagulation, and its overall activity is typically measured using functional tests. A large number of molecular forms of AT have been identified and each individual carries multiple molecular proteoforms representing variable activities. Conventional functional tests are completely blind for these proteoforms. A method that ensures properly defined measurands for AT is therefore needed. We here assess whether mass spectrometry technology, in particular multiple reaction monitoring (MRM), is suitable for the quantification of AT and the qualitative detection of its molecular proteoforms. Plasma proteins were denatured, reduced and alkylated prior to enzymatic digestion. MRM transitions were developed towards tryptic peptides and glycopeptides using AT purified from human plasma. For each peptide, three transitions were measured, and stable isotope-labeled peptides were used for quantitation. Completeness of digestion was assessed using digestion time curves. MRM transitions were developed for 19 tryptic peptides and 4 glycopeptides. Two peptides, FDTISEK and FATTFYQHLADSK, were used for quantitation, and using a calibration curve of isolated AT in 40 g/L human serum albumin, CVs below 3.5% were obtained for FDTISEK, whereas CVs below 8% were obtained for FATTFYQHLADSK. Of the 26 important AT mutations, 20 can be identified using this method, while altered glycosylation profiles can also be detected. We here show the feasibility of the liquid chromatography multiple reaction monitoring mass spectrometry (LC-MRM-MS) technique for the quantitation of AT and the qualitative analysis of most of its molecular proteoforms. Knowing the measurands will enable standardization of AT tests by providing in-depth information on the molecular proteoforms of AT.

  1. The effects of morphine on the temporal structure of Wistar rat behavioral response to pain in hot-plate.

    PubMed

    Casarrubea, Maurizio; Faulisi, Fabiana; Magnusson, Magnus S; Crescimanno, Giuseppe

    2016-08-01

    The largest amount of researches on the hot-plate test was carried out using quantitative assessments. However, the evaluation of the relationships among the different elements that compose the behavioral response to pain requires different approaches. Although previous studies have provided clear information on the behavioral structure of the response, no data are available on its temporal structure. The objective of this study was to investigate the temporal structure of the behavioral response to pain in Wistar rat tested in hot-plate and how this structure was influenced by morphine-induced analgesia. The behavior of four groups of subjects tested in hot-plate, one administered saline and three with different doses (3, 6, 12 mg/kg) of morphine IP, was analyzed by means of quantitative and t-pattern analyses. The latter is a multivariate technique able to detect the existence of statistically significant temporal relationships among the behavioral events in time. A clear-cut influence of morphine on quantitative parameters of the response to the noxious stimulation was observed. T-pattern analysis evidenced profound structural changes of behavior. Twenty-four different t-patterns were identified following saline, whereas a dose-dependent reduction was observed following morphine. Such a reduction was accompanied by a decrease of the total amount of t-patterns detected. Morphine, by reducing the effects of the noxious stimulation, orients animal behavior prevalently toward exploratory t-patterns. In addition, it is suggested that the temporal structure of the response is very quickly organized and adapted to environmental noxious cues.

  2. Screening and prevention of neonatal glucose 6-phosphate dehydrogenase deficiency in Guangzhou, China.

    PubMed

    Jiang, J; Li, B; Cao, W; Jiang, X; Jia, X; Chen, Q; Wu, J

    2014-06-09

    We aimed to summarize the results of screening protocol and prevention of neonatal glucose 6-phosphate dehydrogenase (G6PD) deficiency during a 22-year-long period to provide a basis of reference for the screening of this disease. About 1,705,569 newborn subjects in Guangzhou City were screened for this deficiency. Specimens were collected according to the conventional method of specimen acquisition for "newborn dried bloodspot screening", preserved, and inspected. The specimens were studied with fluorescent spot test and quantitative fluorescence assay. Diagnosis was performed using the modified NBTG6PD/6PGD ratio method. Bloodspot filter paper specimens were sent to the laboratory within 24 h via EMS Express, and the G6PD test was performed on the same day. The G6PD deficiency-positive rate was 4.2% in the samples screened using the fluorescent spot test, while it was 5% in case of the quantitative fluorescence assay. Neonatal screening for G6PD deficiency for 11,437 cases (6117 boys and 5320 girls) showed positive results in 481 cases. About 420 cases (318 boys and 102 girls) of G6PD deficiency were confirmed with the modified Duchenne NBT ratio method. The total detection rate was 3.7:5.2% for boys and 1.9% for girls. Quantitative fluorescence assay improved the sensitivity and detection rate. Accelerating the speed of sample delivery by using Internet network systems and ensuring online availability of screening results can aid the screening and diagnosis of this deficiency within 1 week of birth.

  3. Education in a Devolved Scotland: A Quantitative Analysis. Report to the Economic and Social Research Council. CEP Special Paper No. 30

    ERIC Educational Resources Information Center

    Machin, Stephen; McNally, Sandra; Wyness, Gill

    2013-01-01

    Education is an area that is highly devolved in the UK, and the fact that all four constituent countries have pursued very different policies in the recent past provides a good testing ground to undertake a comparative review of the merits or otherwise of the education reforms that have taken place. There is, of course, an important policy context…

  4. Modulating Wnt Signaling Pathway to Enhance Allograft Integration in Orthopedic Trauma Treatment

    DTIC Science & Technology

    2013-10-01

    presented below. Quantitative output provides an extensive set of data but we have chosen to present the most relevant parameters that are reflected in...multiple parameters .  Most samples have been mechanically tested and data extracted for multiple parameters .  Histological evaluation of subset of...Sumner, D. R. Saline Irrigation Does Not Affect Bone Formation or Fixation Strength of Hydroxyapatite /Tricalcium Phosphate-Coated Implants in a Rat Model

  5. Effects of Pulsed and CW (Continuous Wave) 2450 MHz Radiation on Transformation and Chromosomes of Human Lymphocytes in vitro

    DTIC Science & Technology

    1989-12-15

    conditions of these experiments. In order to provide reliable quantitative data on exposure, a system with automated dosimetry was developed, and tested...exposure system and dosimetry, and (2) studies on lymphocyte cultures, and (3) conclusions. EXPOSURE SYSTEM AND DOSIMETRY Description of the Exposure... System The experiments planned in this project necessitated the design and assembly of an exposure system that would meet several engineering

  6. KSC01pp0797

    NASA Image and Video Library

    2001-04-12

    At Astrotech, Titusville, Fla., the GOES-M (Geostationary Operational Environmental Satellite) satellite is tilted on a workstand so that workers can remove the rest of the protective cover. The GOES-M provides weather imagery and quantitative sounding data used to support weather forecasting, severe storm tracking and meteorological research. The satellite will undergo testing at Astrotech before its scheduled launch July 12 on an Atlas-IIA booster, Centaur upper stage from Cape Canaveral Air Force Station

  7. Clinical evaluation of tuberculosis viability microscopy for assessing treatment response.

    PubMed

    Datta, Sumona; Sherman, Jonathan M; Bravard, Marjory A; Valencia, Teresa; Gilman, Robert H; Evans, Carlton A

    2015-04-15

    It is difficult to determine whether early tuberculosis treatment is effective in reducing the infectiousness of patients' sputum, because culture takes weeks and conventional acid-fast sputum microscopy and molecular tests cannot differentiate live from dead tuberculosis. To assess treatment response, sputum samples (n=124) from unselected patients (n=35) with sputum microscopy-positive tuberculosis were tested pretreatment and after 3, 6, and 9 days of empiric first-line therapy. Tuberculosis quantitative viability microscopy with fluorescein diacetate, quantitative culture, and acid-fast auramine microscopy were all performed in triplicate. Tuberculosis quantitative viability microscopy predicted quantitative culture results such that 76% of results agreed within ±1 logarithm (rS=0.85; P<.0001). In 31 patients with non-multidrug-resistant (MDR) tuberculosis, viability and quantitative culture results approximately halved (both 0.27 log reduction, P<.001) daily. For patients with non-MDR tuberculosis and available data, by treatment day 9 there was a >10-fold reduction in viability in 100% (24/24) of cases and quantitative culture in 95% (19/20) of cases. Four other patients subsequently found to have MDR tuberculosis had no significant changes in viability (P=.4) or quantitative culture (P=.6) results during early treatment. The change in viability and quantitative culture results during early treatment differed significantly between patients with non-MDR tuberculosis and those with MDR tuberculosis (both P<.001). Acid-fast microscopy results changed little during early treatment, and this change was similar for non-MDR tuberculosis vs MDR tuberculosis (P=.6). Tuberculosis quantitative viability microscopy is a simple test that within 1 hour predicted quantitative culture results that became available weeks later, rapidly indicating whether patients were responding to tuberculosis therapy. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America.

  8. Quantitative thermal sensory testing -- value of testing for both cold and warm sensation detection in evaluation of small fiber neuropathy.

    PubMed

    Shukla, Garima; Bhatia, Manvir; Behari, Madhuri

    2005-10-01

    Small fiber neuropathy is a common neurological disorder, often missed or ignored by physicians, since examination and routine nerve conduction studies are usually normal in this condition. Many methods including quantitative thermal sensory testing are currently being used for early detection of this condition, so as to enable timely investigation and treatment. This study was conducted to assess the yield of quantitative thermal sensory testing in diagnosis of small fiber neuropathy. We included patients presenting with history suggestive of positive and/or negative sensory symptoms, with normal examination findings, clinically suggestive of small fiber neuropathy, with normal or minimally abnormal routine nerve conduction studies. These patients were subjected to quantitative thermal sensory testing using a Medoc TSA-II Neurosensory analyser at two sites and for two modalities. QST data were compared with those in 120 normal healthy controls. Twenty-five patients (16 males, 9 females) with mean age 46.8+/-16.6 years (range: 21-75 years) were included in the study. The mean duration of symptoms was 1.6+/-1.6 years (range: 3 months-6 years). Eighteen patients (72%) had abnormal thresholds in at least one modality. Thermal thresholds were normal in 7 out of the 25 patients. This study demonstrates that quantitative thermal sensory testing is a fairly sensitive method for detection of small fiber neuropathy especially in patients with normal routine nerve conduction studies.

  9. 75 FR 30460 - Notice of Funding Availability for the Department of Transportation's National Infrastructure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-01

    ... provide quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as... provide quantitative information that validates the existence of substantial transportation-related costs... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...

  10. The Oral Minimal Model Method

    PubMed Central

    Cobelli, Claudio; Dalla Man, Chiara; Toffolo, Gianna; Basu, Rita; Vella, Adrian; Rizza, Robert

    2014-01-01

    The simultaneous assessment of insulin action, secretion, and hepatic extraction is key to understanding postprandial glucose metabolism in nondiabetic and diabetic humans. We review the oral minimal method (i.e., models that allow the estimation of insulin sensitivity, β-cell responsivity, and hepatic insulin extraction from a mixed-meal or an oral glucose tolerance test). Both of these oral tests are more physiologic and simpler to administer than those based on an intravenous test (e.g., a glucose clamp or an intravenous glucose tolerance test). The focus of this review is on indices provided by physiological-based models and their validation against the glucose clamp technique. We discuss first the oral minimal model method rationale, data, and protocols. Then we present the three minimal models and the indices they provide. The disposition index paradigm, a widely used β-cell function metric, is revisited in the context of individual versus population modeling. Adding a glucose tracer to the oral dose significantly enhances the assessment of insulin action by segregating insulin sensitivity into its glucose disposal and hepatic components. The oral minimal model method, by quantitatively portraying the complex relationships between the major players of glucose metabolism, is able to provide novel insights regarding the regulation of postprandial metabolism. PMID:24651807

  11. Cryogenic Pressure Control Modeling for Ellipsoidal Space Tanks

    NASA Technical Reports Server (NTRS)

    Lopez, Alfredo; Grayson, Gary D.; Chandler, Frank O.; Hastings, Leon J.; Heyadat, Ali

    2007-01-01

    A computational fluid dynamics (CFD) model is developed to simulate pressure control of an ellipsoidal-shaped liquid hydrogen tank under external heating in normal gravity. Pressure control is provided by an axial jet thermodynamic vent system (TVS) centered within the vessel that injects cooler liquid into the tank, mixing the contents and reducing tank pressure. The two-phase cryogenic tank model considers liquid hydrogen in its own vapor with liquid density varying with temperature only and a fully compressible ullage. The axisymmetric model is developed using a custom version of the commercially available FLOW-31) software. Quantitative model validation is ,provided by engineering checkout tests performed at Marshall Space Flight Center in 1999 in support of the Solar Thermal Upper Stage_ Technology Demonstrator (STUSTD) program. The engineering checkout tests provide cryogenic tank self-pressurization test data at various heat leaks and tank fill levels. The predicted self-pressurization rates, ullage and liquid temperatures at discrete locations within the STUSTD tank are in good agreement with test data. The work presented here advances current CFD modeling capabilities for cryogenic pressure control and helps develop a low cost CFD-based design process for space hardware.

  12. International Standards and Reference Materials for Quantitative Molecular Infectious Disease Testing

    PubMed Central

    Madej, Roberta M.; Davis, Jack; Holden, Marcia J.; Kwang, Stan; Labourier, Emmanuel; Schneider, George J.

    2010-01-01

    The utility of quantitative molecular diagnostics for patient management depends on the ability to relate patient results to prior results or to absolute values in clinical practice guidelines. To do this, those results need to be comparable across time and methods, either by producing the same value across methods and test versions or by using reliable and stable conversions. Universally available standards and reference materials specific to quantitative molecular technologies are critical to this process but are few in number. This review describes recent history in the establishment of international standards for nucleic acid test development, organizations involved in current efforts, and future issues and initiatives. PMID:20075208

  13. Stroke onset time estimation from multispectral quantitative magnetic resonance imaging in a rat model of focal permanent cerebral ischemia.

    PubMed

    McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A

    2016-08-01

    Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.

  14. 76 FR 6796 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-08

    ... perform the ``Quantitative Survey of Physician Practices in Laboratory Test Ordering and Interpretation... technology. Written comments should be received within 60 days of this notice. Proposed Project Quantitative Survey of Physician Practices in Laboratory Test Ordering and Interpretation-NEW-the Office of...

  15. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  16. Determination of selected azaarenes in water by bonded-phase extraction and liquid chromatography

    USGS Publications Warehouse

    Steinheimer, T.R.; Ondrus, M.G.

    1986-01-01

    A method for the rapid and simple quantitative determination of quinoline, isoquinoline, and five selected three-ring azaarenes in water has been developed. The azaarene fraction is separated from its carbon analogues on n-octadecyl packing material by edition with acidified water/acetonitrile. Concentration as great as 1000-fold is achieved readily. Instrumental analysis involves high-speed liquid chromatography on flexible-walled, wide-bore columns with fluorescence and ultraviolet detection at several wavelengths employing filter photometers in series. Method-validation data is provided as azaarene recovery efficiency from fortified samples. Distilled water, river water, contaminated ground water, and secondary-treatment effluent have been tested. Recoveries at part-per-billion levels are nearly quantitative for the three-ring compounds, but they decrease for quinoline and isoquinoline. ?? 1986 American Chemical Society.

  17. From differences in means between cases and controls to risk stratification: a business plan for biomarker development.

    PubMed

    Wentzensen, Nicolas; Wacholder, Sholom

    2013-02-01

    Researchers developing biomarkers for early detection can determine the potential for clinical benefit at early stages of development. We provide the theoretical background showing the quantitative connection between biomarker levels in cases and controls and clinically meaningful risk measures, as well as a spreadsheet for researchers to use in their own research. We provide researchers with tools to decide whether a test is useful, whether it needs technical improvement, whether it may work only in specific populations, or whether any further development is futile. The methods described here apply to any method that aims to estimate risk of disease based on biomarkers, clinical tests, genetics, environment, or behavior. Many efforts go into futile biomarker development and premature clinical testing. In many instances, predictions for translational success or failure can be made early, simply based on critical analysis of case–control data. Our article presents well-established theory in a form that can be appreciated by biomarker researchers. Furthermore, we provide an interactive spreadsheet that links biomarker performance with specific disease characteristics to evaluate the promise of biomarker candidates at an early stage.

  18. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  19. Quantitative multi-pinhole small-animal SPECT: uniform versus non-uniform Chang attenuation correction.

    PubMed

    Wu, C; de Jong, J R; Gratama van Andel, H A; van der Have, F; Vastenhouw, B; Laverman, P; Boerman, O C; Dierckx, R A J O; Beekman, F J

    2011-09-21

    Attenuation of photon flux on trajectories between the source and pinhole apertures affects the quantitative accuracy of reconstructed single-photon emission computed tomography (SPECT) images. We propose a Chang-based non-uniform attenuation correction (NUA-CT) for small-animal SPECT/CT with focusing pinhole collimation, and compare the quantitative accuracy with uniform Chang correction based on (i) body outlines extracted from x-ray CT (UA-CT) and (ii) on hand drawn body contours on the images obtained with three integrated optical cameras (UA-BC). Measurements in phantoms and rats containing known activities of isotopes were conducted for evaluation. In (125)I, (201)Tl, (99m)Tc and (111)In phantom experiments, average relative errors comparing to the gold standards measured in a dose calibrator were reduced to 5.5%, 6.8%, 4.9% and 2.8%, respectively, with NUA-CT. In animal studies, these errors were 2.1%, 3.3%, 2.0% and 2.0%, respectively. Differences in accuracy on average between results of NUA-CT, UA-CT and UA-BC were less than 2.3% in phantom studies and 3.1% in animal studies except for (125)I (3.6% and 5.1%, respectively). All methods tested provide reasonable attenuation correction and result in high quantitative accuracy. NUA-CT shows superior accuracy except for (125)I, where other factors may have more impact on the quantitative accuracy than the selected attenuation correction.

  20. Porous Silicon Antibody Microarrays for Quantitative Analysis: Measurement of Free and Total PSA in Clinical Plasma Samples

    PubMed Central

    Tojo, Axel; Malm, Johan; Marko-Varga, György; Lilja, Hans; Laurell, Thomas

    2014-01-01

    The antibody microarrays have become widespread, but their use for quantitative analyses in clinical samples has not yet been established. We investigated an immunoassay based on nanoporous silicon antibody microarrays for quantification of total prostate-specific-antigen (PSA) in 80 clinical plasma samples, and provide quantitative data from a duplex microarray assay that simultaneously quantifies free and total PSA in plasma. To further develop the assay the porous silicon chips was placed into a standard 96-well microtiter plate for higher throughput analysis. The samples analyzed by this quantitative microarray were 80 plasma samples obtained from men undergoing clinical PSA testing (dynamic range: 0.14-44ng/ml, LOD: 0.14ng/ml). The second dataset, measuring free PSA (dynamic range: 0.40-74.9ng/ml, LOD: 0.47ng/ml) and total PSA (dynamic range: 0.87-295ng/ml, LOD: 0.76ng/ml), was also obtained from the clinical routine. The reference for the quantification was a commercially available assay, the ProStatus PSA Free/Total DELFIA. In an analysis of 80 plasma samples the microarray platform performs well across the range of total PSA levels. This assay might have the potential to substitute for the large-scale microtiter plate format in diagnostic applications. The duplex assay paves the way for a future quantitative multiplex assay, which analyses several prostate cancer biomarkers simultaneously. PMID:22921878

Top