Sample records for benefits quantification methodology

  1. Precision and accuracy of clinical quantification of myocardial blood flow by dynamic PET: A technical perspective.

    PubMed

    Moody, Jonathan B; Lee, Benjamin C; Corbett, James R; Ficaro, Edward P; Murthy, Venkatesh L

    2015-10-01

    A number of exciting advances in PET/CT technology and improvements in methodology have recently converged to enhance the feasibility of routine clinical quantification of myocardial blood flow and flow reserve. Recent promising clinical results are pointing toward an important role for myocardial blood flow in the care of patients. Absolute blood flow quantification can be a powerful clinical tool, but its utility will depend on maintaining precision and accuracy in the face of numerous potential sources of methodological errors. Here we review recent data and highlight the impact of PET instrumentation, image reconstruction, and quantification methods, and we emphasize (82)Rb cardiac PET which currently has the widest clinical application. It will be apparent that more data are needed, particularly in relation to newer PET technologies, as well as clinical standardization of PET protocols and methods. We provide recommendations for the methodological factors considered here. At present, myocardial flow reserve appears to be remarkably robust to various methodological errors; however, with greater attention to and more detailed understanding of these sources of error, the clinical benefits of stress-only blood flow measurement may eventually be more fully realized.

  2. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    PubMed Central

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431

  3. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    PubMed

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  4. Methodological problems in the method used by IQWiG within early benefit assessment of new pharmaceuticals in Germany.

    PubMed

    Herpers, Matthias; Dintsios, Charalabos-Markos

    2018-04-25

    The decision matrix applied by the Institute for Quality and Efficiency in Health Care (IQWiG) for the quantification of added benefit within the early benefit assessment of new pharmaceuticals in Germany with its nine fields is quite complex and could be simplified. Furthermore, the method used by IQWiG is subject to manifold criticism: (1) it is implicitly weighting endpoints differently in its assessments favoring overall survival and, thereby, drug interventions in fatal diseases, (2) it is assuming that two pivotal trials are available when assessing the dossiers submitted by the pharmaceutical manufacturers, leading to far-reaching implications with respect to the quantification of added benefit, and, (3) it is basing the evaluation primarily on dichotomous endpoints and consequently leading to an information loss of usable evidence. To investigate if criticism is justified and to propose methodological adaptations. Analysis of the available dossiers up to the end of 2016 using statistical tests and multinomial logistic regression and simulations. It was shown that due to power losses, the method does not ensure that results are statistically valid and outcomes of the early benefit assessment may be compromised, though evidence on favoring overall survival remains unclear. Modifications, however, of the IQWiG method are possible to address the identified problems. By converging with the approach of approval authorities for confirmatory endpoints, the decision matrix could be simplified and the analysis method could be improved, to put the results on a more valid statistical basis.

  5. Quantification of health benefits related with reduction of atmospheric PM₁₀ levels: implementation of population mobility approach.

    PubMed

    Tchepel, Oxana; Dias, Daniela

    2011-06-01

    This study is focused on the assessment of potential health benefits by meeting the air quality limit values (2008/50/CE) for short-term PM₁₀ exposure. For this purpose, the methodology of the WHO for Health Impact Assessment and APHEIS guidelines for data collection were applied to Porto Metropolitan Area, Portugal. Additionally, an improved methodology using population mobility data is proposed in this work to analyse number of persons exposed. In order to obtain representative background concentrations, an innovative approach to process air quality time series was implemented. The results provide the number of attributable cases prevented annually by reducing PM(10) concentration. An intercomparison of two approaches to process input data for the health risk analysis provides information on sensitivity of the applied methodology. The findings highlight the importance of taking into account spatial variability of the air pollution levels and population mobility in the health impact assessment.

  6. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    NASA Astrophysics Data System (ADS)

    Seebauer, Matthias

    2014-03-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.

  7. PET/MRI for neurologic applications.

    PubMed

    Catana, Ciprian; Drzezga, Alexander; Heiss, Wolf-Dieter; Rosen, Bruce R

    2012-12-01

    PET and MRI provide complementary information in the study of the human brain. Simultaneous PET/MRI data acquisition allows the spatial and temporal correlation of the measured signals, creating opportunities impossible to realize using stand-alone instruments. This paper reviews the methodologic improvements and potential neurologic and psychiatric applications of this novel technology. We first present methods for improving the performance and information content of each modality by using the information provided by the other technique. On the PET side, we discuss methods that use the simultaneously acquired MRI data to improve the PET data quantification. On the MRI side, we present how improved PET quantification can be used to validate several MRI techniques. Finally, we describe promising research, translational, and clinical applications that can benefit from these advanced tools.

  8. PET/MRI for Neurological Applications

    PubMed Central

    Catana, Ciprian; Drzezga, Alexander; Heiss, Wolf-Dieter; Rosen, Bruce R.

    2013-01-01

    PET and MRI provide complementary information in the study of the human brain. Simultaneous PET/MR data acquisition allows the spatial and temporal correlation of the measured signals, opening up opportunities impossible to realize using stand-alone instruments. This paper reviews the methodological improvements and potential neurological and psychiatric applications of this novel technology. We first present methods for improving the performance and information content of each modality by using the information provided by the other technique. On the PET side, we discuss methods that use the simultaneously acquired MR data to improve the PET data quantification. On the MR side, we present how improved PET quantification could be used to validate a number of MR techniques. Finally, we describe promising research, translational and clinical applications that could benefit from these advanced tools. PMID:23143086

  9. Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification

    DOT National Transportation Integrated Search

    2011-04-29

    For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...

  10. A systematic methodology for the robust quantification of energy efficiency at wastewater treatment plants featuring Data Envelopment Analysis.

    PubMed

    Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M

    2018-05-10

    This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  12. Electrochemical sensors and biosensors for the analysis of antineoplastic drugs.

    PubMed

    Lima, Handerson Rodrigues Silva; da Silva, Josany Saibrosa; de Oliveira Farias, Emanuel Airton; Teixeira, Paulo Ronaldo Sousa; Eiras, Carla; Nunes, Lívio César Cunha

    2018-06-15

    Cancer is a leading cause of death worldwide, often being treated with antineoplastic drugs that have high potential for toxicity to humans and the environment, even at very low concentrations. Therefore, monitoring these drugs is of utmost importance. Among the techniques used to detect substances at low concentrations, electrochemical sensors and biosensors have been noted for their practicality and low cost. This review brings, for the first time, a simplified outline of the main electrochemical sensors and biosensors developed for the analysis of antineoplastic drugs. The drugs analyzed and the methodology used for electrochemical sensing are described, as are the techniques used for drug quantification and the analytical performance of each sensor, highlighting the limit of detection (LOD), as well as the linear range of quantification (LR) for each system. Finally, we present a technological prospection on the development and use of electrochemical sensors and biosensors in the quantification of antineoplastic drugs. A search of international patent databases revealed no patents currently submitted under this topic, suggesting this is an area to be further explored. We also show that the use of these systems has been gaining prominence in recent years, and that the quantification of antineoplastic drugs using electrochemical techniques could bring great financial and health benefits. Copyright © 2018. Published by Elsevier B.V.

  13. A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis

    DTIC Science & Technology

    2012-01-01

    probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY

  14. Immunohistochemistry as an Important Tool in Biomarkers Detection and Clinical Practice

    PubMed Central

    de Matos, Leandro Luongo; Trufelli, Damila Cristina; de Matos, Maria Graciela Luongo; da Silva Pinhal, Maria Aparecida

    2010-01-01

    The immunohistochemistry technique is used in the search for cell or tissue antigens that range from amino acids and proteins to infectious agents and specific cellular populations. The technique comprises two phases: (1) slides preparation and stages involved for the reaction; (2) interpretation and quantification of the obtained expression. Immunohistochemistry is an important tool for scientific research and also a complementary technique for the elucidation of differential diagnoses which are not determinable by conventional analysis with hematoxylin and eosin. In the last couple of decades there has been an exponential increase in publications on immunohistochemistry and immunocytochemistry techniques. This review covers the immunohistochemistry technique; its history, applications, importance, limitations, difficulties, problems and some aspects related to results interpretation and quantification. Future developments on the immunohistochemistry technique and its expression quantification should not be disseminated in two languages—that of the pathologist and another of clinician or surgeon. The scientific, diagnostic and prognostic applications of this methodology must be explored in a bid to benefit of patient. In order to achieve this goal a collaboration and pooling of knowledge from both of these valuable medical areas is vital PMID:20212918

  15. Quantification of the benefits of access management for Kentucky : final report.

    DOT National Transportation Integrated Search

    2006-07-01

    This report describes the benefits quantification performed for the proposed access management plan for Kentucky. This study evaluates the capacity, safety and economic impacts associated with access management programs. The proposed Kentucky access ...

  16. Absolute quantification of olive oil DNA by droplet digital-PCR (ddPCR): Comparison of isolation and amplification methodologies.

    PubMed

    Scollo, Francesco; Egea, Leticia A; Gentile, Alessandra; La Malfa, Stefano; Dorado, Gabriel; Hernandez, Pilar

    2016-12-15

    Olive oil is considered a premium product for its nutritional value and health benefits, and the ability to define its origin and varietal composition is a key step towards ensuring the traceability of the product. However, isolating the DNA from such a matrix is a difficult task. In this study, the quality and quantity of olive oil DNA, isolated using four different DNA isolation protocols, was evaluated using the qRT-PCR and ddPCR techniques. The results indicate that CTAB-based extraction methods were the best for unfiltered oil, while Nucleo Spin-based extraction protocols showed greater overall reproducibility. The use of both qRT-PCR and ddPCR led to the absolute quantification of the DNA copy number. The results clearly demonstrate the importance of the choice of DNA-isolation protocol, which should take into consideration the qualitative aspects of DNA and the evaluation of the amplified DNA copy number. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  18. Stochastic Analysis and Design of Heterogeneous Microstructural Materials System

    NASA Astrophysics Data System (ADS)

    Xu, Hongyi

    Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.

  19. Adapting a standardised international 24 h dietary recall methodology (GloboDiet software) for research and dietary surveillance in Korea.

    PubMed

    Park, Min Kyung; Park, Jin Young; Nicolas, Geneviève; Paik, Hee Young; Kim, Jeongseon; Slimani, Nadia

    2015-06-14

    During the past decades, a rapid nutritional transition has been observed along with economic growth in the Republic of Korea. Since this dramatic change in diet has been frequently associated with cancer and other non-communicable diseases, dietary monitoring is essential to understand the association. Benefiting from pre-existing standardised dietary methodologies, the present study aimed to evaluate the feasibility and describe the development of a Korean version of the international computerised 24 h dietary recall method (GloboDiet software) and its complementary tools, developed at the International Agency for Research on Cancer (IARC), WHO. Following established international Standard Operating Procedures and guidelines, about seventy common and country-specific databases on foods, recipes, dietary supplements, quantification methods and coefficients were customised and translated. The main results of the present study highlight the specific adaptations made to adapt the GloboDiet software for research and dietary surveillance in Korea. New (sub-) subgroups were added into the existing common food classification, and new descriptors were added to the facets to classify and describe specific Korean foods. Quantification methods were critically evaluated and adapted considering the foods and food packages available in the Korean market. Furthermore, a picture book of foods/dishes was prepared including new pictures and food portion sizes relevant to Korean diet. The development of the Korean version of GloboDiet demonstrated that it was possible to adapt the IARC-WHO international dietary tool to an Asian context without compromising its concept of standardisation and software structure. It, thus, confirms that this international dietary methodology, used so far only in Europe, is flexible and robust enough to be customised for other regions worldwide.

  20. Analytical methodologies based on LC-MS/MS for monitoring selected emerging compounds in liquid and solid phases of the sewage sludge.

    PubMed

    Boix, C; Ibáñez, M; Fabregat-Safont, D; Morales, E; Pastor, L; Sancho, J V; Sánchez-Ramírez, J E; Hernández, F

    2016-01-01

    In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (4-FAA), venlafaxine and benzoylecgonine. The aqueous and solid phases of the sewage sludge were analyzed making use of Solid-Phase Extraction (SPE) and UltraSonic Extraction (USE) for sample treatment, respectively. The methods were validated at three concentration levels: 0.2, 2 and 20 μg L(-1) for the aqueous phase, and 50, 500 and 2000 μg kg(-1) for the solid phase of the sludge. In general, the method was satisfactorily validated, showing good recoveries (70-120%) and precision (RSD < 20%). Regarding the limit of quantification (LOQ), it was below 0.1 μg L(-1) in the aqueous phase and below 50 μg kg(-1) in the solid phase for the majority of the analytes. The method applicability was tested by analysis of samples from a wider study on degradation of emerging pollutants in sewage sludge under anaerobic digestion. The key benefits of these methodologies are: • SPE and USE are appropriate sample procedures to extract selected emerging contaminants from the aqueous phase of the sewage sludge and the solid residue. • LC-MS/MS is highly suitable for determining emerging contaminants in both sludge phases. • Up to our knowledge, the main metabolites of dipyrone had not been studied before in sewage sludge.

  1. Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations

    NASA Astrophysics Data System (ADS)

    Giovanis, D. G.; Shields, M. D.

    2018-07-01

    This paper addresses uncertainty quantification (UQ) for problems where scalar (or low-dimensional vector) response quantities are insufficient and, instead, full-field (very high-dimensional) responses are of interest. To do so, an adaptive stochastic simulation-based methodology is introduced that refines the probability space based on Grassmann manifold variations. The proposed method has a multi-element character discretizing the probability space into simplex elements using a Delaunay triangulation. For every simplex, the high-dimensional solutions corresponding to its vertices (sample points) are projected onto the Grassmann manifold. The pairwise distances between these points are calculated using appropriately defined metrics and the elements with large total distance are sub-sampled and refined. As a result, regions of the probability space that produce significant changes in the full-field solution are accurately resolved. An added benefit is that an approximation of the solution within each element can be obtained by interpolation on the Grassmann manifold. The method is applied to study the probability of shear band formation in a bulk metallic glass using the shear transformation zone theory.

  2. The economic benefits of reducing the levels of nitrogen dioxide (NO2) near primary schools: The case of London.

    PubMed

    Guerriero, Carla; Chatzidiakou, Lia; Cairns, John; Mumovic, Dejan

    2016-10-01

    Providing a healthy school environment is a priority for child health. The aim of this study is to develop a methodology that allows quantification of the potential economic benefit of reducing indoor exposure to nitrogen dioxide (NO2) in children attending primary schools. Using environmental and health data collected in primary schools in London, this study estimates that, on average, 82 asthma exacerbations per school can be averted each year by reducing outdoor NO2 concentrations. The study expands upon previous analyses in two ways: first it assesses the health benefits of reducing children's exposure to indoor NO2 while at school, second it considers the children's perspective in the economic evaluation. Using a willingness to pay approach, the study quantifies that the monetary benefits of reducing children's indoor NO2 exposure while at school would range between £2.5 k per school if a child's perspective based on child's budget is adopted up to £60 k if a parent's perspective is considered. This study highlights that designers, engineers, policymakers and stakeholders need to consider the reduction of outdoor pollution, and particularly NO2 levels, near primary schools as there may be substantial health and monetary benefits. Copyright © 2016. Published by Elsevier Ltd.

  3. Meeting report: Estimating the benefits of reducing hazardous air pollutants--summary of 2009 workshop and future considerations.

    PubMed

    Gwinn, Maureen R; Craig, Jeneva; Axelrad, Daniel A; Cook, Rich; Dockins, Chris; Fann, Neal; Fegley, Robert; Guinnup, David E; Helfand, Gloria; Hubbell, Bryan; Mazur, Sarah L; Palma, Ted; Smith, Roy L; Vandenberg, John; Sonawane, Babasaheb

    2011-01-01

    Quantifying the benefits of reducing hazardous air pollutants (HAPs, or air toxics) has been limited by gaps in toxicological data, uncertainties in extrapolating results from high-dose animal experiments to estimate human effects at lower doses, limited ambient and personal exposure monitoring data, and insufficient economic research to support valuation of the health impacts often associated with exposure to individual air toxics. To address some of these issues, the U.S. Environmental Protection Agency held the Workshop on Estimating the Benefits of Reducing Hazardous Air Pollutants (HAPs) in Washington, DC, from 30 April to 1 May 2009. Experts from multiple disciplines discussed how best to move forward on air toxics benefits assessment, with a focus on developing near-term capability to conduct quantitative benefits assessment. Proposed methodologies involved analysis of data-rich pollutants and application of this analysis to other pollutants, using dose-response modeling of animal data for estimating benefits to humans, determining dose-equivalence relationships for different chemicals with similar health effects, and analysis similar to that used for criteria pollutants. Limitations and uncertainties in economic valuation of benefits assessment for HAPS were discussed as well. These discussions highlighted the complexities in estimating the benefits of reducing air toxics, and participants agreed that alternative methods for benefits assessment of HAPs are needed. Recommendations included clearly defining the key priorities of the Clean Air Act air toxics program to identify the most effective approaches for HAPs benefits analysis, focusing on susceptible and vulnerable populations, and improving dose-response estimation for quantification of benefits.

  4. Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.

    PubMed

    Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio

    2018-01-01

    Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.

  5. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    NASA Astrophysics Data System (ADS)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  6. Characterization of nutraceuticals and functional foods by innovative HPLC methods.

    PubMed

    Corradini, Claudio; Galanti, Roberta; Nicoletti, Isabella

    2002-04-01

    In recent years there is a growing interest in food and food ingredient which may provide health benefits. Food as well as food ingredients containing health-preserving components, are not considered conventional food, but can be defined as functional food. To characterise such foods, as well as nutraceuticals specific, high sensitive and reproducible analytical methodologies are needed. In light of this importance we set out to develop innovative HPLC methods employing reversed phase narrow bore column and high-performance anion-exchange chromatographic methods coupled with pulsed amperometric detection (HPAEC-PAD), which are specific for carbohydrate analysis. The developed methods were applied for the separation and quantification of citrus flavonoids and to characterize fructooligosaccharide (FOS) and fructans added to functional foods and nutraceuticals.

  7. Environmental and Sustainability Education Policy Research: A Systematic Review of Methodological and Thematic Trends

    ERIC Educational Resources Information Center

    Aikens, Kathleen; McKenzie, Marcia; Vaughter, Philip

    2016-01-01

    This paper reports on a systematic literature review of policy research in the area of environmental and sustainability education. We analyzed 215 research articles, spanning four decades and representing 71 countries, and which engaged a range of methodologies. Our analysis combines quantification of geographic and methodological trends with…

  8. Validation and evaluation of an HPLC methodology for the quantification of the potent antimitotic compound (+)-discodermolide in the Caribbean marine sponge Discodermia dissoluta.

    PubMed

    Valderrama, Katherine; Castellanos, Leonardo; Zea, Sven

    2010-08-01

    The sponge Discodermia dissoluta is the source of the potent antimitotic compound (+)-discodermolide. The relatively abundant and shallow populations of this sponge in Santa Marta, Colombia, allow for studies to evaluate the natural and biotechnological supply options of (+)-discodermolide. In this work, an RP-HPLC-UV methodology for the quantification of (+)-discodermolide from sponge samples was tested and validated. Our protocol for extracting this compound from the sponge included lyophilization, exhaustive methanol extraction, partitioning using water and dichloromethane, purification of the organic fraction in RP-18 cartridges and then finally retrieving the (+)-discodermolide in the methanol-water (80:20 v/v) fraction. This fraction was injected into an HPLC system with an Xterra RP-18 column and a detection wavelength of 235 nm. The calibration curve was linear, making it possible to calculate the LODs and quantification in these experiments. The intra-day and inter-day precision showed relative standard deviations lower than 5%. The accuracy, determined as the percentage recovery, was 99.4%. Nine samples of the sponge from the Bahamas, Bonaire, Curaçao and Santa Marta had concentrations of (+)-discodermolide ranging from 5.3 to 29.3 microg/g(-1) of wet sponge. This methodology is quick and simple, allowing for the quantification in sponges from natural environments, in situ cultures or dissociated cells.

  9. The role of pharmacoeconomics in current Indian healthcare system.

    PubMed

    Ahmad, Akram; Patel, Isha; Parimilakrishnan, Sundararajan; Mohanta, Guru Prasad; Chung, HaeChung; Chang, Jongwha

    2013-01-01

    Phamacoeconomics can aid the policy makers and the healthcare providers in decision making in evaluating the affordability of and access to rational drug use. Efficiency is a key concept of pharmacoeconomics, and various strategies are suggested for buying the greatest amount of benefits for a given resource use. Phamacoeconomic evaluation techniques such as cost minimization analysis, cost effectiveness analysis, cost benefit analysis, and cost utilization analysis, which support identification and quantification of cost of drugs, are conducted in a similar way, but vary in measurement of value of health benefits and outcomes. This article provides a brief overview about pharmacoeconomics, its utility with respect to the Indian pharmaceutical industry, and the expanding insurance system in India. Pharmacoeconomic evidences can be utilized to support decisions on licensing, pricing, reimbursement, and maintenance of formulary procedure of pharmaceuticals. For the insurance companies to give better facility at minimum cost, India must develop the platform for pharmacoeconomics with a validating methodology and appropriate training. The role of clinical pharmacists including PharmD graduates are expected to be more beneficial than the conventional pharmacists, as they will be able to apply the principles of economics in daily basis practice in community and hospital pharmacy.

  10. The role of pharmacoeconomics in current Indian healthcare system

    PubMed Central

    Ahmad, Akram; Patel, Isha; Parimilakrishnan, Sundararajan; Mohanta, Guru Prasad; Chung, HaeChung; Chang, Jongwha

    2013-01-01

    Phamacoeconomics can aid the policy makers and the healthcare providers in decision making in evaluating the affordability of and access to rational drug use. Efficiency is a key concept of pharmacoeconomics, and various strategies are suggested for buying the greatest amount of benefits for a given resource use. Phamacoeconomic evaluation techniques such as cost minimization analysis, cost effectiveness analysis, cost benefit analysis, and cost utilization analysis, which support identification and quantification of cost of drugs, are conducted in a similar way, but vary in measurement of value of health benefits and outcomes. This article provides a brief overview about pharmacoeconomics, its utility with respect to the Indian pharmaceutical industry, and the expanding insurance system in India. Pharmacoeconomic evidences can be utilized to support decisions on licensing, pricing, reimbursement, and maintenance of formulary procedure of pharmaceuticals. For the insurance companies to give better facility at minimum cost, India must develop the platform for pharmacoeconomics with a validating methodology and appropriate training. The role of clinical pharmacists including PharmD graduates are expected to be more beneficial than the conventional pharmacists, as they will be able to apply the principles of economics in daily basis practice in community and hospital pharmacy. PMID:24991597

  11. Methodological aspects of multicenter studies with quantitative PET.

    PubMed

    Boellaard, Ronald

    2011-01-01

    Quantification of whole-body FDG PET studies is affected by many physiological and physical factors. Much of the variability in reported standardized uptake value (SUV) data seen in the literature results from the variability in methodology applied among these studies, i.e., due to the use of different scanners, acquisition and reconstruction settings, region of interest strategies, SUV normalization, and/or corrections methods. To date, the variability in applied methodology prohibits a proper comparison and exchange of quantitative FDG PET data. Consequently, the promising role of quantitative PET has been demonstrated in several monocentric studies, but these published results cannot be used directly as a guideline for clinical (multicenter) trials performed elsewhere. In this chapter, the main causes affecting whole-body FDG PET quantification and strategies to minimize its inter-institute variability are addressed.

  12. Extraction Methodological Contributions Toward Ultra-Performance Liquid ChromatographyTime-of-Flight Mass Spectrometry: Quantification of Free GB from Various Food Matrices

    DTIC Science & Technology

    2016-02-01

    SPECTROMETRY: QUANTIFICATION OF FREE GB FROM VARIOUS FOOD MATRICES ECBC-TR-1351 Sue Y. Bae Mark D. Winemiller RESEARCH AND TECHNOLOGY DIRECTORATE...Flight Mass Spectrometry: Quantification of Free GB from Various Food Matrices 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...methylphosphonofluoridate (sarin, GB) in various food matrices. The development of a solid-phase extraction method using a normal-phase silica gel column for

  13. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.

  14. Quantification of flood risk mitigation benefits: A building-scale damage assessment through the RASOR platform.

    PubMed

    Arrighi, Chiara; Rossi, Lauro; Trasforini, Eva; Rudari, Roberto; Ferraris, Luca; Brugioni, Marcello; Franceschini, Serena; Castelli, Fabio

    2018-02-01

    Flood risk mitigation usually requires a significant investment of public resources and cost-effectiveness should be ensured. The assessment of the benefits of hydraulic works requires the quantification of (i) flood risk in absence of measures, (ii) risk in presence of mitigation works, (iii) investments to achieve acceptable residual risk. In this work a building-scale is adopted to estimate direct tangible flood losses to several building classes (e.g. residential, industrial, commercial, etc.) and respective contents, exploiting various sources of public open data in a GIS environment. The impact simulations for assigned flood hazard scenarios are computed through the RASOR platform which allows for an extensive characterization of the properties and their vulnerability through libraries of stage-damage curves. Recovery and replacement costs are estimated based on insurance data, market values and socio-economic proxies. The methodology is applied to the case study of Florence (Italy) where a system of retention basins upstream of the city is under construction to reduce flood risk. Current flood risk in the study area (70 km 2 ) is about 170 Mio euros per year without accounting for people, infrastructures, cultural heritage and vehicles at risk. The monetary investment in the retention basins is paid off in about 5 years. However, the results show that although hydraulic works are cost-effective, a significant residual risk has to be managed and the achievement of the desired level of acceptable risk would require about 1 billion euros of investments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Richard Yorg

    2011-03-01

    The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less

  16. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    PubMed

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  17. Balancing benefit and risk of medicines: a systematic review and classification of available methodologies.

    PubMed

    Mt-Isa, Shahrul; Hallgreen, Christine E; Wang, Nan; Callréus, Torbjörn; Genov, Georgy; Hirsch, Ian; Hobbiger, Stephen F; Hockley, Kimberley S; Luciani, Davide; Phillips, Lawrence D; Quartey, George; Sarac, Sinan B; Stoeckert, Isabelle; Tzoulaki, Ioanna; Micaleff, Alain; Ashby, Deborah

    2014-07-01

    The need for formal and structured approaches for benefit-risk assessment of medicines is increasing, as is the complexity of the scientific questions addressed before making decisions on the benefit-risk balance of medicines. We systematically collected, appraised and classified available benefit-risk methodologies to facilitate and inform their future use. A systematic review of publications identified benefit-risk assessment methodologies. Methodologies were appraised on their fundamental principles, features, graphical representations, assessability and accessibility. We created a taxonomy of methodologies to facilitate understanding and choice. We identified 49 methodologies, critically appraised and classified them into four categories: frameworks, metrics, estimation techniques and utility survey techniques. Eight frameworks describe qualitative steps in benefit-risk assessment and eight quantify benefit-risk balance. Nine metric indices include threshold indices to measure either benefit or risk; health indices measure quality-of-life over time; and trade-off indices integrate benefits and risks. Six estimation techniques support benefit-risk modelling and evidence synthesis. Four utility survey techniques elicit robust value preferences from relevant stakeholders to the benefit-risk decisions. Methodologies to help benefit-risk assessments of medicines are diverse and each is associated with different limitations and strengths. There is not a 'one-size-fits-all' method, and a combination of methods may be needed for each benefit-risk assessment. The taxonomy introduced herein may guide choice of adequate methodologies. Finally, we recommend 13 of 49 methodologies for further appraisal for use in the real-life benefit-risk assessment of medicines. Copyright © 2014 John Wiley & Sons, Ltd.

  18. A STATISTICAL MODELING METHODOLOGY FOR THE DETECTION, QUANTIFICATION, AND PREDICTION OF ECOLOGICAL THRESHOLDS

    EPA Science Inventory

    This study will provide a general methodology for integrating threshold information from multiple species ecological metrics, allow for prediction of changes of alternative stable states, and provide a risk assessment tool that can be applied to adaptive management. The integr...

  19. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    NASA Technical Reports Server (NTRS)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  20. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  1. Quantification of key long-term risks at CO₂ sequestration sites: Latest results from US DOE's National Risk Assessment Partnership (NRAP) Project

    DOE PAGES

    Pawar, Rajesh; Bromhal, Grant; Carroll, Susan; ...

    2014-12-31

    Risk assessment for geologic CO₂ storage including quantification of risks is an area of active investigation. The National Risk Assessment Partnership (NRAP) is a US-Department of Energy (US-DOE) effort focused on developing a defensible, science-based methodology and platform for quantifying risk profiles at geologic CO₂ sequestration sites. NRAP has been developing a methodology that centers round development of an integrated assessment model (IAM) using system modeling approach to quantify risks and risk profiles. The IAM has been used to calculate risk profiles with a few key potential impacts due to potential CO₂ and brine leakage. The simulation results are alsomore » used to determine long-term storage security relationships and compare the long-term storage effectiveness to IPCC storage permanence goal. Additionally, we also demonstrate application of IAM for uncertainty quantification in order to determine parameters to which the uncertainty in model results is most sensitive.« less

  2. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  3. Comparison of biochemical and microscopic methods for quantification of mycorrhizal fungi in soil and roots

    USDA-ARS?s Scientific Manuscript database

    Arbuscular mycorrhizal fungi (AMF) are well-known plant symbionts which provide enhanced phosphorus uptake as well as other benefits to their host plants. Quantification of mycorrhizal biomass and root colonization has traditionally been performed by root staining and microscopic examination methods...

  4. Phase 1 of the near term hybrid passenger vehicle development program, appendix A. Mission analysis and performance specification studies. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Traversi, M.; Barbarek, L. A. C.

    1979-01-01

    A handy reference for JPL minimum requirements and guidelines is presented as well as information on the use of the fundamental information source represented by the Nationwide Personal Transportation Survey. Data on U.S. demographic statistics and highway speeds are included along with methodology for normal parameters evaluation, synthesis of daily distance distributions, and projection of car ownership distributions. The synthesis of tentative mission quantification results, of intermediate mission quantification results, and of mission quantification parameters are considered and 1985 in place fleet fuel economy data are included.

  5. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  6. Methods for quantification of soil-transmitted helminths in environmental media: current techniques and recent advances

    PubMed Central

    Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.

    2015-01-01

    Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788

  7. EPRI/NRC-RES fire human reliability analysis guidelines.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan

    2010-03-01

    During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less

  8. Direct PCR amplification of forensic touch and other challenging DNA samples: A review.

    PubMed

    Cavanaugh, Sarah E; Bathrick, Abigail S

    2018-01-01

    DNA evidence sample processing typically involves DNA extraction, quantification, and STR amplification; however, DNA loss can occur at both the DNA extraction and quantification steps, which is not ideal for forensic evidence containing low levels of DNA. Direct PCR amplification of forensic unknown samples has been suggested as a means to circumvent extraction and quantification, thereby retaining the DNA typically lost during those procedures. Direct PCR amplification is a method in which a sample is added directly to an amplification reaction without being subjected to prior DNA extraction, purification, or quantification. It allows for maximum quantities of DNA to be targeted, minimizes opportunities for error and contamination, and reduces the time and monetary resources required to process samples, although data analysis may take longer as the increased DNA detection sensitivity of direct PCR may lead to more instances of complex mixtures. ISO 17025 accredited laboratories have successfully implemented direct PCR for limited purposes (e.g., high-throughput databanking analysis), and recent studies indicate that direct PCR can be an effective method for processing low-yield evidence samples. Despite its benefits, direct PCR has yet to be widely implemented across laboratories for the processing of evidentiary items. While forensic DNA laboratories are always interested in new methods that will maximize the quantity and quality of genetic information obtained from evidentiary items, there is often a lag between the advent of useful methodologies and their integration into laboratories. Delayed implementation of direct PCR of evidentiary items can be attributed to a variety of factors, including regulatory guidelines that prevent laboratories from omitting the quantification step when processing forensic unknown samples, as is the case in the United States, and, more broadly, a reluctance to validate a technique that is not widely used for evidence samples. The advantages of direct PCR of forensic evidentiary samples justify a re-examination of the factors that have delayed widespread implementation of this method and of the evidence supporting its use. In this review, the current and potential future uses of direct PCR in forensic DNA laboratories are summarized. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  10. ESA space spin-offs benefits for the health sector

    NASA Astrophysics Data System (ADS)

    Szalai, Bianca; Detsis, Emmanouil; Peeters, Walter

    2012-11-01

    Humanity will be faced with an important number of future challenges, including an expansion of the lifespan, a considerable increase of the population (estimated 9 billion by 2050) and a depletion of resources. These factors could trigger an increase of chronic diseases and various other health concerns that would bear a heavy weight on finances worldwide. Scientific advances can play an important role in solving a number of these problems, space technology; in general, can propose a panoply of possible solutions and applications that can make life on Earth easier and better for everyone. Satellites, Earth Observation, the International Space Station (ISS) and the European Space Agency (ESA) may not be the first tools that come to mind when thinking of improving health, yet there are many ways in which ESA and its programmes contribute to the health care arena. The research focuses on quantifying two ESA spin-offs to provide an initial view on how space can contribute to worldwide health. This quantification is part of the present strategy not only to show macroeconomic return factors for space in general, but also to identify and describe samples of 'best practice' type of examples close to the general public's interest. For each of the 'best practices' the methodology takes into account the cost of the space hardware/software, a number of tangible and intangible benefits, as well as some logical assumptions in order to determine the potential overall returns. Some of the hindering factors for a precise quantification are also highlighted. In conclusion, the study recommends a way in which ESA's spin-offs can be taken into account early on in the development process of space programmes in order to generate higher awareness with the general public and also to provide measurable returns.

  11. Development and validation of a liquid chromatography isotope dilution mass spectrometry method for the reliable quantification of alkylphenols in environmental water samples by isotope pattern deconvolution.

    PubMed

    Fabregat-Cabello, Neus; Sancho, Juan V; Vidal, Andreu; González, Florenci V; Roig-Navarro, Antoni Francesc

    2014-02-07

    We present here a new measurement method for the rapid extraction and accurate quantification of technical nonylphenol (NP) and 4-t-octylphenol (OP) in complex matrix water samples by UHPLC-ESI-MS/MS. The extraction of both compounds is achieved in 30min by means of hollow fiber liquid phase microextraction (HF-LPME) using 1-octanol as acceptor phase, which provides an enrichment (preconcentration) factor of 800. On the other hand we have developed a quantification method based on isotope dilution mass spectrometry (IDMS) and singly (13)C1-labeled compounds. To this end the minimal labeled (13)C1-4-(3,6-dimethyl-3-heptyl)-phenol and (13)C1-t-octylphenol isomers were synthesized, which coelute with the natural compounds and allows the compensation of the matrix effect. The quantification was carried out by using isotope pattern deconvolution (IPD), which permits to obtain the concentration of both compounds without the need to build any calibration graph, reducing the total analysis time. The combination of both extraction and determination techniques have allowed to validate for the first time a HF-LPME methodology at the required levels by legislation achieving limits of quantification of 0.1ngmL(-1) and recoveries within 97-109%. Due to the low cost of HF-LPME and total time consumption, this methodology is ready for implementation in routine analytical laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.

    PubMed

    Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus

    2018-03-01

    Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1  < k). The improved analytical 'quantification of exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  13. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    NASA Technical Reports Server (NTRS)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  14. Three-dimensional microstructural characterization of bulk plutonium and uranium metals using focused ion beam technique

    NASA Astrophysics Data System (ADS)

    Chung, Brandon W.; Erler, Robert G.; Teslich, Nick E.

    2016-05-01

    Nuclear forensics requires accurate quantification of discriminating microstructural characteristics of the bulk nuclear material to identify its process history and provenance. Conventional metallographic preparation techniques for bulk plutonium (Pu) and uranium (U) metals are limited to providing information in two-dimension (2D) and do not allow for obtaining depth profile of the material. In this contribution, use of dual-beam focused ion-beam/scanning electron microscopy (FIB-SEM) to investigate the internal microstructure of bulk Pu and U metals is demonstrated. Our results demonstrate that the dual-beam methodology optimally elucidate microstructural features without preparation artifacts, and the three-dimensional (3D) characterization of inner microstructures can reveal salient microstructural features that cannot be observed from conventional metallographic techniques. Examples are shown to demonstrate the benefit of FIB-SEM in improving microstructural characterization of microscopic inclusions, particularly with respect to nuclear forensics.

  15. Three-dimensional microstructural characterization of bulk plutonium and uranium metals using focused ion beam technique

    DOE PAGES

    Chung, Brandon W.; Erler, Robert G.; Teslich, Nick E.

    2016-03-03

    Nuclear forensics requires accurate quantification of discriminating microstructural characteristics of the bulk nuclear material to identify its process history and provenance. Conventional metallographic preparation techniques for bulk plutonium (Pu) and uranium (U) metals are limited to providing information in two-dimension (2D) and do not allow for obtaining depth profile of the material. In this contribution, use of dual-beam focused ion-beam/scanning electron microscopy (FIB-SEM) to investigate the internal microstructure of bulk Pu and U metals is demonstrated. Our results demonstrate that the dual-beam methodology optimally elucidate microstructural features without preparation artifacts, and the three-dimensional (3D) characterization of inner microstructures can revealmore » salient microstructural features that cannot be observed from conventional metallographic techniques. As a result, examples are shown to demonstrate the benefit of FIB-SEM in improving microstructural characterization of microscopic inclusions, particularly with respect to nuclear forensics.« less

  16. Methods for Quantification of Soil-Transmitted Helminths in Environmental Media: Current Techniques and Recent Advances.

    PubMed

    Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V

    2015-12-01

    Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. New solid surface fluorescence methodology for lead traces determination using rhodamine B as fluorophore and coacervation scheme: Application to lead quantification in e-cigarette refill liquids.

    PubMed

    Talio, María C; Zambrano, Karen; Kaplan, Marcos; Acosta, Mariano; Gil, Raúl A; Luconi, Marta O; Fernández, Liliana P

    2015-10-01

    A new environmental friendly methodology based on fluorescent signal enhancement of rhodamine B dye is proposed for Pb(II) traces quantification using a preconcentration step based on the coacervation phenomenon. A cationic surfactant (cetyltrimethylammonium bromide, CTAB) and potassium iodine were chosen for this aim. The coacervate phase was collected on a filter paper disk and the solid surface fluorescence signal was determined in a spectrofluorometer. Experimental variables that influence on preconcentration step and fluorimetric sensitivity have been optimized using uni-variation assays. The calibration graph using zero th order regression was linear from 7.4×10(-4) to 3.4 μg L(-1) with a correlation coefficient of 0.999. Under the optimal conditions, a limit of detection of 2.2×10(-4) μg L(-1) and a limit of quantification of 7.4×10(-4) μg L(-1) were obtained. The method showed good sensitivity, adequate selectivity with good tolerance to foreign ions, and was applied to the determination of trace amounts of Pb(II) in refill solutions for e-cigarettes with satisfactory results validated by ICP-MS. The proposed method represents an innovative application of coacervation processes and of paper filters to solid surface fluorescence methodology. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Application of ERTS-1 data to the protection and management of New Jersey's coastal environment

    NASA Technical Reports Server (NTRS)

    Yunghans, R. S.; Feinberg, E. B.; Wobber, F. J.; Mairs, R. L. (Principal Investigator); Macomber, R. T.; Stanczuk, D.

    1973-01-01

    The author has identified the following significant results. A Coastal Zone Surveillance Program has been developed in which systematic comparisons of early ERTS-1 images and recently acquired images are regularly made to identify areas where changes have occurred. A methodology for assessing and documenting benefits has been established. Quantification of benefits has been directed toward four candidate areas: shore protection, ocean outfalls, coastal land resources, and offshore waste disposal. A refinement in the change detection analysis procedure has led to greater accuracy in spotting developmental changes in the Coastal Zone. Preliminary conclusions drawn from the Shore Erosion case study indicate that in the northern test area (developed beach) erosion has occurred more often, is generally more severe, and the beach is slower to recover than in the southern test area (natural beach). From these data it appears that it may be possible to define areas most likely to experience further erosion. The assumption of continued erosion in areas that have at one time experienced severe erosion is supported by the simple fact that as a beach narrows wave energy is concentrated on a narrower beach surface. The higher energy condition subsequently results in accelerated erosion.

  19. Modeling and analysis of walkability in suburban neighborhoods in Las Vegas.

    DOT National Transportation Integrated Search

    2017-05-01

    Walking has sound health benefits and can be a pleasurable experience requiring neither fuel, fare, license, nor registration. : Society also benefits by the associated reduction of motorized vehicle travel. The objective of this study was to quantif...

  20. Value of Earth Observation for Risk Mitigation

    NASA Astrophysics Data System (ADS)

    Pearlman, F.; Shapiro, C. D.; Grasso, M.; Pearlman, J.; Adkins, J. E.; Pindilli, E.; Geppi, D.

    2017-12-01

    Societal benefits flowing from Earth observation are intuitively obvious as we use the information to assess natural hazards (such as storm tracks), water resources (such as flooding and droughts in coastal and riverine systems), ecosystem vitality and other dynamics that impact the health and economic well being of our population. The most powerful confirmation of these benefits would come from quantifying the impact and showing direct quantitative links in the value chain from data to decisions. However, our ability to identify and quantify those benefits is challenging. The impact of geospatial data on these types of decisions is not well characterized and assigning a true value to the observations on a broad scale across disciplines still remains to be done in a systematic way. This presentation provides the outcomes of a workshop held in October 2017 as a side event of the GEO Plenary that addressed research on economic methodologies for quantification of impacts. To achieve practical outputs during the meeting, the workshop focused on the use and value of Earth observations in risk mitigation including: ecosystem impacts, weather events, and other natural and manmade hazards. Case studies on approaches were discussed and will be part of this presentation. The presentation will also include the exchange of lessons learned and a discussion of gaps in the current understanding of the use and value of earth observation information for risk mitigation.

  1. Cost benefits of advanced software: A review of methodology used at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1993-01-01

    To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.

  2. Critical methodological factors in diagnosing minimal residual disease in hematological malignancies using quantitative PCR.

    PubMed

    Nyvold, Charlotte Guldborg

    2015-05-01

    Hematological malignancies are a heterogeneous group of cancers with respect to both presentation and prognosis, and many subtypes are nowadays associated with aberrations that make up excellent molecular targets for the quantification of minimal residual disease. The quantitative PCR methodology is outstanding in terms of sensitivity, specificity and reproducibility and thus an excellent choice for minimal residual disease assessment. However, the methodology still has pitfalls that should be carefully considered when the technique is integrated in a clinical setting.

  3. 76 FR 34270 - Federal-State Extended Benefits Program-Methodology for Calculating “on” or “off” Total...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ...--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate Indicators for Purposes of Determining...'' or ``off'' total unemployment rate (TUR) indicators to determine when extended benefit (EB) periods...-State Extended Benefits Program--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate...

  4. Ecosystem services in urban water investment.

    PubMed

    Kandulu, John M; Connor, Jeffery D; MacDonald, Darla Hatton

    2014-12-01

    Increasingly, water agencies and utilities have an obligation to consider the broad environmental impacts associated with investments. To aid in understanding water cycle interdependencies when making urban water supply investment decisions, an ecosystem services typology was augmented with the concept of integrated water resources management. This framework is applied to stormwater harvesting in a case study catchment in Adelaide, South Australia. Results show that this methodological framework can effectively facilitate systematic consideration and quantitative assessment of broad environmental impacts of water supply investments. Five ecosystem service impacts were quantified including provision of 1) urban recreational amenity, 2) regulation of coastal water quality, 3) salinity, 4) greenhouse gas emissions, and 5) support of estuarine habitats. This study shows that ignoring broad environmental impacts can underestimate ecosystem service benefits of water supply investments by a value of up to A$1.36/kL, or three times the cost of operating and maintenance of stormwater harvesting. Rigorous assessment of the public welfare impacts of water infrastructure investments is required to guide long-term optimal water supply investment decisions. Numerous challenges remain in the quantification of broad environmental impacts of a water supply investment including a lack of peer-reviewed studies of environmental impacts, aggregation of incommensurable impacts, potential for double-counting errors, uncertainties in available impact estimates, and how to determine the most suitable quantification technique. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Quantitative Determination of Noa (Naturally Occurring Asbestos) in Rocks : Comparison Between Pcom and SEM Analysis

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Amodeo, Francesco; Giorgis, Ilaria; Vitaliti, Martina

    2017-04-01

    The quantification of NOA (Naturally Occurring Asbestos) in a rock or soil matrix is complex and subject to numerous errors. The purpose of this study is to compare two fundamental methodologies used for the analysis: the first one uses Phase Contrast Optical Microscope (PCOM) while the second one uses Scanning Electron Microscope (SEM). The two methods, although they provide the same result, which is the asbestos mass to total mass ratio, have completely different characteristics and both present pros and cons. The current legislation in Italy involves the use of SEM, DRX, FTIR, PCOM (DM 6/9/94) for the quantification of asbestos in bulk materials and soils and the threshold beyond which the material is considered as hazardous waste is a concentration of asbestos fiber of 1000 mg/kg.(DM 161/2012). The most used technology is the SEM which is the one among these with the better analytical sensitivity.(120mg/Kg DM 6 /9/94) The fundamental differences among the analyses are mainly: - Amount of analyzed sample portion - Representativeness of the sample - Resolution - Analytical precision - Uncertainty of the methodology - Operator errors Due to the problem of quantification of DRX and FTIR (1% DM 6/9/94) our Asbestos Laboratory (DIATI POLITO) since more than twenty years apply the PCOM methodology and in the last years the SEM methodology for quantification of asbestos content. The aim of our research is to compare the results obtained from a PCOM analysis with the results provided by SEM analysis on the base of more than 100 natural samples both from cores (tunnel-boring or explorative-drilling) and from tunnelling excavation . The results obtained show, in most cases, a good correlation between the two techniques. Of particular relevance is the fact that both techniques are reliable for very low quantities of asbestos, even lower than the analytical sensitivity. This work highlights the comparison between the two techniques emphasizing strengths and weaknesses of the two procedures and suggests how an integrated approach, together with the skills and experience of the operator may be the best way forward in order to obtain a constructive improvement of analysis techniques.

  6. Development of Total Reflection X-ray fluorescence spectrometry quantitative methodologies for elemental characterization of building materials and their degradation products

    NASA Astrophysics Data System (ADS)

    García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel

    2018-05-01

    In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of the TXRF results.

  7. Organizational Change Efforts: Methodologies for Assessing Organizational Effectiveness and Program Costs versus Benefits.

    ERIC Educational Resources Information Center

    Macy, Barry A.; Mirvis, Philip H.

    1982-01-01

    A standardized methodology for identifying, defining, and measuring work behavior and performance rather than production, and a methodology that estimates the costs and benefits of work innovation are presented for assessing organizational effectiveness and program costs versus benefits in organizational change programs. Factors in a cost-benefit…

  8. Multiple reaction monitoring (MRM) of plasma proteins in cardiovascular proteomics.

    PubMed

    Dardé, Verónica M; Barderas, Maria G; Vivanco, Fernando

    2013-01-01

    Different methodologies have been used through years to discover new potential biomarkers related with cardiovascular risk. The conventional proteomic strategy involves a discovery phase that requires the use of mass spectrometry (MS) and a validation phase, usually on an alternative platform such as immunoassays that can be further implemented in clinical practice. This approach is suitable for a single biomarker, but when large panels of biomarkers must be validated, the process becomes inefficient and costly. Therefore, it is essential to find an alternative methodology to perform the biomarker discovery, validation, and -quantification. The skills provided by quantitative MS turn it into an extremely attractive alternative to antibody-based technologies. Although it has been traditionally used for quantification of small molecules in clinical chemistry, MRM is now emerging as an alternative to traditional immunoassays for candidate protein biomarker validation.

  9. Life-Cycle Cost/Benefit Assessment of Expedite Departure Path (EDP)

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Chang, Paul; Datta, Koushik

    2005-01-01

    This report presents a life-cycle cost/benefit assessment (LCCBA) of Expedite Departure Path (EDP), an air traffic control Decision Support Tool (DST) currently under development at NASA. This assessment is an update of a previous study performed by bd Systems, Inc. (bd) during FY01, with the following revisions: The life-cycle cost assessment methodology developed by bd for the previous study was refined and calibrated using Free Flight Phase 1 (FFP1) cost information for Traffic Management Advisor (TMA, or TMA-SC in the FAA's terminology). Adjustments were also made to the site selection and deployment scheduling methodology to include airspace complexity as a factor. This technique was also applied to the benefit extrapolation methodology to better estimate potential benefits for other years, and at other sites. This study employed a new benefit estimating methodology because bd s previous single year potential benefit assessment of EDP used unrealistic assumptions that resulted in optimistic estimates. This methodology uses an air traffic simulation approach to reasonably predict the impacts from the implementation of EDP. The results of the costs and benefits analyses were then integrated into a life-cycle cost/benefit assessment.

  10. Need for a marginal methodology in assessing natural gas system methane emissions in response to incremental consumption.

    PubMed

    Mac Kinnon, Michael; Heydarzadeh, Zahra; Doan, Quy; Ngo, Cuong; Reed, Jeff; Brouwer, Jacob

    2018-05-17

    Accurate quantification of methane emissions from the natural gas system is important for establishing greenhouse gas inventories and understanding cause and effect for reducing emissions. Current carbon intensity methods generally assume methane emissions are proportional to gas throughput so that increases in gas consumption yield linear increases in emitted methane. However, emissions sources are diverse and many are not proportional to throughput. Insights into the causal drivers of system methane emissions, and how system-wide changes affect such drivers are required. The development of a novel cause-based methodology to assess marginal methane emissions per unit of fuel consumed is introduced. The carbon intensities of technologies consuming natural gas are critical metrics currently used in policy decisions for reaching environmental goals. For example, the low-carbon fuel standard in California uses carbon intensity to determine incentives provided. Current methods generally assume methane emissions from the natural gas system are completely proportional to throughput. The proposed cause-based marginal emissions method will provide a better understanding of the actual drivers of emissions to support development of more effective mitigation measures. Additionally, increasing the accuracy of carbon intensity calculations supports the development of policies that can maximize the environmental benefits of alternative fuels, including reducing greenhouse gas emissions.

  11. Measuring the complexity of design in real-time imaging software

    NASA Astrophysics Data System (ADS)

    Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.

    2007-02-01

    Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.

  12. Stochastic output error vibration-based damage detection and assessment in structures under earthquake excitation

    NASA Astrophysics Data System (ADS)

    Sakellariou, J. S.; Fassois, S. D.

    2006-11-01

    A stochastic output error (OE) vibration-based methodology for damage detection and assessment (localization and quantification) in structures under earthquake excitation is introduced. The methodology is intended for assessing the state of a structure following potential damage occurrence by exploiting vibration signal measurements produced by low-level earthquake excitations. It is based upon (a) stochastic OE model identification, (b) statistical hypothesis testing procedures for damage detection, and (c) a geometric method (GM) for damage assessment. The methodology's advantages include the effective use of the non-stationary and limited duration earthquake excitation, the handling of stochastic uncertainties, the tackling of the damage localization and quantification subproblems, the use of "small" size, simple and partial (in both the spatial and frequency bandwidth senses) identified OE-type models, and the use of a minimal number of measured vibration signals. Its feasibility and effectiveness are assessed via Monte Carlo experiments employing a simple simulation model of a 6 storey building. It is demonstrated that damage levels of 5% and 20% reduction in a storey's stiffness characteristics may be properly detected and assessed using noise-corrupted vibration signals.

  13. Development of a qPCR Method for the Identification and Quantification of Two Closely Related Tuna Species, Bigeye Tuna (Thunnus obesus) and Yellowfin Tuna (Thunnus albacares), in Canned Tuna.

    PubMed

    Bojolly, Daline; Doyen, Périne; Le Fur, Bruno; Christaki, Urania; Verrez-Bagnis, Véronique; Grard, Thierry

    2017-02-01

    Bigeye tuna (Thunnus obesus) and yellowfin tuna (Thunnus albacares) are among the most widely used tuna species for canning purposes. Not only substitution but also mixing of tuna species is prohibited by the European regulation for canned tuna products. However, as juveniles of bigeye and yellowfin tunas are very difficult to distinguish, unintentional substitutions may occur during the canning process. In this study, two mitochondrial markers from NADH dehydrogenase subunit 2 and cytochrome c oxidase subunit II genes were used to identify bigeye tuna and yellowfin tuna, respectively, utilizing TaqMan qPCR methodology. Two different qPCR-based methods were developed to quantify the percentage of flesh of each species used for can processing. The first one was based on absolute quantification using standard curves realized with these two markers; the second one was founded on relative quantification with the universal 12S rRNA gene as the endogenous gene. On the basis of our results, we conclude that our methodology could be applied to authenticate these two closely related tuna species when used in a binary mix in tuna cans.

  14. Quantifying construction and demolition waste: an analytical review.

    PubMed

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Electromagnetic Modeling, Optimization and Uncertainty Quantification for Antenna and Radar Systems Surfaces Scattering and Energy Absorption

    DTIC Science & Technology

    2017-03-06

    design of antenna and radar systems, energy absorption and scattering by rough-surfaces. This work has lead to significant new methodologies , including...problems in the field of electromagnetic propagation and scattering, with applicability to design of antenna and radar systems, energy absorption...and scattering by rough-surfaces. This work has lead to significant new methodologies , including introduction of a certain Windowed Green Function

  16. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  17. A Methodology for Dynamic Security Risk Quantification and Optimal Resource Allocation of Security Assets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brigantic, Robert T.; Betzsold, Nick J.; Bakker, Craig KR

    In this presentation we overview a methodology for dynamic security risk quantification and optimal resource allocation of security assets for high profile venues. This methodology is especially applicable to venues that require security screening operations such as mass transit (e.g., train or airport terminals), critical infrastructure protection (e.g., government buildings), and largescale public events (e.g., concerts or professional sports). The method starts by decomposing the three core components of risk -- threat, vulnerability, and consequence -- into their various subcomponents. For instance, vulnerability can be decomposed into availability, accessibility, organic security, and target hardness and each of these can bemore » evaluated against the potential threats of interest for the given venue. Once evaluated, these subcomponents are rolled back up to compute the specific value for the vulnerability core risk component. Likewise, the same is done for consequence and threat, and then risk is computed as the product of these three components. A key aspect of our methodology is dynamically quantifying risk. That is, we incorporate the ability to uniquely allow the subcomponents and core components, and in turn, risk, to be quantified as a continuous function of time throughout the day, week, month, or year as appropriate.« less

  18. Riparian and Related Values Associated with Flood Control Project Alternatives at Wildcat and San Pablo Creeks

    Treesearch

    Philip A. Meyer

    1989-01-01

    This analysis will consider Riparian benefits from alternative project designs at Wildcat and San Pablo Creeks. Particular emphasis will be placed on quantification of riparian values and on the relationship of projects benefits for each project alternative to estimated costs of implementation.

  19. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris.

    PubMed

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest.

  20. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris

    PubMed Central

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest. PMID:26284241

  1. Non-perturbative Quantification of Ionic Charge Transfer through Nm-Scale Protein Pores Using Graphene Microelectrodes

    NASA Astrophysics Data System (ADS)

    Ping, Jinglei; Johnson, A. T. Charlie; A. T. Charlie Johnson Team

    Conventional electrical methods for detecting charge transfer through protein pores perturb the electrostatic condition of the solution and chemical reactivity of the pore, and are not suitable to be used for complex biofluids. We developed a non-perturbative methodology ( fW input power) for quantifying trans-pore electrical current and detecting the pore status (i.e., open vs. closes) via graphene microelectrodes. Ferritin was used as a model protein featuring a large interior compartment, well-separated from the exterior solution with discrete pores as charge commuting channels. The charge flowing through the ferritin pores transfers into the graphene microelectrode and is recorded by an electrometer. In this example, our methodology enables the quantification of an inorganic nanoparticle-protein nanopore interaction in complex biofluids. The authors acknowledge the support from the Defense Advanced Research Projects Agency (DARPA) and the U.S. Army Research Office under Grant Number W911NF1010093.

  2. Monitoring of chlorsulfuron in biological fluids and water samples by molecular fluorescence using rhodamine B as fluorophore.

    PubMed

    Alesso, Magdalena; Escudero, Luis A; Talio, María Carolina; Fernández, Liliana P

    2016-11-01

    A new simple methodology is proposed for chlorsufuron (CS) traces quantification based upon enhancement of rhodamine B (RhB) fluorescent signal. Experimental variables that influence fluorimetric sensitivity have been studied and optimized. The zeroth order regression calibration was linear from 0.866 to 35.800µgL(-1) CS, with a correlation coefficient of 0.99. At optimal experimental conditions, a limit of detection of 0.259µgL(-1) and a limit of quantification of 0.866µgL(-1) were obtained. The method showed good sensitivity and adequate selectivity and was applied to the determination of trace amounts of CS in plasma, serum and water samples with satisfactory results analyzed by ANOVA test. The proposed methodology represents an alternative to traditional chromatographic techniques for CS monitoring in complex samples, using an accessible instrument in control laboratories. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Shadow prices of emerging pollutants in wastewater treatment plants: Quantification of environmental externalities.

    PubMed

    Bellver-Domingo, A; Fuentes, R; Hernández-Sancho, F

    2017-12-01

    Conventional wastewater treatment plants (WWTPs) are designed to remove mainly the organic matter, nitrogen and phosphorus compounds and suspended solids from wastewater but are not capable of removing chemicals of human origin, such as pharmaceutical and personal care products (PPCPs). The presence of PPCPs in wastewater has environmental effects on the water bodies receiving the WWTP effluents and renders the effluent as unsuitable as a nonconventional water source. Considering PPCPs as non-desirable outputs, the shadow prices methodology has been implemented using the output distance function to measure the environmental benefits of removing five PPCPs (acetaminophen, ibuprofen, naproxen, carbamazepine and trimethoprim) from WWTP effluents discharged to three different ecosystems (wetland, river and sea). Acetaminophen and ibuprofen show the highest shadow prices of the sample for wetland areas. Their values are 128.2 and 11.0 €/mg respectively. These results represent a proxy in monetary terms of the environmental benefit achieved from avoiding the discharge of these PPCPs in wetlands. These results suggest which PPCPs are urgent to remove from wastewater and which ecosystems are most vulnerable to their presence. The findings of this study will be useful for the plant managers in order to make decisions about prioritization in the removal of different pollutants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Time-Structured and Net Intraindividual Variability: Tools for Examining the Development of Dynamic Characteristics and Processes

    PubMed Central

    Ram, Nilam; Gerstorf, Denis

    2009-01-01

    The study of intraindividual variability is the study of fluctuations, oscillations, adaptations, and “noise” in behavioral outcomes that manifest on micro-time scales. This paper provides a descriptive frame for the combined study of intraindividual variability and aging/development. At the conceptual level, we highlight that the study of intraindividual variability provides access to dynamic characteristics – construct-level descriptions of individuals' capacities for change (e.g., lability), and dynamic processes – the systematic changes individuals' exhibit in response to endogenous and exogenous influences (e.g., regulation). At the methodological level, we review how quantifications of net intraindividual variability (e.g., iSD) and models of time-structured intraindividual variability (e.g., time-series) are being used to measure and describe dynamic characteristics and processes. At the research design level, we point to the benefits of measurement burst study designs, wherein data are obtained across multiple time scales, for the study of development. PMID:20025395

  5. Methods for measuring denitrification: Diverse approaches to a difficult problem

    USGS Publications Warehouse

    Groffman, Peter M; Altabet, Mary A.; Böhlke, J.K.; Butterbach-Bahl, Klaus; David, Mary B.; Firestone, Mary K.; Giblin, Anne E.; Kana, Todd M.; Nielsen , Lars Peter; Voytek, Mary A.

    2006-01-01

    Denitrification, the reduction of the nitrogen (N) oxides, nitrate (NO3−) and nitrite (NO2−), to the gases nitric oxide (NO), nitrous oxide (N2O), and dinitrogen (N2), is important to primary production, water quality, and the chemistry and physics of the atmosphere at ecosystem, landscape, regional, and global scales. Unfortunately, this process is very difficult to measure, and existing methods are problematic for different reasons in different places at different times. In this paper, we review the major approaches that have been taken to measure denitrification in terrestrial and aquatic environments and discuss the strengths, weaknesses, and future prospects for the different methods. Methodological approaches covered include (1) acetylene-based methods, (2) 15N tracers, (3) direct N2 quantification, (4) N2:Ar ratio quantification, (5) mass balance approaches, (6) stoichiometric approaches, (7) methods based on stable isotopes, (8) in situ gradients with atmospheric environmental tracers, and (9) molecular approaches. Our review makes it clear that the prospects for improved quantification of denitrification vary greatly in different environments and at different scales. While current methodology allows for the production of accurate estimates of denitrification at scales relevant to water and air quality and ecosystem fertility questions in some systems (e.g., aquatic sediments, well-defined aquifers), methodology for other systems, especially upland terrestrial areas, still needs development. Comparison of mass balance and stoichiometric approaches that constrain estimates of denitrification at large scales with point measurements (made using multiple methods), in multiple systems, is likely to propel more improvement in denitrification methods over the next few years.

  6. Institutionalizing urban forestry as a "biotechnology" to improve environmental quality

    Treesearch

    David J. Nowak

    2006-01-01

    Urban forests can provide multiple environmental benefits. As urban areas expand, the role of urban vegetation in improving environmental quality will increase in importance. Quantification of these benefits has revealed that urban forests can significantly improve air quality. As a result, national air quality regulations are now willing to potentially credit tree...

  7. Análisis de costo-beneficio: prevención del VIH/sida en migrantes en Centroamérica

    PubMed Central

    Alarid-Escudero, Fernando; Sosa-Rubí, Sandra G.; Fernández, Bertha; Galárraga, Omar

    2014-01-01

    Objective To quantify the costs and benefits of three HIV prevention interventions in migrants in Central America: voluntary counseling and testing, treatment of sexually transmitted infections, and condom distribution. Materials and methods The methods were: a) identification and quantification of costs; b) quantification of benefits, defined as the potential savings in antiretroviral treatment of HIV cases prevented; and c) estimation of the cost-benefit ratio. Results The model estimated that 9, 21 and 8 cases of HIV were prevented by voluntary counseling and testing, treatment for sexually transmitted infections and condom distribution per 10 000 migrants, respectively. In Panama, condom distribution and treatment for sexually transmitted infections had a return of US$131/USD and US$69.8/USD. Returns in El Salvador were US$2.0/USD and US$42.3/USD in voluntary counseling and testing and condom distribution, respectively. Conclusion The potential savings on prevention have a large variation between countries. Nevertheless, the cost-benefit estimates suggest that the HIV prevention programs in Central America can potentially result in monetary savings in the long run. PMID:23918053

  8. Quantification of the Impact of Roadway Conditions on Emissions

    DOT National Transportation Integrated Search

    2017-11-01

    The scope of this project involved developing a methodology to quantify the impact of roads condition on emissions and providing guidance to assist TxDOT in improving maintenance strategies to reduce gas emissions. The research quantified vehicle ...

  9. Nanomaterials in consumer products: a challenging analytical problem.

    PubMed

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  10. Nanomaterials in consumer products: a challenging analytical problem

    NASA Astrophysics Data System (ADS)

    Contado, Catia

    2015-08-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  11. Theobroma cacao L., the Food of the Gods: a scientific approach beyond myths and claims.

    PubMed

    Rusconi, M; Conti, A

    2010-01-01

    Cocoa beans are rich source of polyphenols, contributing about 10% of the dry weight of the whole bean and its derivative chocolate, particularly dark chocolate, is considered one of the major contributors of antioxidants to the American diet after fruits and vegetables. At present the wide variation in cocoa processing and in the content and profile of polyphenols make it difficult to determine to what extent the findings about positive effects expressed in different studies, translate into tangible clinical benefits. Moreover, before claiming any healthy properties to a plant, natural product or food item on human subject, a basic research project approved by scientific and ethical commissions has to be performed. Until now the definition, composition, manufacturing specifications, packaging and labelling of cocoa and chocolate products in Europe, are regulated by "Directive 2000/36/EC of the European parliament and of the council". The definitions take changes in consumer tastes, chocolate composition and labelling into account, but do not consider the real potential of healthy, beneficial and nutraceutical effects. In fact, they fail to establish an official analytical methodology for the quantification of phenolic compounds in cocoa and chocolate. Moreover quantification of these compounds is not used in product classification. This article reviews many qualitative differences of cocoa and chocolate, in particular dark chocolate, aiming to establish the different implications for public health through the use of the analyzed concentration of polyphenols in cocoa products. Copyright 2009 Elsevier Ltd. All rights reserved.

  12. Nanomaterials in consumer products: a challenging analytical problem

    PubMed Central

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216

  13. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  14. The influence of capture-recapture methodology on the evolution of the North American Bird Banding Program

    USGS Publications Warehouse

    Tautin, J.; Lebreton, J.-D.; North, P.M.

    1993-01-01

    Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.

  15. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  16. Stoichiometric and kinetic analysis of extreme halophilic Archaea on various substrates in a corrosion resistant bioreactor.

    PubMed

    Lorantfy, Bettina; Seyer, Bernhard; Herwig, Christoph

    2014-01-25

    Extreme halophilic Archaea are extremophile species which can thrive in hypersaline environments of up to 3-5 M sodium chloride concentration. Although their ecology and physiology are widely identified on the microbiological level, little emphasis has been laid on quantitative bioprocess development with extreme halophiles. The goal of this study was to establish, on the one hand, a methodological basis for quantitative bioprocess analysis of extreme halophilic Archaea with an extreme halophilic strain as an example. Firstly, as a novel usage, a corrosion resistant bioreactor setup for extreme halophiles has been implemented. Then, paying special attention to total bioprocess quantification approaches, an indirect method for biomass quantification using on-line process signals was introduced. Subsequently, robust quantitative data evaluation methods for halophiles could be developed, providing defined and controlled cultivation conditions in the bioreactor and therefore obtaining suitable quality of on-line as well as off-line datasets. On the other hand, new physiological results of extreme halophiles in bioreactor have also been obtained based on the quantitative methodological tools. For the first time, quantitative data on stoichiometry and kinetics were collected and evaluated on different carbon sources. The results on various substrates were interpreted, with proposed metabolic mechanisms, by linking to the reported primary carbon metabolism of extreme halophilic Archaea. Moreover, results of chemostat cultures demonstrated that extreme halophilic organisms show Monod-kinetics on different sole carbon sources. A diauxic growth pattern was described on a mixture of substrates in batch cultivations. In addition, the methodologies presented here enable one to characterize the utilized strain Haloferax mediterranei (HFX) as a potential new host organism. Thus, this study offers a strong methodological basis as well as a fundamental physiological assessment for bioreactor quantification of extreme halophiles that can serve as primary knowledge for applications of extreme halophiles in biotechnology. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. A novel approach for the automated segmentation and volume quantification of cardiac fats on computed tomography.

    PubMed

    Rodrigues, É O; Morais, F F C; Morais, N A O S; Conci, L S; Neto, L V; Conci, A

    2016-01-01

    The deposits of fat on the surroundings of the heart are correlated to several health risk factors such as atherosclerosis, carotid stiffness, coronary artery calcification, atrial fibrillation and many others. These deposits vary unrelated to obesity, which reinforces its direct segmentation for further quantification. However, manual segmentation of these fats has not been widely deployed in clinical practice due to the required human workload and consequential high cost of physicians and technicians. In this work, we propose a unified method for an autonomous segmentation and quantification of two types of cardiac fats. The segmented fats are termed epicardial and mediastinal, and stand apart from each other by the pericardium. Much effort was devoted to achieve minimal user intervention. The proposed methodology mainly comprises registration and classification algorithms to perform the desired segmentation. We compare the performance of several classification algorithms on this task, including neural networks, probabilistic models and decision tree algorithms. Experimental results of the proposed methodology have shown that the mean accuracy regarding both epicardial and mediastinal fats is 98.5% (99.5% if the features are normalized), with a mean true positive rate of 98.0%. In average, the Dice similarity index was equal to 97.6%. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Determination of the purity of pharmaceutical reference materials by 1H NMR using the standardless PULCON methodology.

    PubMed

    Monakhova, Yulia B; Kohl-Himmelseher, Matthias; Kuballa, Thomas; Lachenmeier, Dirk W

    2014-11-01

    A fast and reliable nuclear magnetic resonance spectroscopic method for quantitative determination (qNMR) of targeted molecules in reference materials has been established using the ERETIC2 methodology (electronic reference to access in vivo concentrations) based on the PULCON principle (pulse length based concentration determination). The developed approach was validated for the analysis of pharmaceutical samples in the context of official medicines control, including ibandronic acid, amantadine, ambroxol and lercanidipine. The PULCON recoveries were above 94.3% and coefficients of variation (CVs) obtained by quantification of different targeted resonances ranged between 0.7% and 2.8%, demonstrating that the qNMR method is a precise tool for rapid quantification (approximately 15min) of reference materials and medicinal products. Generally, the values were within specification (certified values) provided by the manufactures. The results were in agreement with NMR quantification using an internal standard and validated reference HPLC analysis. The PULCON method was found to be a practical alternative with competitive precision and accuracy to the classical internal reference method and it proved to be applicable to different solvent conditions. The method can be recommended for routine use in medicines control laboratories, especially when the availability and costs of reference compounds are problematic. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. The clinico-radiological paradox of cognitive function and MRI burden of white matter lesions in people with multiple sclerosis: A systematic review and meta-analysis.

    PubMed

    Mollison, Daisy; Sellar, Robin; Bastin, Mark; Mollison, Denis; Chandran, Siddharthan; Wardlaw, Joanna; Connick, Peter

    2017-01-01

    Moderate correlation exists between the imaging quantification of brain white matter lesions and cognitive performance in people with multiple sclerosis (MS). This may reflect the greater importance of other features, including subvisible pathology, or methodological limitations of the primary literature. To summarise the cognitive clinico-radiological paradox and explore the potential methodological factors that could influence the assessment of this relationship. Systematic review and meta-analysis of primary research relating cognitive function to white matter lesion burden. Fifty papers met eligibility criteria for review, and meta-analysis of overall results was possible in thirty-two (2050 participants). Aggregate correlation between cognition and T2 lesion burden was r = -0.30 (95% confidence interval: -0.34, -0.26). Wide methodological variability was seen, particularly related to key factors in the cognitive data capture and image analysis techniques. Resolving the persistent clinico-radiological paradox will likely require simultaneous evaluation of multiple components of the complex pathology using optimum measurement techniques for both cognitive and MRI feature quantification. We recommend a consensus initiative to support common standards for image analysis in MS, enabling benchmarking while also supporting ongoing innovation.

  20. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples.

    PubMed

    Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S

    2016-03-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).

  1. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples

    USGS Publications Warehouse

    Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David

    2016-01-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.

  2. An analysis of potassium iodide (KI) prophylaxis for the general public in the event of a nuclear accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behling, H.; Behling, K.; Amarasooriya, H.

    1995-02-01

    A generic difficulty encountered in cost-benefit analyses is the quantification of major elements that define the costs and the benefits in commensurate units. In this study, the costs of making KI available for public use, and the avoidance of thyroidal health effects predicted to be realized from the availability of that KI (i.e., the benefits), are defined in the commensurate units of dollars.

  3. Effects of Special Use Airspace on Economic Benefits of Direct Flights

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Barrington, Craig; Foster, John D. (Technical Monitor)

    1996-01-01

    A methodology for estimating the economic effects of Special Use Airspace (SUA) on direct route flights is presented in this paper. The methodology is based on evaluating operating costs of aircraft and analyzing the different ground-track distances traveled by flights under different air traffic scenarios. Using this methodology the following objectives are evaluated: optimistic bias of studies that assume accessible SUAs the maximum economic benefit of dynamic use of SUAs and the marginal economic benefit of the dynamic use of individual SUAs.

  4. Methodological challenges for the evaluation of clinical effectiveness in the context of accelerated regulatory approval: an overview.

    PubMed

    Woolacott, Nerys; Corbett, Mark; Jones-Diette, Julie; Hodgson, Robert

    2017-10-01

    Regulatory authorities are approving innovative therapies with limited evidence. Although this level of data is sufficient for the regulator to establish an acceptable risk-benefit balance, it is problematic for downstream health technology assessment, where assessment of cost-effectiveness requires reliable estimates of effectiveness relative to existing clinical practice. Some key issues associated with a limited evidence base include using data, from nonrandomized studies, from small single-arm trials, or from single-center trials; and using surrogate end points. We examined these methodological challenges through a pragmatic review of the available literature. Methods to adjust nonrandomized studies for confounding are imperfect. The relative treatment effect generated from single-arm trials is uncertain and may be optimistic. Single-center trial results may not be generalizable. Surrogate end points, on average, overestimate treatment effects. Current methods for analyzing such data are limited, and effectiveness claims based on these suboptimal forms of evidence are likely to be subject to significant uncertainty. Assessments of cost-effectiveness, based on the modeling of such data, are likely to be subject to considerable uncertainty. This uncertainty must not be underestimated by decision makers: methods for its quantification are required and schemes to protect payers from the cost of uncertainty should be implemented. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  5. Rapid and label-free microfluidic neutrophil purification and phenotyping in diabetes mellitus

    NASA Astrophysics Data System (ADS)

    Hou, Han Wei; Petchakup, Chayakorn; Tay, Hui Min; Tam, Zhi Yang; Dalan, Rinkoo; Chew, Daniel Ek Kwang; Li, King Ho Holden; Boehm, Bernhard O.

    2016-07-01

    Advanced management of dysmetabolic syndromes such as diabetes will benefit from a timely mechanistic insight enabling personalized medicine approaches. Herein, we present a rapid microfluidic neutrophil sorting and functional phenotyping strategy for type 2 diabetes mellitus (T2DM) patients using small blood volumes (fingerprick ~100 μL). The developed inertial microfluidics technology enables single-step neutrophil isolation (>90% purity) without immuno-labeling and sorted neutrophils are used to characterize their rolling behavior on E-selectin, a critical step in leukocyte recruitment during inflammation. The integrated microfluidics testing methodology facilitates high throughput single-cell quantification of neutrophil rolling to detect subtle differences in speed distribution. Higher rolling speed was observed in T2DM patients (P < 0.01) which strongly correlated with neutrophil activation, rolling ligand P-selectin glycoprotein ligand 1 (PSGL-1) expression, as well as established cardiovascular risk factors (cholesterol, high-sensitive C-reactive protein (CRP) and HbA1c). Rolling phenotype can be modulated by common disease risk modifiers (metformin and pravastatin). Receiver operating characteristics (ROC) and principal component analysis (PCA) revealed neutrophil rolling as an important functional phenotype in T2DM diagnostics. These results suggest a new point-of-care testing methodology, and neutrophil rolling speed as a functional biomarker for rapid profiling of dysmetabolic subjects in clinical and patient-oriented settings.

  6. Using Public Data for Comparative Proteome Analysis in Precision Medicine Programs.

    PubMed

    Hughes, Christopher S; Morin, Gregg B

    2018-03-01

    Maximizing the clinical utility of information obtained in longitudinal precision medicine programs would benefit from robust comparative analyses to known information to assess biological features of patient material toward identifying the underlying features driving their disease phenotype. Herein, the potential for utilizing publically deposited mass-spectrometry-based proteomics data to perform inter-study comparisons of cell-line or tumor-tissue materials is investigated. To investigate the robustness of comparison between MS-based proteomics studies carried out with different methodologies, deposited data representative of label-free (MS1) and isobaric tagging (MS2 and MS3 quantification) are utilized. In-depth quantitative proteomics data acquired from analysis of ovarian cancer cell lines revealed the robust recapitulation of observable gene expression dynamics between individual studies carried out using significantly different methodologies. The observed signatures enable robust inter-study clustering of cell line samples. In addition, the ability to classify and cluster tumor samples based on observed gene expression trends when using a single patient sample is established. With this analysis, relevant gene expression dynamics are obtained from a single patient tumor, in the context of a precision medicine analysis, by leveraging a large cohort of repository data as a comparator. Together, these data establish the potential for state-of-the-art MS-based proteomics data to serve as resources for robust comparative analyses in precision medicine applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Reliable estimates of predictive uncertainty for an Alpine catchment using a non-parametric methodology

    NASA Astrophysics Data System (ADS)

    Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.

    2017-04-01

    Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.

  8. Validation of musculoskeletal ultrasound to assess and quantify muscle glycogen content. A novel approach.

    PubMed

    Hill, John C; Millán, Iñigo San

    2014-09-01

    Glycogen storage is essential for exercise performance. The ability to assess muscle glycogen levels should be an important advantage for performance. However, skeletal muscle glycogen assessment has only been available and validated through muscle biopsy. We have developed a new methodology using high-frequency ultrasound to assess skeletal muscle glycogen content in a rapid, portable, and noninvasive way using MuscleSound (MuscleSound, LCC, Denver, CO) technology. To validate the utilization of high-frequency musculoskeletal ultrasound for muscle glycogen assessment and correlate it with histochemical glycogen quantification through muscle biopsy. Twenty-two male competitive cyclists (categories: Pro, 1-4; average height, 183.7 ± 4.9 cm; average weight, 76.8 ± 7.8 kg) performed a steady-state test on a cyclergometer for 90 minutes at a moderate to high exercise intensity, eliciting a carbohydrate oxidation of 2-3 g·min⁻¹ and a blood lactate concentration of 2 to 3 mM. Pre- and post-exercise glycogen content from rectus femoris muscle was measured using histochemical analysis through muscle biopsy and through high-frequency ultrasound scans using MuscleSound technology. Correlations between muscle biopsy glycogen histochemical quantification (mmol·kg⁻¹) and high-frequency ultrasound methodology through MuscleSound technology were r = 0.93 (P < 0.0001) pre-exercise and r = 0.94 (P < 0.0001) post-exercise. The correlation between muscle biopsy glycogen quantification and high-frequency ultrasound methodology for the change in glycogen from pre- and post-exercise was r = 0.81 (P < 0.0001). These results demonstrate that skeletal muscle glycogen can be measured quickly and noninvasively through high-frequency ultrasound using MuscleSound technology.

  9. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    NASA Astrophysics Data System (ADS)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  10. BATSE gamma-ray burst line search. 2: Bayesian consistency methodology

    NASA Technical Reports Server (NTRS)

    Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.

    1994-01-01

    We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.

  11. Future research needs associated with the assessment of potential human health risks from exposure to toxic ambient air pollutants.

    PubMed Central

    Möller, L; Schuetzle, D; Autrup, H

    1994-01-01

    This paper presents key conclusions and future research needs from a Workshop on the Risk Assessment of Urban Air, Emissions, Exposure, Risk Identification, and Quantification, which was held in Stockholm during June 1992 by 41 participants from 13 countries. Research is recommended in the areas of identification and quantification of toxics in source emissions and ambient air, atmospheric transport and chemistry, exposure level assessment, the development of improved in vitro bioassays, biomarker development, the development of more accurate epidemiological methodologies, and risk quantification techniques. Studies are described that will be necessary to assess and reduce the level of uncertainties associated with each step of the risk assessment process. International collaborative research efforts between industry and government organizations are recommended as the most effective way to carry out this research. PMID:7529703

  12. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  13. Analysis of optimal phenotypic space using elementary modes as applied to Corynebacterium glutamicum

    PubMed Central

    Gayen, Kalyan; Venkatesh, KV

    2006-01-01

    Background Quantification of the metabolic network of an organism offers insights into possible ways of developing mutant strain for better productivity of an extracellular metabolite. The first step in this quantification is the enumeration of stoichiometries of all reactions occurring in a metabolic network. The structural details of the network in combination with experimentally observed accumulation rates of external metabolites can yield flux distribution at steady state. One such methodology for quantification is the use of elementary modes, which are minimal set of enzymes connecting external metabolites. Here, we have used a linear objective function subject to elementary modes as constraint to determine the fluxes in the metabolic network of Corynebacterium glutamicum. The feasible phenotypic space was evaluated at various combinations of oxygen and ammonia uptake rates. Results Quantification of the fluxes of the elementary modes in the metabolism of C. glutamicum was formulated as linear programming. The analysis demonstrated that the solution was dependent on the criteria of objective function when less than four accumulation rates of the external metabolites were considered. The analysis yielded feasible ranges of fluxes of elementary modes that satisfy the experimental accumulation rates. In C. glutamicum, the elementary modes relating to biomass synthesis through glycolysis and TCA cycle were predominantly operational in the initial growth phase. At a later time, the elementary modes contributing to lysine synthesis became active. The oxygen and ammonia uptake rates were shown to be bounded in the phenotypic space due to the stoichiometric constraint of the elementary modes. Conclusion We have demonstrated the use of elementary modes and the linear programming to quantify a metabolic network. We have used the methodology to quantify the network of C. glutamicum, which evaluates the set of operational elementary modes at different phases of fermentation. The methodology was also used to determine the feasible solution space for a given set of substrate uptake rates under specific optimization criteria. Such an approach can be used to determine the optimality of the accumulation rates of any metabolite in a given network. PMID:17038164

  14. Natural Rubber Quantification in Sunflower Using an Automated Solvent Extractor

    USDA-ARS?s Scientific Manuscript database

    Leaves of sunflower (Helianthus annuus) produce a small amount of low molecular weight natural rubber (NR) and this species has potential as a rubber-producing crop plant. Quantifying NR in plant tissue has traditionally been accomplished using Soxhlet or gravimetric methodologies. Accelerated solve...

  15. Proof-of-Concept Study for Uncertainty Quantification and Sensitivity Analysis using the BRL Shaped-Charge Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Justin Matthew

    These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less

  16. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    NASA Astrophysics Data System (ADS)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  17. Determination of polycyclic aromatic hydrocarbons in kerosene and bio-kerosene soot.

    PubMed

    Andrade-Eiroa, Auréa; Leroy, Valérie; Dagaut, Philippe; Bedjanian, Yuri

    2010-03-01

    Here we report a new, efficient and reliable analytical methodology for sensitive and selective quantification of Polycyclic Aromatic Hydrocarbons (PAHs) in soot samples. The methodology developed is based on ultrasonic extraction of the soot-bound PAHs into small volumes of acetonitrile, purification of the extracts through C(18) Solid Phase Extraction (SPE) cartridges and analysis by Reverse Phase Liquid Chromatography (RPLC) with UV and fluorimetric detection. For the first time, we report the convenience of adapting the SPE procedure to the nature of the soot samples. As a matter of fact, extracts containing high percentage of unpolar material are recommended to be cleaned with acetone, whereas extracts poor in unpolar compounds can be efficiently cleaned with methanol. The method was satisfactorily applied to kerosene and bio-kerosene soot from atmospheric open diffusion flames (pool fires) and premixed flames achieving Quantification and Detection limits in the range ng mg(-1) soot and recoveries about 90% for most of the PAHs studied. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  18. Culture, intangibles and metrics in environmental management.

    PubMed

    Satterfield, Terre; Gregory, Robin; Klain, Sarah; Roberts, Mere; Chan, Kai M

    2013-03-15

    The demand for better representation of cultural considerations in environmental management is increasingly evident. As two cases in point, ecosystem service approaches increasingly include cultural services, and resource planners recognize indigenous constituents and the cultural knowledge they hold as key to good environmental management. Accordingly, collaborations between anthropologists, planners, decision makers and biodiversity experts about the subject of culture are increasingly common-but also commonly fraught. Those whose expertise is culture often engage in such collaborations because they worry a practitioner from 'elsewhere' will employ a 'measure of culture' that is poorly or naively conceived. Those from an economic or biophysical training must grapple with the intangible properties of culture as they intersect with economic, biological or other material measures. This paper seeks to assist those who engage in collaborations to characterize cultural benefits or impacts relevant to decision-making in three ways; by: (i) considering the likely mindset of would-be collaborators; (ii) providing examples of tested approaches that might enable innovation; and (iii) characterizing the kinds of obstacles that are in principle solvable through methodological alternatives. We accomplish these tasks in part by examining three cases wherein culture was a critical variable in environmental decision making: risk management in New Zealand associated with Māori concerns about genetically modified organisms; cultural services to assist marine planning in coastal British Columbia; and a decision-making process involving a local First Nation about water flows in a regulated river in western Canada. We examine how 'culture' came to be manifest in each case, drawing from ethnographic and cultural-models interviews and using subjective metrics (recommended by theories of judgment and decision making) to express cultural concerns. We conclude that the characterization of cultural benefits and impacts is least amenable to methodological solution when prevailing cultural worldviews contain elements fundamentally at odds with efforts to quantify benefits/impacts, but that even in such cases some improvements are achievable if decision-makers are flexible regarding processes for consultation with community members and how quantification is structured. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  19. Quantification of penicillin G during labor and delivery by capillary electrophoresis.

    PubMed

    Thomas, Andrea; Ukpoma, Omon K; Inman, Jennifer A; Kaul, Anil K; Beeson, James H; Roberts, Kenneth P

    2008-04-24

    In this study, a capillary electrophoresis (CE) method was developed as a means to measure levels of penicillin G (PCN G) in Group B Streptococcus (GBS) positive pregnant women during labor and delivery. Volunteers for this developmental study were administered five million units of PCN G at the onset of labor. Urine, blood, and amniotic fluid samples were collected during labor and post delivery. Samples were semi-purified by solid-phase extraction (SPE) using Waters tC18 SepPak 3cc cartridges with a sodium phosphate/methanol step gradient for elution. Capillary electrophoresis or reversed-phase high-performance liquid chromatography (RP-HPLC) with diode-array absorbance detection were used to separate the samples in less than 30 min. Quantification was accomplished by establishing a calibration curve with a linear dynamic range. The tC18 SPE methodology provided substantial sample clean-up with high recovery yields of PCN G ( approximately 90%). It was found that SPE was critical for maintaining the integrity of the separation column when using RP-HPLC, but was not necessary for sample analysis by CE where no stationary phase is present. Quantification results ranged from millimolar concentrations of PCN G in maternal urine to micromolar concentrations in amniotic fluid. Serum and cord blood levels of PCN G were below quantification limits, which is likely due to the prolonged delay in sample collection after antibiotic administration. These results show that CE can serve as a simple and effective means to characterize the pharmacokinetic distribution of PCN G from mother to unborn fetus during labor and delivery. It is anticipated that similar methodologies have the potential to provide a quick, simple, and cost-effective means of monitoring the clinical efficacy of PCN G and other drugs during pregnancy.

  20. Big Data Analytics Methodology in the Financial Industry

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  1. Uncertainty quantification and validation of 3D lattice scaffolds for computer-aided biomedical applications.

    PubMed

    Gorguluarslan, Recep M; Choi, Seung-Kyum; Saldana, Christopher J

    2017-07-01

    A methodology is proposed for uncertainty quantification and validation to accurately predict the mechanical response of lattice structures used in the design of scaffolds. Effective structural properties of the scaffolds are characterized using a developed multi-level stochastic upscaling process that propagates the quantified uncertainties at strut level to the lattice structure level. To obtain realistic simulation models for the stochastic upscaling process and minimize the experimental cost, high-resolution finite element models of individual struts were reconstructed from the micro-CT scan images of lattice structures which are fabricated by selective laser melting. The upscaling method facilitates the process of determining homogenized strut properties to reduce the computational cost of the detailed simulation model for the scaffold. Bayesian Information Criterion is utilized to quantify the uncertainties with parametric distributions based on the statistical data obtained from the reconstructed strut models. A systematic validation approach that can minimize the experimental cost is also developed to assess the predictive capability of the stochastic upscaling method used at the strut level and lattice structure level. In comparison with physical compression test results, the proposed methodology of linking the uncertainty quantification with the multi-level stochastic upscaling method enabled an accurate prediction of the elastic behavior of the lattice structure with minimal experimental cost by accounting for the uncertainties induced by the additive manufacturing process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Implicit Social Scaling from an Institutional Perspective

    ERIC Educational Resources Information Center

    D'Epifanio, Giulio

    2009-01-01

    The methodological question concerns constructing a cardinal social index, in order to assess performances of social agents, taking into account implicit political judgments. Based on the formal structure of a Choquet's expected utility, index construction demands quantification of levels of a meaningful ordinal indicator of overall performance.…

  3. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.

    DOT National Transportation Integrated Search

    1979-09-01

    This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...

  4. Development of an Uncertainty Quantification Predictive Chemical Reaction Model for Syngas Combustion

    DOE PAGES

    Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.; ...

    2017-01-24

    An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less

  5. Development of an Uncertainty Quantification Predictive Chemical Reaction Model for Syngas Combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.

    An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less

  6. Automated Control of the Organic and Inorganic Composition of Aloe vera Extracts Using (1)H NMR Spectroscopy.

    PubMed

    Monakhova, Yulia B; Randel, Gabriele; Diehl, Bernd W K

    2016-09-01

    Recent classification of Aloe vera whole-leaf extract by the International Agency for Research and Cancer as a possible carcinogen to humans as well as the continuous adulteration of A. vera's authentic material have generated renewed interest in controlling A. vera. The existing NMR spectroscopic method for the analysis of A. vera, which is based on a routine developed at Spectral Service, was extended. Apart from aloverose, glucose, malic acid, lactic acid, citric acid, whole-leaf material (WLM), acetic acid, fumaric acid, sodium benzoate, and potassium sorbate, the quantification of Mg(2+), Ca(2+), and fructose is possible with the addition of a Cs-EDTA solution to sample. The proposed methodology was automated, which includes phasing, baseline-correction, deconvolution (based on the Lorentzian function), integration, quantification, and reporting. The NMR method was applied to 41 A. vera preparations in the form of liquid A. vera juice and solid A. vera powder. The advantages of the new NMR methodology over the previous method were discussed. Correlation between the new and standard NMR methodologies was significant for aloverose, glucose, malic acid, lactic acid, citric acid, and WLM (P < 0.0001, R(2) = 0.99). NMR was found to be suitable for the automated simultaneous quantitative determination of 13 parameters in A. vera.

  7. A novel functionalisation process for glucose oxidase immobilisation in poly(methyl methacrylate) microchannels in a flow system for amperometric determinations.

    PubMed

    Cerqueira, Marcos Rodrigues Facchini; Grasseschi, Daniel; Matos, Renato Camargo; Angnes, Lucio

    2014-08-01

    Different materials like glass, silicon and poly(methyl methacrylate) (PMMA) are being used to immobilise enzymes in microchannels. PMMA shows advantages such as its low price, biocompatibility and attractive mechanical and chemical properties. Despite this, the introduction of reactive functional groups on PMMA is still problematic, either because of the complex chemistry or extended reaction time involved. In this paper, a new methodology was developed to immobilise glucose oxidase (GOx) in PMMA microchannels, with the benefit of a rapid immobilisation process and a very simple route. The new procedure involves only two steps, based on the reaction of 5.0% (w/w) polyethyleneimine (PEI) with PMMA in a dimethyl sulphoxide medium, followed by the immobilisation of glucose oxidase using a solution containing 100U enzymes and 1.0% (v/v) glutaraldehyde. The reactors prepared in this way were evaluated by a flowing system with amperometric detection (+0.60V) based on the oxidation of the H2O2 produced by the reactor. The microreactor proposed here was able to work with high bioconversion and a frequency of 60 samples h(-1), with detection and quantification limits of 0.50 and 1.66µmol L(-1), respectively. Michaelis-Menten parameters (Vmax and KM) were calculated as 449±47.7nmol min(-1) and 7.79±0.98mmol. Statistical evaluations were done to validate the proposed methodology. The content of glucose in natural and commercial coconut water samples was evaluated using the developed method. Comparison with spectrophotometric measurements showed that both methodologies have a very good correlation (tcalculated, 0.05, 4=1.35

  8. Quantification of free circulating tumor DNA as a diagnostic marker for breast cancer.

    PubMed

    Catarino, Raquel; Ferreira, Maria M; Rodrigues, Helena; Coelho, Ana; Nogal, Ana; Sousa, Abreu; Medeiros, Rui

    2008-08-01

    To determine whether the amounts of circulating DNA could discriminate between breast cancer patients and healthy individuals by using real-time PCR quantification methodology. Our standard protocol for quantification of cell-free plasma DNA involved 175 consecutive patients with breast cancer and 80 healthy controls. We found increased levels of circulating DNA in breast cancer patients compared to control individuals (105.2 vs. 77.06 ng/mL, p < 0.001). We also found statistically significant differences in circulating DNA amounts in patients before and after breast surgery (105.2 vs. 59.0 ng/mL, p = 0.001). Increased plasma cell-free DNA concentration was a strong risk factor for breast cancer, conferring an increased risk for the presence of this disease (OR, 12.32; 95% CI, 2.09-52.28; p < 0.001). Quantification of circulating DNA by real-time PCR may be a good and simple tool for detection of breast cancer with a potential to clinical applicability together with other current methods used for monitoring the disease.

  9. Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS

    PubMed Central

    Chitranshi, Priyanka; da Costa, Gonçalo Gamboa

    2016-01-01

    We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219

  10. A novel method for the quantification of fatty infiltration in skeletal muscle.

    PubMed

    Biltz, Nicole K; Meyer, Gretchen A

    2017-01-10

    Fatty infiltration of the skeletal muscle is a common but poorly understood feature of many myopathies. It is best described in human muscle, where non-invasive imaging techniques and representative histology have been optimized to view and quantify infiltrating fat. However, human studies are limited in their ability to identify cellular and molecular mechanisms regulating fatty infiltration, a likely prerequisite to developing targeted interventions. As mechanistic investigations move to small animals, studies may benefit from new or adapted imaging tools optimized for high resolution and whole muscle quantification. Here, we describe a novel method to evaluate fatty infiltration, developed for use with mouse muscle. In this methodology, muscle cellular membranes and proteins are removed via decellularization, but fatty infiltrate lipid is spared, trapped in its native distribution in a transparent extracellular matrix construct. This lipid can then be stained with visible or fluorescent dyes and imaged. We present three methods to stain and evaluate lipid in decellularized muscles which can be used individually or combined: (1) qualitative visualization of the amount and 3D spatial distribution of fatty infiltration using visible lipid soluble dye Oil Red O (ORO), (2) quantitative analysis of individual lipid droplet metrics (e.g., volume) via confocal imaging of fluorescent lipid soluble dye boron-dipyrromethene (BODIPY), and (3) quantitative analysis of total lipid content by optical density reading of extracted stained lipid. This methodology was validated by comparing glycerol-induced fatty infiltration between two commonly used mouse strains: 129S1/SvlmJ (129S1) and C57BL/6J (BL/6J). All three methods were able to detect a significant increase in fatty infiltrate volume in the 129S1 muscle compared with that in BL/6J, and methods 1 and 2 additionally described a difference in the distribution of fatty infiltrate, indicating susceptibility to glycerol-induced fatty infiltration is strain-specific. With more mechanistic studies of fatty infiltration moving to small animal models, having an alternative to expensive non-invasive imaging techniques and selective representative histology will be beneficial. In this work, we present a method that can quantify both individual adipocyte lipids and whole muscle total fatty infiltrate lipid.

  11. Recommendations for benefit-risk assessment methodologies and visual representations.

    PubMed

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain

    2016-03-01

    The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  13. Microwave-assisted extraction of green coffee oil and quantification of diterpenes by HPLC.

    PubMed

    Tsukui, A; Santos Júnior, H M; Oigman, S S; de Souza, R O M A; Bizzo, H R; Rezende, C M

    2014-12-01

    The microwave-assisted extraction (MAE) of 13 different green coffee beans (Coffea arabica L.) was compared to Soxhlet extraction for oil obtention. The full factorial design applied to the microwave-assisted extraction (MAE), related to time and temperature parameters, allowed to develop a powerful fast and smooth methodology (10 min at 45°C) compared to a 4h Soxhlet extraction. The quantification of cafestol and kahweol diterpenes present in the coffee oil was monitored by HPLC/UV and showed satisfactory linearity (R(2)=0.9979), precision (CV 3.7%), recovery (<93%), limit of detection (0.0130 mg/mL), and limit of quantification (0.0406 mg/mL). The space-time yield calculated on the diterpenes content for sample AT1 (Arabica green coffee) showed a six times higher value compared to the traditional Soxhlet method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Ferromagnetic resonance for the quantification of superparamagnetic iron oxide nanoparticles in biological materials

    PubMed Central

    Gamarra, Lionel F; daCosta-Filho, Antonio J; Mamani, Javier B; de Cassia Ruiz, Rita; Pavon, Lorena F; Sibov, Tatiana T; Vieira, Ernanni D; Silva, André C; Pontuschka, Walter M; Amaro, Edson

    2010-01-01

    The aim of the present work is the presentation of a quantification methodology for the control of the amount of superparamagnetic iron oxide nanoparticles (SPIONs) administered in biological materials by means of the ferromagnetic resonance technique (FMR) applied to studies both in vivo and in vitro. The in vivo study consisted in the analysis of the elimination and biodistribution kinetics of SPIONs after intravenous administration in Wistar rats. The results were corroborated by X-ray fluorescence. For the in vitro study, a quantitative analysis of the concentration of SPIONs bound to the specific AC133 monoclonal antibodies was carried out in order to detect the expression of the antigenic epitopes (CD133) in stem cells from human umbilical cord blood. In both studies FMR has proven to be an efficient technique for the SPIONs quantification per volume unit (in vivo) or per labeled cell (in vitro). PMID:20463936

  15. Benefit-cost methodology study with example application of the use of wind generators

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.

    1975-01-01

    An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.

  16. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    NASA Astrophysics Data System (ADS)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity at different scales and to produce maps with the spatial representation of the geodiversity index, which could be an inestimable contribute for land-use management.

  17. Cost-benefit analysis of space technology

    NASA Technical Reports Server (NTRS)

    Hein, G. F.; Stevenson, S. M.; Sivo, J. N.

    1976-01-01

    A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.

  18. Nuclear Data Uncertainty Quantification: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  19. Selected methods for quantification of community exposure to aircraft noise

    NASA Technical Reports Server (NTRS)

    Edge, P. M., Jr.; Cawthorn, J. M.

    1976-01-01

    A review of the state-of-the-art for the quantification of community exposure to aircraft noise is presented. Physical aspects, people response considerations, and practicalities of useful application of scales of measure are included. Historical background up through the current technology is briefly presented. The developments of both single-event and multiple-event scales are covered. Selective choice is made of scales currently in the forefront of interest and recommended methodology is presented for use in computer programing to translate aircraft noise data into predictions of community noise exposure. Brief consideration is given to future programing developments and to supportive research needs.

  20. Advances in the identification of adulterated cereals and cereal products

    USDA-ARS?s Scientific Manuscript database

    This book chapter addresses the most common occurrences of adulteration in the cereal grains, the regulations in place by countries (such as the United States, United Kingdom, Italy, India, and the European Union), and the methodologies by which detection and quantification of the contaminant are ma...

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, J J; Gallagher, D W; Modarres, M

    Appendices are presented concerning isolation condenser makeup; vapor suppression system; station air system; reactor building closed cooling water system; turbine building secondary closed water system; service water system; emergency service water system; fire protection system; emergency ac power; dc power system; event probability estimation; methodology of accident sequence quantification; and assignment of dominant sequences to release categories.

  2. An Analysis of Information Asset Valuation (IAV) Quantification Methodology for Application with Cyber Information Mission Impact Assessment (CIMIA)

    DTIC Science & Technology

    2008-03-01

    sponsor, Capt. Larry Fortson, for sharing a common vision; my knowledgeable committee members, Dr. Robert F. Mills and Dr. Dennis D. Strouble, for...Accounting Approaches ........................................................................ 16  vi Page Fair Market Value (FMV...22  Uniform Commercial Code ( UCC ) ....................................................................... 23

  3. Inverse models: A necessary next step in ground-water modeling

    USGS Publications Warehouse

    Poeter, E.P.; Hill, M.C.

    1997-01-01

    Inverse models using, for example, nonlinear least-squares regression, provide capabilities that help modelers take full advantage of the insight available from ground-water models. However, lack of information about the requirements and benefits of inverse models is an obstacle to their widespread use. This paper presents a simple ground-water flow problem to illustrate the requirements and benefits of the nonlinear least-squares repression method of inverse modeling and discusses how these attributes apply to field problems. The benefits of inverse modeling include: (1) expedited determination of best fit parameter values; (2) quantification of the (a) quality of calibration, (b) data shortcomings and needs, and (c) confidence limits on parameter estimates and predictions; and (3) identification of issues that are easily overlooked during nonautomated calibration.Inverse models using, for example, nonlinear least-squares regression, provide capabilities that help modelers take full advantage of the insight available from ground-water models. However, lack of information about the requirements and benefits of inverse models is an obstacle to their widespread use. This paper presents a simple ground-water flow problem to illustrate the requirements and benefits of the nonlinear least-squares regression method of inverse modeling and discusses how these attributes apply to field problems. The benefits of inverse modeling include: (1) expedited determination of best fit parameter values; (2) quantification of the (a) quality of calibration, (b) data shortcomings and needs, and (c) confidence limits on parameter estimates and predictions; and (3) identification of issues that are easily overlooked during nonautomated calibration.

  4. Significant issues in proof testing: A critical appraisal

    NASA Technical Reports Server (NTRS)

    Chell, G. G.; Mcclung, R. C.; Russell, D. A.; Chang, K. J.; Donnelly, B.

    1994-01-01

    Issues which impact on the interpretation and quantification of proof test benefits are reviewed. The importance of each issue in contributing to the extra quality assurance conferred by proof testing components is discussed, particularly with respect to the application of advanced fracture mechanics concepts to enhance the flaw screening capability of a proof test analysis. Items covered include the role in proof testing of elastic-plastic fracture mechanics, ductile instability analysis, deterministic versus probabilistic analysis, single versus multiple cycle proof testing, and non-destructive examination (NDE). The effects of proof testing on subsequent service life are reviewed, particularly with regard to stress redistribution and changes in fracture behavior resulting from the overload. The importance of proof test conditions are also addressed, covering aspects related to test temperature, simulation of service environments, test media and the application of real-time NDE. The role of each issue in a proof test methodology is assessed with respect to its ability to: promote proof test practice to a state-of-the-art; aid optimization of proof test design; and increase awareness and understanding of outstanding issues.

  5. A method for the analysis of the benefits and costs for aeronautical research and technology

    NASA Technical Reports Server (NTRS)

    Williams, L. J.; Hoy, H. H.; Anderson, J. L.

    1978-01-01

    A relatively simple, consistent, and reasonable methodology for performing cost-benefit analyses which can be used to guide, justify, and explain investments in aeronautical research and technology is presented. The elements of this methodology (labeled ABC-ART for the Analysis of the Benefits and Costs of Aeronautical Research and Technology) include estimation of aircraft markets; manufacturer costs and return on investment versus aircraft price; airline costs and return on investment versus aircraft price and passenger yield; and potential system benefits--fuel savings, cost savings, and noise reduction. The application of this methodology is explained using the introduction of an advanced turboprop powered transport aircraft in the medium range market in 1978 as an example.

  6. Methodological Aspects Regarding The Organizational Stress Analysis

    NASA Astrophysics Data System (ADS)

    Irimie, Sabina; Pricope (Muntean), Luminiţa Doina; Pricope, Sorin; Irimie, Sabin Ioan

    2015-07-01

    This work presents a research of methodology in occupational stress analyse in the educational field, as a part of a larger study. The objectives of the work are in finding accents in existence of significant relations between stressors and effects, meaning the differences between the indicators of occupational stress to teaching staff in primary and gymnasium school, taking notice of each specific condition: the institution as an entity, the working community, the discipline he/she is teaching others, the geographic and administrative district (urban/rural) and the quantification of stress level.

  7. An Imager Gaussian Process Machine Learning Methodology for Cloud Thermodynamic Phase classification

    NASA Astrophysics Data System (ADS)

    Marchant, B.; Platnick, S. E.; Meyer, K.

    2017-12-01

    The determination of cloud thermodynamic phase from MODIS and VIIRS instruments is an important first step in cloud optical retrievals, since ice and liquid clouds have different optical properties. To continue improving the cloud thermodynamic phase classification algorithm, a machine-learning approach, based on Gaussian processes, has been developed. The new proposed methodology provides cloud phase uncertainty quantification and improves the algorithm portability between MODIS and VIIRS. We will present new results, through comparisons between MODIS and CALIOP v4, and for VIIRS as well.

  8. Method validation for methanol quantification present in working places

    NASA Astrophysics Data System (ADS)

    Muna, E. D. M.; Bizarri, C. H. B.; Maciel, J. R. M.; da Rocha, G. P.; de Araújo, I. O.

    2015-01-01

    Given the widespread use of methanol by different industry sectors and high toxicity associated with this substance, it is necessary to use an analytical method able to determine in a sensitive, precise and accurate levels of methanol in the air of working environments. Based on the methodology established by the National Institute for Occupational Safety and Health (NIOSH), it was validated a methodology for determination of methanol in silica gel tubes which had demonstrated its effectiveness based on the participation of the international collaborative program sponsored by the American Industrial Hygiene Association (AIHA).

  9. Are You Making an Impact? Evaluating the Population Health Impact of Community Benefit Programs.

    PubMed

    Rains, Catherine M; Todd, Greta; Kozma, Nicole; Goodman, Melody S

    The Patient Protection and Affordable Care Act includes a change to the IRS 990 Schedule H, requiring nonprofit hospitals to submit a community health needs assessment every 3 years. Such health care entities are challenged to evaluate the effectiveness of community benefit programs addressing the health needs identified. In an effort to determine the population health impact of community benefit programs in 1 hospital outreach department, researchers and staff conducted an impact evaluation to develop priority areas and overarching goals along with program- and department-level objectives. The longitudinal impact evaluation study design consists of retrospective and prospective secondary data analyses. As an urban pediatric hospital, St Louis Children's Hospital provides an array of community benefit programs to the surrounding community. Hospital staff and researchers came together to form an evaluation team. Data from program evaluation and administrative data for analysis were provided by hospital staff. Impact scores were calculated by scoring objectives as met or unmet and averaged across goals to create impact scores that measure how closely programs meet the overarching departmental mission and goals. Over the 4-year period, there is an increasing trend in program-specific impact scores across all programs except one, Healthy Kids Express Asthma, which had a slight decrease in year 4 only. Current work in measuring and assessing the population health impact of community benefit programs is mostly focused on quantifying dollars invested into community benefit work rather than measuring the quality and impact of services. This article provides a methodology for measuring population health impact of community benefit programs that can be used to evaluate the effort of hospitals in providing community benefit. This is particularly relevant in our changing health care climate, as hospitals are being asked to justify community benefit and make meaningful contributions to population health. The Patient Protection and Affordable Care Act includes a change to the IRS 990 Schedule H, requiring nonprofit hospitals to submit a community health needs assessment every 3 years, and requires evaluation of program effectiveness; yet, it does not require any quantification of the impact of community benefit programs. The IRS Schedule H 990 policies could be strengthened by requiring an impact evaluation such as outlined in this article. As hospitals are being asked to justify community benefit and make meaningful contributions to population health, impact evaluations can be utilized to demonstrate the cumulative community benefit of programs and assess population health impact of community benefit programs.

  10. Analysis of host-cell proteins in biotherapeutic proteins by comprehensive online two-dimensional liquid chromatography/mass spectrometry

    PubMed Central

    Xenopoulos, Alex; Fadgen, Keith; Murphy, Jim; Skilton, St. John; Prentice, Holly; Stapels, Martha

    2012-01-01

    Assays for identification and quantification of host-cell proteins (HCPs) in biotherapeutic proteins over 5 orders of magnitude in concentration are presented. The HCP assays consist of two types: HCP identification using comprehensive online two-dimensional liquid chromatography coupled with high resolution mass spectrometry (2D-LC/MS), followed by high-throughput HCP quantification by liquid chromatography, multiple reaction monitoring (LC-MRM). The former is described as a “discovery” assay, the latter as a “monitoring” assay. Purified biotherapeutic proteins (e.g., monoclonal antibodies) were digested with trypsin after reduction and alkylation, and the digests were fractionated using reversed-phase (RP) chromatography at high pH (pH 10) by a step gradient in the first dimension, followed by a high-resolution separation at low pH (pH 2.5) in the second dimension. As peptides eluted from the second dimension, a quadrupole time-of-flight mass spectrometer was used to detect the peptides and their fragments simultaneously by alternating the collision cell energy between a low and an elevated energy (MSE methodology). The MSE data was used to identify and quantify the proteins in the mixture using a proven label-free quantification technique (“Hi3” method). The same data set was mined to subsequently develop target peptides and transitions for monitoring the concentration of selected HCPs on a triple quadrupole mass spectrometer in a high-throughput manner (20 min LC-MRM analysis). This analytical methodology was applied to the identification and quantification of low-abundance HCPs in six samples of PTG1, a recombinant chimeric anti-phosphotyrosine monoclonal antibody (mAb). Thirty three HCPs were identified in total from the PTG1 samples among which 21 HCP isoforms were selected for MRM monitoring. The absolute quantification of three selected HCPs was undertaken on two different LC-MRM platforms after spiking isotopically labeled peptides in the samples. Finally, the MRM quantitation results were compared with TOF-based quantification based on the Hi3 peptides, and the TOF and MRM data sets correlated reasonably well. The results show that the assays provide detailed valuable information to understand the relative contributions of purification schemes to the nature and concentrations of HCP impurities in biopharmaceutical samples, and the assays can be used as generic methods for HCP analysis in the biopharmaceutical industry. PMID:22327428

  11. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  12. Benefit-cost analysis of addiction treatment: methodological guidelines and empirical application using the DATCAP and ASI.

    PubMed

    French, Michael T; Salomé, Helena J; Sindelar, Jody L; McLellan, A Thomas

    2002-04-01

    To provide detailed methodological guidelines for using the Drug Abuse Treatment Cost Analysis Program (DATCAP) and Addiction Severity Index (ASI) in a benefit-cost analysis of addiction treatment. A representative benefit-cost analysis of three outpatient programs was conducted to demonstrate the feasibility and value of the methodological guidelines. Procedures are outlined for using resource use and cost data collected with the DATCAP. Techniques are described for converting outcome measures from the ASI to economic (dollar) benefits of treatment. Finally, principles are advanced for conducting a benefit-cost analysis and a sensitivity analysis of the estimates. The DATCAP was administered at three outpatient drug-free programs in Philadelphia, PA, for 2 consecutive fiscal years (1996 and 1997). The ASI was administered to a sample of 178 treatment clients at treatment entry and at 7-months postadmission. The DATCAP and ASI appear to have significant potential for contributing to an economic evaluation of addiction treatment. The benefit-cost analysis and subsequent sensitivity analysis all showed that total economic benefit was greater than total economic cost at the three outpatient programs, but this representative application is meant to stimulate future economic research rather than justifying treatment per se. This study used previously validated, research-proven instruments and methods to perform a practical benefit-cost analysis of real-world treatment programs. The study demonstrates one way to combine economic and clinical data and offers a methodological foundation for future economic evaluations of addiction treatment.

  13. From California dreaming to California data: Challenging historic models for landfill CH4 emissions

    USDA-ARS?s Scientific Manuscript database

    Improved quantification of diverse CH4 sources at the urban scale is needed to guide local greenhouse gas (GHG) mitigation strategies in the Anthropocene. Herein, we focus on landfill CH4 emissions in California, challenging the current IPCC methodology which focuses on a climate dependency for land...

  14. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    ERIC Educational Resources Information Center

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  15. The "Discernment of a Human Face" in Research: A Reaction to Van Hesteren's Human Science and Counselling.

    ERIC Educational Resources Information Center

    Knowles, Don

    1986-01-01

    Responds to Van Hesteren's advocacy of a "human science" orientation, using constructs and analytical methods from phenomenology to enhance methodology. Approves of a philosophy of science and of a self-perspective. Questions the need for complex terminology, Van Hesteren's definition of what constitutes research, quantification, and…

  16. Reduced Order Modeling Methods for Turbomachinery Design

    DTIC Science & Technology

    2009-03-01

    and Ma- terials Conference, May 2006. [45] A. Gelman , J. B. Carlin, H. S. Stern, and D. B. Rubin, Bayesian Data Analysis. New York, NY: Chapman I& Hall...Macian- Juan , and R. Chawla, “A statistical methodology for quantif ca- tion of uncertainty in best estimate code physical models,” Annals of Nuclear En

  17. Analysis of biologically active oxyprenylated phenylpropanoids in Tea tree oil using selective solid-phase extraction with UHPLC-PDA detection.

    PubMed

    Scotti, Luca; Genovese, Salvatore; Bucciarelli, Tonino; Martini, Filippo; Epifano, Francesco; Fiorito, Serena; Preziuso, Francesca; Taddeo, Vito Alessandro

    2018-05-30

    An efficient analytical strategy based on different extraction methods of biologically active naturally occurring oxyprenylated umbelliferone and ferulic acid derivatives 7-isopentenyloxycoumarin, auraptene, umbelliprenin, boropinic acid, and 4'-geranyloxyferulic acid and quantification by UHPLC with spectrophotometric (UV/Vis) detection from Tea tree oil is reported. Absorption of the pure oil on Al 2 O 3 (Brockmann activity II) prior washing the resulting solid with MeOH and treatment of this latter with CH 2 Cl 2 resulted the best extraction methodology in terms of yields of oxyprenylated secondary metabolites. Among the five O-prenylphenylpropanoids herein under investigation auraptene and umbelliprenin were never detected while 4'-geranyloxyferulic acid was the most abundant compound resulting from all the three extraction methods employed. The UHPLC analytical methodology set up in the present study resulted to be an effective and versatile technique for the simultaneous characterization and quantification of prenyloxyphenylpropanoids in Tea tree oil and applicable to other complex matrices from the plant kingdom. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Caffeine as an indicator for the quantification of untreated wastewater in karst systems.

    PubMed

    Hillebrand, Olav; Nödler, Karsten; Licha, Tobias; Sauter, Martin; Geyer, Tobias

    2012-02-01

    Contamination from untreated wastewater leakage and related bacterial contamination poses a threat to drinking water quality. However, a quantification of the magnitude of leakage is difficult. The objective of this work is to provide a highly sensitive methodology for the estimation of the mass of untreated wastewater entering karst aquifers with rapid recharge. For this purpose a balance approach is adapted. It is based on the mass flow of caffeine in spring water, the load of caffeine in untreated wastewater and the daily water consumption per person in a spring catchment area. Caffeine is a source-specific indicator for wastewater, consumed and discharged in quantities allowing detection in a karst spring. The methodology was applied to estimate the amount of leaking and infiltrating wastewater to a well investigated karst aquifer on a daily basis. The calculated mean volume of untreated wastewater entering the aquifer was found to be 2.2 ± 0.5 m(3) d(-1) (undiluted wastewater). It corresponds to approximately 0.4% of the total amount of wastewater within the spring catchment. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Quantification of Anti-Aggregation Activity of Chaperones: A Test-System Based on Dithiothreitol-Induced Aggregation of Bovine Serum Albumin

    PubMed Central

    Borzova, Vera A.; Markossian, Kira A.; Kara, Dmitriy A.; Chebotareva, Natalia A.; Makeeva, Valentina F.; Poliansky, Nikolay B.; Muranov, Konstantin O.; Kurganov, Boris I.

    2013-01-01

    The methodology for quantification of the anti-aggregation activity of protein and chemical chaperones has been elaborated. The applicability of this methodology was demonstrated using a test-system based on dithiothreitol-induced aggregation of bovine serum albumin at 45°C as an example. Methods for calculating the initial rate of bovine serum albumin aggregation (v agg) have been discussed. The comparison of the dependences of v agg on concentrations of intact and cross-linked α-crystallin allowed us to make a conclusion that a non-linear character of the dependence of v agg on concentration of intact α-crystallin was due to the dynamic mobility of the quaternary structure of α-crystallin and polydispersity of the α-crystallin–target protein complexes. To characterize the anti-aggregation activity of the chemical chaperones (arginine, arginine ethyl ester, arginine amide and proline), the semi-saturation concentration [L]0.5 was used. Among the chemical chaperones studied, arginine ethyl ester and arginine amide reveal the highest anti-aggregation activity ([L]0.5 = 53 and 58 mM, respectively). PMID:24058554

  20. Computational Lipidomics and Lipid Bioinformatics: Filling In the Blanks.

    PubMed

    Pauling, Josch; Klipp, Edda

    2016-12-22

    Lipids are highly diverse metabolites of pronounced importance in health and disease. While metabolomics is a broad field under the omics umbrella that may also relate to lipids, lipidomics is an emerging field which specializes in the identification, quantification and functional interpretation of complex lipidomes. Today, it is possible to identify and distinguish lipids in a high-resolution, high-throughput manner and simultaneously with a lot of structural detail. However, doing so may produce thousands of mass spectra in a single experiment which has created a high demand for specialized computational support to analyze these spectral libraries. The computational biology and bioinformatics community has so far established methodology in genomics, transcriptomics and proteomics but there are many (combinatorial) challenges when it comes to structural diversity of lipids and their identification, quantification and interpretation. This review gives an overview and outlook on lipidomics research and illustrates ongoing computational and bioinformatics efforts. These efforts are important and necessary steps to advance the lipidomics field alongside analytic, biochemistry, biomedical and biology communities and to close the gap in available computational methodology between lipidomics and other omics sub-branches.

  1. Quantification of furanic compounds in coated deep-fried products simulating normal preparation and consumption: optimisation of HS-SPME analytical conditions by response surface methodology.

    PubMed

    Pérez-Palacios, T; Petisca, C; Melo, A; Ferreira, I M P L V O

    2012-12-01

    The validation of a method for the simultaneous quantification of furanic compounds in coated deep-fried samples processed and handled as usually consumed is presented. The deep-fried food was grinded using a device that simulates the mastication, and immediately analysed by headspace solid phase microextraction coupled to gas chromatography-mass spectrometry. Parameters affecting the efficiency of HS-SPME procedure were selected by response surface methodology, using a 2(3) full-factorial central composite design. Optimal conditions were achieved using 2g of sample, 3g of NaCl and 40min of absorption time at 37°C. Consistency between predicted and experimented values was observed and quality parameters of the method were established. As a result, furan, 2-furfural, furfuryl alcohol and 2-pentylfuran were, for the first time, simultaneously detected and quantified (5.59, 0.27, 10.48 and 1.77μgg(-1) sample, respectively) in coated deep-fried fish, contributing to a better understanding of the amounts of these compounds in food. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Quantitative profile of lipid classes in blood by normal phase chromatography with evaporative light scattering detector: application in the detection of lipid class abnormalities in liver cirrhosis.

    PubMed

    Chamorro, Laura; García-Cano, Ana; Busto, Rebeca; Martínez-González, Javier; Albillos, Agustín; Lasunción, Miguel Ángel; Pastor, Oscar

    2013-06-05

    The lack of analytical methods specific for each lipid class, particularly for phospholipids and sphyngolipids, makes necessary their separation by preparative techniques before quantification. LC-MS would be the election method but for daily work in the clinical laboratory this is not feasible for different reasons, both economic and time consuming. In the present work, we have optimized an HPLC method to quantify lipid classes in plasma and erythrocytes and applied it to samples from patients with cirrhosis. Lipid classes were analyzed by normal phase liquid chromatography with evaporative light scattering detection. We employed a quaternary solvent system to separate twelve lipid classes in 15 min. Interday, intraday and recovery for quantification of lipid classes in plasma were excellent with our methodology. The total plasma lipid content of cirrhotic patients vs control subjects was decreased with diminished CE (81±33 vs 160±17 mg/dL) and PC (37±16 vs 60±19 mg/dL). The composition of erythrocytes showed a decrease in acidic phospholipids: PE, PI and PS. Present methodology provides a reliable quantification of lipid classes in blood. The lipid profile of cirrhotics showed alterations in the PC/PE plasma ratio and in the phospholipid content of erythrocytes, which might reflect alterations in hepatocyte and erythrocyte membrane integrity. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Recommendations for Accurate Resolution of Gene and Isoform Allele-Specific Expression in RNA-Seq Data

    PubMed Central

    Wood, David L. A.; Nones, Katia; Steptoe, Anita; Christ, Angelika; Harliwong, Ivon; Newell, Felicity; Bruxner, Timothy J. C.; Miller, David; Cloonan, Nicole; Grimmond, Sean M.

    2015-01-01

    Genetic variation modulates gene expression transcriptionally or post-transcriptionally, and can profoundly alter an individual’s phenotype. Measuring allelic differential expression at heterozygous loci within an individual, a phenomenon called allele-specific expression (ASE), can assist in identifying such factors. Massively parallel DNA and RNA sequencing and advances in bioinformatic methodologies provide an outstanding opportunity to measure ASE genome-wide. In this study, matched DNA and RNA sequencing, genotyping arrays and computationally phased haplotypes were integrated to comprehensively and conservatively quantify ASE in a single human brain and liver tissue sample. We describe a methodological evaluation and assessment of common bioinformatic steps for ASE quantification, and recommend a robust approach to accurately measure SNP, gene and isoform ASE through the use of personalized haplotype genome alignment, strict alignment quality control and intragenic SNP aggregation. Our results indicate that accurate ASE quantification requires careful bioinformatic analyses and is adversely affected by sample specific alignment confounders and random sampling even at moderate sequence depths. We identified multiple known and several novel ASE genes in liver, including WDR72, DSP and UBD, as well as genes that contained ASE SNPs with imbalance direction discordant with haplotype phase, explainable by annotated transcript structure, suggesting isoform derived ASE. The methods evaluated in this study will be of use to researchers performing highly conservative quantification of ASE, and the genes and isoforms identified as ASE of interest to researchers studying those loci. PMID:25965996

  4. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Xiao-Ying; Yao, Juan; He, Hua

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  5. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  6. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D.L., E-mail: Donald.L.Smith@anl.gov

    2015-01-15

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  7. The harmful chemistry behind "krokodil": Street-like synthesis and product analysis.

    PubMed

    Alves, Emanuele Amorim; Soares, José Xavier; Afonso, Carlos Manuel; Grund, Jean-Paul C; Agonia, Ana Sofia; Cravo, Sara Manuela; Netto, Annibal Duarte Pereira; Carvalho, Félix; Dinis-Oliveira, Ricardo Jorge

    2015-12-01

    "Krokodil" is the street name for a drug, which has been attracting media and researchers attention due to its increasing spread and extreme toxicity. "Krokodil" is a homemade injectable mixture being used as a cheap substitute for heroin. Its use begun in Russia and Ukraine, but it is being spread throughout other countries. The starting materials for "krokodil" synthesis are tablets containing codeine, caustic soda, gasoline, hydrochloric acid, iodine from disinfectants and red phosphorus from matchboxes, all of which are easily available in a retail market or drugstores. The resulting product is a light brown liquid that is injected without previous purification. Herein, we aimed to understand the chemistry behind "krokodil" synthesis by mimicking the steps followed by people who use this drug. The successful synthesis was assessed by the presence of desomorphine and other two morphinans. An analytical gas chromatography-electron impact/mass spectrometry (GC-EI/MS) methodology for quantification of desomorphine and codeine was also developed and validated. The methodologies presented herein provide a representative synthesis of "krokodil" street samples and the application of an effective analytical methodology for desomorphine quantification, which was the major morphinan found. Further studies are required in order to find other hypothetical by-products in "krokodil" since these may help to explain signs and symptoms presented by abusers. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Optimal design of experiments applied to headspace solid phase microextraction for the quantification of vicinal diketones in beer through gas chromatography-mass spectrometric detection.

    PubMed

    Leça, João M; Pereira, Ana C; Vieira, Ana C; Reis, Marco S; Marques, José C

    2015-08-05

    Vicinal diketones, namely diacetyl (DC) and pentanedione (PN), are compounds naturally found in beer that play a key role in the definition of its aroma. In lager beer, they are responsible for off-flavors (buttery flavor) and therefore their presence and quantification is of paramount importance to beer producers. Aiming at developing an accurate quantitative monitoring scheme to follow these off-flavor compounds during beer production and in the final product, the head space solid-phase microextraction (HS-SPME) analytical procedure was tuned through experiments planned in an optimal way and the final settings were fully validated. Optimal design of experiments (O-DOE) is a computational, statistically-oriented approach for designing experiences that are most informative according to a well-defined criterion. This methodology was applied for HS-SPME optimization, leading to the following optimal extraction conditions for the quantification of VDK: use a CAR/PDMS fiber, 5 ml of samples in 20 ml vial, 5 min of pre-incubation time followed by 25 min of extraction at 30 °C, with agitation. The validation of the final analytical methodology was performed using a matrix-matched calibration, in order to minimize matrix effects. The following key features were obtained: linearity (R(2) > 0.999, both for diacetyl and 2,3-pentanedione), high sensitivity (LOD of 0.92 μg L(-1) and 2.80 μg L(-1), and LOQ of 3.30 μg L(-1) and 10.01 μg L(-1), for diacetyl and 2,3-pentanedione, respectively), recoveries of approximately 100% and suitable precision (repeatability and reproducibility lower than 3% and 7.5%, respectively). The applicability of the methodology was fully confirmed through an independent analysis of several beer samples, with analyte concentrations ranging from 4 to 200 g L(-1). Copyright © 2015 Elsevier B.V. All rights reserved.

  9. The applicability of real-time PCR in the diagnostic of cutaneous leishmaniasis and parasite quantification for clinical management: Current status and perspectives.

    PubMed

    Moreira, Otacilio C; Yadon, Zaida E; Cupolillo, Elisa

    2017-09-29

    Cutaneous leishmaniasis (CL) is spread worldwide and is the most common manifestation of leishmaniasis. Diagnosis is performed by combining clinical and epidemiological features, and through the detection of Leishmania parasites (or DNA) in tissue specimens or trough parasite isolation in culture medium. Diagnosis of CL is challenging, reflecting the pleomorphic clinical manifestations of this disease. Skin lesions vary in severity, clinical appearance, and duration, and in some cases, they can be indistinguishable from lesions related to other diseases. Over the past few decades, PCR-based methods, including real-time PCR assays, have been developed for Leishmania detection, quantification and species identification, improving the molecular diagnosis of CL. This review provides an overview of many real-time PCR methods reported for the diagnostic evaluation of CL and some recommendations for the application of these methods for quantification purposes for clinical management and epidemiological studies. Furthermore, the use of real-time PCR for Leishmania species identification is also presented. The advantages of real-time PCR protocols are numerous, including increased sensitivity and specificity and simpler standardization of diagnostic procedures. However, despite the numerous assays described, there is still no consensus regarding the methods employed. Furthermore, the analytical and clinical validation of CL molecular diagnosis has not followed international guidelines so far. A consensus methodology comprising a DNA extraction protocol with an exogenous quality control and an internal reference to normalize parasite load is still needed. In addition, the analytical and clinical performance of any consensus methodology must be accurately assessed. This review shows that a standardization initiative is essential to guide researchers and clinical laboratories towards the achievement of a robust and reproducible methodology, which will permit further evaluation of parasite load as a surrogate marker of prognosis and monitoring of aetiological treatment, particularly in multi-centric observational studies and clinical trials. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. 42 CFR 493.649 - Methodology for determining fee amount.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... fringe benefit costs to support the required number of State inspectors, management and direct support... full time equivalent employee. Included in this cost are salary and fringe benefit costs, necessary... 42 Public Health 5 2010-10-01 2010-10-01 false Methodology for determining fee amount. 493.649...

  11. Multiple Approaches to the Evaluation of Educational Reform: From Cost-Benefit to Power-Benefit Analysis.

    ERIC Educational Resources Information Center

    Paulston, Rolland G.

    Theories or explanations of educational evaluation are discussed and categorized under two broad methodological headings, the objectivist and subjectivist epistemological orientations. They can be seen as potentially complementary empirical approaches that offer evaluators two methodological orientations to assess educational-reform outcomes.…

  12. Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.

    2018-02-01

    In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.

  13. Cost allocation methodology applicable to the temporary assistance for needy families program. Final rule.

    PubMed

    2008-07-23

    This final rule applies to the Temporary Assistance for Needy Families (TANF) program and requires States, the District of Columbia and the Territories (hereinafter referred to as the "States") to use the "benefiting program" cost allocation methodology in U.S. Office of Management and Budget (OMB) Circular A-87 (2 CFR part 225). It is the judgment and determination of HHS/ACF that the "benefiting program" cost allocation methodology is the appropriate methodology for the proper use of Federal TANF funds. The Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) of 1996 gave federally-recognized Tribes the opportunity to operate their own Tribal TANF programs. Federally-recognized Indian tribes operating approved Tribal TANF programs have always followed the "benefiting program" cost allocation methodology in accordance with OMB Circular A-87 (2 CFR part 225) and the applicable regulatory provisions at 45 CFR 286.45(c) and (d). This final rule contains no substantive changes to the proposed rule published on September 27, 2006.

  14. Issues connected with indirect cost quantification: a focus on the transportation system

    NASA Astrophysics Data System (ADS)

    Křivánková, Zuzana; Bíl, Michal; Kubeček, Jan; Vodák, Rostislav

    2017-04-01

    Transportation and communication networks in general are vital parts of modern society. The economy relies heavily on transportation system performance. A number of people commutes to work regularly. Stockpiles in many companies are being reduced as the just-in-time production process is able to supply resources via the transportation network on time. Natural hazards have the potential to disturb transportation systems. Earthquakes, flooding or landsliding are examples of high-energetic processes which are capable of causing direct losses (i.e. physical damage to the infrastructure). We have focused on quantification of the indirect cost of natural hazards which are not easy to estimate. Indirect losses can also emerge as a result of meteorological hazards with low energy which only seldom cause direct losses, e.g. glaze, snowfall. Whereas evidence of repair work and general direct costs usually exist or can be estimated, indirect costs are much more difficult to identify particularly when they are not covered by insurance agencies. Delimitations of alternative routes (detours) are the most frequent responses to blocked road links. Indirect costs can then be related to increased fuel consumption and additional operating costs. Detours usually result in prolonged travel times. Indirect costs quantification has to therefore cover the value of the time. The costs from the delay are a nonlinear function of travel time, however. The existence of an alternative transportation pattern may also result in an increased number of traffic crashes. This topic has not been studied in depth but an increase in traffic crashes has been reported when people suddenly changed their traffic modes, e.g. when air traffic was not possible. The lost user benefit from those trips that were cancelled or suppressed is also difficult to quantify. Several approaches, based on post-event questioner surveys, have been applied to communities and companies affected by transportation accessibility cut-off. No widely accepted methodology is available, however. In this presentation we will discuss current approaches, and their limitations related to indirect cost estimation which can be applied to estimation of natural hazard impacts.

  15. Flood Protection Through Landscape Scale Ecosystem Restoration- Quantifying the Benefits

    NASA Astrophysics Data System (ADS)

    Pinero, E.

    2017-12-01

    Hurricane Harvey illustrated the risks associated with storm surges on coastal areas, especially during severe storms. One way to address storm surges is to utilize the natural ability of offshore coastal land to dampen their severity. In addition to helping reduce storm surge intensity and related damage, restoring the land will generate numerous co-benefits such as carbon sequestration and water quality improvement. The session will discuss the analytical methodology that helps define what is the most resilient species to take root, and to calculate quantified benefits. It will also address the quantification and monetization of benefits to make the business case for restoration. In 2005, Hurricanes Katrina and Rita damaged levees along the Gulf of Mexico, leading to major forest degradation, habitat deterioration and reduced wildlife use. As a result, this area lost an extensive amount of land, with contiguous sections of wetlands being converted to open water. The Restore the Earth Foundation's North American Amazon project intends to restore one million acres of forests and forested wetlands in the lower Mississippi River Valley. The proposed area for the first phase of this project was once an historic bald cypress forested wetland, which was degraded due to increased salinity levels and extreme fluctuations in hydrology. The Terrebonne and Lafourche Parishes, the "bayou parishes", communities with a combined population of over 200,000, sit on thin fingers of land that are protected by surrounding wetland swamps and wetlands, beyond which is the Gulf of Mexico. The Parishes depend on fishing, hunting, trapping, boat building, off-shore oil and gas production and support activities. Yet these communities are highly vulnerable to risks from natural hazards and future land loss. The ground is at or near sea level and therefore easily inundated by storm surges if not protected by wetlands. While some communities are protected by a levee system, the Terrebonne and Lafourche Parishes remain vulnerable to the impacts of flooding and hurricanes.

  16. Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.

    PubMed

    Heredia, Nicholas J

    2018-01-01

    Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.

  17. Quantification of prebiotics in commercial infant formulas.

    PubMed

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. A novel approach to quantify cybersecurity for electric power systems

    NASA Astrophysics Data System (ADS)

    Kaster, Paul R., Jr.

    Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.

  19. WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marques da Silva, A; Fischer, A

    2015-06-15

    Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization,more » in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation must be maintained, in order to minimize the SUV variability due to biological factors. Financial support by CAPES.« less

  20. New dual in-growth core isotopic technique to assess the root litter carbon input to the soil

    USDA-ARS?s Scientific Manuscript database

    The root-derived carbon (C) input to the soil, whose quantification is often neglected because of methodological difficulties, is considered a crucial C flux for soil C dynamics and net ecosystem productivity (NEP) studies. In the present study, we compared two independent methods to quantify this C...

  1. A reliable methodology for quantitative extraction of fruit and vegetable physiological amino acids and their subsequent analysis with commonly available HPLC systems

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiological amino acids in selected fruits and vegetables. This method was found to be particularly useful because the dabsyl derivatives of glutamine and citrulline were sufficiently se...

  2. Economic feasibility study for new technological alternatives in wastewater treatment processes: a review.

    PubMed

    Molinos-Senante, María; Hernández-Sancho, Francesc; Sala-Garrido, Ramón

    2012-01-01

    The concept of sustainability involves the integration of economic, environmental, and social aspects and this also applies in the field of wastewater treatment. Economic feasibility studies are a key tool for selecting the most appropriate option from a set of technological proposals. Moreover, these studies are needed to assess the viability of transferring new technologies from pilot-scale to full-scale. In traditional economic feasibility studies, the benefits that have no market price, such as environmental benefits, are not considered and are therefore underestimated. To overcome this limitation, we propose a new methodology to assess the economic viability of wastewater treatment technologies that considers internal and external impacts. The estimation of the costs is based on the use of cost functions. To quantify the environmental benefits from wastewater treatment, the distance function methodology is proposed to estimate the shadow price of each pollutant removed in the wastewater treatment. The application of this methodological approach by decision makers enables the calculation of the true costs and benefits associated with each alternative technology. The proposed methodology is presented as a useful tool to support decision making.

  3. Methodology for evaluation of railroad technology research projects

    DOT National Transportation Integrated Search

    1981-04-01

    This Project memorandum presents a methodology for evaluating railroad research projects. The methodology includes consideration of industry and societal benefits, with special attention given to technical risks, implementation considerations, and po...

  4. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less

  5. Methodological Challenges to Economic Evaluations of Vaccines: Is a Common Approach Still Possible?

    PubMed

    Jit, Mark; Hutubessy, Raymond

    2016-06-01

    Economic evaluation of vaccination is a key tool to inform effective spending on vaccines. However, many evaluations have been criticised for failing to capture features of vaccines which are relevant to decision makers. These include broader societal benefits (such as improved educational achievement, economic growth and political stability), reduced health disparities, medical innovation, reduced hospital beds pressures, greater peace of mind and synergies in economic benefits with non-vaccine interventions. Also, the fiscal implications of vaccination programmes are not always made explicit. Alternative methodological frameworks have been proposed to better capture these benefits. However, any broadening of the methodology for economic evaluation must also involve evaluations of non-vaccine interventions, and hence may not always benefit vaccines given a fixed health-care budget. The scope of an economic evaluation must consider the budget from which vaccines are funded, and the decision-maker's stated aims for that spending to achieve.

  6. NASA Electronic Publishing System: Cost/benefit Methodology

    NASA Technical Reports Server (NTRS)

    Tuey, Richard C.

    1994-01-01

    The NASA Scientific and Technical Information Office was assigned the responsibility to examine the benefits of the utilization of electronic printing and duplicating systems throughout NASA Installations and Headquarters. The subject of this report is the documentation of the methodology used in justifying the acquisition of the most cost beneficial solution for the printing and duplicating requirements of a duplicating facility that is contemplating the acquisition of an electronic printing and duplicating system. Four alternatives are presented with each alternative costed out with its associated benefits. The methodology goes a step further than just a cost benefit analysis through its comparison of risks associated with each alternative, sensitivity to number of impressions and productivity gains on the selected alternative and finally the return on investment for the selected alternative. The report can be used in conjunction with the two earlier reports, NASA-TM-106242 and TM-106510 in guiding others in determining the cost effective duplicating alternative.

  7. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  8. Answering Aggregation Questions in Contingency Valuation of Rural Transit Benefits

    DOT National Transportation Integrated Search

    2001-08-01

    While the qualitative benefits of transit are relatively well known, quantifying the benefits of transit is still a developing methodology. Quantifying benefits offers improved operational management and planning as well as better information for pol...

  9. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    PubMed

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. An efficient assisted history matching and uncertainty quantification workflow using Gaussian processes proxy models and variogram based sensitivity analysis: GP-VARS

    NASA Astrophysics Data System (ADS)

    Rana, Sachin; Ertekin, Turgay; King, Gregory R.

    2018-05-01

    Reservoir history matching is frequently viewed as an optimization problem which involves minimizing misfit between simulated and observed data. Many gradient and evolutionary strategy based optimization algorithms have been proposed to solve this problem which typically require a large number of numerical simulations to find feasible solutions. Therefore, a new methodology referred to as GP-VARS is proposed in this study which uses forward and inverse Gaussian processes (GP) based proxy models combined with a novel application of variogram analysis of response surface (VARS) based sensitivity analysis to efficiently solve high dimensional history matching problems. Empirical Bayes approach is proposed to optimally train GP proxy models for any given data. The history matching solutions are found via Bayesian optimization (BO) on forward GP models and via predictions of inverse GP model in an iterative manner. An uncertainty quantification method using MCMC sampling in conjunction with GP model is also presented to obtain a probabilistic estimate of reservoir properties and estimated ultimate recovery (EUR). An application of the proposed GP-VARS methodology on PUNQ-S3 reservoir is presented in which it is shown that GP-VARS provides history match solutions in approximately four times less numerical simulations as compared to the differential evolution (DE) algorithm. Furthermore, a comparison of uncertainty quantification results obtained by GP-VARS, EnKF and other previously published methods shows that the P50 estimate of oil EUR obtained by GP-VARS is in close agreement to the true values for the PUNQ-S3 reservoir.

  11. Identification and quantification of VOCs by proton transfer reaction time of flight mass spectrometry: An experimental workflow for the optimization of specificity, sensitivity, and accuracy

    PubMed Central

    Hanna, George B.

    2018-01-01

    Abstract Proton transfer reaction time of flight mass spectrometry (PTR‐ToF‐MS) is a direct injection MS technique, allowing for the sensitive and real‐time detection, identification, and quantification of volatile organic compounds. When aiming to employ PTR‐ToF‐MS for targeted volatile organic compound analysis, some methodological questions must be addressed, such as the need to correctly identify product ions, or evaluating the quantitation accuracy. This work proposes a workflow for PTR‐ToF‐MS method development, addressing the main issues affecting the reliable identification and quantification of target compounds. We determined the fragmentation patterns of 13 selected compounds (aldehydes, fatty acids, phenols). Experiments were conducted under breath‐relevant conditions (100% humid air), and within an extended range of reduced electric field values (E/N = 48–144 Td), obtained by changing drift tube voltage. Reactivity was inspected using H3O+, NO+, and O2 + as primary ions. The results show that a relatively low (<90 Td) E/N often permits to reduce fragmentation enhancing sensitivity and identification capabilities, particularly in the case of aldehydes using NO+, where a 4‐fold increase in sensitivity is obtained by means of drift voltage reduction. We developed a novel calibration methodology, relying on diffusion tubes used as gravimetric standards. For each of the tested compounds, it was possible to define suitable conditions whereby experimental error, defined as difference between gravimetric measurements and calculated concentrations, was 8% or lower. PMID:29336521

  12. A General Uncertainty Quantification Methodology for Cloud Microphysical Property Retrievals

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Xie, S.; Chen, X.; Zhao, C.

    2014-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program provides long-term (~20 years) ground-based cloud remote sensing observations. However, there are large uncertainties in the retrieval products of cloud microphysical properties based on the active and/or passive remote-sensing measurements. To address this uncertainty issue, a DOE Atmospheric System Research scientific focus study, Quantification of Uncertainties in Cloud Retrievals (QUICR), has been formed. In addition to an overview of recent progress of QUICR, we will demonstrate the capacity of an observation-based general uncertainty quantification (UQ) methodology via the ARM Climate Research Facility baseline cloud microphysical properties (MICROBASE) product. This UQ method utilizes the Karhunen-Loéve expansion (KLE) and Central Limit Theorems (CLT) to quantify the retrieval uncertainties from observations and algorithm parameters. The input perturbations are imposed on major modes to take into account the cross correlations between input data, which greatly reduces the dimension of random variables (up to a factor of 50) and quantifies vertically resolved full probability distribution functions of retrieved quantities. Moreover, this KLE/CLT approach has the capability of attributing the uncertainties in the retrieval output to individual uncertainty source and thus sheds light on improving the retrieval algorithm and observations. We will present the results of a case study for the ice water content at the Southern Great Plains during an intensive observing period on March 9, 2000. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  13. LC-MS-MS characterization of curry leaf flavonols and antioxidant activity

    USDA-ARS?s Scientific Manuscript database

    Curry leaf is a commonly used flavoring agent whose flavonol constituents have potential health benefits. This study characterized the curry leaf flavonol profile and antioxidant activity. Flavonols were extracted using ethanol, methanol, or acetone prior to identification and quantification using l...

  14. Linking Ecosystem Services Benefit Transfer Databases and Ecosystem Services Production Function Libraries

    EPA Science Inventory

    The quantification or estimation of the economic and non-economic values of ecosystem services can be done from a number of distinct approaches. For example, practitioners may use ecosystem services production function models (ESPFMs) for a particular location, or alternatively, ...

  15. Methodology for turbulence code validation: Quantification of simulation-experiment agreement and application to the TORPEX experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, Paolo; Theiler, C.; Fasoli, A.

    A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less

  16. [Health protection for rural workers: the need to standardize techniques for quantifying dermal exposure to pesticides].

    PubMed

    Selmi, Giuliana da Fontoura Rodrigues; Trapé, Angelo Zanaga

    2014-05-01

    Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.

  17. Plasticity models of material variability based on uncertainty quantification techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Reese E.; Rizzi, Francesco; Boyce, Brad

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQmore » techniques can be used in model selection and assessing the quality of calibrated physical parameters.« less

  18. Telomere length analysis.

    PubMed

    Canela, Andrés; Klatt, Peter; Blasco, María A

    2007-01-01

    Most somatic cells of long-lived species undergo telomere shortening throughout life. Critically short telomeres trigger loss of cell viability in tissues, which has been related to alteration of tissue function and loss of regenerative capabilities in aging and aging-related diseases. Hence, telomere length is an important biomarker for aging and can be used in the prognosis of aging diseases. These facts highlight the importance of developing methods for telomere length determination that can be employed to evaluate telomere length during the human aging process. Telomere length quantification methods have improved greatly in accuracy and sensitivity since the development of the conventional telomeric Southern blot. Here, we describe the different methodologies recently developed for telomere length quantification, as well as their potential applications for human aging studies.

  19. A methodology for direct quantification of over-ranging length in helical computed tomography with real-time dosimetry.

    PubMed

    Tien, Christopher J; Winslow, James F; Hintenlang, David E

    2011-01-31

    In helical computed tomography (CT), reconstruction information from volumes adjacent to the clinical volume of interest (VOI) is required for proper reconstruction. Previous studies have relied upon either operator console readings or indirect extrapolation of measurements in order to determine the over-ranging length of a scan. This paper presents a methodology for the direct quantification of over-ranging dose contributions using real-time dosimetry. A Siemens SOMATOM Sensation 16 multislice helical CT scanner is used with a novel real-time "point" fiber-optic dosimeter system with 10 ms temporal resolution to measure over-ranging length, which is also expressed in dose-length-product (DLP). Film was used to benchmark the exact length of over-ranging. Over-ranging length varied from 4.38 cm at pitch of 0.5 to 6.72 cm at a pitch of 1.5, which corresponds to DLP of 131 to 202 mGy-cm. The dose-extrapolation method of Van der Molen et al. yielded results within 3%, while the console reading method of Tzedakis et al. yielded consistently larger over-ranging lengths. From film measurements, it was determined that Tzedakis et al. overestimated over-ranging lengths by one-half of beam collimation width. Over-ranging length measured as a function of reconstruction slice thicknesses produced two linear regions similar to previous publications. Over-ranging is quantified with both absolute length and DLP, which contributes about 60 mGy-cm or about 10% of DLP for a routine abdominal scan. This paper presents a direct physical measurement of over-ranging length within 10% of previous methodologies. Current uncertainties are less than 1%, in comparison with 5% in other methodologies. Clinical implantation can be increased by using only one dosimeter if codependence with console readings is acceptable, with an uncertainty of 1.1% This methodology will be applied to different vendors, models, and postprocessing methods--which have been shown to produce over-ranging lengths differing by 125%.

  20. Hyperplex-MRM: a hybrid multiple reaction monitoring method using mTRAQ/iTRAQ labeling for multiplex absolute quantification of human colorectal cancer biomarker.

    PubMed

    Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie

    2013-09-06

    Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.

  1. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.

  2. Methodological Issues in Controlled Studies on Effects of Prenatal Exposure to Drug Abuse. Research Monograph 114.

    ERIC Educational Resources Information Center

    Kilbey, M. Marlyne, Ed.; Asghar, Khursheed, Ed.

    This monograph presents the proceedings of the first National Institute on Drug Abuse technical review related to the conduct of controlled studies on prenatal exposure to drugs of abuse. Papers in the monograph are categorized by session. The first session (two papers) focused on the detection and quantification of prenatal drug exposure in…

  3. Trainee and Instructor Task Quantification: Development of Quantitative Indices and a Predictive Methodology.

    ERIC Educational Resources Information Center

    Whaton, George R.; And Others

    As the first step in a program to develop quantitative techniques for prescribing the design and use of training systems, the present study attempted: to compile an initial set of quantitative indices, to determine whether these indices could be used to describe a sample of trainee tasks and differentiate among them, to develop a predictive…

  4. An approach for quantification of platinum distribution in tissues by LA-ICP-MS imaging using isotope dilution analysis.

    PubMed

    Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D

    2018-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. 4D Subject-Specific Inverse Modeling of the Chick Embryonic Heart Outflow Tract Hemodynamics

    PubMed Central

    Goenezen, Sevan; Chivukula, Venkat Keshav; Midgett, Madeline; Phan, Ly; Rugonyi, Sandra

    2015-01-01

    Blood flow plays a critical role in regulating embryonic cardiac growth and development, with altered flow leading to congenital heart disease. Progress in the field, however, is hindered by a lack of quantification of hemodynamic conditions in the developing heart. In this study, we present a methodology to quantify blood flow dynamics in the embryonic heart using subject-specific computational fluid dynamics (CFD) models. While the methodology is general, we focused on a model of the chick embryonic heart outflow tract (OFT), which distally connects the heart to the arterial system, and is the region of origin of many congenital cardiac defects. Using structural and Doppler velocity data collected from optical coherence tomography (OCT), we generated 4D (3D + time) embryo-specific CFD models of the heart OFT. To replicate the blood flow dynamics over time during the cardiac cycle, we developed an iterative inverse-method optimization algorithm, which determines the CFD model boundary conditions such that differences between computed velocities and measured velocities at one point within the OFT lumen are minimized. Results from our developed CFD model agree with previously measured hemodynamics in the OFT. Further, computed velocities and measured velocities differ by less than 15% at locations that were not used in the optimization, validating the model. The presented methodology can be used in quantifications of embryonic cardiac hemodynamics under normal and altered blood flow conditions, enabling an in depth quantitative study of how blood flow influences cardiac development. PMID:26361767

  6. A multifractal approach to space-filling recovery for PET quantification.

    PubMed

    Willaime, Julien M Y; Aboagye, Eric O; Tsoumpas, Charalampos; Turkheimer, Federico E

    2014-11-01

    A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV mean) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic (18)F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical (18)F-fluorothymidine PET test-retest dataset. TLA estimates were stable for a range of resolutions typical in PET oncology (4-6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV mean or TV measurements across imaging protocols. The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.

  7. Stochastic approach for radionuclides quantification

    NASA Astrophysics Data System (ADS)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  8. Quantification of model uncertainty in aerosol optical thickness retrieval from Ozone Monitoring Instrument (OMI) measurements

    NASA Astrophysics Data System (ADS)

    Määttä, A.; Laine, M.; Tamminen, J.; Veefkind, J. P.

    2013-09-01

    We study uncertainty quantification in remote sensing of aerosols in the atmosphere with top of the atmosphere reflectance measurements from the nadir-viewing Ozone Monitoring Instrument (OMI). Focus is on the uncertainty in aerosol model selection of pre-calculated aerosol models and on the statistical modelling of the model inadequacies. The aim is to apply statistical methodologies that improve the uncertainty estimates of the aerosol optical thickness (AOT) retrieval by propagating model selection and model error related uncertainties more realistically. We utilise Bayesian model selection and model averaging methods for the model selection problem and use Gaussian processes to model the smooth systematic discrepancies from the modelled to observed reflectance. The systematic model error is learned from an ensemble of operational retrievals. The operational OMI multi-wavelength aerosol retrieval algorithm OMAERO is used for cloud free, over land pixels of the OMI instrument with the additional Bayesian model selection and model discrepancy techniques. The method is demonstrated with four examples with different aerosol properties: weakly absorbing aerosols, forest fires over Greece and Russia, and Sahara dessert dust. The presented statistical methodology is general; it is not restricted to this particular satellite retrieval application.

  9. Detection and quantification of MS lesions using fuzzy topological principles

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Wei, Luogang; Samarasekera, Supun; Miki, Yukio; van Buchem, M. A.; Grossman, Robert I.

    1996-04-01

    Quantification of the severity of the multiple sclerosis (MS) disease through estimation of lesion volume via MR imaging is vital for understanding and monitoring the disease and its treatment. This paper presents a novel methodology and a system that can be routinely used for segmenting and estimating the volume of MS lesions via dual-echo spin-echo MR imagery. An operator indicates a few points in the images by pointing to the white matter, the gray matter, and the CSF. Each of these objects is then detected as a fuzzy connected set. The holes in the union of these objects correspond to potential lesion sites which are utilized to detect each potential lesion as a fuzzy connected object. These 3D objects are presented to the operator who indicates acceptance/rejection through the click of a mouse button. The volume of accepted lesions is then computed and output. Based on several evaluation studies and over 300 3D data sets that were processed, we conclude that the methodology is highly reliable and consistent, with a coefficient of variation (due to subjective operator actions) of less than 1.0% for volume.

  10. Quantile-based bias correction and uncertainty quantification of extreme event attribution statements

    DOE PAGES

    Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.

    2016-02-16

    Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less

  11. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data.

    PubMed

    Zeng, Ying-Xu; Mjøs, Svein Are; David, Fabrice P A; Schmid, Adrien W

    2016-03-31

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. On uncertainty quantification in hydrogeology and hydrogeophysics

    NASA Astrophysics Data System (ADS)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  13. Validation of a fast and accurate chromatographic method for detailed quantification of vitamin E in green leafy vegetables.

    PubMed

    Cruz, Rebeca; Casal, Susana

    2013-11-15

    Vitamin E analysis in green vegetables is performed by an array of different methods, making it difficult to compare published data or choosing the adequate one for a particular sample. Aiming to achieve a consistent method with wide applicability, the current study reports the development and validation of a fast micro-method for quantification of vitamin E in green leafy vegetables. The methodology uses solid-liquid extraction based on the Folch method, with tocol as internal standard, and normal-phase HPLC with fluorescence detection. A large linear working range was confirmed, being highly reproducible, with inter-day precisions below 5% (RSD). Method sensitivity was established (below 0.02 μg/g fresh weight), and accuracy was assessed by recovery tests (>96%). The method was tested in different green leafy vegetables, evidencing diverse tocochromanol profiles, with variable ratios and amounts of α- and γ-tocopherol, and other minor compounds. The methodology is adequate for routine analyses, with a reduced chromatographic run (<7 min) and organic solvent consumption, and requires only standard chromatographic equipment available in most laboratories. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Quantification of endocrine disruptors and pesticides in water by gas chromatography-tandem mass spectrometry. Method validation using weighted linear regression schemes.

    PubMed

    Mansilha, C; Melo, A; Rebelo, H; Ferreira, I M P L V O; Pinho, O; Domingues, V; Pinho, C; Gameiro, P

    2010-10-22

    A multi-residue methodology based on a solid phase extraction followed by gas chromatography-tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC-MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Capillary liquid chromatography-ion trap-mass spectrometry methodology for the simultaneous quantification of four angiotensin-converting enzyme-inhibitory peptides in Prunus seed hydrolysates.

    PubMed

    González-García, Estefanía; García, María Concepción; Marina, María Luisa

    2018-03-09

    Prunus genus fruit seeds are sources of highly angiotensin-I-converting enzyme (ACE)-inhibitory peptides. The presence of peptides IYSPH, IYTPH, IFSPR, and VAIP seems to be related to this activity but no previous work has demonstrated the direct relationship between the concentration of these peptides and the antihypertensive activity of hydrolysates. This work describes the development of a method for the quantification of these peptides in Prunus seeds hydrolysates based on capillary liquid chromatography-IT-MS/MS. The analytical characteristics of the method were evaluated through the study of the linearity, LOD, LOQ, presence of matrix interferences, precision, and recovery. The developed methodology was applied to the determination of the four peptides in seed hydrolysates from different Prunus genus fruits: peaches (7 varieties), plums (2 varieties), nectarines (3 varieties), apricots (2 varieties), cherry, and paraguayo. Peaches and plums seed hydrolysates yielded the highest concentrations of these peptides while paraguayo one showed the lowest concentrations. A high correlation between peptides concentrations was demonstrated suggesting that the four peptides could be released from the same seed proteins. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Contribution of plastic waste recovery to greenhouse gas (GHG) savings in Spain.

    PubMed

    Sevigné-Itoiz, Eva; Gasol, Carles M; Rieradevall, Joan; Gabarrell, Xavier

    2015-12-01

    This paper concentrates on the quantification of greenhouse gas (GHG) emissions of post-consumer plastic waste recovery (material or energy) by considering the influence of the plastic waste quality (high or low), the recycled plastic applications (virgin plastic substitution or non-plastic substitution) and the markets of recovered plastic (regional or global). The aim is to quantify the environmental consequences of different alternatives in order to evaluate opportunities and limitations to select the best and most feasible plastic waste recovery option to decrease the GHG emissions. The methodologies of material flow analysis (MFA) for a time period of thirteen years and consequential life cycle assessment (CLCA) have been integrated. The study focuses on Spain as a representative country for Europe. The results show that to improve resource efficiency and avoid more GHG emissions, the options for plastic waste management are dependent on the quality of the recovered plastic. The results also show that there is an increasing trend of exporting plastic waste for recycling, mainly to China, that reduces the GHG benefits from recycling, suggesting that a new focus should be introduced to take into account the split between local recycling and exporting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Near-infrared spectroscopy for the detection and quantification of bacterial contaminations in pharmaceutical products.

    PubMed

    Quintelas, Cristina; Mesquita, Daniela P; Lopes, João A; Ferreira, Eugénio C; Sousa, Clara

    2015-08-15

    Accurate detection and quantification of microbiological contaminations remains an issue mainly due the lack of rapid and precise analytical techniques. Standard methods are expensive and time-consuming being associated to high economic losses and public health threats. In the context of pharmaceutical industry, the development of fast analytical techniques able to overcome these limitations is crucial and spectroscopic techniques might constitute a reliable alternative. In this work we proved the ability of Fourier transform near infrared spectroscopy (FT-NIRS) to detect and quantify bacteria (Bacillus subtilis, Escherichia coli, Pseudomonas fluorescens, Salmonella enterica, Staphylococcus epidermidis) from 10 to 10(8) CFUs/mL in sterile saline solutions (NaCl 0.9%). Partial least squares discriminant analysis (PLSDA) models showed that FT-NIRS was able to discriminate between sterile and contaminated solutions for all bacteria as well as to identify the contaminant bacteria. Partial least squares (PLS) models allowed bacterial quantification with limits of detection ranging from 5.1 to 9 CFU/mL for E. coli and B. subtilis, respectively. This methodology was successfully validated in three pharmaceutical preparations (contact lens solution, cough syrup and topic anti-inflammatory solution) proving that this technique possess a high potential to be routinely used for the detection and quantification of bacterial contaminations. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perko, Z.; Gilli, L.; Lathouwers, D.

    2013-07-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used techniquemore » proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)« less

  19. Challenges for continuity of L-Band observations over land

    USDA-ARS?s Scientific Manuscript database

    Over land, L-band observations are primarily used for the detection of soil freeze/thaw events and the quantification of surface soil moisture content. Both products have important science, climate and decision support applications and would benefit from longer historical data records derived from s...

  20. Analysis of ISO NE Balancing Requirements: Uncertainty-based Secure Ranges for ISO New England Dynamic Inerchange Adjustments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Makarov, Yuri V.; Wu, Di

    The document describes detailed uncertainty quantification (UQ) methodology developed by PNNL to estimate secure ranges of potential dynamic intra-hour interchange adjustments in the ISO-NE system and provides description of the dynamic interchange adjustment (DINA) tool developed under the same contract. The overall system ramping up and down capability, spinning reserve requirements, interchange schedules, load variations and uncertainties from various sources that are relevant to the ISO-NE system are incorporated into the methodology and the tool. The DINA tool has been tested by PNNL and ISO-NE staff engineers using ISO-NE data.

  1. New methodology for the analysis of volatile organic compounds (VOCs) in bioethanol by gas chromatography coupled to mass spectrometry

    NASA Astrophysics Data System (ADS)

    Campos, M. S. G.; Sarkis, J. E. S.

    2018-03-01

    The present study presents a new analytical methodology for the determination of 11 compounds present in ethanol samples through the gas chromatography coupled to mass spectrometry (GC-MS) technique using a medium polarity chromatography column composed of 6% cyanopropyl-phenyl and 94% dimethyl polysiloxane. The validation parameters were determined according to NBR ISO 17025:2005. The recovery rates of the studied compounds were 100.4% to 114.7%. The limits of quantification are between 2.4 mg.kg-1 and 5.8 mg.kg-1. The uncertainty of the measurement was estimate in circa of 8%.

  2. Quantification and micron-scale imaging of spatial distribution of trace beryllium in shrapnel fragments and metallurgic samples with correlative fluorescence detection method and secondary ion mass spectrometry (SIMS)

    PubMed Central

    Abraham, Jerrold L.; Chandra, Subhash; Agrawal, Anoop

    2014-01-01

    Recently, a report raised the possibility of shrapnel-induced chronic beryllium disease (CBD) from long-term exposure to the surface of retained aluminum shrapnel fragments in the body. Since the shrapnel fragments contained trace beryllium, methodological developments were needed for beryllium quantification and to study its spatial distribution in relation to other matrix elements, such as aluminum and iron, in metallurgic samples. In this work, we developed methodology for quantification of trace beryllium in samples of shrapnel fragments and other metallurgic sample-types with main matrix of aluminum (aluminum cans from soda, beer, carbonated water, and aluminum foil). Sample preparation procedures were developed for dissolving beryllium for its quantification with the fluorescence detection method for homogenized measurements. The spatial distribution of trace beryllium on the sample surface and in 3D was imaged with a dynamic secondary ion mass spectrometry (SIMS) instrument, CAMECA IMS 3f SIMS ion microscope. The beryllium content of shrapnel (~100 ppb) was the same as the trace quantities of beryllium found in aluminum cans. The beryllium content of aluminum foil (~25 ppb) was significantly lower than cans. SIMS imaging analysis revealed beryllium to be distributed in the form of low micron-sized particles and clusters distributed randomly in X-Y-and Z dimensions, and often in association with iron, in the main aluminum matrix of cans. These observations indicate a plausible formation of Be-Fe or Al-Be alloy in the matrix of cans. Further observations were made on fluids (carbonated water) for understanding if trace beryllium in cans leached out and contaminated the food product. A direct comparison of carbonated water in aluminum cans and plastic bottles revealed that beryllium was below the detection limits of the fluorescence detection method (~0.01 ppb). These observations indicate that beryllium present in aluminum matrix was either present in an immobile form or its mobilization into the food product was prevented by a polymer coating on the inside of cans, a practice used in food industry to prevent contamination of food products. The lack of such coating in retained shrapnel fragments renders their surface a possible source of contamination for long-term exposure of tissues and fluids and induction of disease, as characterized in a recent study. Methodological developments reported here can be extended to studies of beryllium in electronics devices and components. PMID:25146877

  3. Quantification and micron-scale imaging of spatial distribution of trace beryllium in shrapnel fragments and metallurgic samples with correlative fluorescence detection method and secondary ion mass spectrometry (SIMS).

    PubMed

    Abraham, J L; Chandra, S; Agrawal, A

    2014-11-01

    Recently, a report raised the possibility of shrapnel-induced chronic beryllium disease from long-term exposure to the surface of retained aluminum shrapnel fragments in the body. Since the shrapnel fragments contained trace beryllium, methodological developments were needed for beryllium quantification and to study its spatial distribution in relation to other matrix elements, such as aluminum and iron, in metallurgic samples. In this work, we developed methodology for quantification of trace beryllium in samples of shrapnel fragments and other metallurgic sample-types with main matrix of aluminum (aluminum cans from soda, beer, carbonated water and aluminum foil). Sample preparation procedures were developed for dissolving beryllium for its quantification with the fluorescence detection method for homogenized measurements. The spatial distribution of trace beryllium on the sample surface and in 3D was imaged with a dynamic secondary ion mass spectrometry instrument, CAMECA IMS 3f secondary ion mass spectrometry ion microscope. The beryllium content of shrapnel (∼100 ppb) was the same as the trace quantities of beryllium found in aluminum cans. The beryllium content of aluminum foil (∼25 ppb) was significantly lower than cans. SIMS imaging analysis revealed beryllium to be distributed in the form of low micron-sized particles and clusters distributed randomly in X-Y- and Z dimensions, and often in association with iron, in the main aluminum matrix of cans. These observations indicate a plausible formation of Be-Fe or Al-Be alloy in the matrix of cans. Further observations were made on fluids (carbonated water) for understanding if trace beryllium in cans leached out and contaminated the food product. A direct comparison of carbonated water in aluminum cans and plastic bottles revealed that beryllium was below the detection limits of the fluorescence detection method (∼0.01 ppb). These observations indicate that beryllium present in aluminum matrix was either present in an immobile form or its mobilization into the food product was prevented by a polymer coating on the inside of cans, a practice used in food industry to prevent contamination of food products. The lack of such coating in retained shrapnel fragments renders their surface a possible source of contamination for long-term exposure of tissues and fluids and induction of disease, as characterized in a recent study. Methodological developments reported here can be extended to studies of beryllium in electronics devices and components. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  4. Using discrete choice experiments within a cost-benefit analysis framework: some considerations.

    PubMed

    McIntosh, Emma

    2006-01-01

    A great advantage of the stated preference discrete choice experiment (SPDCE) approach to economic evaluation methodology is its immense flexibility within applied cost-benefit analyses (CBAs). However, while the use of SPDCEs in healthcare has increased markedly in recent years there has been a distinct lack of equivalent CBAs in healthcare using such SPDCE-derived valuations. This article outlines specific issues and some practical suggestions for consideration relevant to the development of CBAs using SPDCE-derived benefits. The article shows that SPDCE-derived CBA can adopt recent developments in cost-effectiveness methodology including the cost-effectiveness plane, appropriate consideration of uncertainty, the net-benefit framework and probabilistic sensitivity analysis methods, while maintaining the theoretical advantage of the SPDCE approach. The concept of a cost-benefit plane is no different in principle to the cost-effectiveness plane and can be a useful tool for reporting and presenting the results of CBAs.However, there are many challenging issues to address for the advancement of CBA methodology using SPCDEs within healthcare. Particular areas for development include the importance of accounting for uncertainty in SPDCE-derived willingness-to-pay values, the methodology of SPDCEs in clinical trial settings and economic models, measurement issues pertinent to using SPDCEs specifically in healthcare, and the importance of issues such as consideration of the dynamic nature of healthcare and the resulting impact this has on the validity of attribute definitions and context.

  5. Learning-by-catching: uncertain invasive-species populations and the value of information.

    PubMed

    D'Evelyn, Sean T; Tarui, Nori; Burnett, Kimberly; Roumasset, James A

    2008-12-01

    This paper develops a model of invasive species control when the species' population size is unknown. In the face of an uncertain population size, a resource manager's species-control efforts provide two potential benefits: (1) a direct benefit of possibly reducing the population of invasive species, and (2) an indirect benefit of information acquisition (due to learning about the population size, which reduces uncertainty). We provide a methodology that takes into account both of these benefits, and show how optimal management decisions are altered in the presence of the indirect benefit of learning. We then apply this methodology to the case of controlling the Brown Treesnake (Boiga irregularis) on the island of Saipan. We find that the indirect benefit--the value of information to reduce uncertainty--is likely to be quite large.

  6. Economic impact of a nationwide interoperable e-Health system using the PENG evaluation tool.

    PubMed

    Parv, L; Saluse, J; Aaviksoo, A; Tiik, M; Sepper, R; Ross, P

    2012-01-01

    The aim of this paper is to evaluate the costs and benefits of the Estonian interoperable health information exchange system. In addition, a framework will be built for follow-up monitoring and analysis of a nationwide HIE system. PENG evaluation tool was used to map and quantify the costs and benefits arising from type II diabetic patient management for patients, providers and the society. The analysis concludes with a quantification based on real costs and potential benefits identified by a panel of experts. Setting up a countrywide interoperable eHealth system incurs a large initial investment. However, if the system is working seamlessly, benefits will surpass costs within three years. The results show that while the society stands to benefit the most, the costs will be mainly borne by the healthcare providers. Therefore, new government policies should be devised to encourage providers to invest to ensure society wide benefits.

  7. Cost-benefit analysis in occupational health: a comparison of intervention scenarios for occupational asthma and rhinitis among bakery workers.

    PubMed

    Meijster, Tim; van Duuren-Stuurman, Birgit; Heederik, Dick; Houba, Remko; Koningsveld, Ernst; Warren, Nicholas; Tielemans, Erik

    2011-10-01

    Use of cost-benefit analysis in occupational health increases insight into the intervention strategy that maximises the cost-benefit ratio. This study presents a methodological framework identifying the most important elements of a cost-benefit analysis for occupational health settings. One of the main aims of the methodology is to evaluate cost-benefit ratios for different stakeholders (employers, employees and society). The developed methodology was applied to two intervention strategies focused on reducing respiratory diseases. A cost-benefit framework was developed and used to set up a calculation spreadsheet containing the inputs and algorithms required to calculate the costs and benefits for all cost elements. Inputs from a large variety of sources were used to calculate total costs, total benefits, net costs and the benefit-to-costs ratio for both intervention scenarios. Implementation of a covenant intervention program resulted in a net benefit of €16 848 546 over 20 years for a population of 10 000 workers. Implementation was cost-effective for all stakeholders. For a health surveillance scenario, total benefits resulting from a decreased disease burden were estimated to be €44 659 352. The costs of the interventions could not be calculated. This study provides important insights for developing effective intervention strategies in the field of occupational medicine. Use of a model based approach enables investigation of those parameters most likely to impact on the effectiveness and costs of interventions for work related diseases. Our case study highlights the importance of considering different perspectives (of employers, society and employees) in assessing and sharing the costs and benefits of interventions.

  8. Impact of quality of research on patient outcomes in the Institute of Medicine 2013 report on dietary sodium.

    PubMed

    Lucko, Aaron; Doktorchik, Chelsea Ta; Campbell, Norm Rc

    2018-02-01

    The 2013 Institute of Medicine report entitled "Sodium Intake in Populations: Assessment of Evidence" found inconsistent evidence of health benefit with dietary sodium intake <2300 mg/d. Different studies reported benefit and harm of population dietary intake <2300 mg/d. The Institute of Medicine committee, however, did not assess whether the methodology used in each of the studies was appropriate to examine dietary sodium and health outcomes. This review investigates the association of methodological rigor and outcomes of studies in the Institute of Medicine report. For the 13 studies that met all methodological criteria, nine found a detrimental impact of high sodium consumption on health, one found a health benefit, and in three the effect was unclear (P = .068). For the 22 studies that failed to meet all criteria, 11 showed a detrimental impact, four a health benefit, and seven had unclear effects from increasing dietary sodium (P = .42). ©2018 Wiley Periodicals, Inc.

  9. Quantification of Human and Animal Viruses to Differentiate the Origin of the Fecal Contamination Present in Environmental Samples

    PubMed Central

    Bofill-Mas, Sílvia; Rusiñol, Marta; Fernandez-Cassi, Xavier; Carratalà, Anna; Hundesa, Ayalkibet

    2013-01-01

    Many different viruses are excreted by humans and animals and are frequently detected in fecal contaminated waters causing public health concerns. Classical bacterial indicator such as E. coli and enterococci could fail to predict the risk for waterborne pathogens such as viruses. Moreover, the presence and levels of bacterial indicators do not always correlate with the presence and concentration of viruses, especially when these indicators are present in low concentrations. Our research group has proposed new viral indicators and methodologies for determining the presence of fecal pollution in environmental samples as well as for tracing the origin of this fecal contamination (microbial source tracking). In this paper, we examine to what extent have these indicators been applied by the scientific community. Recently, quantitative assays for quantification of poultry and ovine viruses have also been described. Overall, quantification by qPCR of human adenoviruses and human polyomavirus JC, porcine adenoviruses, bovine polyomaviruses, chicken/turkey parvoviruses, and ovine polyomaviruses is suggested as a toolbox for the identification of human, porcine, bovine, poultry, and ovine fecal pollution in environmental samples. PMID:23762826

  10. A study of a self diagnostic platform for the detection of A2 biomarker for Leishmania donovani

    NASA Astrophysics Data System (ADS)

    Roche, Philip J. R.; Cheung, Maurice C.; Najih, Mohamed; McCall, Laura-Isobel; Fakih, Ibrahim; Chodavarapu, Vamsy P.; Ward, Brian; Ndao, Momar; Kirk, Andrew G.

    2012-03-01

    Visceral leishmaniasis (L.donovani) is a protozoan infection that attacks mononuclear phagocytes and causes the liver and spleen damage that can cause death. The investigation presented is a proof of concept development applying a plasmonic diagnostic platform with simple microfluidic sample delivery and optical readout. An immune-assay method is applied to the quantification of A2 protein, a highly immunogenic biomarker for the pathogen. Quantification of A2 was performed in the ng/ml range, analysis by ELISA suggested that a limit of 0.1ng/ml of A2 is approximate to 1 pathogen per ml and the sensing system shows the potential to deliver a similar level of quantification. Significant reduction in assay complexity as further enzyme linked enhancement is not required when applying a plasmonic methodology to an immunoassay. The basic instrumentation required for a portable device and potential dual optical readout where both plasmonic and photoluminescent response are assessed and investigated including consideration of the application of the device to testing where non-literate communication of results is considered and issues of performance are addressed.

  11. Uncertainty Quantification and Certification Prediction of Low-Boom Supersonic Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Reuter, Bryan W.; Walker, Eric L.; Kleb, Bil; Park, Michael A.

    2014-01-01

    The primary objective of this work was to develop and demonstrate a process for accurate and efficient uncertainty quantification and certification prediction of low-boom, supersonic, transport aircraft. High-fidelity computational fluid dynamics models of multiple low-boom configurations were investigated including the Lockheed Martin SEEB-ALR body of revolution, the NASA 69 Delta Wing, and the Lockheed Martin 1021-01 configuration. A nonintrusive polynomial chaos surrogate modeling approach was used for reduced computational cost of propagating mixed, inherent (aleatory) and model-form (epistemic) uncertainty from both the computation fluid dynamics model and the near-field to ground level propagation model. A methodology has also been introduced to quantify the plausibility of a design to pass a certification under uncertainty. Results of this study include the analysis of each of the three configurations of interest under inviscid and fully turbulent flow assumptions. A comparison of the uncertainty outputs and sensitivity analyses between the configurations is also given. The results of this study illustrate the flexibility and robustness of the developed framework as a tool for uncertainty quantification and certification prediction of low-boom, supersonic aircraft.

  12. qFlow Cytometry-Based Receptoromic Screening: A High-Throughput Quantification Approach Informing Biomarker Selection and Nanosensor Development.

    PubMed

    Chen, Si; Weddell, Jared; Gupta, Pavan; Conard, Grace; Parkin, James; Imoukhuede, Princess I

    2017-01-01

    Nanosensor-based detection of biomarkers can improve medical diagnosis; however, a critical factor in nanosensor development is deciding which biomarker to target, as most diseases present several biomarkers. Biomarker-targeting decisions can be informed via an understanding of biomarker expression. Currently, immunohistochemistry (IHC) is the accepted standard for profiling biomarker expression. While IHC provides a relative mapping of biomarker expression, it does not provide cell-by-cell readouts of biomarker expression or absolute biomarker quantification. Flow cytometry overcomes both these IHC challenges by offering biomarker expression on a cell-by-cell basis, and when combined with calibration standards, providing quantitation of biomarker concentrations: this is known as qFlow cytometry. Here, we outline the key components for applying qFlow cytometry to detect biomarkers within the angiogenic vascular endothelial growth factor receptor family. The key aspects of the qFlow cytometry methodology include: antibody specificity testing, immunofluorescent cell labeling, saturation analysis, fluorescent microsphere calibration, and quantitative analysis of both ensemble and cell-by-cell data. Together, these methods enable high-throughput quantification of biomarker expression.

  13. Improving microstructural quantification in FIB/SEM nanotomography.

    PubMed

    Taillon, Joshua A; Pellegrinelli, Christopher; Huang, Yi-Lin; Wachsman, Eric D; Salamanca-Riba, Lourdes G

    2018-01-01

    FIB/SEM nanotomography (FIB-nt) is a powerful technique for the determination and quantification of the three-dimensional microstructure in subsurface features. Often times, the microstructure of a sample is the ultimate determiner of the overall performance of a system, and a detailed understanding of its properties is crucial in advancing the materials engineering of a resulting device. While the FIB-nt technique has developed significantly in the 15 years since its introduction, advanced nanotomographic analysis is still far from routine, and a number of challenges remain in data acquisition and post-processing. In this work, we present a number of techniques to improve the quality of the acquired data, together with easy-to-implement methods to obtain "advanced" microstructural quantifications. The techniques are applied to a solid oxide fuel cell cathode of interest to the electrochemistry community, but the methodologies are easily adaptable to a wide range of material systems. Finally, results from an analyzed sample are presented as a practical example of how these techniques can be implemented. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. All-Electronic Quantification of Neuropeptide-Receptor Interaction Using a Bias-Free Functionalized Graphene Microelectrode.

    PubMed

    Ping, Jinglei; Vishnubhotla, Ramya; Xi, Jin; Ducos, Pedro; Saven, Jeffery G; Liu, Renyu; Johnson, Alan T Charlie

    2018-05-22

    Opioid neuropeptides play a significant role in pain perception, appetite regulation, sleep, memory, and learning. Advances in understanding of opioid peptide physiology are held back by the lack of methodologies for real-time quantification of affinities and kinetics of the opioid neuropeptide-receptor interaction at levels typical of endogenous secretion (<50 pM) in biosolutions with physiological ionic strength. To address this challenge, we developed all-electronic opioid-neuropeptide biosensors based on graphene microelectrodes functionalized with a computationally redesigned water-soluble μ-opioid receptor. We used the functionalized microelectrode in a bias-free charge measurement configuration to measure the binding kinetics and equilibrium binding properties of the engineered receptor with [d-Ala 2 , N-MePhe 4 , Gly-ol]-enkephalin and β-endorphin at picomolar levels in real time.

  15. Use of a portable, automated, open-circuit gas quantification system and the sulfur hexafluoride tracer technique for measuring enteric methane emissions in Holstein cows fed ad libitum or restricted

    USDA-ARS?s Scientific Manuscript database

    The sulfur hexafluoride tracer technique (SF**6) is a commonly used method for measuring CH**4 enteric emissions in ruminants. Studies using SF**6 have shown large variation in CH**4 emissions data, inconsistencies in CH**4 emissions across studies, and potential methodological errors. Therefore, th...

  16. New approach for the quantification of processed animal proteins in feed using light microscopy.

    PubMed

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  17. Measurements of VOC fluxes by Eddy-covariance with a PTR-Qi-TOF-MS over a mature wheat crop near Paris: Evaluation of data quality and uncertainties.

    NASA Astrophysics Data System (ADS)

    Buysse, Pauline; Loubet, Benjamin; Ciuraru, Raluca; Lafouge, Florence; Zurfluh, Olivier; Gonzaga-Gomez, Lais; Fanucci, Olivier; Gueudet, Jean-Christophe; Decuq, Céline; Gros, Valérie; Sarda, Roland; Zannoni, Nora

    2017-04-01

    The quantification of volatile organic compounds (VOC) fluxes exchanged by terrestrial ecosystems is of large interest because of their influence on the chemistry and composition of the atmosphere including aerosols and oxidants. Latest developments in the techniques for detecting, identifying and measuring VOC fluxes have considerably improved the abilities to get reliable estimates. Among these, the eddy-covariance (EC) methodology constitutes the most direct approach, and relies on both well-established principles (Aubinet et al. 2000) and a sound continuously worldwide improving experience. The combination of the EC methodology with the latest proton-transfer-reaction mass spectrometer (PTR-MS) device, the PTR-Qi-TOF-MS, which allows the identification and quantification of more than 500 VOC at high frequency, now provides a very powerful and precise tool for an accurate quantification of VOC fluxes on various types of terrestrial ecosystems. The complexity of the whole methodology however demands that several data quality requirements are fulfilled. VOC fluxes were measured by EC with a PTR-Qi-TOF-MS (national instrument within the ANAEE-France framework) for one month and a half over a mature wheat crop near Paris (FR-GRI ICOS site). Most important emissions (by descending order) were observed from detected compounds with mass-over-charge (m/z) ratios of 33.033 (methanol), 45.033 (acetaldehyde), 93.033 (not identified yet), 59.049 (acetone), and 63.026 (dimethyl sulfide or DMS). Emissions from higher-mass compounds, which might be due to pesticide applications at the beginning of our observation period, were also detected. Some compounds were also seen to deposit (e.g. m/z 47.013, 71.085, 75.044, 83.05) while others exhibited bidirectional fluxes (e.g. m/z 57.07, 69.07). Before analyzing VOC flux responses to meteorological and crop development drivers, a data quality check was performed which included (i) uncertainty analysis of mass and concentration calibration, (ii) determination of fragmentation patterns and (iii) of lag time high-frequency losses for all ions that showed a flux, and (iv) the determination of the flux random uncertainties and of the limit of detection.

  18. Hydrologic Impacts of Climate Change: Quantification of Uncertainties (Alexander von Humboldt Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Mujumdar, Pradeep P.

    2014-05-01

    Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.

  19. A multi-model assessment of the co-benefits of climate mitigation for global air quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana

    The recent International Panel on Climate change (IPCC) report identifies significant co-benefits from climate policies on near-term ambient air pollution and related human health outcomes [1]. This is increasingly relevant for policy making as the health impacts of air pollution are a major global concern- the Global Burden of Disease (GBD) study identifies outdoor air pollution as the sixth major cause of death globally [2]. Integrated assessment models (IAMs) are an effective tool to evaluate future air pollution outcomes across a wide range of assumptions on socio-economic development and policy regimes. The Representative Concentration Pathways (RCPs) [3] were the firstmore » set of long-term global scenarios developed across multiple integrated assessment models that provided detailed estimates of a number of air pollutants until 2100. However these scenarios were primarily designed to cover a defined range of radiative forcing outcomes and thus did not specifically focus on the interactions of long-term climate goals on near-term air pollution impacts. More recently, [4] used the RCP4.5 scenario to evaluate the co-benefits of global GHG reductions on air quality and human health in 2030. [5-7] have further examined the interactions of more diverse pollution control regimes with climate policies. This paper extends the listed studies in a number of ways. Firstly it uses multiple IAMs to look into the co-benefits of a global climate policy for ambient air pollution under harmonized assumptions on near-term air pollution control. Multi-model frameworks have been extensively used in the analysis of climate change mitigation pathways, and the structural uncertainties regarding the underlying mechanisms (see for example [8-10]. This is to our knowledge the first time that a multi-model evaluation has been specifically designed and applied to analyze the co-benefits of climate change policy on ambient air quality, thus enabling a better understanding of at a detailed sector and region level. A second methodological advancement is a quantification of the co-benefits in terms of the associated atmospheric concentrations of fine particulate matter (PM2.5) and consequent mortality related outcomes across different models. This is made possible by the use of state-of the art simplified atmospheric model that allows for the first time a computationally feasible multi-model evaluation of such outcomes.« less

  20. Headspace solid-phase microextraction and gas chromatographic analysis of low-molecular-weight sulfur volatiles with pulsed flame photometric detection and quantification by a stable isotope dilution assay.

    PubMed

    Ullrich, Sebastian; Neef, Sylvia K; Schmarr, Hans-Georg

    2018-02-01

    Low-molecular-weight volatile sulfur compounds such as thiols, sulfides, disulfides as well as thioacetates cause a sulfidic off-flavor in wines even at low concentration levels. The proposed analytical method for quantification of these compounds in wine is based on headspace solid-phase microextraction, followed by gas chromatographic analysis with sulfur-specific detection using a pulsed flame photometric detector. Robust quantification was achieved via a stable isotope dilution assay using commercial and synthesized deuterated isotopic standards. The necessary chromatographic separation of analytes and isotopic standards benefits from the inverse isotope effect realized on an apolar polydimethylsiloxane stationary phase of increased film thickness. Interferences with sulfur-specific detection in wine caused by sulfur dioxide were minimized by addition of propanal. The method provides adequate validation data, with good repeatability and limits of detection and quantification. It suits the requirements of wine quality management, allowing the control of oenological treatments to counteract an eventual formation of excessively high concentration of such malodorous compounds. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. How to convince your manager to invest in an HIS preimplementation methodology for appraisal of material, process and human costs and benefits.

    PubMed Central

    Bossard, B.; Renard, J. M.; Capelle, P.; Paradis, P.; Beuscart, M. C.

    2000-01-01

    Investing in information technology has become a crucial process in hospital management today. Medical and administrative managers are faced with difficulties in measuring medical information technology costs and benefits due to the complexity of the domain. This paper proposes a preimplementation methodology for evaluating and appraising material, process and human costs and benefits. Based on the users needs and organizational process analysis, the methodology provides an evaluative set of financial and non financial indicators which can be integrated in a decision making and investment evaluation process. We describe the first results obtained after a few months of operation for the Computer-Based Patient Record (CPR) project. Its full acceptance, in spite of some difficulties, encourages us to diffuse the method for the entire project. PMID:11079851

  2. Human health benefits and burdens of a pharmaceutical treatment: Discussion of a conceptual integrated approach.

    PubMed

    Debaveye, Sam; De Soete, Wouter; De Meester, Steven; Vandijck, Dominique; Heirman, Bert; Kavanagh, Shane; Dewulf, Jo

    2016-01-01

    The effects of a pharmaceutical treatment have until now been evaluated by the field of Health Economics on the patient health benefits, expressed in Quality-Adjusted Life Years (QALYs) versus the monetary costs. However, there is also a Human Health burden associated with this process, resulting from emissions that originate from the pharmaceutical production processes, Use Phase and End of Life (EoL) disposal of the medicine. This Human Health burden is evaluated by the research field of Life Cycle Assessment (LCA) and expressed in Disability-Adjusted Life Years (DALYs), a metric similar to the QALY. The need for a new framework presents itself in which both the positive and negative health effects of a pharmaceutical treatment are integrated into a net Human Health effect. To do so, this article reviews the methodologies of both Health Economics and the area of protection Human Health of the LCA methodology and proposes a conceptual framework on which to base an integration of both health effects. Methodological issues such as the inclusion of future costs and benefits, discounting and age weighting are discussed. It is suggested to use the structure of an LCA as a backbone to cover all methodological challenges involved in the integration. The possibility of monetizing both Human Health benefits and burdens is explored. The suggested approach covers the main methodological aspects that should be considered in an integrated assessment of the health effects of a pharmaceutical treatment. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Industry survey of space system cost benefits from New Ways Of Doing Business

    NASA Technical Reports Server (NTRS)

    Rosmait, Russell L.

    1992-01-01

    The cost of designing, building and operating space system hardware has always been expensive. Small quantities of specialty parts escalate engineering design, production and operations cost. Funding cutbacks and shrinking revenues dictate aggressive cost saving programs. NASA's highest priority is providing economical transportation to and from space. Over the past three decades NASA has seen technological advances that provide grater efficiencies in designing, building, and operating of space system hardware. As future programs such as NLS, LUTE and SEI begin, these greater efficiencies and cost savings should be reflected in the cost models. There are several New Ways Of Doing Business (NWODB) which, when fully implemented will reduce space system costs. These philosophies and/or culture changes are integrated in five areas: (1) More Extensive Pre-Phase C/D & E, (2) Multi Year Funding Stability, (3) Improved Quality, Management and Procurement Processes, (4) Advanced Design Methods, and (5) Advanced Production Methods. Following is an overview of NWODB and the Cost Quantification Analysis results using an industry survey, one of the four quantification techniques used in the study. The NWODB Cost Quantification Analysis is a study performed at Marshall Space Flight Center by the Engineering Cost Group, Applied Research Incorporated and Pittsburg State University. This study took place over a period of four months in mid 1992. The purpose of the study was to identify potential NWODB which could lead to improved cost effectiveness within NASA and to quantify potential cost benefits that might accrue if these NWODB were implemented.

  4. A refined methodology for modeling volume quantification performance in CT

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  5. Outline of cost-benefit analysis and a case study

    NASA Technical Reports Server (NTRS)

    Kellizy, A.

    1978-01-01

    The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.

  6. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  7. Uncertainty Quantification given Discontinuous Climate Model Response and a Limited Number of Model Runs

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Debusschere, B.; Najm, H.

    2010-12-01

    Uncertainty quantification in complex climate models is challenged by the sparsity of available climate model predictions due to the high computational cost of model runs. Another feature that prevents classical uncertainty analysis from being readily applicable is bifurcative behavior in climate model response with respect to certain input parameters. A typical example is the Atlantic Meridional Overturning Circulation. The predicted maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We outline a methodology for uncertainty quantification given discontinuous model response and a limited number of model runs. Our approach is two-fold. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve shape and location for arbitrarily distributed input parameter values. Then, we construct spectral representations of uncertainty, using Polynomial Chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification. The approach is enabled by a Rosenblatt transformation that maps each side of the discontinuity to regular domains where desirable orthogonality properties for the spectral bases hold. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference - where the latter allows obtaining reasonable expansions without extra forward model runs. The model output, and its associated uncertainty at specific design points, are then computed by taking an ensemble average over PC expansions corresponding to possible realizations of the discontinuity curve. The methodology is tested on synthetic examples of discontinuous model data with adjustable sharpness and structure. This work was supported by the Sandia National Laboratories Seniors’ Council LDRD (Laboratory Directed Research and Development) program. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Company, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

  8. A multifractal approach to space-filling recovery for PET quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O.; Tsoumpas, Charalampos

    2014-11-01

    Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal andmore » synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.« less

  9. The economic value of remote sensing of earth resources from space: An ERTS overview and the value of continuity of service. Volume 9: Oceans

    NASA Technical Reports Server (NTRS)

    Lietzke, K. R.

    1974-01-01

    The impact of remote sensing upon marine activities and oceanography is presented. The present capabilities of the current Earth Resources Technology Satellite (ERTS-1), as demonstrated by the principal investigators are discussed. Cost savings benefits are quantified in the area of nautical and hygrographic mapping and charting. Benefits are found in aiding coastal zone management and in the fields of weather (marine) prediction, fishery harvesting and management, and potential uses for ocean vegetation. Difficulties in quantification are explained, the primary factor being that remotely sensed information will be of greater benefit as input to forecasting models which have not yet been constructed.

  10. New LightCycler PCR for Rapid and Sensitive Quantification of Parvovirus B19 DNA Guides Therapeutic Decision-Making in Relapsing Infections

    PubMed Central

    Harder, Timm C.; Hufnagel, Markus; Zahn, Katrin; Beutel, Karin; Schmitt, Heinz-Josef; Ullmann, Uwe; Rautenberg, Peter

    2001-01-01

    Detection of parvovirus B19 DNA offers diagnostic advantages over serology, particularly in persistent infections of immunocompromised patients. A rapid, novel method of B19 DNA detection and quantification is introduced. This method, a quantitative PCR assay, is based on real-time glass capillary thermocycling (LightCycler [LC]) and fluorescence resonance energy transfer (FRET). The PCR assay allowed quantification over a dynamic range of over 7 logs and could quantify as little as 250 B19 genome equivalents (geq) per ml as calculated for plasmid DNA (i.e., theoretically ≥5 geq per assay). Interrater agreement analysis demonstrated equivalence of LC-FRET PCR and conventional nested PCR in the diagnosis of an active B19 infection (kappa coefficient = 0.83). The benefit of the new method was demonstrated in an immunocompromised child with a relapsing infection, who required an attenuation of the immunosuppressive therapy in addition to repeated doses of immunoglobulin to eliminate the virus. PMID:11724854

  11. Risk-benefit analysis and public policy: a bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less

  12. Itô-SDE MCMC method for Bayesian characterization of errors associated with data limitations in stochastic expansion methods for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.

    2017-11-01

    This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.

  13. Large-scale isotype-specific quantification of Serum amyloid A 1/2 by multiple reaction monitoring in crude sera.

    PubMed

    Sung, Hye-Jin; Jeon, Seon-Ae; Ahn, Jung-Mo; Seul, Kyung-Jo; Kim, Jin Young; Lee, Ju Yeon; Yoo, Jong Shin; Lee, Soo-Youn; Kim, Hojoong; Cho, Je-Yoel

    2012-04-03

    Quantification is an essential step in biomarker development. Multiple reaction monitoring (MRM) is a new modified mass spectrometry-based quantification technology that does not require antibody development. Serum amyloid A (SAA) is a positive acute-phase protein identified as a lung cancer biomarker in our previous study. Acute SAA exists in two isoforms with highly similar (92%) amino acid sequences. Until now, studies of SAA have been unable to distinguish between SAA1 and SAA2. To overcome the unavailability of a SAA2-specific antibody, we developed MRM methodology for the verification of SAA1 and SAA2 in clinical crude serum samples from 99 healthy controls and 100 lung adenocarcinoma patients. Differential measurement of SAA1 and SAA2 was made possible for the first time with the developed isotype-specific MRM method. Most healthy control samples had small or no MS/MS peaks of the targeted peptides otherwise, higher peak areas with 10- to 34-fold increase over controls were detected in lung cancer samples. In addition, our SAA1 MRM data demonstrated good agreement with the SAA1 enzyme-linked immunosorbent assay (ELISA) data. Finally, successful quantification of SAA2 in crude serum by MRM, for the first time, shows that SAA2 can be a good biomarker for the detection of lung cancers. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Methods to Detect Nitric Oxide and its Metabolites in Biological Samples

    PubMed Central

    Bryan, Nathan S.; Grisham, Matthew B.

    2007-01-01

    Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129

  15. An Efficient Approach to Evaluate Reporter Ion Behavior from MALDI-MS/MS Data for Quantification Studies using Isobaric Tags

    PubMed Central

    Cologna, Stephanie M.; Crutchfield, Christopher A.; Searle, Brian C.; Blank, Paul S.; Toth, Cynthia L.; Ely, Alexa M.; Picache, Jaqueline A.; Backlund, Peter S.; Wassif, Christopher A.; Porter, Forbes D.; Yergey, Alfred L.

    2017-01-01

    Protein quantification, identification and abundance determination are important aspects of proteome characterization and are crucial in understanding biological mechanisms and human diseases. Different strategies are available to quantify proteins using mass spectrometric detection, and most are performed at the peptide level and include both targeted and un-targeted methodologies. Discovery-based or un-targeted approaches oftentimes use covalent tagging strategies (i.e., iTRAQ®, TMT™) where reporter ion signals collected in the tandem MS experiment are used for quantification. Herein we investigate the behavior of the iTRAQ 8-plex chemistry using MALDI-TOF/TOF instrumentation. The experimental design and data analysis approach described is simple and straightforward, which allows researchers to optimize data collection and proper analysis within a laboratory. iTRAQ reporter ion signals were normalized within each spectrum to remove peptide biases. An advantage of this approach is that missing reporter ion values can be accepted for purposes of protein identification and quantification with the need for ANOVA analysis. We investigate the distribution of reporter ion peak areas in an equimolar system and a mock biological system and provide recommendations for establishing fold-change cutoff values at the peptide level for iTRAQ datasets. These data provide a unique dataset available to the community for informatics training and analysis. PMID:26288259

  16. MRI-based methods for quantification of the cerebral metabolic rate of oxygen

    PubMed Central

    Rodgers, Zachary B; Detre, John A

    2016-01-01

    The brain depends almost entirely on oxidative metabolism to meet its significant energy requirements. As such, the cerebral metabolic rate of oxygen (CMRO2) represents a key measure of brain function. Quantification of CMRO2 has helped elucidate brain functional physiology and holds potential as a clinical tool for evaluating neurological disorders including stroke, brain tumors, Alzheimer’s disease, and obstructive sleep apnea. In recent years, a variety of magnetic resonance imaging (MRI)-based CMRO2 quantification methods have emerged. Unlike positron emission tomography – the current “gold standard” for measurement and mapping of CMRO2 – MRI is non-invasive, relatively inexpensive, and ubiquitously available in modern medical centers. All MRI-based CMRO2 methods are based on modeling the effect of paramagnetic deoxyhemoglobin on the magnetic resonance signal. The various methods can be classified in terms of the MRI contrast mechanism used to quantify CMRO2: T2*, T2′, T2, or magnetic susceptibility. This review article provides an overview of MRI-based CMRO2 quantification techniques. After a brief historical discussion motivating the need for improved CMRO2 methodology, current state-of-the-art MRI-based methods are critically appraised in terms of their respective tradeoffs between spatial resolution, temporal resolution, and robustness, all of critical importance given the spatially heterogeneous and temporally dynamic nature of brain energy requirements. PMID:27089912

  17. Digital Protocol for Chemical Analysis at Ultralow Concentrations by Surface-Enhanced Raman Scattering.

    PubMed

    de Albuquerque, Carlos Diego L; Sobral-Filho, Regivaldo G; Poppi, Ronei J; Brolo, Alexandre G

    2018-01-16

    Single molecule surface-enhanced Raman spectroscopy (SM-SERS) has the potential to revolutionize quantitative analysis at ultralow concentrations (less than 1 nM). However, there are no established protocols to generalize the application of this technique in analytical chemistry. Here, a protocol for quantification at ultralow concentrations using SM-SERS is proposed. The approach aims to take advantage of the stochastic nature of the single-molecule regime to achieved lower limits of quantification (LOQ). Two emerging contaminants commonly found in aquatic environments, enrofloxacin (ENRO) and ciprofloxacin (CIPRO), were chosen as nonresonant molecular probes. The methodology involves a multivariate resolution curve fitting known as non-negative matrix factorization with alternating least-squares algorithm (NMF-ALS) to solve spectral overlaps. The key element of the quantification is to realize that, under SM-SERS conditions, the Raman intensity generated by a molecule adsorbed on a "hotspot" can be digitalized. Therefore, the number of SERS event counts (rather than SERS intensities) was shown to be proportional to the solution concentration. This allowed the determination of both ENRO and CIPRO with high accuracy and precision even at ultralow concentrations regime. The LOQ for both ENRO and CIPRO were achieved at 2.8 pM. The digital SERS protocol, suggested here, is a roadmap for the implementation of SM-SERS as a routine tool for quantification at ultralow concentrations.

  18. Installation Restoration Program. Phase 2. Confirmation/Quantification. Stage 1. Volume 2.

    DTIC Science & Technology

    1986-10-01

    contaminatioa. Details of the data base received daily over a lifetime. For non - feeding studies on experimental used in these projections for each of...might be generated experimentally for a evaluation of the health effects of the highest no-observed-adverse-effect-level non -carcinogenic end-point of...8217 HARDFILL: Disposal sites receiving construction debris, wood, miscellaneous spoil material. HARM: Hazard Assessment Rating Methodology HAZARDOUS

  19. Short RNA indicator sequences are not completely degraded by autoclaving

    PubMed Central

    Unnithan, Veena V.; Unc, Adrian; Joe, Valerisa; Smith, Geoffrey B.

    2014-01-01

    Short indicator RNA sequences (<100 bp) persist after autoclaving and are recovered intact by molecular amplification. Primers targeting longer sequences are most likely to produce false positives due to amplification errors easily verified by melting curves analyses. If short indicator RNA sequences are used for virus identification and quantification then post autoclave RNA degradation methodology should be employed, which may include further autoclaving. PMID:24518856

  20. Detection, Localization and Quantification of Impact Events on a Stiffened Composite Panel with Embedded Fiber Bragg Grating Sensor Networks

    PubMed Central

    Lamberti, Alfredo; Luyckx, Geert; Van Paepegem, Wim; Rezayat, Ali; Vanlanduit, Steve

    2017-01-01

    Nowadays, it is possible to manufacture smart composite materials with embedded fiber optic sensors. These sensors can be exploited during the composites’ operating life to identify occurring damages such as delaminations. For composite materials adopted in the aviation and wind energy sector, delaminations are most often caused by impacts with external objects. The detection, localization and quantification of such impacts are therefore crucial for the prevention of catastrophic events. In this paper, we demonstrate the feasibility to perform impact identification in smart composite structures with embedded fiber optic sensors. For our analyses, we manufactured a carbon fiber reinforced plate in which we embedded a distributed network of fiber Bragg grating (FBG) sensors. We impacted the plate with a modal hammer and we identified the impacts by processing the FBG data with an improved fast phase correlation (FPC) algorithm in combination with a variable selective least squares (VS-LS) inverse solver approach. A total of 164 impacts distributed on 41 possible impact locations were analyzed. We compared our methodology with the traditional P-Inv based approach. In terms of impact localization, our methodology performed better in 70.7% of the cases. An improvement on the impact time domain reconstruction was achieved in 95.1% of the cases. PMID:28368319

  1. Detection, Localization and Quantification of Impact Events on a Stiffened Composite Panel with Embedded Fiber Bragg Grating Sensor Networks.

    PubMed

    Lamberti, Alfredo; Luyckx, Geert; Van Paepegem, Wim; Rezayat, Ali; Vanlanduit, Steve

    2017-04-01

    Nowadays, it is possible to manufacture smart composite materials with embedded fiber optic sensors. These sensors can be exploited during the composites' operating life to identify occurring damages such as delaminations. For composite materials adopted in the aviation and wind energy sector, delaminations are most often caused by impacts with external objects. The detection, localization and quantification of such impacts are therefore crucial for the prevention of catastrophic events. In this paper, we demonstrate the feasibility to perform impact identification in smart composite structures with embedded fiber optic sensors. For our analyses, we manufactured a carbon fiber reinforced plate in which we embedded a distributed network of fiber Bragg grating (FBG) sensors. We impacted the plate with a modal hammer and we identified the impacts by processing the FBG data with an improved fast phase correlation (FPC) algorithm in combination with a variable selective least squares (VS-LS) inverse solver approach. A total of 164 impacts distributed on 41 possible impact locations were analyzed. We compared our methodology with the traditional P-Inv based approach. In terms of impact localization, our methodology performed better in 70.7% of the cases. An improvement on the impact time domain reconstruction was achieved in 95 . 1 % of the cases.

  2. 28 CFR 104.47 - Collateral sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... determining the appropriate collateral source offset for future benefit payments, the Special Master may employ an appropriate methodology for determining the present value of such future benefits. In... compensation, including life insurance, pension funds, death benefits programs, and payments by Federal, State...

  3. A comprehensive NMR methodology to assess the composition of biobased and biodegradable polymers in contact with food.

    PubMed

    Gratia, Audrey; Merlet, Denis; Ducruet, Violette; Lyathaud, Cédric

    2015-01-01

    A nuclear magnetic resonance (NMR) methodology was assessed regarding the identification and quantification of additives in three types of polylactide (PLA) intended as food contact materials. Additives were identified using the LNE/NMR database which clusters NMR datasets on more than 130 substances authorized by European Regulation No. 10/2011. Of the 12 additives spiked in the three types of PLA pellets, 10 were rapidly identified by the database and correlated with spectral comparison. The levels of the 12 additives were estimated using quantitative NMR combined with graphical computation. A comparison with chromatographic methods tended to prove the sensitivity of NMR by demonstrating an analytical difference of less than 15%. Our results therefore demonstrated the efficiency of the proposed NMR methodology for rapid assessment of the composition of PLA. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Methodology for assessment of low level laser therapy (LLLT) irradiation parameters in muscle inflammation treatment

    NASA Astrophysics Data System (ADS)

    Mantineo, M.; Pinheiro, J. P.; Morgado, A. M.

    2013-11-01

    Several studies in human and animals show the clinical effectiveness of low level laser therapy (LLLT) in reducing some types of pain, treating inflammation and wound healing. However, more scientific evidence is required to prove the effectiveness of LLLT since many aspects of the cellular and molecular mechanisms triggered by irradiation of injured tissue with laser remain unknown. Here, we present a methodology that can be used to evaluate the effect of different LLLT irradiation parameters on the treatment of muscle inflammation on animals, through the quantification of four cytokines (TNF-α, IL-1β, IL-2 and IL-6) in systemic blood and histological analysis of muscle tissue. We have used this methodology to assess the effect of LLLT parameters (wavelength, dose, power and type of illumination) in the treatment of inflammation induced in the gastrocnemius muscle of Wistar rats. Results obtained for laser dose evaluation with continuous illumination are presented.

  5. A healthcare Lean Six Sigma System for postanesthesia care unit workflow improvement.

    PubMed

    Kuo, Alex Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Lee, Te-Shu

    2011-01-01

    The aim of this article is to propose a new model called Healthcare Lean Six Sigma System that integrates Lean and Six Sigma methodologies to improve workflow in a postanesthesia care unit. The methodology of the proposed model is fully described. A postanesthesia care unit case study is also used to demonstrate the benefits of using the Healthcare Lean Six Sigma System model by combining Lean and Six Sigma methodologies together. The new model bridges the service gaps between health care providers and patients, balances the requirements of health care managers, and delivers health care services to patients by taking the benefits of the Lean speed and Six Sigma high-quality principles. The full benefits of the new model will be realized when applied at both strategic and operational levels. For further research, we will examine how the proposed model is used in different real-world case studies.

  6. Forecasting the Economic Impact of Future Space Station Operations

    NASA Technical Reports Server (NTRS)

    Summer, R. A.; Smolensky, S. M.; Muir, A. H.

    1967-01-01

    Recent manned and unmanned Earth-orbital operations have suggested great promise of improved knowledge and of substantial economic and associated benefits to be derived from services offered by a space station. Proposed application areas include agriculture, forestry, hydrology, public health, oceanography, natural disaster warning, and search/rescue operations. The need for reliable estimates of economic and related Earth-oriented benefits to be realized from Earth-orbital operations is discussed and recent work in this area is reviewed. Emphasis is given to those services based on remote sensing. Requirements for a uniform, comprehensive and flexible methodology are discussed. A brief review of the suggested methodology is presented. This methodology will be exercised through five case studies which were chosen from a gross inventory of almost 400 user candidates. The relationship of case study results to benefits in broader application areas is discussed, Some management implications of possible future program implementation are included.

  7. Practical guide: Tools and methodologies for an oil and gas industry emission inventory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, C.C.; Killian, T.L.

    1996-12-31

    During the preparation of Title V Permit applications, the quantification and speciation of emission sources from oil and gas facilities were reevaluated to determine the {open_quotes}potential-to-emit.{close_quotes} The existing emissions were primarily based on EPA emission factors such as AP-42, for tanks, combustion sources, and fugitive emissions from component leaks. Emissions from insignificant activities and routine operations that are associated with maintenance, startups and shutdowns, and releases to control devices also required quantification. To reconcile EPA emission factors with test data, process knowledge, and manufacturer`s data, a careful review of other estimation options was performed. This paper represents the results ofmore » this analysis of emission sources at oil and gas facilities, including exploration and production, compressor stations and gas plants.« less

  8. Assessment of indirect losses and costs of emergency for project planning of alpine hazard mitigation

    NASA Astrophysics Data System (ADS)

    Amenda, Lisa; Pfurtscheller, Clemens

    2013-04-01

    By virtue of augmented settling in hazardous areas and increased asset values, natural disasters such as floods, landslides and rockfalls cause high economic losses in Alpine lateral valleys. Especially in small municipalities, indirect losses, mainly stemming from a breakdown of transport networks, and costs of emergency can reach critical levels. A quantification of these losses is necessary to estimate the worthiness of mitigation measures, to determine the appropriate level of disaster assistance and to improve risk management strategies. There are comprehensive approaches available for assessing direct losses. However, indirect losses and costs of emergency are widely not assessed and the empirical basis for estimating these costs is weak. To address the resulting uncertainties of project appraisals, a standardized methodology has been developed dealing with issues of local economic effects and emergency efforts needed. In our approach, the cost-benefit-analysis for technical mitigation of the Austrian Torrent and Avalanche Control (TAC) will be optimized and extended using the 2005-debris flow as a design event, which struggled a small town in the upper Inn valley in southwest Tyrol (Austria). Thereby, 84 buildings were affected, 430 people were evacuated and due to this, the TAC implemented protection measures for 3.75 million Euros. Upgrading the method of the TAC and analyzing to what extent the cost-benefit-ratio is about to change, is one of the main objectives of this study. For estimating short-run indirect effects and costs of emergency on the local level, data was collected via questionnaires, field mapping, guided interviews, as well as intense literature research. According to this, up-to-date calculation methods were evolved and the cost-benefit-analysis of TAC was recalculated with these new-implemented results. The cost-benefit-ratio will be more precise and specific and hence, the decision, which mitigation alternative will be carried out. Based on this, the worthiness of the mitigation measures can be determined in more detail and the proper level of emergency assistance can be calculated more adequately. By dint of this study, a better data basis will be created evaluating technical and non-technical mitigation measures, which is useful for government agencies, insurance companies and research.

  9. A Subject Reference: Benefit-Cost Analysis of Toxic Substances, Hazardous Materials and Solid Waste Control (1977)

    EPA Pesticide Factsheets

    Discussion of methodological issues for conducting benefit-cost analysis and provides guidance for selecting and applying the most appropriate and useful mechanisms in benefit-cost analysis of toxic substances, hazardous materials, and solid waste control

  10. Quantification of atherosclerotic plaque activity and vascular inflammation using [18-F] fluorodeoxyglucose positron emission tomography/computed tomography (FDG-PET/CT).

    PubMed

    Mehta, Nehal N; Torigian, Drew A; Gelfand, Joel M; Saboury, Babak; Alavi, Abass

    2012-05-02

    Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC) and carotid intimal medial thickness (C-IMT) provide information about the burden of disease. However, despite multiple validation studies of CAC, and C-IMT, these modalities do not accurately assess plaque characteristics, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events. [(18)F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity, an important source of cellular inflammation in vessel walls. More recently, we and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors and is also highly associated with overall burden of atherosclerosis. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy as well as longer term therapeutic lifestyle changes (16 months). The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is calculated by dividing the arterial SUV by the venous blood pool SUV. This method has shown to represent a stable, reproducible phenotype over time, has a high sensitivity for detection of vascular inflammation, and also has high inter-and intra-reader reliability. Here we present our methodology for patient preparation, image acquisition, and quantification of atherosclerotic plaque activity and vascular inflammation using SUV, TBR, and a global parameter called the metabolic volumetric product (MVP). These approaches may be applied to assess vascular inflammation in various study samples of interest in a consistent fashion as we have shown in several prior publications.

  11. Advection and dispersion heat transport mechanisms in the quantification of shallow geothermal resources and associated environmental impacts.

    PubMed

    Alcaraz, Mar; García-Gil, Alejandro; Vázquez-Suñé, Enric; Velasco, Violeta

    2016-02-01

    Borehole Heat Exchangers (BHEs) are increasingly being used to exploit shallow geothermal energy. This paper presents a new methodology to provide a response to the need for a regional quantification of the geothermal potential that can be extracted by BHEs and the associated environmental impacts. A set of analytical solutions facilitates accurate calculation of the heat exchange of BHEs with the ground and its environmental impacts. For the first time, advection and dispersion heat transport mechanisms and the temporal evolution from the start of operation of the BHE are taken into account in the regional estimation of shallow geothermal resources. This methodology is integrated in a GIS environment, which facilitates the management of input and output data at a regional scale. An example of the methodology's application is presented for Barcelona, in Spain. As a result of the application, it is possible to show the strengths and improvements of this methodology in the development of potential maps of low temperature geothermal energy as well as maps of environmental impacts. The minimum and maximum energy potential values for the study site are 50 and 1800 W/m(2) for a drilled depth of 100 m, proportionally to Darcy velocity. Regarding to thermal impacts, the higher the groundwater velocity and the energy potential, the higher the size of the thermal plume after 6 months of exploitation, whose length ranges from 10 to 27 m long. A sensitivity analysis was carried out in the calculation of heat exchange rate and its impacts for different scenarios and for a wide range of Darcy velocities. The results of this analysis lead to the conclusion that the consideration of dispersion effects and temporal evolution of the exploitation prevent significant differences up to a factor 2.5 in the heat exchange rate accuracy and up to several orders of magnitude in the impacts generated. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Cognitive remediation in schizophrenia: A methodological appraisal of systematic reviews and meta-analyses.

    PubMed

    Bryce, Shayden; Sloan, Elise; Lee, Stuart; Ponsford, Jennie; Rossell, Susan

    2016-04-01

    Systematic reviews and meta-analyses are a primary source of evidence when evaluating the benefit(s) of cognitive remediation (CR) in schizophrenia. These studies are designed to rigorously synthesize scientific literature; however, cannot be assumed to be of high methodological quality. The aims of this report were to: 1) review the use of systematic reviews and meta-analyses regarding CR in schizophrenia; 2) conduct a systematic methodological appraisal of published reports examining the benefits of this intervention on core outcome domains; and 3) compare the correspondence between methodological and reporting quality. Electronic databases were searched for relevant articles. Twenty-one reviews met inclusion criteria and were scored according to the AMSTAR checklist-a validated scale of methodological quality. Five meta-analyses were also scored according to PRISMA statement to compare 'quality of conduct' with 'quality of reporting'. Most systematic reviews and meta-analyses shared strengths and fell within a 'medium' level of methodological quality. Nevertheless, there were consistent areas of potential weakness that were not addressed by most reviews. These included the lack of protocol registration, uncertainty regarding independent data extraction and consensus procedures, and the minimal assessment of publication bias. Moreover, quality of conduct may not necessarily parallel quality of reporting, suggesting that consideration of these methods independently may be important. Reviews concerning CR for schizophrenia are a valuable source of evidence. However, the methodological quality of these reports may require additional consideration. Enhancing quality of conduct is essential for enabling research literature to be interpreted with confidence. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    PubMed

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  14. A methodology for estimating health benefits of electricity generation using renewable technologies.

    PubMed

    Partridge, Ian; Gamkhar, Shama

    2012-02-01

    At Copenhagen, the developed countries agreed to provide up to $100 bn per year to finance climate change mitigation and adaptation by developing countries. Projects aimed at cutting greenhouse gas (GHG) emissions will need to be evaluated against dual criteria: from the viewpoint of the developed countries they must cut emissions of GHGs at reasonable cost, while host countries will assess their contribution to development, or simply their overall economic benefits. Co-benefits of some types of project will also be of interest to host countries: for example some projects will contribute to reducing air pollution, thus improving the health of the local population. This paper uses a simple damage function methodology to quantify some of the health co-benefits of replacing coal-fired generation with wind or small hydro in China. We estimate the monetary value of these co-benefits and find that it is probably small compared to the added costs. We have not made a full cost-benefit analysis of renewable energy in China as some likely co-benefits are omitted from our calculations. Our results are subject to considerable uncertainty however, after careful consideration of their likely accuracy and comparisons with other studies, we believe that they provide a good first cut estimate of co-benefits and are sufficiently robust to stand as a guide for policy makers. In addition to these empirical results, a key contribution made by the paper is to demonstrate a simple and reasonably accurate methodology for health benefits estimation that applies the most recent academic research in the field to the solution of an increasingly important problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. The characterisation and management of greenhouse gas emissions from fires in northern Australian savannas

    NASA Astrophysics Data System (ADS)

    Cook, G. D.; Liedloff, A. C.; Richards, A. E.; Meyer, M.

    2016-12-01

    Australia is the only OECD country with a significant area of tropical savannas within it borders. Approximately 220 000 km2 of these savannas burn every year releasing 2 to 4 % of Australia's accountable greenhouse gas emissions. Reduction in uncertainty in the quantification of these emissions of methane and nitrous has been fundamental to improving both the national GHG inventory and developing approaches to better manage land to reduce these emissions. Projects to reduce pyrogenic emissions have been adopted across 30% of Australia's high rainfall savannas. Recent work has focussed on quantifying the additional benefit of increased carbon stocks in fine fuel and coarse woody debris (CWD) resulting from improvements in fire management. An integrated set of equations have been developed to enable seemless quantification of emissions and sequestration in these frequently burnt savannas. These show that increases in carbon stored in fine fuel and CWD comprises about 3 times the emissions abatement from improvements in fire management that have been achieved in a project area of 28 000 km2. Future work is focussing on improving the understanding of spatial and temporal variation in fire behaviour across Australia's savanna biome, improvements in quantification of carbon dynamics of CWD and improved quantification of the effects of fire on carbon dynamics in soils of the savannas.

  16. Ecosystem services in changing landscapes: An introduction

    Treesearch

    Louis Iverson; Cristian Echeverria; Laura Nahuelhual; Sandra Luque

    2014-01-01

    The concept of ecosystem services from landscapes is rapidly gaining momentum as a language to communicate values and benefits to scientists and lay alike. Landscape ecology has an enormous contribution to make to this field, and one could argue, uniquely so. Tools developed or adapted for landscape ecology are being increasingly used to assist with the quantification...

  17. Update on Controls for Isolation and Quantification Methodology of Extracellular Vesicles Derived from Adipose Tissue Mesenchymal Stem Cells

    PubMed Central

    Franquesa, Marcella; Hoogduijn, Martin J.; Ripoll, Elia; Luk, Franka; Salih, Mahdi; Betjes, Michiel G. H.; Torras, Juan; Baan, Carla C.; Grinyó, Josep M.; Merino, Ana Maria

    2014-01-01

    The research field on extracellular vesicles (EV) has rapidly expanded in recent years due to the therapeutic potential of EV. Adipose tissue human mesenchymal stem cells (ASC) may be a suitable source for therapeutic EV. A major limitation in the field is the lack of standardization of the challenging techniques to isolate and characterize EV. The aim of our study was to incorporate new controls for the detection and quantification of EV derived from ASC and to analyze the applicability and limitations of the available techniques. ASC were cultured in medium supplemented with 5% of vesicles-free fetal bovine serum. The EV were isolated from conditioned medium by differential centrifugation with size filtration (0.2 μm). As a control, non-conditioned culture medium was used (control medium). To detect EV, electron microscopy, conventional flow cytometry, and western blot were used. The quantification of the EV was by total protein quantification, ExoELISA immunoassay, and Nanosight. Cytokines and growth factors in the EV samples were measured by multiplex bead array kit. The EV were detected by electron microscope. Total protein measurement was not useful to quantify EV as the control medium showed similar protein contents as the EV samples. The ExoELISA kits had technical troubles and it was not possible to quantify the concentration of exosomes in the samples. The use of Nanosight enabled quantification and size determination of the EV. It is, however, not possible to distinguish protein aggregates from EV with this method. The technologies for quantification and characterization of the EV need to be improved. In addition, we detected protein contaminants in the EV samples, which make it difficult to determine the real effect of EV in experimental models. It will be crucial in the future to optimize design novel methods for purification and characterization of EV. PMID:25374572

  18. College Students' Reactions to Participating in Relational Trauma Research: A Mixed Methodological Study.

    PubMed

    Edwards, Katie M; Neal, Angela M; Dardis, Christina M; Kelley, Erika L; Gidycz, Christine A; Ellis, Gary

    2015-08-24

    Using a mixed methodology, the present study compared men's and women's perceived benefits and emotional reactions with participating in research that inquired about child maltreatment and intimate partner violence (IPV) victimization and perpetration. Participants consisted of 703 college students (357 women, 346 men), ages 18 to 25 who reported on their childhood maltreatment, adolescent and adult IPV victimization and perpetration, and their reactions (perceived benefits and emotional effects) to participating. Participants' reactions to participating were assessed using quantitative scales, as well as open-ended written responses that were content coded by researchers. Women reported more personal benefits from research, whereas men and women reported similar levels of emotional reactions to research participation. Furthermore, greater frequencies of child maltreatment and IPV victimization were related to higher levels of emotional reactions. Common self-identified reasons for emotional reactions (e.g., not liking to think about abuse in general, personal victimization experiences) and benefits (e.g., reflection and awareness about oneself, learning about IPV) were also presented and analyzed. These data underscore the importance of future research that examines the behavioral impact of research participation utilizing longitudinal and in-depth qualitative methodologies. Findings also highlight the potential psychoeducational value of research on understanding the reasons underlying participants' benefits and emotional effects. © The Author(s) 2015.

  19. [Development and validation of an HPLC method for the quantification of vitamin A in human milk. Its application to a rural population in Argentina].

    PubMed

    López, Laura B; Baroni, Andrea V; Rodríguez, Viviana G; Greco, Carola B; de Costa, Sara Macías; de Ferrer, Patricia Ronayne; Rodríguez de Pece, Silvia

    2005-06-01

    A methodology for the quantification of vitamin A in human milk was developed and validated. Vitamin A levels were assessed in 223 samples corresponding to the 5th, 6th and 7th postpartum months, obtained in the province of Santiago del Estero, Argentina. The samples (500 microL) were saponified with potassium hydroxide/ethanol, extracted with hexane, evaporated to dryness and reconstituted with methanol. A column RP-C18, a mobile phase methanol/water (91:9 v/v) and a fluorescence detector (lambda excitation 330 nm and lambda emition 470 nm) were used for the separation and quantification of vitamin A. The analytical parameters of linearity (r2: 0.9995), detection (0.010 microg/mL) and quantification (0.025 microg/mL) limits, precision of the method (relative standard deviation, RSD = 9.0% within a day and RSD = 8.9% among days) and accuracy (recovery = 83.8%) demonstrate that the developed method allows the quantification of vitamin A in an efficient way. The mean values + standard deviation (SD) obtained for the analyzed samples were 0.60 +/- 0.32; 0.65 +/- 0.33 and 0.61 +/- 0.26 microg/ mL for the 5th, 6th and 7th postpartum months, respectively. There were no significant differences among the three months studied and the values found were similar to those in the literature. Considering the whole population under study, 19.3% showed vitamin A levels less than 0.40 microg/mL, which represents a risk to the children in this group since at least 0.50 microg/mL are necessary to meet the infant daily needs.

  20. Validated analytical methodology for the simultaneous determination of a wide range of pesticides in human blood using GC-MS/MS and LC-ESI/MS/MS and its application in two poisoning cases.

    PubMed

    Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D

    2015-09-01

    Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Robotic voltammetry with carbon nanotube-based sensors: a superb blend for convenient high-quality antimicrobial trace analysis.

    PubMed

    Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert

    2015-01-01

    A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1-10 μM and 2-100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories.

  2. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  3. Development and Evaluation of a Parallel Reaction Monitoring Strategy for Large-Scale Targeted Metabolomics Quantification.

    PubMed

    Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin

    2016-04-19

    Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.

  4. Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.

    PubMed

    Chahrour, Osama; Cobice, Diego; Malone, John

    2015-09-10

    Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. A simple dilute and shoot methodology for the identification and quantification of illegal insulin.

    PubMed

    Vanhee, Celine; Janvier, Steven; Moens, Goedele; Deconinck, Eric; Courselle, Patricia

    2016-10-01

    The occurrence of illegal medicines is a well-established global problem and concerns mostly small molecules. However, due to the advances in genomics and recombinant expression technologies there is an increased development of polypeptide therapeutics. Insulin is one of the best known polypeptide drug, and illegal versions of this medicine led to lethal incidents in the past. Therefore, it is crucial for the public health sector to develop reliable, efficient, cheap, unbiased and easily applicable active pharmaceutical ingredient (API) identification and quantification strategies for routine analysis of suspected illegal insulins. Here we demonstrate that our combined label-free full scan approach is not only able to distinguish between all those different versions of insulin and the insulins originating from different species, but also able to chromatographically separate human insulin and insulin lispro in conditions that are compatible with mass spectrometry (MS). Additionally, we were also able to selectively quantify the different insulins, including human insulin and insulin lispro according to the validation criteria, put forward by the United Nations (UN), for the analysis of seized illicit drugs. The proposed identification and quantification method is currently being used in our official medicines control laboratory to analyze insulins retrieved from the illegal market.

  6. Modeling of structural uncertainties in Reynolds-averaged Navier-Stokes closures

    NASA Astrophysics Data System (ADS)

    Emory, Michael; Larsson, Johan; Iaccarino, Gianluca

    2013-11-01

    Estimation of the uncertainty in numerical predictions by Reynolds-averaged Navier-Stokes closures is a vital step in building confidence in such predictions. An approach to model-form uncertainty quantification that does not assume the eddy-viscosity hypothesis to be exact is proposed. The methodology for estimation of uncertainty is demonstrated for plane channel flow, for a duct with secondary flows, and for the shock/boundary-layer interaction over a transonic bump.

  7. Modern Instrumental Methods in Forensic Toxicology*

    PubMed Central

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  8. Environmental impacts and production performances of organic agriculture in China: A monetary valuation.

    PubMed

    Meng, Fanqiao; Qiao, Yuhui; Wu, Wenliang; Smith, Pete; Scott, Steffanie

    2017-03-01

    Organic agriculture has developed rapidly in China since the 1990s, driven by the increasing domestic and international demand for organic products. Quantification of the environmental benefits and production performances of organic agriculture on a national scale helps to develop sustainable high yielding agricultural production systems with minimum impacts on the environment. Data of organic production for 2013 were obtained from a national survey organized by the Certification and Accreditation Administration of China. Farming performance and environmental impact indicators were screened and indicator values were defined based on an intensive literature review and were validated by national statistics. The economic (monetary) values of farming inputs, crop production and individual environmental benefits were then quantified and integrated to compare the overall performances of organic vs. conventional agriculture. In 2013, organically managed farmland accounted for approximately 0.97% of national arable land, covering 1.158 million ha. If organic crop yields were assumed to be 10%-15% lower than conventional yields, the environmental benefits of organic agriculture (i.e., a decrease in nitrate leaching, an increase in farmland biodiversity, an increase in carbon sequestration and a decrease in greenhouse gas emissions) were valued at 1921 million RMB (320.2 million USD), or 1659 RMB (276.5 USD) per ha. By reducing the farming inputs, the costs saved was 3110 million RMB (518.3 million USD), or 2686 RMB (447.7 USD) per ha. The economic loss associated with the decrease in crop yields from organic agriculture was valued at 6115 million RMB (1019.2 million USD), or 5280 RMB (880 USD) per ha. Although they were likely underestimated because of the complex relationships among farming operations, ecosystems and humans, the production costs saved and environmental benefits of organic agriculture that were quantified in our study compensated substantially for the economic losses associated with the decrease in crop production. This suggests that payment for the environmental benefits of organic agriculture should be incorporated into public policies. Most of the environmental impacts of organic farming were related to N fluxes within agroecosystems, which is a call for the better management of N fertilizer in regions or countries with low levels of N-use efficiency. Issues such as higher external inputs and lack of integration cropping with animal husbandry should be addressed during the quantification of change of conventional to organic agriculture, and the quantification of this change is challenging. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  10. Spanish methodological approach for biosphere assessment of radioactive waste disposal.

    PubMed

    Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C

    2007-10-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.

  11. A new methodology capable of characterizing most volatile and less volatile minor edible oils components in a single chromatographic run without solvents or reagents. Detection of new components.

    PubMed

    Alberdi-Cedeño, Jon; Ibargoitia, María L; Cristillo, Giovanna; Sopelana, Patricia; Guillén, María D

    2017-04-15

    The possibilities offered by a new methodology to determine minor components in edible oils are described. This is based on immersion of a solid-phase microextraction fiber of PDMS/DVB into the oil matrix, followed by Gas Chromatography/Mass Spectrometry. It enables characterization and differentiation of edible oils in a simple way, without either solvents or sample modification. This methodology allows simultaneous identification and quantification of sterols, tocols, hydrocarbons of different natures, fatty acids, esters, monoglycerides, fatty amides, aldehydes, ketones, alcohols, epoxides, furans, pyrans and terpenic oxygenated derivatives. The broad information provided by this methodology is useful for different areas of interest such as nutritional value, oxidative stability, technological performance, quality, processing, safety and even the prevention of fraudulent practices. Furthermore, for the first time, certain fatty amides, gamma- and delta-lactones of high molecular weight, and other aromatic compounds such as some esters derived from cinnamic acid have been detected in edible oils. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  13. Dried Blood Spot Methodology in Combination With Liquid Chromatography/Tandem Mass Spectrometry Facilitates the Monitoring of Teriflunomide

    PubMed Central

    Lunven, Catherine; Turpault, Sandrine; Beyer, Yann-Joel; O'Brien, Amy; Delfolie, Astrid; Boyanova, Neli; Sanderink, Ger-Jan; Baldinetti, Francesca

    2016-01-01

    Background: Teriflunomide, a once-daily oral immunomodulator approved for treatment of relapsing-remitting multiple sclerosis, is eliminated slowly from plasma. If necessary to rapidly lower plasma concentrations of teriflunomide, an accelerated elimination procedure using cholestyramine or activated charcoal may be used. The current bioanalytical assay for determination of plasma teriflunomide concentration requires laboratory facilities for blood centrifugation and plasma storage. An alternative method, with potential for greater convenience, is dried blood spot (DBS) methodology. Analytical and clinical validations are required to switch from plasma to DBS (finger-prick sampling) methodology. Methods: Using blood samples from healthy subjects, an LC-MS/MS assay method for quantification of teriflunomide in DBS over a range of 0.01–10 mcg/mL was developed and validated for specificity, selectivity, accuracy, precision, reproducibility, and stability. Results were compared with those from the current plasma assay for determination of plasma teriflunomide concentration. Results: Method was specific and selective relative to endogenous compounds, with process efficiency ∼88%, and no matrix effect. Inaccuracy and imprecision for intraday and interday analyses were <15% at all concentrations tested. Quantification of teriflunomide in DBS assay was not affected by blood deposit volume and punch position within spot, and hematocrit level had a limited but acceptable effect on measurement accuracy. Teriflunomide was stable for at least 4 months at room temperature, and for at least 24 hours at 37°C with and without 95% relative humidity, to cover sampling, drying, and shipment conditions in the field. The correlation between DBS and plasma concentrations (R2 = 0.97), with an average blood to plasma ratio of 0.59, was concentration independent and constant over time. Conclusions: DBS sampling is a simple and practical method for monitoring teriflunomide concentrations. PMID:27015245

  14. Quantification of multiple simultaneously occurring nitrogen flows in the euphotic ocean

    NASA Astrophysics Data System (ADS)

    Xu, Min Nina; Wu, Yanhua; Zheng, Li Wei; Zheng, Zhenzhen; Zhao, Huade; Laws, Edward A.; Kao, Shuh-Ji

    2017-03-01

    The general features of the N cycle in the sunlit region of the ocean are well known, but methodological difficulties have previously confounded simultaneous quantification of transformation rates among the many different forms of N, e.g., ammonium (NH4+), nitrite (NO2-), nitrate (NO3-), and particulate/dissolved organic nitrogen (PN/DON). However, recent advances in analytical methodology have made it possible to employ a convenient isotope labeling technique to quantify in situ fluxes among oft-measured nitrogen species within the euphotic zone. Addition of a single 15N-labeled NH4+ tracer and monitoring of the changes in the concentrations and isotopic compositions of the total dissolved nitrogen (TDN), PN, NH4+, NO2-, and NO3- pools allowed us to quantify the 15N and 14N fluxes simultaneously. Constraints expressing the balance of 15N and 14N fluxes between the different N pools were expressed in the form of simultaneous equations, the unique solution of which via matrix inversion yielded the relevant N fluxes, including rates of NH4+, NO2-, and NO3- uptake; ammonia oxidation; nitrite oxidation; DON release; and NH4+ uptake by bacteria. The matrix inversion methodology that we used was designed specifically to analyze the results of incubations under simulated in situ conditions in the euphotic zone. By taking into consideration simultaneous fluxes among multiple N pools, we minimized potential artifacts caused by non-targeted processes in traditional source-product methods. The proposed isotope matrix method facilitates post hoc analysis of data from on-deck incubation experiments and can be used to probe effects of environmental factors (e.g., pH, temperature, and light) on multiple processes under controlled conditions.

  15. Convenient, inexpensive quantification of elemental sulfur by simultaneous in situ reduction and colorimetric detection.

    PubMed

    Kwasniewski, Misha T; Allison, Rachel B; Wilcox, Wayne F; Sacks, Gavin L

    2011-10-03

    Rapid, inexpensive, and convenient methods for quantifying elemental sulfur (S(0)) with low or sub-μgg(-1) limits of detection would be useful for a range of applications where S(0) can act as a precursor for noxious off-aromas, e.g., S(0) in pesticide residues on winegrapes or as a contaminant in drywall. However, existing quantification methods rely on toxic reagents, expensive and cumbersome equipment, or demonstrate poor selectivity. We have developed and optimized an inexpensive, rapid method (∼15 min per sample) for quantifying S(0) in complex matrices. Following dispersion of the sample in PEG-400 and buffering, S(0) is quantitatively reduced to H(2)S in situ by dithiothreitol and simultaneously quantified by commercially available colorimetric H(2)S detection tubes. By employing multiple tubes, the method demonstrated linearity from 0.03 to 100 μg S(0) g(-1) for a 5 g sample (R(2)=0.994, mean CV=6.4%), and the methodological detection limit was 0.01 μg S(0) g(-1). Interferences from sulfite or sulfate were not observed. Mean recovery of an S(0) containing sulfur fungicide in grape macerate was 84.7% with a mean CV of 10.4%. Mean recovery of S(0) in a colloidal sulfur preparation from a drywall matrix was 106.6% with a mean CV of 6.9%. Comparable methodological detection limits, sensitivity, and recoveries were achieved in grape juice, grape macerate and with 1g drywall samples, indicating that the methodology should be robust across a range of complex matrices. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Detection and quantification of microparticles from different cellular lineages using flow cytometry. Evaluation of the impact of secreted phospholipase A2 on microparticle assessment.

    PubMed

    Rousseau, Matthieu; Belleannee, Clemence; Duchez, Anne-Claire; Cloutier, Nathalie; Levesque, Tania; Jacques, Frederic; Perron, Jean; Nigrovic, Peter A; Dieude, Melanie; Hebert, Marie-Josee; Gelb, Michael H; Boilard, Eric

    2015-01-01

    Microparticles, also called microvesicles, are submicron extracellular vesicles produced by plasma membrane budding and shedding recognized as key actors in numerous physio(patho)logical processes. Since they can be released by virtually any cell lineages and are retrieved in biological fluids, microparticles appear as potent biomarkers. However, the small dimensions of microparticles and soluble factors present in body fluids can considerably impede their quantification. Here, flow cytometry with improved methodology for microparticle resolution was used to detect microparticles of human and mouse species generated from platelets, red blood cells, endothelial cells, apoptotic thymocytes and cells from the male reproductive tract. A family of soluble proteins, the secreted phospholipases A2 (sPLA2), comprises enzymes concomitantly expressed with microparticles in biological fluids and that catalyze the hydrolysis of membrane phospholipids. As sPLA2 can hydrolyze phosphatidylserine, a phospholipid frequently used to assess microparticles, and might even clear microparticles, we further considered the impact of relevant sPLA2 enzymes, sPLA2 group IIA, V and X, on microparticle quantification. We observed that if enriched in fluids, certain sPLA2 enzymes impair the quantification of microparticles depending on the species studied, the source of microparticles and the means of detection employed (surface phosphatidylserine or protein antigen detection). This study provides analytical considerations for appropriate interpretation of microparticle cytofluorometric measurements in biological samples containing sPLA2 enzymes.

  17. The Application of Cost-Benefit Analysis in Manpower Area.

    ERIC Educational Resources Information Center

    Barsby, Steven L.

    The relative efficiency of various manpower programs as seen through cost-benefit analysis is assessed, and the contribution that cost-benefit analysis has made in evaluating manpower programs is discussed, taking into account a variety of methodologies presented in different studies. Vocational rehabilitation appears to yield the highest…

  18. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  19. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Crevillén-García, D.; Power, H.

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  20. Statistical characterization of multiple-reaction monitoring mass spectrometry (MRM-MS) assays for quantitative proteomics

    PubMed Central

    2012-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) with stable isotope dilution (SID) is increasingly becoming a widely accepted assay for the quantification of proteins and peptides. These assays have shown great promise in relatively high throughput verification of candidate biomarkers. While the use of MRM-MS assays is well established in the small molecule realm, their introduction and use in proteomics is relatively recent. As such, statistical and computational methods for the analysis of MRM-MS data from proteins and peptides are still being developed. Based on our extensive experience with analyzing a wide range of SID-MRM-MS data, we set forth a methodology for analysis that encompasses significant aspects ranging from data quality assessment, assay characterization including calibration curves, limits of detection (LOD) and quantification (LOQ), and measurement of intra- and interlaboratory precision. We draw upon publicly available seminal datasets to illustrate our methods and algorithms. PMID:23176545

  1. Statistical characterization of multiple-reaction monitoring mass spectrometry (MRM-MS) assays for quantitative proteomics.

    PubMed

    Mani, D R; Abbatiello, Susan E; Carr, Steven A

    2012-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) with stable isotope dilution (SID) is increasingly becoming a widely accepted assay for the quantification of proteins and peptides. These assays have shown great promise in relatively high throughput verification of candidate biomarkers. While the use of MRM-MS assays is well established in the small molecule realm, their introduction and use in proteomics is relatively recent. As such, statistical and computational methods for the analysis of MRM-MS data from proteins and peptides are still being developed. Based on our extensive experience with analyzing a wide range of SID-MRM-MS data, we set forth a methodology for analysis that encompasses significant aspects ranging from data quality assessment, assay characterization including calibration curves, limits of detection (LOD) and quantification (LOQ), and measurement of intra- and interlaboratory precision. We draw upon publicly available seminal datasets to illustrate our methods and algorithms.

  2. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  3. Variation compensation and analysis on diaphragm curvature analysis for emphysema quantification on whole lung CT scans

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Barr, R. Graham; Yankelevitz, David F.; Henschke, Claudia I.

    2010-03-01

    CT scans allow for the quantitative evaluation of the anatomical bases of emphysema. Recently, a non-density based geometric measurement of lung diagphragm curvature has been proposed as a method for the quantification of emphysema from CT. This work analyzes variability of diaphragm curvature and evaluates the effectiveness of a compensation methodology for the reduction of this variability as compared to emphysema index. Using a dataset of 43 scan-pairs with less than a 100 day time-interval between scans, we find that the diaphragm curvature had a trend towards lower overall variability over emphysema index (95% CI:-9.7 to + 14.7 vs. -15.8 to +12.0), and that the variation of both measures was reduced after compensation. We conclude that the variation of the new measure can be considered comparable to the established measure and the compensation can reduce the apparent variation of quantitative measures successfully.

  4. Remote sensing-aided systems for snow qualification, evapotranspiration estimation, and their application in hydrologic models

    NASA Technical Reports Server (NTRS)

    Korram, S.

    1977-01-01

    The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.

  5. Analysis of linear and cyclic oligomers in polyamide-6 without sample preparation by liquid chromatography using the sandwich injection method. II. Methods of detection and quantification and overall long-term performance.

    PubMed

    Mengerink, Y; Peters, R; Kerkhoff, M; Hellenbrand, J; Omloo, H; Andrien, J; Vestjens, M; van der Wal, S

    2000-05-05

    By separating the first six linear and cyclic oligomers of polyamide-6 on a reversed-phase high-performance liquid chromatographic system after sandwich injection, quantitative determination of these oligomers becomes feasible. Low-wavelength UV detection of the different oligomers and selective post-column reaction detection of the linear oligomers with o-phthalic dicarboxaldehyde (OPA) and 3-mercaptopropionic acid (3-MPA) are discussed. A general methodology for quantification of oligomers in polymers was developed. It is demonstrated that the empirically determined group-equivalent absorption coefficients and quench factors are a convenient way of quantifying linear and cyclic oligomers of nylon-6. The overall long-term performance of the method was studied by monitoring a reference sample and the calibration factors of the linear and cyclic oligomers.

  6. Automated quantification of pancreatic β-cell mass

    PubMed Central

    Golson, Maria L.; Bush, William S.

    2014-01-01

    β-Cell mass is a parameter commonly measured in studies of islet biology and diabetes. However, the rigorous quantification of pancreatic β-cell mass using conventional histological methods is a time-consuming process. Rapidly evolving virtual slide technology with high-resolution slide scanners and newly developed image analysis tools has the potential to transform β-cell mass measurement. To test the effectiveness and accuracy of this new approach, we assessed pancreata from normal C57Bl/6J mice and from mouse models of β-cell ablation (streptozotocin-treated mice) and β-cell hyperplasia (leptin-deficient mice), using a standardized systematic sampling of pancreatic specimens. Our data indicate that automated analysis of virtual pancreatic slides is highly reliable and yields results consistent with those obtained by conventional morphometric analysis. This new methodology will allow investigators to dramatically reduce the time required for β-cell mass measurement by automating high-resolution image capture and analysis of entire pancreatic sections. PMID:24760991

  7. On a PLIF quantification methodology in a nonlinear dye response regime

    NASA Astrophysics Data System (ADS)

    Baj, P.; Bruce, P. J. K.; Buxton, O. R. H.

    2016-06-01

    A new technique of planar laser-induced fluorescence calibration is presented in this work. It accounts for a nonlinear dye response at high concentrations, an illumination light attenuation and a secondary fluorescence's influence in particular. An analytical approximation of a generic solution of the Beer-Lambert law is provided and utilized for effective concentration evaluation. These features make the technique particularly well suited for high concentration measurements, or those with a large range of concentration values, c, present (i.e. a high dynamic range of c). The method is applied to data gathered in a water flume experiment where a stream of a fluorescent dye (rhodamine 6G) was released into a grid-generated turbulent flow. Based on these results, it is shown that the illumination attenuation and the secondary fluorescence introduce a significant error into the data quantification (up to 15 and 80 %, respectively, for the case considered in this work) unless properly accounted for.

  8. Uncertainty quantification in Eulerian-Lagrangian models for particle-laden flows

    NASA Astrophysics Data System (ADS)

    Fountoulakis, Vasileios; Jacobs, Gustaaf; Udaykumar, Hs

    2017-11-01

    A common approach to ameliorate the computational burden in simulations of particle-laden flows is to use a point-particle based Eulerian-Lagrangian model, which traces individual particles in their Lagrangian frame and models particles as mathematical points. The particle motion is determined by Stokes drag law, which is empirically corrected for Reynolds number, Mach number and other parameters. The empirical corrections are subject to uncertainty. Treating them as random variables renders the coupled system of PDEs and ODEs stochastic. An approach to quantify the propagation of this parametric uncertainty to the particle solution variables is proposed. The approach is based on averaging of the governing equations and allows for estimation of the first moments of the quantities of interest. We demonstrate the feasibility of our proposed methodology of uncertainty quantification of particle-laden flows on one-dimensional linear and nonlinear Eulerian-Lagrangian systems. This research is supported by AFOSR under Grant FA9550-16-1-0008.

  9. Quantitative bioanalysis of strontium in human serum by inductively coupled plasma-mass spectrometry

    PubMed Central

    Somarouthu, Srikanth; Ohh, Jayoung; Shaked, Jonathan; Cunico, Robert L; Yakatan, Gerald; Corritori, Suzana; Tami, Joe; Foehr, Erik D

    2015-01-01

    Aim: A bioanalytical method using inductively-coupled plasma-mass spectrometry to measure endogenous levels of strontium in human serum was developed and validated. Results & methodology: This article details the experimental procedures used for the method development and validation thus demonstrating the application of the inductively-coupled plasma-mass spectrometry method for quantification of strontium in human serum samples. The assay was validated for specificity, linearity, accuracy, precision, recovery and stability. Significant endogenous levels of strontium are present in human serum samples ranging from 19 to 96 ng/ml with a mean of 34.6 ± 15.2 ng/ml (SD). Discussion & conclusion: Calibration procedures and sample pretreatment were simplified for high throughput analysis. The validation demonstrates that the method was sensitive, selective for quantification of strontium (88Sr) and is suitable for routine clinical testing of strontium in human serum samples. PMID:28031925

  10. Skeletal Muscle Ultrasound in Critical Care: A Tool in Need of Translation.

    PubMed

    Mourtzakis, Marina; Parry, Selina; Connolly, Bronwen; Puthucheary, Zudin

    2017-10-01

    With the emerging interest in documenting and understanding muscle atrophy and function in critically ill patients and survivors, ultrasonography has transformational potential for measurement of muscle quantity and quality. We discuss the importance of quantifying skeletal muscle in the intensive care unit setting. We also identify the merits and limitations of various modalities that are capable of accurately and precisely measuring muscularity. Ultrasound is emerging as a potentially powerful tool for skeletal muscle quantification; however, there are key challenges that need to be addressed in future work to ensure useful interpretation and comparability of results across diverse observational and interventional studies. Ultrasound presents several methodological challenges, and ultimately muscle quantification combined with metabolic, nutritional, and functional markers will allow optimal patient assessment and prognosis. Moving forward, we recommend that publications include greater detail on landmarking, repeated measures, identification of muscle that was not assessable, and reproducible protocols to more effectively compare results across different studies.

  11. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media.

    PubMed

    Crevillén-García, D; Power, H

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  12. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    PubMed Central

    Power, H.

    2017-01-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen–Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error. PMID:28878974

  13. Organophilic clays as a tracer to determine Erosion processes

    NASA Astrophysics Data System (ADS)

    Mentler, A.; Strauss, P.; Schomakers, J.; Hann, S.; Köllensberger, G.; Ottner, F.

    2009-04-01

    In recent years the use of new tracing techniques to measure soil erosion has gained attention. Beside long time existing isotopic methods the use of rare earth elements has been reported. We wanted to contribute to the efforts of obtaining better methods for determination surface soil movement and tested a novel method using organophilic clays as a tracer for erosion related studies. At present tests to extract organophilic clays from soil have been performed successfully using an Industrial produced organophilic bentonite (Tixogel TVZ, Süd-Chemie) treated with quaternary ammonium surfactants. A liquid extraction method with barium ions (Ba2+) and methanol was used to extract the n-alkyl ammonium compounds from the inter crystal layers of the modified Bentonite. To increase extraction efficiency, an ultrasound device was used (UW 2200 Bandelin, 10.000 cycles per second, vibration amplitude 54 µm, sonification time of one minute). This procedure lead to a recovery rate of about 85% for the organophilic bentonite. This was clearly superior to alternative extraction methods such as acetonitrile in different mixing ratios. Quantification of the extracted surfactants was performed via high performance liquid chromatography - mass spectrometry (HPLC-MS, Agilent 1200 SL HPLC and 6220 time-of-flight MS). The mass spectra of this industrial produced organophilic clay mineral showed four different molecular masses (M+H+ of 304.30, 332.33, 360.36 and 388.39. The four substances could be separated by HPLC (20 x 2 mm Zorbax C18 reversed phase column, 0.5 mL/min isocratic flow with 90% acetonitrile and 0.1% formic acid in water, run time of 7 minutes). The linear working range of the method was 5 to 1000 µg/L, with a limit of quantification of 1 µg/L n-alkyl ammonium compound. All four compounds of the Tixogel were extracted with identical extraction efficiencies and are hence suitable for accurate quantification procedures. Next steps of the methodology to develop are the application of the organophilic clays in an indoor rainfall simulation experiment at a small scale of 2 m². At present the methodology has been tested only for one particular soil. Future tests will be performed to see if the chosen methodology needs soil specific treatment when applied to more soils of different textural composition.

  14. An automated construction of error models for uncertainty quantification and model calibration

    NASA Astrophysics Data System (ADS)

    Josset, L.; Lunati, I.

    2015-12-01

    To reduce the computational cost of stochastic predictions, it is common practice to rely on approximate flow solvers (or «proxy»), which provide an inexact, but computationally inexpensive response [1,2]. Error models can be constructed to correct the proxy response: based on a learning set of realizations for which both exact and proxy simulations are performed, a transformation is sought to map proxy into exact responses. Once the error model is constructed a prediction of the exact response is obtained at the cost of a proxy simulation for any new realization. Despite its effectiveness [2,3], the methodology relies on several user-defined parameters, which impact the accuracy of the predictions. To achieve a fully automated construction, we propose a novel methodology based on an iterative scheme: we first initialize the error model with a small training set of realizations; then, at each iteration, we add a new realization both to improve the model and to evaluate its performance. More specifically, at each iteration we use the responses predicted by the updated model to identify the realizations that need to be considered to compute the quantity of interest. Another user-defined parameter is the number of dimensions of the response spaces between which the mapping is sought. To identify the space dimensions that optimally balance mapping accuracy and risk of overfitting, we follow a Leave-One-Out Cross Validation. Also, the definition of a stopping criterion is central to an automated construction. We use a stability measure based on bootstrap techniques to stop the iterative procedure when the iterative model has converged. The methodology is illustrated with two test cases in which an inverse problem has to be solved and assess the performance of the method. We show that an iterative scheme is crucial to increase the applicability of the approach. [1] Josset, L., and I. Lunati, Local and global error models for improving uncertainty quantification, Math.ematical Geosciences, 2013 [2] Josset, L., D. Ginsbourger, and I. Lunati, Functional Error Modeling for uncertainty quantification in hydrogeology, Water Resources Research, 2015 [3] Josset, L., V. Demyanov, A.H. Elsheikhb, and I. Lunati, Accelerating Monte Carlo Markov chains with proxy and error models, Computer & Geosciences, 2015 (In press)

  15. Adaptation of Communicative Language Teaching Methodology to an English Textbook for English Language Learning of NIDA Students

    ERIC Educational Resources Information Center

    West, Andrew J.

    2016-01-01

    In this paper, the researcher focuses on assessing the language learning benefits for students of adapting the communicative language teaching (CLT) methodology to an English textbook, a methodology that, according to Richards (2006), Littlewood (2008) and others, is influential in shaping second language learning worldwide. This paper is intended…

  16. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    PubMed

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement must be performed. Moreover, for some reference genes no sufficient information on copy number in and among genomes of different lines is available, making adequate quantification difficult. Once developed, the method was subsequently validated according to IUPAC and ISO 5725 guidelines. Thirteen laboratories from 8 EU countries participated in the trial. Eleven laboratories provided results complying with the predefined study requirements. Repeatability (RSDr) values ranged from 8.7 to 15.9%, with a mean value of 12%. Reproducibility (RSDR) values ranged from 16.3 to 25.5%, with a mean value of 21%. Following Codex Alimentarius Committee guidelines, both the limits of detection and quantitation were determined to be <0.1%.

  17. A comparison of three methods for measuring local urban tree canopy cover

    Treesearch

    Kristen L. King; Dexter H. Locke

    2013-01-01

    Measurements of urban tree canopy cover are crucial for managing urban forests and required for the quantification of the benefits provided by trees. These types of data are increasingly used to secure funding and justify large-scale planting programs in urban areas. Comparisons of tree canopy measurement methods have been conducted before, but a rapidly evolving set...

  18. Metabolite ratios to assumed stable creatine level may confound the quantification of proton brain MR spectroscopy.

    PubMed

    Li, Belinda S Y; Wang, Hao; Gonen, Oded

    2003-10-01

    In localized brain proton MR spectroscopy ((1)H-MRS), metabolites' levels are often expressed as ratios, rather than as absolute concentrations. Frequently, their denominator is the creatine [Cr], which level is explicitly assumed to be stable in normal as well as in many pathologic states. The rationale is that ratios self-correct for imager and localization method differences, gain instabilities, regional susceptibility variations and partial volume effects. The implicit assumption is that these benefits are worth their cost(w)-(w) propagation of the individual variation of each of the ratio's components. To test this hypothesis, absolute levels of N-acetylaspartate [NAA], choline [Cho] and [Cr] were quantified in various regions of the brains of 8 volunteers, using 3-dimensional (3D) (1)H-MRS at 1.5 T. The results show that in over 50% of approximately 2000 voxels examined, [NAA]/[Cr] and [Cho]/[Cr] exhibited higher coefficients of variations (CV) than [NAA] and [Cho] individually. Furthermore, in approximately 33% of these voxels, the ratios' CVs exceeded even the combined constituents' CVs. Consequently, basing metabolite quantification on ratios and assuming stable [Cr] introduces more variability into (1)H-MRS than it prevents. Therefore, its cost exceeds the benefit.

  19. Quantification of menadione from plasma and urine by a novel cysteamine-derivatization based UPLC-MS/MS method.

    PubMed

    Yuan, Teng-Fei; Wang, Shao-Ting; Li, Yan

    2017-09-15

    Menadione, as the crucial component of vitamin Ks, possessed significant nutritional and clinical values. However, there was still lack of favourable quantification strategies for it to date. For improvement, a novel cysteamine derivatization based UPLC-MS/MS method was presented in this work. The derivatizating reaction was proved non-toxic, easy-handling and high-efficient, which realized the MS detection of menadione under positive mode. Benefitting from the excellent sensitivity of the derivatizating product as well as the introduction of the stable isotope dilution technique, the quantification could be achieved in the range of 0.05-50.0ng/mL for plasma and urine matrixes with satisfied accuracy and precision. After analysis of the samples from healthy volunteers after oral administration of menadione sodium bisulfite tablets, the urinary free menadione was quantified for the very first time. We believe the progress in this work could largely promote the exploration of the metabolic mechanism of vitamin K in vivo. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Simultaneous detection of valine and lactate using MEGA-PRESS editing in pyogenic brain abscess.

    PubMed

    Lange, Thomas; Ko, Cheng-Wen; Lai, Ping-Hong; Dacko, Michael; Tsai, Shang-Yueh; Buechert, Martin

    2016-12-01

    Valine and lactate have been recognized as important metabolic markers to diagnose brain abscess by means of MRS. However, in vivo unambiguous detection and quantification is hampered by macromolecular contamination. In this work, MEGA-PRESS difference editing of valine and lactate is proposed. The method is validated in vitro and applied for quantitative in vivo experiments in one healthy subject and two brain abscess patients. It is demonstrated that with this technique the overlapping lipid signal can be reduced by more than an order of magnitude and thus the robustness of valine and lactate detection in vivo can be enhanced. Quantification of the two abscess MEGA-PRESS spectra yielded valine/lactate concentration ratios of 0.10 and 0.27. These ratios agreed with the concentration ratios determined from concomitantly acquired short-T E PRESS data and were in line with literature values. The quantification accuracy of lactate (as measured with Cramér-Rao lower bounds in LCModel processing) was better for MEGA-PRESS than for short-T E PRESS in all acquired in vivo datasets. The Cramér-Rao lower bounds of valine were only better for MEGA-PRESS in one of the two abscess cases, while in the other case coediting of isoleucine confounded the quantification in the MEGA-PRESS analysis. MEGA-PRESS and short-T E PRESS should be combined for unambiguous quantification of amino acids in abscess measurements. Simultaneous valine/lactate MEGA-PRESS editing might benefit the distinction of brain abscesses from tumors, and further categorization of bacteria with reasonable sensitivity and specificity. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Developing and Implementing the Data Mining Algorithms in RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less

  2. Multi-residue method for the determination of antibiotics and some of their metabolites in seafood.

    PubMed

    Serra-Compte, Albert; Álvarez-Muñoz, Diana; Rodríguez-Mozaz, Sara; Barceló, Damià

    2017-06-01

    The presence of antibiotics in seafood for human consumption may pose a risk for consumers. A methodology for the analysis of antibiotics in seafood based on QuEChERS (quick, easy, cheap, effective, rugged, and safe) extraction, followed by detection and quantification using liquid chromatography coupled to mass spectrometry was developed. The analytical method was evaluated for the determination of 23 antibiotics (including parent compounds and some metabolites) in fish, mussels and clams. Recoveries ranged between 30% and 70% for most of the compounds and method detection and quantification limits (MDLs and MQLs) were between 0.01 and 0.31 ng/g dry weigh (dw) and 0.02-1.03 ng/g (dw) respectively. Real seafood samples were analysed using this method. Nine antibiotics were found at levels above MDLs; however none of them exceed the maximum residue limits (MRL) established by the authorities. Tetracycline was the most ubiquitous compound, presenting also the highest concentration: 5.63 ng/g (dw) in fish from Netherlands. In addition, an alternative technique based on microbial growth inhibition was explored as semiquantitative detection method of antibiotics in seafood. This methodology could be applied as a fast screening technique for the detection of macrolides and β-lactams in seafood but further research is needed for other antibiotics families. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A risk assessment methodology to evaluate the risk failure of managed aquifer recharge in the Mediterranean Basin

    NASA Astrophysics Data System (ADS)

    Rodríguez-Escales, Paula; Canelles, Arnau; Sanchez-Vila, Xavier; Folch, Albert; Kurtzman, Daniel; Rossetto, Rudy; Fernández-Escalante, Enrique; Lobo-Ferreira, João-Paulo; Sapiano, Manuel; San-Sebastián, Jon; Schüth, Christoph

    2018-06-01

    Managed aquifer recharge (MAR) can be affected by many risks. Those risks are related to different technical and non-technical aspects of recharge, like water availability, water quality, legislation, social issues, etc. Many other works have acknowledged risks of this nature theoretically; however, their quantification and definition has not been developed. In this study, the risk definition and quantification has been performed by means of fault trees and probabilistic risk assessment (PRA). We defined a fault tree with 65 basic events applicable to the operation phase. After that, we have applied this methodology to six different managed aquifer recharge sites located in the Mediterranean Basin (Portugal, Spain, Italy, Malta, and Israel). The probabilities of the basic events were defined by expert criteria, based on the knowledge of the different managers of the facilities. From that, we conclude that in all sites, the perception of the expert criteria of the non-technical aspects were as much or even more important than the technical aspects. Regarding the risk results, we observe that the total risk in three of the six sites was equal to or above 0.90. That would mean that the MAR facilities have a risk of failure equal to or higher than 90 % in the period of 2-6 years. The other three sites presented lower risks (75, 29, and 18 % for Malta, Menashe, and Serchio, respectively).

  4. Supercritical fluid chromatography with photodiode array detection for pesticide analysis in papaya and avocado samples.

    PubMed

    Pano-Farias, Norma S; Ceballos-Magaña, Silvia G; Gonzalez, Jorge; Jurado, José M; Muñiz-Valencia, Roberto

    2015-04-01

    To improve the analysis of pesticides in complex food matrices with economic importance, alternative chromatographic techniques, such as supercritical fluid chromatography, can be used. Supercritical fluid chromatography has barely been applied for pesticide analysis in food matrices. In this paper, an analytical method using supercritical fluid chromatography coupled to a photodiode array detection has been established for the first time for the quantification of pesticides in papaya and avocado. The extraction of methyl parathion, atrazine, ametryn, carbofuran, and carbaryl was performed through the quick, easy, cheap, effective, rugged, and safe methodology. The method was validated using papaya and avocado samples. For papaya, the correlation coefficient values were higher than 0.99; limits of detection and quantification ranged from 130-380 and 220-640 μg/kg, respectively; recovery values ranged from 72.8-94.6%; precision was lower than 3%. For avocado, limit of detection values were ˂450 μg/kg; precision was lower than 11%; recoveries ranged from 50.0-94.2%. Method feasibility was tested for lime, banana, mango, and melon samples. Our results demonstrate that the proposed method is applicable to methyl parathion, atrazine, ametryn, and carbaryl, toxics pesticides used worldwide. The methodology presented in this work could be applicable to other fruits. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Towards tributyltin quantification in natural water at the Environmental Quality Standard level required by the Water Framework Directive.

    PubMed

    Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola

    2016-11-01

    The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toro, Javier, E-mail: jjtoroca@unal.edu.co; Requena, Ignacio, E-mail: requena@decsai.ugr.es; Duarte, Oscar, E-mail: ogduartev@unal.edu.co

    In environmental impact assessment, qualitative methods are used because they are versatile and easy to apply. This methodology is based on the evaluation of the strength of the impact by grading a series of qualitative attributes that can be manipulated by the evaluator. The results thus obtained are not objective, and all too often impacts are eliminated that should be mitigated with corrective measures. However, qualitative methodology can be improved if the calculation of Impact Importance is based on the characteristics of environmental factors and project activities instead on indicators assessed by evaluators. In this sense, this paper proposes themore » inclusion of the vulnerability of environmental factors and the potential environmental impact of project activities. For this purpose, the study described in this paper defined Total Impact Importance and specified a quantification procedure. The results obtained in the case study of oil drilling in Colombia reflect greater objectivity in the evaluation of impacts as well as a positive correlation between impact values, the environmental characteristics at and near the project location, and the technical characteristics of project activities. -- Highlights: • Concept of vulnerability has been used to calculate the importance impact assessment. • This paper defined Total Impact Importance and specified a quantification procedure. • The method includes the characteristics of environmental and project activities. • The application has shown greater objectivity in the evaluation of impacts. • Better correlation between impact values, environment and the project has been shown.« less

  7. Benefits estimates of highway capital improvements with uncertain parameters.

    DOT National Transportation Integrated Search

    2006-01-01

    This report warrants consideration in the development of goals, performance measures, and standard cost-benefit methodology required of transportation agencies by the Virginia 2006 Appropriations Act. The Virginia Department of Transportation has beg...

  8. Performance-cost evaluation methodology for ITS equipment deployment

    DOT National Transportation Integrated Search

    2000-09-01

    Although extensive Intelligent Transportation Systems (ITS) technology is being deployed in the field, little analysis is being performed to evaluate the benefits of implementation schemes. Benefit analysis is particularly in need for one popular ITS...

  9. Incorporating ITS into corridor planning : Seattle case study

    DOT National Transportation Integrated Search

    1999-08-01

    The goals of this study were to develop a methodology for incorporating Intelligent Transportation Systems (ITS) into the transportation planning process and apply the methodology to estimate ITS costs and benefits for one case study. A major result ...

  10. Highway User Benefit Analysis System Research Project #128

    DOT National Transportation Integrated Search

    2000-10-01

    In this research, a methodology for estimating road user costs of various competing alternatives was developed. Also, software was developed to calculate the road user cost, perform economic analysis and update cost tables. The methodology is based o...

  11. Incorporating ITS into corridor planning : Seattle case study

    DOT National Transportation Integrated Search

    1999-06-01

    The goals of this study were to develop a methodology for incorporating Intelligent Transportation Systems (ITS) into the transportation planning process and apply the methodology to estimate ITS costs and benefits for one case study. A major result ...

  12. HBsAg quantification to predict natural history and treatment outcome in chronic hepatitis B patients.

    PubMed

    Martinot-Peignoux, Michelle; Asselah, Tarik; Marcellin, Patrick

    2013-08-01

    There is a growing interest in serum HBsAg quantification (qHbsAg). HBsAg titers are negatively correlated with liver fibrosis in HBeAg(+) patients. In HBeAg(-) HBsAg level <1000 IU/ml and HBV-DNA titer <2000 IU/ml accurately identify inactive carriers. During PEG-IFN treatment qHBsAg identifies patients with no benefit from therapy at week 12, allowing stopping or switched- "week 12 stopping rule". During nucleos(t)ide analogues the role of qHBsAg need to be clarified. In clinical practice qHBsAg is a simple and reproducible tool that may be used in association with HBV-DNA to classify patients during the natural history of HBV and to monitor therapy. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. A Framework for the Evaluation of the Cost and Benefits of Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Greg Young; Abbey, Chad; Joos, Geza

    2011-07-15

    A Microgrid is recognized as an innovative technology to help integrate renewables into distribution systems and to provide additional benefits to a variety of stakeholders, such as offsetting infrastructure investments and improving the reliability of the local system. However, these systems require additional investments for control infrastructure, and as such, additional costs and the anticipated benefits need to be quantified in order to determine whether the investment is economically feasible. This paper proposes a methodology for systematizing and representing benefits and their interrelationships based on the UML Use Case paradigm, which allows complex systems to be represented in a concise,more » elegant format. This methodology is demonstrated by determining the economic feasibility of a Microgrid and Distributed Generation installed on a typical Canadian rural distribution system model as a case study. The study attempts to minimize the cost of energy served to the community, considering the fixed costs associated with Microgrids and Distributed Generation, and suggests benefits to a variety of stakeholders.« less

  14. The Monetary Rewards of Migration Within the U.S.

    ERIC Educational Resources Information Center

    Wertheimer, Richard F., II

    This study focuses on the economic benefits derived by the migrant from migration. The report presents a methodology for computing monetary benefits, an estimate of these benefits, and implications of the findings for public policy. Included are a discussion of domestic migration and public policy, an economic theory of migration, an explanation…

  15. The Service Learning Projects: Stakeholder Benefits and Potential Class Topics

    ERIC Educational Resources Information Center

    Rutti, Raina M.; LaBonte, Joanne; Helms, Marilyn Michelle; Hervani, Aref Agahei; Sarkarat, Sy

    2016-01-01

    Purpose: The purpose of this paper is to summarize the benefits of including a service learning project in college classes and focusses on benefits to all stakeholders, including students, community, and faculty. Design/methodology/approach: Using a snowball approach in academic databases as well as a nominal group technique to poll faculty, key…

  16. Rapid monitoring of intermediate states and mass balance of nitrogen during denitrification by means of cavity enhanced Raman multi-gas sensing.

    PubMed

    Keiner, Robert; Herrmann, Martina; Küsel, Kirsten; Popp, Jürgen; Frosch, Torsten

    2015-03-15

    The comprehensive investigation of changes in N cycling has been challenging so far due to difficulties with measuring gases such as N2 and N2O simultaneously. In this study we introduce cavity enhanced Raman gas spectroscopy as a new analytical methodology for tracing the stepwise reduction of (15)N-labelled nitrate by the denitrifying bacteria Pseudomonas stutzeri. The unique capabilities of Raman multi-gas analysis enabled real-time, continuous, and non-consumptive quantification of the relevant gases ((14)N2, (14)N2O, O2, and CO2) and to trace the fate of (15)N-labeled nitrate substrate ((15)N2, (15)N2O) added to a P. stutzeri culture with one single measurement. Using this new methodology, we could quantify the kinetics of the formation and degradation for all gaseous compounds (educts and products) and thus study the reaction orders. The gas quantification was complemented with the analysis of nitrate and nitrite concentrations for the online monitoring of the total nitrogen element budget. The simultaneous quantification of all gases also enabled the contactless and sterile online acquisition of the pH changes in the P. stutzeri culture by the stoichiometry of the redox reactions during denitrification and the CO2-bicarbonate equilibrium. Continuous pH monitoring - without the need to insert an electrode into solution - elucidated e.g. an increase in the slope of the pH value coinciding with an accumulation of nitrite, which in turn led to a temporary accumulation of N2O, due to an inhibition of nitrous oxide reductase. Cavity enhanced Raman gas spectroscopy has a high potential for the assessment of denitrification processes and can contribute substantially to our understanding of nitrogen cycling in both natural and agricultural systems. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Inorganic carbon and fossil organic carbon are source of bias for quantification of sequestered carbon in mine spoil

    NASA Astrophysics Data System (ADS)

    Vindušková, Olga; Frouz, Jan

    2016-04-01

    Carbon sequestration in mine soils has been studied as a possibility to mitigate the rising atmospheric CO2 levels and to improve mine soil quality (Vindu\\vsková and Frouz, 2013). Moreover, these soils offer an unique opportunity to study soil carbon dynamics using the chronosequence approach (using a set of sites of different age on similar parent material). However, quantification of sequestered carbon in mine soils is often complicated by fossil organic carbon (e.g., from coal or kerogen) or inorganic carbon present in the spoil. We present a methodology for quantification of both of these common constituents of mine soils. Our recommendations are based on experiments done on post-mining soils in Sokolov basin, Czech Republic. Here, fossil organic carbon is present mainly as kerogen Type I and II and represents 2-6 wt.% C in these soils. Inorganic carbon in these soils is present mainly as siderite (FeCO3), calcite (CaCO3), and dolomite (CaMg(CO3)2). All of these carbonates are often found in the overburden of coal seams thus being a common constituent of post-mining soils in the world. Vindu\\vsková O, Frouz J, 2013. Soil carbon accumulation after open-cast coal and oil shale mining in Northern Hemisphere: a quantitative review. ENVIRONMENTAL EARTH SCIENCES, 69: 1685-1698. Vindu\\vsková O, Dvořáček V, Prohasková A, Frouz J. 2014. Distinguishing recent and fossil organic matter - A critical step in evaluation of post-mining soil development - using near infrared spectroscopy. ECOLOGICAL ENGINEERING. 73: 643-648. Vindu\\vsková O, Sebag D, Cailleau G, Brus J, Frouz J. 2015. Methodological comparison for quantitative analysis of fossil and recently derived carbon in mine soils with high content of aliphatic kerogen. ORGANIC GEOCHEMISTRY, 89-90:14-22.

  18. The role of SO{sub 4}{sup 2−} surface distribution in arsenic removal by iron oxy-hydroxides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tresintsi, S.; Simeonidis, K., E-mail: ksime@physics.auth.gr; Department of Mechanical Engineering, University of Thessaly, 38334 Volos

    2014-05-01

    This study investigates the contribution of chemisorbed SO{sub 4}{sup 2−} in improving arsenic removal properties of iron oxy-hydroxides through an ion-exchange mechanism. An analytical methodology was developed for the accurate quantification of sulfate ion (SO{sub 4}{sup 2−}) distribution onto the surface and structural compartments of iron oxy-hydroxides synthesized by FeSO{sub 4} precipitation. The procedure is based on the sequential determination of SO{sub 4}{sup 2−} presence in the diffuse and Stern layers, and the structure of these materials as defined by the sulfate-rich environments during the reaction and the variation in acidity (pH 3–12). Physically sorbed SO{sub 4}{sup 2−}, extracted inmore » distilled water, and physically/chemically adsorbed ions on the oxy-hydroxide's surface leached by a 5 mM NaOH solution, were determined using ion chromatography. Total sulfate content was gravimetrically measured by precipitation as BaSO{sub 4}. To validate the suggested method, results were verified by X-ray photoelectron and Fourier-transformed infrared spectroscopy. Results showed that low precipitation pH-values favor the incorporation of sulfate ions into the structure and the inner double layer, while under alkaline conditions ions shift to the diffuse layer. - Graphical abstract: An analytical methodology for the accurate quantification of sulfate ions (SO{sub 4}{sup 2−}) distribution onto the diffuse layer, the Stern layer and the structure of iron oxy-hydroxides used as arsenic removal agents. - Highlights: • Quantification of sulfate ions presence in FeOOH surface compartments. • Preparation pH defines the distribution of sulfates. • XPS and FTIR verify the presence of SO{sub 4}{sup 2−} in the structure, the Stern layer the diffuse layer of FeOOH. • Chemically adsorbed sulfates control the arsenic removal efficiency of iron oxyhydroxides.« less

  19. A Methodology to Assess the Strategic Benefits of New Production Technologies

    DTIC Science & Technology

    1990-01-01

    does not capture these strategic advantage , that make new technologies attractive. Our methodology integrates investments in new pro- duction...capture these strategic advantages that make new technologies attractive. Our methodology integrates investments in new production technologies into the...focus on reducing labor costs, does not capture these strategic advantages that make new technologies attractive. In many cases, retaining the existing

  20. Broadening the Study of Participation in the Life Sciences: How Critical Theoretical and Mixed-Methodological Approaches Can Enhance Efforts to Broaden Participation

    ERIC Educational Resources Information Center

    Metcalf, Heather

    2016-01-01

    This research methods Essay details the usefulness of critical theoretical frameworks and critical mixed-methodological approaches for life sciences education research on broadening participation in the life sciences. First, I draw on multidisciplinary research to discuss critical theory and methodologies. Then, I demonstrate the benefits of these…

  1. Applications of Quantum Cascade Laser Spectroscopy in the Analysis of Pharmaceutical Formulations.

    PubMed

    Galán-Freyle, Nataly J; Pacheco-Londoño, Leonardo C; Román-Ospino, Andrés D; Hernandez-Rivera, Samuel P

    2016-09-01

    Quantum cascade laser spectroscopy was used to quantify active pharmaceutical ingredient content in a model formulation. The analyses were conducted in non-contact mode by mid-infrared diffuse reflectance. Measurements were carried out at a distance of 15 cm, covering the spectral range 1000-1600 cm(-1) Calibrations were generated by applying multivariate analysis using partial least squares models. Among the figures of merit of the proposed methodology are the high analytical sensitivity equivalent to 0.05% active pharmaceutical ingredient in the formulation, high repeatability (2.7%), high reproducibility (5.4%), and low limit of detection (1%). The relatively high power of the quantum-cascade-laser-based spectroscopic system resulted in the design of detection and quantification methodologies for pharmaceutical applications with high accuracy and precision that are comparable to those of methodologies based on near-infrared spectroscopy, attenuated total reflection mid-infrared Fourier transform infrared spectroscopy, and Raman spectroscopy. © The Author(s) 2016.

  2. A solid phase extraction-ion chromatography with conductivity detection procedure for determining cationic surfactants in surface water samples.

    PubMed

    Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek

    2013-11-15

    A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.

  3. A Practical Risk Assessment Methodology for Safety-Critical Train Control Systems

    DOT National Transportation Integrated Search

    2009-07-01

    This project proposes a Practical Risk Assessment Methodology (PRAM) for analyzing railroad accident data and assessing the risk and benefit of safety-critical train control systems. This report documents in simple steps the algorithms and data input...

  4. Improved Methodology for Benefit Estimation of Preservation Projects

    DOT National Transportation Integrated Search

    2018-04-01

    This research report presents an improved process for evaluating the benefits and economic tradeoffs associated with a variety of highway preservation projects. It includes a summary of results from a comprehensive phone survey concerning the use and...

  5. Estimation of potential safety benefits for pedestrian crash avoidance/mitigation systems.

    DOT National Transportation Integrated Search

    2017-04-01

    This report presents and exercises a methodology to estimate the effectiveness and potential safety benefits of production pedestrian crash avoidance/mitigation systems. The analysis focuses on light vehicles moving forward and striking a pedestrian ...

  6. Lifetime costing of the body-in-white: Steel vs. aluminum

    NASA Astrophysics Data System (ADS)

    Han, Helen N.; Clark, Joel P.

    1995-05-01

    In order to make informed material choice decisions and to derive the maximum benefit from the use of alternative materials, the automobile producer must understand the full range of costs and benefits for each material. It is becoming clear that the conventional cost-benefit analysis structure currently used by the automotive industry must be broadened to include nontraditional costs such as the environmental externalities associated with the use of existing and potential automotive technologies. This article develops a methodology for comparing the costs and benefits associated with the use of alternative materials in automotive applications by focusing on steel and aluminum in the unibody body-in-white. Authors' Note: This is the first of two articles documenting a methodology for evaluating the lifetime monetary and environmental costs of alternative materials in automotive applications. This article addresses the traditional money costs while a subsequent paper, which is planned for the August issue, will address the environmental externalities.

  7. Baseline Study Methodology for Future Phases of Research on Nuclear Power Plant Control Room Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Blanc, Katya Lee; Bower, Gordon Ross; Hill, Rachael Ann

    In order to provide a basis for industry adoption of advanced technologies, the Control Room Upgrades Benefits Research Project will investigate the benefits of including advanced technologies as part of control room modernization This report describes the background, methodology, and research plan for the first in a series of full-scale studies to test the effects of advanced technology in NPP control rooms. This study will test the effect of Advanced Overview Displays in the partner Utility’s control room simulator

  8. Can Regulatory Bodies Expect Efficient Help from Formal Methods?

    NASA Technical Reports Server (NTRS)

    Lopez Ruiz, Eduardo R.; Lemoine, Michel

    2010-01-01

    In the context of EDEMOI - a French national project that proposed the use of semiformal and formal methods to infer the consistency and robustness of aeronautical regulations through the analysis of faithfully representative models- a methodology had been suggested (and applied) to different (safety and security-related) aeronautical regulations. This paper summarizes the preliminary results of this experience by stating which were the methodology s expected benefits, from a scientific point of view, and which are its useful benefits, from a regulatory body s point of view.

  9. Impact of knowledge and misconceptions on benefit and risk perception of CCS.

    PubMed

    Wallquist, Lasse; Visschers, Vivianne H M; Siegrist, Michael

    2010-09-01

    Carbon Dioxide Capture and Storage (CCS) is assumed to be one of the key technologies in the mitigation of climate change. Public acceptance may have a strong impact on the progress of this technology. Benefit perception and risk perception are known to be important determinants of public acceptance of CCS. In this study, the prevalence and effect of cognitive concepts underlying laypeople's risk perception and benefit perception of CCS were examined in a representative survey (N=654) in Switzerland. Results confirm findings from previous qualitative studies and show a quantification of a variety of widespread intuitive concepts that laypeople hold about storage mechanisms as well as about leakage and socioeconomic issues, which all appeared to influence risk perception and benefit perception. The perception of an overpressurized reservoir and concerns about diffuse impacts furthermore amplified risk perception. Appropriate images about storage mechanisms and climate change awareness were increasing the perception of benefits. Knowledge about CO2 seemed to lower both perceived benefits and perceived risks. Implications for risk communication and management are discussed.

  10. Pediatric echocardiography: new developments and applications.

    PubMed

    Ge, Shuping

    2013-04-01

    In this Special Issue of the Journal, 6 review articles that represent the new developments and applications of echocardiography for diagnosis and assessment of congenital heart disease from fetus to adult are included. The goal is to provide an updated review of the evidence for the current and potential use of some of the new methodologies, i.e. fetal echocardiography, tissue Doppler imaging, strain imaging by speckle tracking imaging, ventricular synchrony, quantification using real time three-dimensional (3D) echocardiography, and 3D echocardiography for adults with congenital heart disease. We hope this effort will provide an impetus for more investigation and ultimately clinical application of these new methodologies to improve the care of those with congenital and acquired heart diseases in the pediatric population and beyond. © 2013, Wiley Periodicals, Inc.

  11. Inequality spectra

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2017-03-01

    Inequality indices are widely applied in economics and in the social sciences as quantitative measures of the socioeconomic inequality of human societies. The application of inequality indices extends to size-distributions at large, where these indices can be used as general gauges of statistical heterogeneity. Moreover, as inequality indices are plentiful, arrays of such indices facilitate high-detail quantification of statistical heterogeneity. In this paper we elevate from arrays of inequality indices to inequality spectra: continuums of inequality indices that are parameterized by a single control parameter. We present a general methodology of constructing Lorenz-based inequality spectra, apply the general methodology to establish four sets of inequality spectra, investigate the properties of these sets, and show how these sets generalize known inequality gauges such as: the Gini index, the extended Gini index, the Rényi index, and hill curves.

  12. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology.

    PubMed

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-09-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.

  13. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology

    PubMed Central

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-01-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs. PMID:26517655

  14. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  15. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  16. Titan Science Return Quantification

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Lincoln, William

    2014-01-01

    Each proposal for a NASA mission concept includes a Science Traceability Matrix (STM), intended to show that what is being proposed would contribute to satisfying one or more of the agency's top-level science goals. But the information traditionally provided cannot be used directly to quantitatively compare anticipated science return. We added numerical elements to NASA's STM and developed a software tool to process the data. We then applied this methodology to evaluate a group of competing concepts for a proposed mission to Saturn's moon, Titan.

  17. Regional energy planning: Some suggestions to public administration

    NASA Astrophysics Data System (ADS)

    Sozzi, R.

    A methodology is proposed to estimate the relevant data and to improve the energy efficiency in regional energy planning. The quantification of the regional energy system is subdivided in three independent parameters which are separetely estimated: energy demand, energy consumption, and transformation capacity. Definitions and estimating procedures are given. The optimization of the regional planning includes the application, wherever possible, of the technologies which centralize the space-heating energy production or combine the production of electric energy with space-heating energy distribution.

  18. Identification and Quantification of N-Acyl Homoserine Lactones Involved in Bacterial Communication by Small-Scale Synthesis of Internal Standards and Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry.

    PubMed

    Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas

    2017-12-01

    N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. Graphical abstract ᅟ.

  19. Analytical method for the evaluation of the outdoor air contamination by emerging pollutants using tree leaves as bioindicators.

    PubMed

    Barroso, Pedro José; Martín, Julia; Santos, Juan Luis; Aparicio, Irene; Alonso, Esteban

    2018-01-01

    In this work, an analytical method, based on sonication-assisted extraction, clean-up by dispersive solid-phase extraction and determination by liquid chromatography-tandem mass spectrometry, has been developed and validated for the simultaneous determination of 15 emerging pollutants in leaves from four ornamental tree species. Target compounds include perfluorinated organic compounds, plasticizers, surfactants, brominated flame retardant, and preservatives. The method was optimized using Box-Behnken statistical experimental design with response surface methodology and validated in terms of recovery, accuracy, precision, and method detection and quantification limits. Quantification of target compounds was carried out using matrix-matched calibration curves. The highest recoveries were achieved for the perfluorinated organic compounds (mean values up to 87%) and preservatives (up to 88%). The lowest recoveries were achieved for plasticizers (51%) and brominated flame retardant (63%). Method detection and quantification limits were in the ranges 0.01-0.09 ng/g dry matter (dm) and 0.02-0.30 ng/g dm, respectively, for most of the target compounds. The method was successfully applied to the determination of the target compounds on leaves from four tree species used as urban ornamental trees (Citrus aurantium, Celtis australis, Platanus hispanica, and Jacaranda mimosifolia). Graphical abstract Analytical method for the biomonitorization of emerging pollutants in outdoor air.

  20. Direct and Absolute Quantification of over 1800 Yeast Proteins via Selected Reaction Monitoring*

    PubMed Central

    Lawless, Craig; Holman, Stephen W.; Brownridge, Philip; Lanthaler, Karin; Harman, Victoria M.; Watkins, Rachel; Hammond, Dean E.; Miller, Rebecca L.; Sims, Paul F. G.; Grant, Christopher M.; Eyers, Claire E.; Beynon, Robert J.

    2016-01-01

    Defining intracellular protein concentration is critical in molecular systems biology. Although strategies for determining relative protein changes are available, defining robust absolute values in copies per cell has proven significantly more challenging. Here we present a reference data set quantifying over 1800 Saccharomyces cerevisiae proteins by direct means using protein-specific stable-isotope labeled internal standards and selected reaction monitoring (SRM) mass spectrometry, far exceeding any previous study. This was achieved by careful design of over 100 QconCAT recombinant proteins as standards, defining 1167 proteins in terms of copies per cell and upper limits on a further 668, with robust CVs routinely less than 20%. The selected reaction monitoring-derived proteome is compared with existing quantitative data sets, highlighting the disparities between methodologies. Coupled with a quantification of the transcriptome by RNA-seq taken from the same cells, these data support revised estimates of several fundamental molecular parameters: a total protein count of ∼100 million molecules-per-cell, a median of ∼1000 proteins-per-transcript, and a linear model of protein translation explaining 70% of the variance in translation rate. This work contributes a “gold-standard” reference yeast proteome (including 532 values based on high quality, dual peptide quantification) that can be widely used in systems models and for other comparative studies. PMID:26750110

  1. Spectral Analysis of Dynamic PET Studies: A Review of 20 Years of Method Developments and Applications.

    PubMed

    Veronese, Mattia; Rizzo, Gaia; Bertoldo, Alessandra; Turkheimer, Federico E

    2016-01-01

    In Positron Emission Tomography (PET), spectral analysis (SA) allows the quantification of dynamic data by relating the radioactivity measured by the scanner in time to the underlying physiological processes of the system under investigation. Among the different approaches for the quantification of PET data, SA is based on the linear solution of the Laplace transform inversion whereas the measured arterial and tissue time-activity curves of a radiotracer are used to calculate the input response function of the tissue. In the recent years SA has been used with a large number of PET tracers in brain and nonbrain applications, demonstrating that it is a very flexible and robust method for PET data analysis. Differently from the most common PET quantification approaches that adopt standard nonlinear estimation of compartmental models or some linear simplifications, SA can be applied without defining any specific model configuration and has demonstrated very good sensitivity to the underlying kinetics. This characteristic makes it useful as an investigative tool especially for the analysis of novel PET tracers. The purpose of this work is to offer an overview of SA, to discuss advantages and limitations of the methodology, and to inform about its applications in the PET field.

  2. Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures

    NASA Technical Reports Server (NTRS)

    Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo

    2014-01-01

    This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.

  3. Quantification by SEM-EDS in uncoated non-conducting samples

    NASA Astrophysics Data System (ADS)

    Galván Josa, V.; Castellano, G.; Bertolino, S. R.

    2013-07-01

    An approach to perform elemental quantitative analysis in a conventional scanning electron microscope with an energy dispersive spectrometer has been developed for non-conductive samples in which the conductive coating should be avoided. Charge accumulation effects, which basically decrease the energy of the primary beam, were taken into account by means of the Duane-Hunt limit. This value represents the maximum energy of the continuum X-ray spectrum, and is related to the effective energy of the incident electron beam. To validate the results obtained by this procedure, a non-conductive sample of known composition was quantified without conductive coating. Complementarily, changes in the X-ray spectrum due to charge accumulation effects were studied by Monte Carlo simulations, comparing relative characteristic intensities as a function of the incident energy. This methodology is exemplified here to obtain the chemical composition of white and reddish archaeological pigments belonging to the Ambato style of "Aguada" culture (Catamarca, Argentina 500-1100 AD). The results obtained in this work show that the quantification procedure taking into account the Duane-Hunt limit is suitable for this kind of samples. This approach may be recommended for the quantification of samples for which coating is not desirable, such as ancient artwork, forensic or archaeological samples, or when the coating element is also present in the sample.

  4. Identification and Quantification of N-Acyl Homoserine Lactones Involved in Bacterial Communication by Small-Scale Synthesis of Internal Standards and Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas

    2017-12-01

    N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. [Figure not available: see fulltext.

  5. Quantification of isocyanates and amines in polyurethane foams and coated products by liquid chromatography–tandem mass spectrometry

    PubMed Central

    Mutsuga, Motoh; Yamaguchi, Miku; Kawamura, Yoko

    2014-01-01

    An analytical method for the identification and quantification of 10 different isocyanates and 11 different amines in polyurethane (PUR) foam and PUR-coated products was developed and optimized. Isocyanates were extracted and derivatized with di-n-butylamine, while amines were extracted with methanol. Quantification was subsequently performed by liquid chromatography–tandem mass spectrometry. Using this methodology, residual levels of isocyanates and amines in commercial PUR products were quantified. Although the recoveries of certain isocyanates and amines were low, the main compounds used as monomers in the production of PUR products, and their decomposition species, were clearly identified at quantifiable levels. 2,4-and 2,6-toluenediisocyanate were detected in most PUR foam samples and a pastry bag in the range of 0.02–0.92 mg/kg, with their decomposition compounds, 2,4-and 2,6-toluenediamine, detected in all PUR foam samples in the range of 9.5–59 mg/kg. PUR-coated gloves are manufactured using 4,4′-methylenebisphenyl diisocyanate as the main raw material, and a large amount of this compound, in addition to 4,4′-methylenedianiline and dicyclohexylmethane-4,4′-diamine were found in these samples. PMID:24804074

  6. Pressurized liquid extraction combined with capillary electrophoresis-mass spectrometry as an improved methodology for the determination of sulfonamide residues in meat.

    PubMed

    Font, Guillermina; Juan-García, Ana; Picó, Yolanda

    2007-08-03

    A new analytical method, based on capillary electrophoresis and tandem mass spectrometry (CE-MS2), is proposed and validated for the identification and simultaneous quantification of 12 sulfonamides (SAs) in pork meat. The studied SAs include sulfathiazole, sulfadiazine, sulfamethoxypyridazine, sulfaguanidine, sulfanilamide, sulfadimethoxyne, sulfapyridine, sulfachloropyridazine, sulfisoxazole, sulfasalazine, sulfabenzamide and sulfadimidine. Different parameters (i.e. separation buffer, sheath liquid, electrospray conditions) were optimized to obtain an adequate CE separation and high MS sensitivity. MS2 experiments using an ion trap as analyzer, operating in the selected reaction monitoring (SRM) mode, were carried out to achieve the required number of identification points according to the 2002/657/EC European Decision. For the quantification in pork tissue samples, a pressurized liquid extraction (PLE) procedure, using hot water as extractant followed by an Oasis HLB cleanup, was developed. Linearity (r between 0.996 and 0.997), precision (RSD<14 %) and recoveries (from 76 to 98%) were satisfactory. The limits of detection and quantification (below 12.5 and 46.5 microg kg(-1), respectively) were in all cases lower than the maximum residue limits (MRLs), indicating the potential of CE-MS2 for the analysis of SAs, in the food quality and safety control areas.

  7. Evaluation methodology for flood damage reduction by preliminary water release from hydroelectric dams

    NASA Astrophysics Data System (ADS)

    Ando, T.; Kawasaki, A.; Koike, T.

    2017-12-01

    IPCC AR5 (2014) reported that rainfall in the middle latitudes of the Northern Hemisphere has been increasing since 1901, and it is claimed that warmer climate will increase the risk of floods. In contrast, world water demand is forecasted to exceed a sustainable supply by 40 percent by 2030. In order to avoid this expectable water shortage, securing new water resources has become an utmost challenge. However, flood risk prevention and the secure of water resources are contradictory. To solve this problem, we can use existing hydroelectric dams not only as energy resources but also for flood control. However, in case of Japan, hydroelectric dams take no responsibility for it, and benefits have not been discussed accrued by controlling flood by hydroelectric dams, namely by using preliminary water release from them. Therefore, our paper proposes methodology for assessing those benefits. This methodology has three stages as shown in Fig. 1. First, RRI model is used to model flood events, taking account of the probability of rainfall. Second, flood damage is calculated using assets in inundation areas multiplied by the inundation depths generated by that RRI model. Third, the losses stemming from preliminary water release are calculated, and adding them to flood damage, overall losses are calculated. We can evaluate the benefits by changing the volume of preliminary release. As a result, shown in Fig. 2, the use of hydroelectric dams to control flooding creates 20 billion Yen benefits, in the probability of three-day-ahead rainfall prediction of the assumed maximum rainfall in Oi River, in the Shizuoka Pref. of Japan. As the third priority in the Sendai Framework for Disaster Risk Reduction 2015-2030, `investing in disaster risk reduction for resilience - public and private investment in disaster risk prevention and reduction through structural and non-structural measures' was adopted. The accuracy of rainfall prediction is the key factor in maximizing the benefits. Therefore, if the accrued 20 billion Yen benefits by adopting this evaluation methodology are invested in improving rainfall prediction, the accuracy of the forecasts will increase and so will the benefits. This positive feedback loop will benefit society. The results of this study may stimulate further discussion on the role of hydroelectric dams in flood control.

  8. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  9. Multiplex electrochemical DNA platform for femtomolar-level quantification of genetically modified soybean.

    PubMed

    Manzanares-Palenzuela, C Lorena; de-Los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2015-06-15

    Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Feminist methodologies and engineering education research

    NASA Astrophysics Data System (ADS)

    Beddoes, Kacey

    2013-03-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.

  11. Intercomparison of mid latitude storm diagnostics (IMILAST)

    NASA Astrophysics Data System (ADS)

    Neu, U.

    2009-04-01

    Diagnostics of the observed and projection of the future changes of extratropical storms are a key issue e.g. for insurance companies, risk management and adaptation planning. Storm-associated damages are amongst the highest losses due to natural disasters in the mid-latitudes. Therefore the knowledge of the future variability and change in extratropical cyclone frequency, intensity and track locations is crucial for the strategic planning and minimization of the disaster impacts. Future changes in the total number of storms might be small but major signals could occur in the characteristics of cyclone life cycle such as intensity, life time, track locations. The quantification of such trends is not independent from the methodologies for storm track detection applied to observational data and models. Comparison of differences in cyclone characteristics obtained using different methods from a single data set may be as large as or even exceed the differences between the results derived from different data sets using a single methodology. Even more, the metrics used become particularly sensitive, resulting in the fact that scientific studies may find seemingly contradictory results based on the same datasets. For users of storm track analyses and projections the results are very difficult to interprete. Thus, it would be very helpful if the research community would provide information in a kind of "handbook" which contains definitions and a description of the available different identification and tracking schemes as well as of the parameters used for the quantification of cyclone activity. It cannot be expected that there is an optimum or standard scheme that fulfills all needs. Rather, a proper knowledge about advantages and restrictions of different schemes must be obtained to be able to provide a synthesis of results rather than puzzling the scientific and the general public with apparently contradicing statements. The project IMILAST aims at providing a systematic intercomparison of different methodologies and a comprehensive assessment of all types of uncertainties inherent in the mid-latitudinal storm tracking by comparing different methodologies with respect to data of different resolution (time and space) and limited areas, for both cyclone identification and cyclone tracking respectively.

  12. Synchronous Videoconferencing in Distance Education for Pre-Licensure Nursing

    ERIC Educational Resources Information Center

    Scarbrough, John E.

    2015-01-01

    Current nursing education practices typically include methodologies for providing access to students located at a distance from the hosting institution. The majority of methodologies make use of asynchronous formatting in which communication occurs without the benefit of simultaneous, synchronous interaction. The increasing worldwide availability…

  13. Detection of Cyanotoxins During Potable Water Treatment

    USDA-ARS?s Scientific Manuscript database

    In 2007, the U.S. EPA listed three cyanobacterial toxins on the CCL3 containment priority list for potable drinking waters. This paper describes all methodologies used for detection of these toxins, and assesses each on a cost/benefit basis. Methodologies for microcystin, cylindrospermopsin, and a...

  14. Benefits of public roadside safety rest areas in Texas : technical report.

    DOT National Transportation Integrated Search

    2011-05-01

    The objective of this investigation was to develop a benefit-cost analysis methodology for safety rest areas in : Texas and to demonstrate its application in select corridors throughout the state. In addition, this project : considered novel safety r...

  15. Integrating public risk perception into formal natural hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Plattner, Th.; Plapp, T.; Hebel, B.

    2006-06-01

    An urgent need to take perception into account for risk assessment has been pointed out by relevant literature, its impact in terms of risk-related behaviour by individuals is obvious. This study represents an effort to overcome the broadly discussed question of whether risk perception is quantifiable or not by proposing a still simple but applicable methodology. A novel approach is elaborated to obtain a more accurate and comprehensive quantification of risk in comparison to present formal risk evaluation practice. A consideration of relevant factors enables a explicit quantification of individual risk perception and evaluation. The model approach integrates the effective individual risk reff and a weighted mean of relevant perception affecting factors PAF. The relevant PAF cover voluntariness of risk-taking, individual reducibility of risk, knowledge and experience, endangerment, subjective damage rating and subjective recurrence frequency perception. The approach assigns an individual weight to each PAF to represent its impact magnitude. The quantification of these weights is target-group-dependent (e.g. experts, laypersons) and may be effected by psychometric methods. The novel approach is subject to a plausibility check using data from an expert-workshop. A first model application is conducted by means of data of an empirical risk perception study in Western Germany to deduce PAF and weight quantification as well as to confirm and evaluate model applicbility and flexibility. Main fields of application will be a quantification of risk perception by individual persons in a formal and technical way e.g. for the purpose of risk communication issues in illustrating differing perspectives of experts and non-experts. For decision making processes this model will have to be applied with caution, since it is by definition not designed to quantify risk acceptance or risk evaluation. The approach may well explain how risk perception differs, but not why it differs. The formal model generates only "snap shots" and considers neither the socio-cultural nor the historical context of risk perception, since it is a highly individualistic and non-contextual approach.

  16. Competitive Reporter Monitored Amplification (CMA) - Quantification of Molecular Targets by Real Time Monitoring of Competitive Reporter Hybridization

    PubMed Central

    Ullrich, Thomas; Ermantraut, Eugen; Schulz, Torsten; Steinmetzer, Katrin

    2012-01-01

    Background State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. Methodology and Principal Findings The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. Conclusions and Significance The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2), we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls and targets into a single assay and to perform the assay on simple and robust instrumentation is a prerequisite for the development of novel molecular point of care tests. PMID:22539973

  17. Simultaneous quantification of the viral antigens hemagglutinin and neuraminidase in influenza vaccines by LC-MSE.

    PubMed

    Creskey, Marybeth C; Li, Changgui; Wang, Junzhi; Girard, Michel; Lorbetskie, Barry; Gravel, Caroline; Farnsworth, Aaron; Li, Xuguang; Smith, Daryl G S; Cyr, Terry D

    2012-07-06

    Current methods for quality control of inactivated influenza vaccines prior to regulatory approval include determining the hemagglutinin (HA) content by single radial immunodiffusion (SRID), verifying neuraminidase (NA) enzymatic activity, and demonstrating that the levels of the contaminant protein ovalbumin are below a set threshold of 1 μg/dose. The SRID assays require the availability of strain-specific reference HA antigens and antibodies, the production of which is a potential rate-limiting step in vaccine development and release, particularly during a pandemic. Immune responses induced by neuraminidase also contribute to protection from infection; however, the amounts of NA antigen in influenza vaccines are currently not quantified or standardized. Here, we report a method for vaccine analysis that yields simultaneous quantification of HA and NA levels much more rapidly than conventional HA quantification techniques, while providing additional valuable information on the total protein content. Enzymatically digested vaccine proteins were analyzed by LC-MS(E), a mass spectrometric technology that allows absolute quantification of analytes, including the HA and NA antigens, other structural influenza proteins and chicken egg proteins associated with the manufacturing process. This method has potential application for increasing the accuracy of reference antigen standards and for validating label claims for HA content in formulated vaccines. It can also be used to monitor NA and chicken egg protein content in order to monitor manufacturing consistency. While this is a useful methodology with potential for broad application, we also discuss herein some of the inherent limitations of this approach and the care and caution that must be taken in its use as a tool for absolute protein quantification. The variations in HA, NA and chicken egg protein concentrations in the vaccines analyzed in this study are indicative of the challenges associated with the current manufacturing and quality control testing procedures. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  18. Methodological Approaches for Estimating the Benefits and Costs of Smart Grid Demonstration Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Russell

    This report presents a comprehensive framework for estimating the benefits and costs of Smart Grid projects and a step-by-step approach for making these estimates. The framework identifies the basic categories of benefits, the beneficiaries of these benefits, and the Smart Grid functionalities that lead to different benefits and proposes ways to estimate these benefits, including their monetization. The report covers cost-effectiveness evaluation, uncertainty, and issues in estimating baseline conditions against which a project would be compared. The report also suggests metrics suitable for describing principal characteristics of a modern Smart Grid to which a project can contribute. This first sectionmore » of the report presents background information on the motivation for the report and its purpose. Section 2 introduces the methodological framework, focusing on the definition of benefits and a sequential, logical process for estimating them. Beginning with the Smart Grid technologies and functions of a project, it maps these functions to the benefits they produce. Section 3 provides a hypothetical example to illustrate the approach. Section 4 describes each of the 10 steps in the approach. Section 5 covers issues related to estimating benefits of the Smart Grid. Section 6 summarizes the next steps. The methods developed in this study will help improve future estimates - both retrospective and prospective - of the benefits of Smart Grid investments. These benefits, including those to consumers, society in general, and utilities, can then be weighed against the investments. Such methods would be useful in total resource cost tests and in societal versions of such tests. As such, the report will be of interest not only to electric utilities, but also to a broad constituency of stakeholders. Significant aspects of the methodology were used by the U.S. Department of Energy (DOE) to develop its methods for estimating the benefits and costs of its renewable and distributed systems integration demonstration projects as well as its Smart Grid Investment Grant projects and demonstration projects funded under the American Recovery and Reinvestment Act (ARRA). The goal of this report, which was cofunded by the Electric Power Research Institute (EPRI) and DOE, is to present a comprehensive set of methods for estimating the benefits and costs of Smart Grid projects. By publishing this report, EPRI seeks to contribute to the development of methods that will establish the benefits associated with investments in Smart Grid technologies. EPRI does not endorse the contents of this report or make any representations as to the accuracy and appropriateness of its contents. The purpose of this report is to present a methodological framework that will provide a standardized approach for estimating the benefits and costs of Smart Grid demonstration projects. The framework also has broader application to larger projects, such as those funded under the ARRA. Moreover, with additional development, it will provide the means for extrapolating the results of pilots and trials to at-scale investments in Smart Grid technologies. The framework was developed by a panel whose members provided a broad range of expertise.« less

  19. Social and Economic Benefits of Improved Adult Literacy: Towards a Better Understanding. An Adult Literacy National Project Report

    ERIC Educational Resources Information Center

    Hartley, Robyn; Horne, Jackie

    2006-01-01

    Assessing the social and economic costs of poor adult literacy and numeracy skills, and the benefits of investing in such skills, is largely unchartered territory in Australia. This feasibility study explores the frameworks and methodologies available for determining and measuring such benefits and costs across a number of life domains, including…

  20. An Attempt to Quantify the Economic Benefits of Scientific Research, Science Policy Studies No. 4.

    ERIC Educational Resources Information Center

    Byatt, I. C. R.; Cohen, A. V.

    This paper presents a possible methodology for measuring and predicting the future course of the long-range economic benefits of "curiosity-oriented" research. The basic premise is that much pure research tends to give rise to major industries in about one generation. Each industry will have some total economic benefit which can be…

  1. Expectations for methodology and translation of animal research: a survey of health care workers.

    PubMed

    Joffe, Ari R; Bara, Meredith; Anton, Natalie; Nobis, Nathan

    2015-05-07

    Health care workers (HCW) often perform, promote, and advocate use of public funds for animal research (AR); therefore, an awareness of the empirical costs and benefits of animal research is an important issue for HCW. We aim to determine what health-care-workers consider should be acceptable standards of AR methodology and translation rate to humans. After development and validation, an e-mail survey was sent to all pediatricians and pediatric intensive care unit nurses and respiratory-therapists (RTs) affiliated with a Canadian University. We presented questions about demographics, methodology of AR, and expectations from AR. Responses of pediatricians and nurses/RTs were compared using Chi-square, with P < .05 considered significant. Response rate was 44/114(39%) (pediatricians), and 69/120 (58%) (nurses/RTs). Asked about methodological quality, most respondents expect that: AR is done to high quality; costs and difficulty are not acceptable justifications for low quality; findings should be reproducible between laboratories and strains of the same species; and guidelines for AR funded with public money should be consistent with these expectations. Asked about benefits of AR, most thought that there are sometimes/often large benefits to humans from AR, and disagreed that "AR rarely produces benefit to humans." Asked about expectations of translation to humans (of toxicity, carcinogenicity, teratogenicity, and treatment findings), most: expect translation >40% of the time; thought that misleading AR results should occur <21% of the time; and that if translation was to occur <20% of the time, they would be less supportive of AR. There were few differences between pediatricians and nurses/RTs. HCW have high expectations for the methodological quality of, and the translation rate to humans of findings from AR. These expectations are higher than the empirical data show having been achieved. Unless these areas of AR significantly improve, HCW support of AR may be tenuous.

  2. Using a relative health indicator (RHI) metric to estimate health risk reductions in drinking water.

    PubMed

    Alfredo, Katherine A; Seidel, Chad; Ghosh, Amlan; Roberson, J Alan

    2017-03-01

    When a new drinking water regulation is being developed, the USEPA conducts a health risk reduction and cost analysis to, in part, estimate quantifiable and non-quantifiable cost and benefits of the various regulatory alternatives. Numerous methodologies are available for cumulative risk assessment ranging from primarily qualitative to primarily quantitative. This research developed a summary metric of relative cumulative health impacts resulting from drinking water, the relative health indicator (RHI). An intermediate level of quantification and modeling was chosen, one which retains the concept of an aggregated metric of public health impact and hence allows for comparisons to be made across "cups of water," but avoids the need for development and use of complex models that are beyond the existing state of the science. Using the USEPA Six-Year Review data and available national occurrence surveys of drinking water contaminants, the metric is used to test risk reduction as it pertains to the implementation of the arsenic and uranium maximum contaminant levels and quantify "meaningful" risk reduction. Uranium represented the threshold risk reduction against which national non-compliance risk reduction was compared for arsenic, nitrate, and radium. Arsenic non-compliance is most significant and efforts focused on bringing those non-compliant utilities into compliance with the 10 μg/L maximum contaminant level would meet the threshold for meaningful risk reduction.

  3. Malaria surveys using rapid diagnostic tests and validation of results using post hoc quantification of Plasmodium falciparum histidine-rich protein 2.

    PubMed

    Plucinski, Mateusz; Dimbu, Rafael; Candrinho, Baltazar; Colborn, James; Badiane, Aida; Ndiaye, Daouda; Mace, Kimberly; Chang, Michelle; Lemoine, Jean F; Halsey, Eric S; Barnwell, John W; Udhayakumar, Venkatachalam; Aidoo, Michael; Rogier, Eric

    2017-11-07

    Rapid diagnostic test (RDT) positivity is supplanting microscopy as the standard measure of malaria burden at the population level. However, there is currently no standard for externally validating RDT results from field surveys. Individuals' blood concentration of the Plasmodium falciparum histidine rich protein 2 (HRP2) protein were compared to results of HRP2-detecting RDTs in participants from field surveys in Angola, Mozambique, Haiti, and Senegal. A logistic regression model was used to estimate the HRP2 concentrations corresponding to the 50 and 90% level of detection (LOD) specific for each survey. There was a sigmoidal dose-response relationship between HRP2 concentration and RDT positivity for all surveys. Variation was noted in estimates for field RDT sensitivity, with the 50% LOD ranging between 0.076 and 6.1 ng/mL and the 90% LOD ranging between 1.1 and 53 ng/mL. Surveys conducted in two different provinces of Angola using the same brand of RDT and same study methodology showed a threefold difference in LOD. Measures of malaria prevalence estimated using population RDT positivity should be interpreted in the context of potentially large variation in RDT LODs between, and even within, surveys. Surveys based on RDT positivity would benefit from external validation of field RDT results by comparing RDT positivity and antigen concentration.

  4. Robotic voltammetry with carbon nanotube-based sensors: a superb blend for convenient high-quality antimicrobial trace analysis

    PubMed Central

    Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert

    2015-01-01

    A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1–10 μM and 2–100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories. PMID:25670899

  5. Asynchronous versus Synchronous Learning in Pharmacy Education

    ERIC Educational Resources Information Center

    Motycka, Carol A.; St. Onge, Erin L.; Williams, Jennifer

    2013-01-01

    Objective: To better understand the technology being used today in pharmacy education through a review of the current methodologies being employed at various institutions. Also, to discuss the benefits and difficulties of asynchronous and synchronous methodologies, which are being utilized at both traditional and distance education campuses.…

  6. Novel integrated nondestructive testing methodology for detection and evaluation of corrosion in cement-based materials.

    DOT National Transportation Integrated Search

    2014-06-01

    The objective of this project focused on the development of a hybrid nondestructive testing and evaluation (NDT&E) methodology that combines the benefits of microwave NDT and thermography into one new technique. In this way, unique features of both N...

  7. The ICA Communication Audit and Perceived Communication Effectiveness Changes in 16 Audited Organizations.

    ERIC Educational Resources Information Center

    Brooks, Keith; And Others

    1979-01-01

    Discusses the benefits of the International Communication Association Communication Audit as a methodology for evaluation of organizational communication processes and outcomes. An "after" survey of 16 audited organizations confirmed the audit as a valid diagnostic methodology and organization development intervention technique which…

  8. Quantification of localized vertebral deformities using a sparse wavelet-based shape model.

    PubMed

    Zewail, R; Elsafi, A; Durdle, N

    2008-01-01

    Medical experts often examine hundreds of spine x-ray images to determine existence of various pathologies. Common pathologies of interest are anterior osteophites, disc space narrowing, and wedging. By careful inspection of the outline shapes of the vertebral bodies, experts are able to identify and assess vertebral abnormalities with respect to the pathology under investigation. In this paper, we present a novel method for quantification of vertebral deformation using a sparse shape model. Using wavelets and Independent component analysis (ICA), we construct a sparse shape model that benefits from the approximation power of wavelets and the capability of ICA to capture higher order statistics in wavelet space. The new model is able to capture localized pathology-related shape deformations, hence it allows for quantification of vertebral shape variations. We investigate the capability of the model to predict localized pathology related deformations. Next, using support-vector machines, we demonstrate the diagnostic capabilities of the method through the discrimination of anterior osteophites in lumbar vertebrae. Experiments were conducted using a set of 150 contours from digital x-ray images of lumbar spine. Each vertebra is labeled as normal or abnormal. Results reported in this work focus on anterior osteophites as the pathology of interest.

  9. Gebiss: an ImageJ plugin for the specification of ground truth and the performance evaluation of 3D segmentation algorithms

    PubMed Central

    2011-01-01

    Background Image segmentation is a crucial step in quantitative microscopy that helps to define regions of tissues, cells or subcellular compartments. Depending on the degree of user interactions, segmentation methods can be divided into manual, automated or semi-automated approaches. 3D image stacks usually require automated methods due to their large number of optical sections. However, certain applications benefit from manual or semi-automated approaches. Scenarios include the quantification of 3D images with poor signal-to-noise ratios or the generation of so-called ground truth segmentations that are used to evaluate the accuracy of automated segmentation methods. Results We have developed Gebiss; an ImageJ plugin for the interactive segmentation, visualisation and quantification of 3D microscopic image stacks. We integrated a variety of existing plugins for threshold-based segmentation and volume visualisation. Conclusions We demonstrate the application of Gebiss to the segmentation of nuclei in live Drosophila embryos and the quantification of neurodegeneration in Drosophila larval brains. Gebiss was developed as a cross-platform ImageJ plugin and is freely available on the web at http://imaging.bii.a-star.edu.sg/projects/gebiss/. PMID:21668958

  10. Spectrometric microbiological analyzer

    NASA Astrophysics Data System (ADS)

    Schlager, Kenneth J.; Meissner, Ken E.

    1996-04-01

    Currently, there are four general approaches to microbiological analysis, i.e., the detection, identification and quantification of micro-organisms: (1) Traditional culturing and staining procedures, metabolic fermentations and visual morphological characteristics; (2) Immunological approaches employing microbe-specific antibodies; (3) Biotechnical techniques employing DNA probes and related genetic engineering methods; and (4) Physical measurement techniques based on the biophysical properties of micro-organisms. This paper describes an instrumentation development in the fourth of the above categories, physical measurement, that uses a combination of fluorometric and light scatter spectra to detect and identify micro-organisms at the species level. A major advantage of this approach is the rapid turnaround possible in medical diagnostic or water testing applications. Fluorometric spectra serve to define the biochemical characteristics of the microbe, and light scatter spectra the size and shape morphology. Together, the two spectra define a 'fingerprint' for each species of microbe for detection, identification and quantification purposes. A prototype instrument has been developed and tested under NASA sponsorship based on fluorometric spectra alone. This instrument demonstrated identification and quantification capabilities at the species level. The paper reports on test results using this instrument, and the benefits of employing a combination of fluorometric and light scatter spectra.

  11. Development of a CZE method for the quantification of pseudoephedrine and cetirizine.

    PubMed

    Alnajjar, Ahmed O; Idris, Abubakr M

    2014-10-01

    Pseudoephedrine and cetirizine have been combined in dosage forms with more therapeutic benefits when compared with single-drug treatment. The current manuscript reports the development of the first capillary zone electrophoresis (CZE) assay method for that combination. The effects of pH and buffer concentration on resolution, noise, migration time and peak area were examined employing experimental design approach. The analytes were electropherographed into a 50.2 cm-long and 50 µm i.d. fused-silica capillary column using 10 mmol/L borate at pH 8.3 with a potential of 25 kV at 25°C and UV detection at 214 nm. The method was successfully validated in order to verify its suitability for pharmaceutical analysis for the purposes of quality control. Over previous high-performance liquid chromatographic methods, the current CZE method features the benefits of the use of cost-effective electrolyte, besides high sample throughput (11 samples/h). Furthermore, other analytical results including linear dynamic ranges, recovery (96.9-98.1%), intra- and interday precision (relative standard deviation ≤ 1.70%) as well as the limits of detection and quantification (≤2.65 µg/mL) were all satisfactory for the intended purpose. © The Author [2013]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Valuing the Ozone-Related Health Benefits of Methane Emission Controls

    EPA Science Inventory

    A recently published paper presented a range of estimates of the monetized ozone-related mortality benefits of reducing methane emissions (Sarofim et al. 2015). This peer review regards the application of the Sarofim et al. methodology to regulatory impact assessments. Sarofim...

  13. Electromagnetomechanical elastodynamic model for Lamb wave damage quantification in composites

    NASA Astrophysics Data System (ADS)

    Borkowski, Luke; Chattopadhyay, Aditi

    2014-03-01

    Physics-based wave propagation computational models play a key role in structural health monitoring (SHM) and the development of improved damage quantification methodologies. Guided waves (GWs), such as Lamb waves, provide the capability to monitor large plate-like aerospace structures with limited actuators and sensors and are sensitive to small scale damage; however due to the complex nature of GWs, accurate and efficient computation tools are necessary to investigate the mechanisms responsible for dispersion, coupling, and interaction with damage. In this paper, the local interaction simulation approach (LISA) coupled with the sharp interface model (SIM) solution methodology is used to solve the fully coupled electro-magneto-mechanical elastodynamic equations for the piezoelectric and piezomagnetic actuation and sensing of GWs in fiber reinforced composite material systems. The final framework provides the full three-dimensional displacement as well as electrical and magnetic potential fields for arbitrary plate and transducer geometries and excitation waveform and frequency. The model is validated experimentally and proven computationally efficient for a laminated composite plate. Studies are performed with surface bonded piezoelectric and embedded piezomagnetic sensors to gain insight into the physics of experimental techniques used for SHM. The symmetric collocation of piezoelectric actuators is modeled to demonstrate mode suppression in laminated composites for the purpose of damage detection. The effect of delamination and damage (i.e., matrix cracking) on the GW propagation is demonstrated and quantified. The developed model provides a valuable tool for the improvement of SHM techniques due to its proven accuracy and computational efficiency.

  14. In vivo quantification of lead in bone with a portable x-ray fluorescence system--methodology and feasibility.

    PubMed

    Nie, L H; Sanchez, S; Newton, K; Grodzins, L; Cleveland, R O; Weisskopf, M G

    2011-02-07

    This study was conducted to investigate the methodology and feasibility of developing a portable x-ray fluorescence (XRF) technology to quantify lead (Pb) in bone in vivo. A portable XRF device was set up and optimal settings of voltage, current, and filter combination for bone lead quantification were selected to achieve the lowest detection limit. The minimum radiation dose delivered to the subject was calculated by Monte Carlo simulations. An ultrasound device was used to measure soft tissue thickness to account for signal attenuation, and an alternative method to obtain soft tissue thickness from the XRF spectrum was developed and shown to be equivalent to the ultrasound measurements (intraclass correlation coefficient, ICC = 0.82). We tested the correlation of in vivo bone lead concentrations between the standard KXRF technology and the portable XRF technology. There was a significant correlation between the bone lead concentrations obtained from the standard KXRF technology and those obtained from the portable XRF technology (ICC = 0.65). The detection limit for the portable XRF device was about 8.4 ppm with 2 mm soft tissue thickness. The entrance skin dose delivered to the human subject was about 13 mSv and the total body effective dose was about 1.5 µSv and should pose minimal radiation risk. In conclusion, portable XRF technology can be used for in vivo bone lead measurement with sensitivity comparable to the KXRF technology and good correlation with KXRF measurements.

  15. In Vivo Quantification of Lead in Bone with a Portable X-ray Fluorescence (XRF) System – Methodology and Feasibility

    PubMed Central

    Nie, LH; Sanchez, S; Newton, K; Grodzins, L; Cleveland, RO; Weisskopf, MG

    2013-01-01

    This study was conducted to investigate the methodology and feasibility of developing a portable XRF technology to quantify lead (Pb) in bone in vivo. A portable XRF device was set up and optimal setting of voltage, current, and filter combination for bone lead quantification were selected to achieve the lowest detection limit. The minimum radiation dose delivered to the subject was calculated by Monte Carlo simulations. An ultrasound device was used to measure soft tissue thickness to account for signal attenuation, and an alternative method to obtain soft tissue thickness from the XRF spectrum was developed and shown to be equivalent to the ultrasound measurements (Intraclass Correlation Coefficient, ICC=0.82). We tested the correlation of in vivo bone lead concentrations between the standard KXRF technology and the portable XRF technology. There was a significant correlation between the bone lead concentrations obtained from the standard KXRF technology and those obtained from the portable XRF technology (ICC=0.65). The detection limit for the portable XRF device was about 8.4 ppm with 2 mm soft tissue thickness. The entrance skin dose delivered to the human subject was about 13 mSv and the total body effective dose was about 1.5 μSv and should pose a minimal radiation risk. In conclusion, portable XRF technology can be used for in vivo bone lead measurement with sensitivity comparable to the KXRF technology and good correlation with KXRF measurements. PMID:21242629

  16. Determination of tocopherols and sitosterols in seeds and nuts by QuEChERS-liquid chromatography.

    PubMed

    Delgado-Zamarreño, M Milagros; Fernández-Prieto, Cristina; Bustamante-Rangel, Myriam; Pérez-Martín, Lara

    2016-02-01

    In the present work a simple, reliable and affordable sample treatment method for the simultaneous analysis of tocopherols and free phytosterols in nuts was developed. Analyte extraction was carried out using the QuEChERS methodology and analyte separation and detection were accomplished using HPLC-DAD. The use of this methodology for the extraction of natural occurring substances provides advantages such as speed, simplicity and ease of use. The parameters evaluated for the validation of the method developed included the linearity of the calibration plots, the detection and quantification limits, repeatability, reproducibility and recovery. The proposed method was successfully applied to the analysis of tocopherols and free phytosterols in samples of almonds, cashew nuts, hazelnuts, peanuts, tiger nuts, sun flower seeds and pistachios. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Identification and quantification of ethyl carbamate occurring in urea complexation processes commonly utilized for polyunsaturated fatty acid concentration.

    PubMed

    Vázquez, Luis; Prados, Isabel M; Reglero, Guillermo; Torres, Carlos F

    2017-08-15

    The concentration of polyunsaturated fatty acids by formation of urea adducts from three different sources was studied to elucidate the formation of ethyl carbamates in the course of these procedures. Two different methodologies were performed: with ethanol at high temperature and with hexane/ethanol mixtures at room temperature. It was proved that the amount of urethanes generated at high temperature was higher than at room temperature. Besides, subsequent washing steps of the PUFA fraction with water were efficient to remove the urethanes from the final products. The methodology at room temperature with 0.4mL ethanol and 3g urea provided good relationship between concentration and yield of the main bioactive PUFA, with the lowest formation of ethyl carbamates in the process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Airport emissions quantification: Impacts of electrification. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geba, V.

    1998-07-01

    Four airports were assessed to demonstrate that electrification of economically viable air- and land-side vehicles and equipment can significantly reduce total airport emissions. Assessments were made using the FAA`s Emissions and Dispersion Modeling System and EPRI Airport Electrification Project data. Development and implementation of cost-effective airport emissions reduction strategies can be complex, requiring successful collaboration of local, state, and federal regulatory agencies with airport authorities. The methodology developed in this study helps to simplify this task. The objectives of this study were: to develop a methodology to quantify annual emissions at US airports from all sources--aircraft, vehicles, and infrastructure; andmore » to demonstrate that electrification of economically viable air- and land-side vehicles and equipment can significantly reduce total airport emissions on-site, even when allowing for emissions from the generation of electricity.« less

  19. Determination of boron in uranium aluminum silicon alloy by spectrophotometry and estimation of expanded uncertainty in measurement

    NASA Astrophysics Data System (ADS)

    Ramanjaneyulu, P. S.; Sayi, Y. S.; Ramakumar, K. L.

    2008-08-01

    Quantification of boron in diverse materials of relevance in nuclear technology is essential in view of its high thermal neutron absorption cross section. A simple and sensitive method has been developed for the determination of boron in uranium-aluminum-silicon alloy, based on leaching of boron with 6 M HCl and H 2O 2, its selective separation by solvent extraction with 2-ethyl hexane 1,3-diol and quantification by spectrophotometry using curcumin. The method has been evaluated by standard addition method and validated by inductively coupled plasma-atomic emission spectroscopy. Relative standard deviation and absolute detection limit of the method are 3.0% (at 1 σ level) and 12 ng, respectively. All possible sources of uncertainties in the methodology have been individually assessed, following the International Organization for Standardization guidelines. The combined uncertainty is calculated employing uncertainty propagation formulae. The expanded uncertainty in the measurement at 95% confidence level (coverage factor 2) is 8.840%.

  20. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    NASA Astrophysics Data System (ADS)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  1. In vivo Magnetic Resonance Spectroscopy of cerebral glycogen metabolism in animals and humans.

    PubMed

    Khowaja, Ameer; Choi, In-Young; Seaquist, Elizabeth R; Öz, Gülin

    2015-02-01

    Glycogen serves as an important energy reservoir in the human body. Despite the abundance of glycogen in the liver and skeletal muscles, its concentration in the brain is relatively low, hence its significance has been questioned. A major challenge in studying brain glycogen metabolism has been the lack of availability of non-invasive techniques for quantification of brain glycogen in vivo. Invasive methods for brain glycogen quantification such as post mortem extraction following high energy microwave irradiation are not applicable in the human brain. With the advent of (13)C Magnetic Resonance Spectroscopy (MRS), it has been possible to measure brain glycogen concentrations and turnover in physiological conditions, as well as under the influence of stressors such as hypoglycemia and visual stimulation. This review presents an overview of the principles of the (13)C MRS methodology and its applications in both animals and humans to further our understanding of glycogen metabolism under normal physiological and pathophysiological conditions such as hypoglycemia unawareness.

  2. Simple and clean determination of tetracyclines by flow injection analysis

    NASA Astrophysics Data System (ADS)

    Rodríguez, Michael Pérez; Pezza, Helena Redigolo; Pezza, Leonardo

    2016-01-01

    An environmentally reliable analytical methodology was developed for direct quantification of tetracycline (TC) and oxytetracycline (OTC) using continuous flow injection analysis with spectrophotometric detection. The method is based on the diazo coupling reaction between the tetracyclines and diazotized sulfanilic acid in a basic medium, resulting in the formation of an intense orange azo compound that presents maximum absorption at 434 nm. Experimental design was used to optimize the analytical conditions. The proposed technique was validated over the concentration range of 1 to 40 μg mL- 1, and was successfully applied to samples of commercial veterinary pharmaceuticals. The detection (LOD) and quantification (LOQ) limits were 0.40 and 1.35 μg mL- 1, respectively. The samples were also analyzed by an HPLC method, and the results showed agreement with the proposed technique. The new flow injection method can be immediately used for quality control purposes in the pharmaceutical industry, facilitating monitoring in real time during the production processes of tetracycline formulations for veterinary use.

  3. Quantitative evaluation of multiple adulterants in roasted coffee by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS) and chemometrics.

    PubMed

    Reis, Nádia; Franca, Adriana S; Oliveira, Leandro S

    2013-10-15

    The current study presents an application of Diffuse Reflectance Infrared Fourier Transform Spectroscopy for detection and quantification of fraudulent addition of commonly employed adulterants (spent coffee grounds, coffee husks, roasted corn and roasted barley) to roasted and ground coffee. Roasted coffee samples were intentionally blended with the adulterants (pure and mixed), with total adulteration levels ranging from 1% to 66% w/w. Partial Least Squares Regression (PLS) was used to relate the processed spectra to the mass fraction of adulterants and the model obtained provided reliable predictions of adulterations at levels as low as 1% w/w. A robust methodology was implemented that included the detection of outliers. High correlation coefficients (0.99 for calibration; 0.98 for validation) coupled with low degrees of error (1.23% for calibration; 2.67% for validation) confirmed that DRIFTS can be a valuable analytical tool for detection and quantification of adulteration in ground, roasted coffee. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Magnetic fingerprints of rolling cells for quantitative flow cytometry in whole blood

    NASA Astrophysics Data System (ADS)

    Reisbeck, Mathias; Helou, Michael Johannes; Richter, Lukas; Kappes, Barbara; Friedrich, Oliver; Hayden, Oliver

    2016-09-01

    Over the past 50 years, flow cytometry has had a profound impact on preclinical and clinical applications requiring single cell function information for counting, sub-typing and quantification of epitope expression. At the same time, the workflow complexity and high costs of such optical systems still limit flow cytometry applications to specialized laboratories. Here, we present a quantitative magnetic flow cytometer that incorporates in situ magnetophoretic cell focusing for highly accurate and reproducible rolling of the cellular targets over giant magnetoresistance sensing elements. Time-of-flight analysis is used to unveil quantitative single cell information contained in its magnetic fingerprint. Furthermore, we used erythrocytes as a biological model to validate our methodology with respect to precise analysis of the hydrodynamic cell diameter, quantification of binding capacity of immunomagnetic labels, and discrimination of cell morphology. The extracted time-of-flight information should enable point-of-care quantitative flow cytometry in whole blood for clinical applications, such as immunology and primary hemostasis.

  5. Qualitative and quantitative analysis of milk for the detection of adulteration by Laser Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Moncayo, S; Manzoor, S; Rosales, J D; Anzano, J; Caceres, J O

    2017-10-01

    The present work focuses on the development of a fast and cost effective method based on Laser Induced Breakdown Spectroscopy (LIBS) to the quality control, traceability and detection of adulteration in milk. Two adulteration cases have been studied; a qualitative analysis for the discrimination between different milk blends and quantification of melamine in adulterated toddler milk powder. Principal Component Analysis (PCA) and neural networks (NN) have been used to analyze LIBS spectra obtaining a correct classification rate of 98% with a 100% of robustness. For the quantification of melamine, two methodologies have been developed; univariate analysis using CN emission band and multivariate calibration NN model obtaining correlation coefficient (R 2 ) values of 0.982 and 0.999 respectively. The results of the use of LIBS technique coupled with chemometric analysis are discussed in terms of its potential use in the food industry to perform the quality control of this dairy product. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. In vivo Magnetic Resonance Spectroscopy of cerebral glycogen metabolism in animals and humans

    PubMed Central

    Khowaja, Ameer; Choi, In-Young; Seaquist, Elizabeth R.; Öz, Gülin

    2015-01-01

    Glycogen serves as an important energy reservoir in the human body. Despite the abundance of glycogen in the liver and skeletal muscles, its concentration in the brain is relatively low, hence its significance has been questioned. A major challenge in studying brain glycogen metabolism has been the lack of availability of non-invasive techniques for quantification of brain glycogen in vivo. Invasive methods for brain glycogen quantification such as post mortem extraction following high energy microwave irradiation are not applicable in the human brain. With the advent of 13C Magnetic Resonance Spectroscopy (MRS), it has been possible to measure brain glycogen concentrations and turnover in physiological conditions, as well as under the influence of stressors such as hypoglycemia and visual stimulation. This review presents an overview of the principles of the 13C MRS methodology and its applications in both animals and humans to further our understanding of glycogen metabolism under normal physiological and pathophysiological conditions such as hypoglycemia unawareness. PMID:24676563

  7. Large-scale time-lapse microscopy of Oct4 expression in human embryonic stem cell colonies.

    PubMed

    Bhadriraju, Kiran; Halter, Michael; Amelot, Julien; Bajcsy, Peter; Chalfoun, Joe; Vandecreme, Antoine; Mallon, Barbara S; Park, Kye-Yoon; Sista, Subhash; Elliott, John T; Plant, Anne L

    2016-07-01

    Identification and quantification of the characteristics of stem cell preparations is critical for understanding stem cell biology and for the development and manufacturing of stem cell based therapies. We have developed image analysis and visualization software that allows effective use of time-lapse microscopy to provide spatial and dynamic information from large numbers of human embryonic stem cell colonies. To achieve statistically relevant sampling, we examined >680 colonies from 3 different preparations of cells over 5days each, generating a total experimental dataset of 0.9 terabyte (TB). The 0.5 Giga-pixel images at each time point were represented by multi-resolution pyramids and visualized using the Deep Zoom Javascript library extended to support viewing Giga-pixel images over time and extracting data on individual colonies. We present a methodology that enables quantification of variations in nominally-identical preparations and between colonies, correlation of colony characteristics with Oct4 expression, and identification of rare events. Copyright © 2016. Published by Elsevier B.V.

  8. Assessing Personality and Mood With Adjective Check List Methodology: A Review

    ERIC Educational Resources Information Center

    Craig, Robert J.

    2005-01-01

    This article addresses the benefits and problems in using adjective check list methodology to assess personality. Recent developments in this assessment method are reviewed, emphasizing seminal adjective-based personality tests (Gough's Adjective Check List), mood tests (Lubin's Depressive Adjective Test, Multiple Affect Adjective Check List),…

  9. Millennial Expectations and Constructivist Methodologies: Their Corresponding Characteristics and Alignment

    ERIC Educational Resources Information Center

    Carter, Timothy L.

    2008-01-01

    In recent years, much emphasis has been placed on constructivist methodologies and their potential benefit for learners of various ages (Brandt & Perkins, 2000; Brooks, 1990). Although certain aspects of the constructivist paradigm have replaced several aspects of the behaviorist paradigm for a large contingency of stakeholders (particularly,…

  10. Management of health care expenditure by soft computing methodology

    NASA Astrophysics Data System (ADS)

    Maksimović, Goran; Jović, Srđan; Jovanović, Radomir; Aničić, Obrad

    2017-01-01

    In this study was managed the health care expenditure by soft computing methodology. The main goal was to predict the gross domestic product (GDP) according to several factors of health care expenditure. Soft computing methodologies were applied since GDP prediction is very complex task. The performances of the proposed predictors were confirmed with the simulation results. According to the results, support vector regression (SVR) has better prediction accuracy compared to other soft computing methodologies. The soft computing methods benefit from the soft computing capabilities of global optimization in order to avoid local minimum issues.

  11. Benzene Case Study Final Report - Second Prospective Report Study Science Advisory Board Review, July 2009

    EPA Pesticide Factsheets

    EPA developed a methodology for estimating the health benefits of benzene reductions and has applied it in a metropolitan-scale case study of the benefits of CAA controls on benzene emissions to accompany the main 812 analysis.

  12. Draft Benzene Case Study Review - Second Prospective Report Study Science Advisory Board Review, March 2008

    EPA Pesticide Factsheets

    EPA developed a methodology for estimating the health benefits of benzene reductions and has applied it in a metropolitan-scale case study of the benefits of CAA controls on benzene emissions to accompany the main 812 analysis.

  13. Sustainable Facility Development: Perceived Benefits and Challenges

    ERIC Educational Resources Information Center

    Stinnett, Brad; Gibson, Fred

    2016-01-01

    Purpose: The purpose of this paper is to assess the perceived benefits and challenges of implementing sustainable initiatives in collegiate recreational sports facilities. Additionally, this paper intends to contribute to the evolving field of facility sustainability in higher education. Design/methodology/approach The design included qualitative…

  14. Candidate substances for space bioprocessing methodology and data specification for benefit evaluation

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Analytical and quantitative economic techniques are applied to the evaluation of the economic benefits of a wide range of substances for space bioprocessing. On the basis of expected clinical applications, as well as the size of the patient that could be affected by the clinical applications, eight substances are recommended for further benefit evaluation. Results show that a transitional probability methodology can be used to model at least one clinical application for each of these substances. In each recommended case, the disease and its therapy are sufficiently well understood and documented, and the statistical data is available to operate the model and produce estimates of the impact of new therapy systems on the cost of treatment, morbidity, and mortality. Utilizing the morbidity and mortality information produced by the model, a standard economic technique called the Value of Human Capital is used to estimate the social welfare benefits that could be attributable to the new therapy systems.

  15. Soil Organic Carbon for Global Benefits - assessing potential SOC increase under SLM technologies worldwide and evaluating tradeoffs and gains of upscaling SLM technologies

    NASA Astrophysics Data System (ADS)

    Wolfgramm, Bettina; Hurni, Hans; Liniger, Hanspeter; Ruppen, Sebastian; Milne, Eleanor; Bader, Hans-Peter; Scheidegger, Ruth; Amare, Tadele; Yitaferu, Birru; Nazarmavloev, Farrukh; Conder, Malgorzata; Ebneter, Laura; Qadamov, Aslam; Shokirov, Qobiljon; Hergarten, Christian; Schwilch, Gudrun

    2013-04-01

    There is a fundamental mutual interest between enhancing soil organic carbon (SOC) in the world's soils and the objectives of the major global environmental conventions (UNFCCC, UNCBD, UNCCD). While there is evidence at the case study level that sustainable land management (SLM) technologies increase SOC stocks and SOC related benefits, there is no quantitative data available on the potential for increasing SOC benefits from different SLM technologies and especially from case studies in the developing countries, and a clear understanding of the trade-offs related to SLM up-scaling is missing. This study aims at assessing the potential increase of SOC under SLM technologies worldwide, evaluating tradeoffs and gains in up-scaling SLM for case studies in Tajikistan, Ethiopia and Switzerland. It makes use of the SLM technologies documented in the online database of the World Overview of Conservation Approaches and Technologies (WOCAT). The study consists of three components: 1) Identifying SOC benefits contributing to the major global environmental issues for SLM technologies worldwide as documented in the WOCAT global database 2) Validation of SOC storage potentials and SOC benefit predictions for SLM technologies from the WOCAT database using results from existing comparative case studies at the plot level, using soil spectral libraries and standardized documentations of ecosystem service from the WOCAT database. 3) Understanding trade-offs and win-win scenarios of up-scaling SLM technologies from the plot to the household and landscape level using material flow analysis. This study builds on the premise that the most promising way to increase benefits from land management is to consider already existing sustainable strategies. Such SLM technologies from all over the world documented are accessible in a standardized way in the WOCAT online database. The study thus evaluates SLM technologies from the WOCAT database by calculating the potential SOC storage increase and related benefits by comparing SOC estimates before-and-after establishment of the SLM technology. These results are validated using comparative case studies of plots with-and-without SLM technologies (existing SLM systems versus surrounding, degrading systems). In view of upscaling SLM technologies, it is crucial to understand tradeoffs and gains supporting or hindering the further spread. Systemic biomass management analysis using material flow analysis allows quantifying organic carbon flows and storages for different land management options at the household, but also at landscape level. The study shows results relevant for science, policy and practice for accounting, monitoring and evaluating SOC related ecosystem services: - A comprehensive methodology for SLM impact assessments allowing quantification of SOC storage and SOC related benefits under different SLM technologies, and - Improved understanding of upscaling options for SLM technologies and tradeoffs as well as win-win opportunities for biomass management, SOC content increase, and ecosystem services improvement at the plot and household level.

  16. Is law enforcement of drug-impaired driving cost-efficient? An explorative study of a methodology for cost-benefit analysis.

    PubMed

    Veisten, Knut; Houwing, Sjoerd; Mathijssen, M P M René; Akhtar, Juned

    2013-03-01

    Road users driving under the influence of psychoactive substances may be at much higher relative risk (RR) in road traffic than the average driver. Legislation banning blood alcohol concentrations above certain threshold levels combined with roadside breath-testing of alcohol have been in lieu for decades in many countries, but new legislation and testing of drivers for drug use have recently been implemented in some countries. In this article we present a methodology for cost-benefit analysis (CBA) of increased law enforcement of roadside drug screening. This is an analysis of the profitability for society, where costs of control are weighed against the reduction in injuries expected from fewer drugged drivers on the roads. We specify assumptions regarding costs and the effect of the specificity of the drug screening device, and quantify a deterrence effect related to sensitivity of the device yielding the benefit estimates. Three European countries with different current enforcement levels were studied, yielding benefit-cost ratios in the approximate range of 0.5-5 for a tripling of current levels of enforcement, with costs of about 4000 EUR per convicted and in the range of 1.5 and 13 million EUR per prevented fatality. The applied methodology for CBA has involved a simplistic behavioural response to enforcement increase and control efficiency. Although this methodology should be developed further, it is clearly indicated that the cost-efficiency of increased law enforcement of drug driving offences is dependent on the baseline situation of drug-use in traffic and on the current level of enforcement, as well as the RR and prevalence of drugs in road traffic. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Application of Fuzzy Logic to Matrix FMECA

    NASA Astrophysics Data System (ADS)

    Shankar, N. Ravi; Prabhu, B. S.

    2001-04-01

    A methodology combining the benefits of Fuzzy Logic and Matrix FMEA is presented in this paper. The presented methodology extends the risk prioritization beyond the conventional Risk Priority Number (RPN) method. Fuzzy logic is used to calculate the criticality rank. Also the matrix approach is improved further to develop a pictorial representation retaining all relevant qualitative and quantitative information of several FMEA elements relationships. The methodology presented is demonstrated by application to an illustrative example.

  18. Return on Investment for the United States Navy’s Training with Industry Program

    DTIC Science & Technology

    2017-06-01

    methodologies, an adaptable metric was developed for NAVSUP. The net benefit of the program divided by the program costs results in an ROI of 88... costs results in an ROI of 88%. Additional intangible benefits obtained include meeting capability gaps, meeting NAVSUP’s objectives, and increasing...Measure All Training Costs and Benefits .....................36 7. Step 7: Full Training Report

  19. How to Measure Costs and Benefits of eHealth Interventions: An Overview of Methods and Frameworks.

    PubMed

    Bergmo, Trine Strand

    2015-11-09

    Information on the costs and benefits of eHealth interventions is needed, not only to document value for money and to support decision making in the field, but also to form the basis for developing business models and to facilitate payment systems to support large-scale services. In the absence of solid evidence of its effects, key decision makers may doubt the effectiveness, which, in turn, limits investment in, and the long-term integration of, eHealth services. However, it is not realistic to conduct economic evaluations of all eHealth applications and services in all situations, so we need to be able to generalize from those we do conduct. This implies that we have to select the most appropriate methodology and data collection strategy in order to increase the transferability across evaluations. This paper aims to contribute to the understanding of how to apply economic evaluation methodology in the eHealth field. It provides a brief overview of basic health economics principles and frameworks and discusses some methodological issues and challenges in conducting cost-effectiveness analysis of eHealth interventions. Issues regarding the identification, measurement, and valuation of costs and benefits are outlined. Furthermore, this work describes the established techniques of combining costs and benefits, presents the decision rules for identifying the preferred option, and outlines approaches to data collection strategies. Issues related to transferability and complexity are also discussed.

  20. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  1. Usefulness of real-time PCR as a complementary tool to the monitoring of Legionella spp. and Legionella pneumophila by culture in industrial cooling systems.

    PubMed

    Touron-Bodilis, A; Pougnard, C; Frenkiel-Lebossé, H; Hallier-Soulier, S

    2011-08-01

    This study was designed to evaluate the usefulness of quantification by real-time PCR as a management tool to monitor concentrations of Legionella spp. and Legionella pneumophila in industrial cooling systems and its ability to anticipate culture trends by the French standard method (AFNOR T90-431). Quantifications of Legionella bacteria were achieved by both methods on samples from nine cooling systems with different water qualities. Proportion of positive samples for L. pneumophila quantified by PCR was clearly lower in deionized or river waters submitted to a biocide treatment than in raw river waters, while positive samples for Legionella spp. were quantified for almost all the samples. For some samples containing PCR inhibitors, high quantification limits (up to 4·80 × 10(5) GU l(-1) ) did not allow us to quantify L. pneumophila, when they were quantified by culture. Finally, the monitoring of concentrations of L. pneumophila by both methods showed similar trends for 57-100% of the samples. These results suggest that, if some methodological steps designed to reduce inhibitory problems and thus decrease the quantification limits, could be developed to quantify Legionella in complex waters, the real-time PCR could be a valuable complementary tool to monitor the evolution of L. pneumophila concentrations. This study shows the possibility of using real-time PCR to monitor L. pneumophila proliferations in cooling systems and the importance to adapt nucleic acid extraction and purification protocols to raw waters. Journal of Applied Microbiology © 2011 The Society for Applied Microbiology. No claim to French Government works.

  2. Composition of complex numbers: Delineating the computational role of the left anterior temporal lobe.

    PubMed

    Blanco-Elorrieta, Esti; Pylkkänen, Liina

    2016-01-01

    What is the neurobiological basis of our ability to create complex messages with language? Results from multiple methodologies have converged on a set of brain regions as relevant for this general process, but the computational details of these areas remain to be characterized. The left anterior temporal lobe (LATL) has been a consistent node within this network, with results suggesting that although it rather systematically shows increased activation for semantically complex structured stimuli, this effect does not extend to number phrases such as 'three books.' In the present work we used magnetoencephalography to investigate whether numbers in general are an invalid input to the combinatory operations housed in the LATL or whether the lack of LATL engagement for stimuli such as 'three books' is due to the quantificational nature of such phrases. As a relevant test case, we employed complex number terms such as 'twenty-three', where one number term is not a quantifier of the other but rather, the two terms form a type of complex concept. In a number naming paradigm, participants viewed rows of numbers and depending on task instruction, named them as complex number terms ('twenty-three'), numerical quantifications ('two threes'), adjectival modifications ('blue threes') or non-combinatory lists (e.g., 'two, three'). While quantificational phrases failed to engage the LATL as compared to non-combinatory controls, both complex number terms and adjectival modifications elicited a reliable activity increase in the LATL. Our results show that while the LATL does not participate in the enumeration of tokens within a set, exemplified by the quantificational phrases, it does support conceptual combination, including the composition of complex number concepts. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Roundness variation in JPEG images affects the automated process of nuclear immunohistochemical quantification: correction with a linear regression model.

    PubMed

    López, Carlos; Jaén Martinez, Joaquín; Lejeune, Marylène; Escrivà, Patricia; Salvadó, Maria T; Pons, Lluis E; Alvaro, Tomás; Baucells, Jordi; García-Rojo, Marcial; Cugat, Xavier; Bosch, Ramón

    2009-10-01

    The volume of digital image (DI) storage continues to be an important problem in computer-assisted pathology. DI compression enables the size of files to be reduced but with the disadvantage of loss of quality. Previous results indicated that the efficiency of computer-assisted quantification of immunohistochemically stained cell nuclei may be significantly reduced when compressed DIs are used. This study attempts to show, with respect to immunohistochemically stained nuclei, which morphometric parameters may be altered by the different levels of JPEG compression, and the implications of these alterations for automated nuclear counts, and further, develops a method for correcting this discrepancy in the nuclear count. For this purpose, 47 DIs from different tissues were captured in uncompressed TIFF format and converted to 1:3, 1:23 and 1:46 compression JPEG images. Sixty-five positive objects were selected from these images, and six morphological parameters were measured and compared for each object in TIFF images and those of the different compression levels using a set of previously developed and tested macros. Roundness proved to be the only morphological parameter that was significantly affected by image compression. Factors to correct the discrepancy in the roundness estimate were derived from linear regression models for each compression level, thereby eliminating the statistically significant differences between measurements in the equivalent images. These correction factors were incorporated in the automated macros, where they reduced the nuclear quantification differences arising from image compression. Our results demonstrate that it is possible to carry out unbiased automated immunohistochemical nuclear quantification in compressed DIs with a methodology that could be easily incorporated in different systems of digital image analysis.

  4. Sensitivity and specificity of human brain glutathione concentrations measured using short-TE (1)H MRS at 7 T.

    PubMed

    Deelchand, Dinesh K; Marjańska, Małgorzata; Hodges, James S; Terpstra, Melissa

    2016-05-01

    Although the MR editing techniques that have traditionally been used for the measurement of glutathione (GSH) concentrations in vivo address the problem of spectral overlap, they suffer detriments associated with inherently long TEs. The purpose of this study was to characterize the sensitivity and specificity for the quantification of GSH concentrations without editing at short TE. The approach was to measure synthetically generated changes in GSH concentrations from in vivo stimulated echo acquisition mode (STEAM) spectra after in vitro GSH spectra had been added to or subtracted from them. Spectra from five test subjects were synthetically altered to mimic changes in the GSH signal. To account for different background noise between measurements, retest spectra (from the same individuals as used to generate the altered data) and spectra from five other individuals were compared with the synthetically altered spectra to investigate the reliability of the quantification of GSH concentration. Using STEAM spectroscopy at 7 T, GSH concentration differences on the order of 20% were detected between test and retest studies, as well as between differing populations in a small sample (n = 5) with high accuracy (R(2) > 0.99) and certainty (p ≤ 0.01). Both increases and decreases in GSH concentration were reliably quantified with small impact on the quantification of ascorbate and γ-aminobutyric acid. These results show the feasibility of using short-TE (1)H MRS to measure biologically relevant changes and differences in human brain GSH concentration. Although these outcomes are specific to the experimental approach used and the spectral quality achieved, this study serves as a template for the analogous scrutiny of quantification reliability for other compounds, methodologies and spectral qualities. Copyright © 2016 John Wiley & Sons, Ltd.

  5. 5 CFR 841.701 - Purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... adjustments (COLA's) for basic benefits under the Federal Employees Retirement System (FERS). (b) This subpart provides the methodology for— (1) Computing COLA's on each type of FERS basic benefit subject to COLA's; and (2) Computing COLA's on annuities partially computed under FERS and partially computed under the...

  6. 5 CFR 841.701 - Purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... adjustments (COLA's) for basic benefits under the Federal Employees Retirement System (FERS). (b) This subpart provides the methodology for— (1) Computing COLA's on each type of FERS basic benefit subject to COLA's; and (2) Computing COLA's on annuities partially computed under FERS and partially computed under the...

  7. 5 CFR 841.701 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... adjustments (COLA's) for basic benefits under the Federal Employees Retirement System (FERS). (b) This subpart provides the methodology for— (1) Computing COLA's on each type of FERS basic benefit subject to COLA's; and (2) Computing COLA's on annuities partially computed under FERS and partially computed under the...

  8. 5 CFR 841.701 - Purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... adjustments (COLA's) for basic benefits under the Federal Employees Retirement System (FERS). (b) This subpart provides the methodology for— (1) Computing COLA's on each type of FERS basic benefit subject to COLA's; and (2) Computing COLA's on annuities partially computed under FERS and partially computed under the...

  9. 5 CFR 841.701 - Purpose and scope.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... adjustments (COLA's) for basic benefits under the Federal Employees Retirement System (FERS). (b) This subpart provides the methodology for— (1) Computing COLA's on each type of FERS basic benefit subject to COLA's; and (2) Computing COLA's on annuities partially computed under FERS and partially computed under the...

  10. 75 FR 74086 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Benefits...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-30

    ... Administration (ETA) sponsored information collection request (ICR) titled, ``Benefits Timeliness and Quality... information and analyzes data. BTQ data measure the timeliness and quality of states' administrative actions... of information, including the validity of the methodology and assumptions used; Enhance the quality...

  11. The Generalized Roy Model and the Cost-Benefit Analysis of Social Programs.

    PubMed

    Eisenhauer, Philipp; Heckman, James J; Vytlacil, Edward

    2015-04-01

    The literature on treatment effects focuses on gross benefits from program participation. We extend this literature by developing conditions under which it is possible to identify parameters measuring the cost and net surplus from program participation. Using the generalized Roy model, we nonparametrically identify the cost, benefit, and net surplus of selection into treatment without requiring the analyst to have direct information on the cost. We apply our methodology to estimate the gross benefit and net surplus of attending college.

  12. The Generalized Roy Model and the Cost-Benefit Analysis of Social Programs*

    PubMed Central

    Eisenhauer, Philipp; Heckman, James J.; Vytlacil, Edward

    2015-01-01

    The literature on treatment effects focuses on gross benefits from program participation. We extend this literature by developing conditions under which it is possible to identify parameters measuring the cost and net surplus from program participation. Using the generalized Roy model, we nonparametrically identify the cost, benefit, and net surplus of selection into treatment without requiring the analyst to have direct information on the cost. We apply our methodology to estimate the gross benefit and net surplus of attending college. PMID:26709315

  13. Practical considerations of image analysis and quantification of signal transduction IHC staining.

    PubMed

    Grunkin, Michael; Raundahl, Jakob; Foged, Niels T

    2011-01-01

    The dramatic increase in computer processing power in combination with the availability of high-quality digital cameras during the last 10 years has fertilized the grounds for quantitative microscopy based on digital image analysis. With the present introduction of robust scanners for whole slide imaging in both research and routine, the benefits of automation and objectivity in the analysis of tissue sections will be even more obvious. For in situ studies of signal transduction, the combination of tissue microarrays, immunohistochemistry, digital imaging, and quantitative image analysis will be central operations. However, immunohistochemistry is a multistep procedure including a lot of technical pitfalls leading to intra- and interlaboratory variability of its outcome. The resulting variations in staining intensity and disruption of original morphology are an extra challenge for the image analysis software, which therefore preferably should be dedicated to the detection and quantification of histomorphometrical end points.

  14. Uncertainty quantification and optimal decisions

    PubMed Central

    2017-01-01

    A mathematical model can be analysed to construct policies for action that are close to optimal for the model. If the model is accurate, such policies will be close to optimal when implemented in the real world. In this paper, the different aspects of an ideal workflow are reviewed: modelling, forecasting, evaluating forecasts, data assimilation and constructing control policies for decision-making. The example of the oil industry is used to motivate the discussion, and other examples, such as weather forecasting and precision agriculture, are used to argue that the same mathematical ideas apply in different contexts. Particular emphasis is placed on (i) uncertainty quantification in forecasting and (ii) how decisions are optimized and made robust to uncertainty in models and judgements. This necessitates full use of the relevant data and by balancing costs and benefits into the long term may suggest policies quite different from those relevant to the short term. PMID:28484343

  15. Quantifying Motor Impairment in Movement Disorders.

    PubMed

    FitzGerald, James J; Lu, Zhongjiao; Jareonsettasin, Prem; Antoniades, Chrystalina A

    2018-01-01

    Until recently the assessment of many movement disorders has relied on clinical rating scales that despite careful design are inherently subjective and non-linear. This makes accurate and truly observer-independent quantification difficult and limits the use of sensitive parametric statistical methods. At last, devices capable of measuring neurological problems quantitatively are becoming readily available. Examples include the use of oculometers to measure eye movements and accelerometers to measure tremor. Many applications are being developed for use on smartphones. The benefits include not just more accurate disease quantification, but also consistency of data for longitudinal studies, accurate stratification of patients for entry into trials, and the possibility of automated data capture for remote follow-up. In this mini review, we will look at movement disorders with a particular focus on Parkinson's disease, describe some of the limitations of existing clinical evaluation tools, and illustrate the ways in which objective metrics have already been successful.

  16. An integrated dispersion preparation, characterization and in vitro dosimetry methodology for engineered nanomaterials

    PubMed Central

    DeLoid, Glen M.; Cohen, Joel M.; Pyrgiotakis, Georgios; Demokritou, Philip

    2018-01-01

    Summary Evidence continues to grow of the importance of in vitro and in vivo dosimetry in the hazard assessment and ranking of engineered nanomaterials (ENMs). Accurate dose metrics are particularly important for in vitro cellular screening to assess the potential health risks or bioactivity of ENMs. In order to ensure meaningful and reproducible quantification of in vitro dose, with consistent measurement and reporting between laboratories, it is necessary to adopt standardized and integrated methodologies for 1) generation of stable ENM suspensions in cell culture media, 2) colloidal characterization of suspended ENMs, particularly properties that determine particle kinetics in an in vitro system (size distribution and formed agglomerate effective density), and 3) robust numerical fate and transport modeling for accurate determination of ENM dose delivered to cells over the course of the in vitro exposure. Here we present such an integrated comprehensive protocol based on such a methodology for in vitro dosimetry, including detailed standardized procedures for each of these three critical steps. The entire protocol requires approximately 6-12 hours to complete. PMID:28102836

  17. An information-theoretic approach to surrogate-marker evaluation with failure time endpoints.

    PubMed

    Pryseley, Assam; Tilahun, Abel; Alonso, Ariel; Molenberghs, Geert

    2011-04-01

    Over the last decades, the evaluation of potential surrogate endpoints in clinical trials has steadily been growing in importance, not only thanks to the availability of ever more potential markers and surrogate endpoints, also because more methodological development has become available. While early work has been devoted, to a large extent, to Gaussian, binary, and longitudinal endpoints, the case of time-to-event endpoints is in need of careful scrutiny as well, owing to the strong presence of such endpoints in oncology and beyond. While work had been done in the past, it was often cumbersome to use such tools in practice, because of the need for fitting copula or frailty models that were further embedded in a hierarchical or two-stage modeling approach. In this paper, we present a methodologically elegant and easy-to-use approach based on information theory. We resolve essential issues, including the quantification of "surrogacy" based on such an approach. Our results are put to the test in a simulation study and are applied to data from clinical trials in oncology. The methodology has been implemented in R.

  18. Non-contact versus contact-based sensing methodologies for in-home upper arm robotic rehabilitation.

    PubMed

    Howard, Ayanna; Brooks, Douglas; Brown, Edward; Gebregiorgis, Adey; Chen, Yu-Ping

    2013-06-01

    In recent years, robot-assisted rehabilitation has gained momentum as a viable means for improving outcomes for therapeutic interventions. Such therapy experiences allow controlled and repeatable trials and quantitative evaluation of mobility metrics. Typically though these robotic devices have been focused on rehabilitation within a clinical setting. In these traditional robot-assisted rehabilitation studies, participants are required to perform goal-directed movements with the robot during a therapy session. This requires physical contact between the participant and the robot to enable precise control of the task, as well as a means to collect relevant performance data. On the other hand, non-contact means of robot interaction can provide a safe methodology for extracting the control data needed for in-home rehabilitation. As such, in this paper we discuss a contact and non-contact based method for upper-arm rehabilitation exercises that enables quantification of upper-arm movements. We evaluate our methodology on upper-arm abduction/adduction movements and discuss the advantages and limitations of each approach as applied to an in-home rehabilitation scenario.

  19. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Catchment-wide impacts on water quality: the use of 'snapshot' sampling during stable flow

    NASA Astrophysics Data System (ADS)

    Grayson, R. B.; Gippel, C. J.; Finlayson, B. L.; Hart, B. T.

    1997-12-01

    Water quality is usually monitored on a regular basis at only a small number of locations in a catchment, generally focused at the catchment outlet. This integrates the effect of all the point and non-point source processes occurring throughout the catchment. However, effective catchment management requires data which identify major sources and processes. As part of a wider study aimed at providing technical information for the development of integrated catchment management plans for a 5000 km 2 catchment in south eastern Australia, a 'snapshot' of water quality was undertaken during stable summer flow conditions. These low flow conditions exist for long periods so water quality at these flow levels is an important constraint on the health of in-stream biological communities. Over a 4 day period, a study of the low flow water quality characteristics throughout the Latrobe River catchment was undertaken. Sixty-four sites were chosen to enable a longitudinal profile of water quality to be established. All tributary junctions and sites along major tributaries, as well as all major industrial inputs were included. Samples were analysed for a range of parameters including total suspended solids concentration, pH, dissolved oxygen, electrical conductivity, turbidity, flow rate and water temperature. Filtered and unfiltered samples were taken from 27 sites along the main stream and tributary confluences for analysis of total N, NH 4, oxidised N, total P and dissolved reactive P concentrations. The data are used to illustrate the utility of this sampling methodology for establishing specific sources and estimating non-point source loads of phosphorous, total suspended solids and total dissolved solids. The methodology enabled several new insights into system behaviour including quantification of unknown point discharges, identification of key in-stream sources of suspended material and the extent to which biological activity (phytoplankton growth) affects water quality. The costs and benefits of the sampling exercise are reviewed.

  1. Enlisting Ecosystem Benefits: Quantification and Valuation of Ecosystem Services to Inform Installation Management

    DTIC Science & Technology

    2015-05-27

    human development and conservation of terrestrial, freshwater, and marine ecosystems. The InVEST toolset currently includes 17 distinct InVEST... Plateau to the north and the Coastal Plain to the south, which represent distinct features of topography, geology and soils, and vegetation communities...threatened by a complex of tree diseases and pine beetles that cause declines or mortality in loblolly pine, a dominant tree across the base. When loblolly

  2. Economic assessment of climate adaptation options for urban drainage design in Odense, Denmark.

    PubMed

    Zhou, Q; Halsnæs, K; Arnbjerg-Nielsen, K

    2012-01-01

    Climate change is likely to influence the water cycle by changing the precipitation patterns, in some cases leading to increased occurrences of precipitation extremes. Urban landscapes are vulnerable to such changes due to the concentrated population and socio-economic values in cities. Feasible adaptation requires better flood risk quantification and assessment of appropriate adaptation actions in term of costs and benefits. This paper presents an economic assessment of three prevailing climate adaptation options for urban drainage design in a Danish case study, Odense. A risk-based evaluation framework is used to give detailed insights of the physical and economic feasibilities of each option. Estimation of marginal benefits of adaptation options are carried out through a step-by-step cost-benefit analysis. The results are aimed at providing important information for decision making on how best to adapt to urban pluvial flooding due to climate impacts in cities.

  3. The misconception of ecosystem disservices: How a catchy term may yield the wrong messages for science and society

    USGS Publications Warehouse

    Villa, Ferdinando; Bagstad, Kenneth J.; Voigt, Brian; Johnson, Gary W.; Athanasiadis, Ioannis N; Balbi, Stefano

    2014-01-01

    In their recent article, Shapiro and Báldi (2014) build on the long-running narrative of “ecosystem services and disservices” (e.g., Zhang et al., 2007 ; Lyytimäki et al., 2008), describing how nature yields both benefits and harms to society. These harms include crop pests, floods, landslides, wildfires, and zoonotic disease transmission, among others. While we agree with their argument that calculation of these harms is commonplace and corresponding quantification of benefits is needed, we feel the use of the concept of “ecosystem disservices” hampers, rather than helps, the development of an integrative and constructive dialogue about conservation and the complex interrelationships between humans and nature. Estimation of costs and benefits and their balancing as positives or negatives is a principal activity in economics; however, we fear that in this case the term “disservice” carries the wrong message for both science and society.

  4. The relationship between return on investment and quality of study methodology in workplace health promotion programs.

    PubMed

    Baxter, Siyan; Sanderson, Kristy; Venn, Alison J; Blizzard, C Leigh; Palmer, Andrew J

    2014-01-01

    To determine the relationship between return on investment (ROI) and quality of study methodology in workplace health promotion programs. Data were obtained through a systematic literature search of National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE), Health Technology Database (HTA), Cost Effectiveness Analysis (CEA) Registry, EconLit, PubMed, Embase, Wiley, and Scopus. Included were articles written in English or German reporting cost(s) and benefit(s) and single or multicomponent health promotion programs on working adults. Return-to-work and workplace injury prevention studies were excluded. Methodological quality was graded using British Medical Journal Economic Evaluation Working Party checklist. Economic outcomes were presented as ROI. ROI was calculated as ROI = (benefits - costs of program)/costs of program. Results were weighted by study size and combined using meta-analysis techniques. Sensitivity analysis was performed using two additional methodological quality checklists. The influences of quality score and important study characteristics on ROI were explored. Fifty-one studies (61 intervention arms) published between 1984 and 2012 included 261,901 participants and 122,242 controls from nine industry types across 12 countries. Methodological quality scores were highly correlated between checklists (r = .84-.93). Methodological quality improved over time. Overall weighted ROI [mean ± standard deviation (confidence interval)] was 1.38 ± 1.97 (1.38-1.39), which indicated a 138% return on investment. When accounting for methodological quality, an inverse relationship to ROI was found. High-quality studies (n = 18) had a smaller mean ROI, 0.26 ± 1.74 (.23-.30), compared to moderate (n = 16) 0.90 ± 1.25 (.90-.91) and low-quality (n = 27) 2.32 ± 2.14 (2.30-2.33) studies. Randomized control trials (RCTs) (n = 12) exhibited negative ROI, -0.22 ± 2.41(-.27 to -.16). Financial returns become increasingly positive across quasi-experimental, nonexperimental, and modeled studies: 1.12 ± 2.16 (1.11-1.14), 1.61 ± 0.91 (1.56-1.65), and 2.05 ± 0.88 (2.04-2.06), respectively. Overall, mean weighted ROI in workplace health promotion demonstrated a positive ROI. Higher methodological quality studies provided evidence of smaller financial returns. Methodological quality and study design are important determinants.

  5. Phase 1 of the near team hybrid passenger vehicle development program. Appendix C: Preliminary design data package, volume 1

    NASA Technical Reports Server (NTRS)

    Piccolo, R.

    1979-01-01

    The methodology used for vehicle layout and component definition is described as well as techniques for system optimization and energy evaluation. The preliminary design is examined with particular attention given to body and structure; propulsion system; crash analysis and handling; internal combustion engine; DC motor separately excited; Ni-Zn battery; transmission; control system; vehicle auxiliarries; weight breakdown, and life cycle costs. Formulas are given for the quantification of energy consumption and results are compared with the reference vehicle.

  6. Highly sensitive and simple liquid chromatography assay with ion-pairing extraction and visible detection for quantification of gold from nanoparticles.

    PubMed

    Pallotta, Arnaud; Philippe, Valentin; Boudier, Ariane; Leroy, Pierre; Clarot, Igor

    2018-03-01

    A simple isocratic HPLC method using visible detection was developed and validated for the quantification of gold in nanoparticles (AuNP). After a first step of oxidation of nanoparticles, an ion-pair between tetrachloroaurate anion and the cationic dye Rhodamine B was formed and extracted from the aqueous media with the help of an organic solvent. The corresponding Rhodamine B was finally quantified by reversed phase liquid chromatography using a Nucleosil C18 (150mm × 4.6mm, 3µm) column and with a mobile phase containing acetonitrile and 0.1% trifluoroacetic acid aqueous solution (25/75, V/V) at 1.0mLmin -1. and at a wavelength of 555nm. The method was validated using methodology described by the International Conference on Harmonization and was shown to be specific, precise (RSD < 11%), accurate and linear in the range of 0.1 - 30.0µM with a lower limit of quantification (LLOQ) of 0.1µM. This method was in a first time applied to AuNP quality control after their synthesis. In a second time, the absence of gold leakage (either as AuNP or gold salt form) from nanostructured multilayered polyelectrolyte films under shear stress was assessed. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Liquid chromatography/electrospray ionisation-tandem mass spectrometry quantification of GM2 gangliosides in human peripheral cells and plasma.

    PubMed

    Fuller, Maria; Duplock, Stephen; Hein, Leanne K; Rigat, Brigitte A; Mahuran, Don J

    2014-08-01

    GM2 gangliosidosis is a group of inherited neurodegenerative disorders resulting primarily from the excessive accumulation of GM2 gangliosides (GM2) in neuronal cells. As biomarkers for categorising patients and monitoring the effectiveness of developing therapies are lacking for this group of disorders, we sought to develop methodology to quantify GM2 levels in more readily attainable patient samples such as plasma, leukocytes, and cultured skin fibroblasts. Following organic extraction, gangliosides were partitioned into the aqueous phase and isolated using C18 solid-phase extraction columns. Relative quantification of three species of GM2 was achieved using LC/ESI-MS/MS with d35GM1 18:1/18:0 as an internal standard. The assay was linear over the biological range, and all GM2 gangliosidosis patients were demarcated from controls by elevated GM2 in cultured skin fibroblast extracts. However, in leukocytes only some molecular species could be used for differentiation and in plasma only one was informative. A reduction in GM2 was easily detected in patient skin fibroblasts after a short treatment with media from normal cells enriched in secreted β-hexosaminidase. This method may show promise for measuring the effectiveness of experimental therapies for GM2 gangliosidosis by allowing quantification of a reduction in the primary storage burden. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Use of lignin extracted from different plant sources as standards in the spectrophotometric acetyl bromide lignin method.

    PubMed

    Fukushima, Romualdo S; Kerley, Monty S

    2011-04-27

    A nongravimetric acetyl bromide lignin (ABL) method was evaluated to quantify lignin concentration in a variety of plant materials. The traditional approach to lignin quantification required extraction of lignin with acidic dioxane and its isolation from each plant sample to construct a standard curve via spectrophotometric analysis. Lignin concentration was then measured in pre-extracted plant cell walls. However, this presented a methodological complexity because extraction and isolation procedures are lengthy and tedious, particularly if there are many samples involved. This work was targeted to simplify lignin quantification. Our hypothesis was that any lignin, regardless of its botanical origin, could be used to construct a standard curve for the purpose of determining lignin concentration in a variety of plants. To test our hypothesis, lignins were isolated from a range of diverse plants and, along with three commercial lignins, standard curves were built and compared among them. Slopes and intercepts derived from these standard curves were close enough to allow utilization of a mean extinction coefficient in the regression equation to estimate lignin concentration in any plant, independent of its botanical origin. Lignin quantification by use of a common regression equation obviates the steps of lignin extraction, isolation, and standard curve construction, which substantially expedites the ABL method. Acetyl bromide lignin method is a fast, convenient analytical procedure that may routinely be used to quantify lignin.

  9. An alternative method for the analysis of melanin production in Cryptococcus neoformans sensu lato and Cryptococcus gattii sensu lato.

    PubMed

    Brilhante, Raimunda S N; España, Jaime D A; de Alencar, Lucas P; Pereira, Vandbergue S; Castelo-Branco, Débora de S C M; Pereira-Neto, Waldemiro de A; Cordeiro, Rossana de A; Sidrim, José J C; Rocha, Marcos F G

    2017-10-01

    Melanin is an important virulence factor for several microorganisms, including Cryptococcus neoformans sensu lato and Cryptococcus gattii sensu lato, thus, the assessment of melanin production and its quantification may contribute to the understanding of microbial pathogenesis. The objective of this study was to standardise an alternative method for the production and indirect quantification of melanin in C. neoformans sensu lato and C. gattii sensu lato. Eight C. neoformans sensu lato and three C. gattii sensu lato, identified through URA5 methodology, Candida parapsilosis ATCC 22019 (negative control) and one Hortaea werneckii (positive control) were inoculated on minimal medium agar with or without L-DOPA, in duplicate, and incubated at 35°C, for 7 days. Pictures were taken from the third to the seventh day, under standardised conditions in a photographic chamber. Then, photographs were analysed using grayscale images. All Cryptococcus spp. strains produced melanin after growth on minimal medium agar containing L-DOPA. C. parapsilosis ATCC 22019 did not produce melanin on medium containing L-DOPA, while H. werneckii presented the strongest pigmentation. This new method allows the indirect analysis of melanin production through pixel quantification in grayscale images, enabling the study of substances that can modulate melanin production. © 2017 Blackwell Verlag GmbH.

  10. Ultimate Drivers and Proximate Correlates of Polyandry in Predatory Mites

    PubMed Central

    Schausberger, Peter; Patiño-Ruiz, J. David; Osakabe, Masahiro; Murata, Yasumasa; Sugimoto, Naoya; Uesugi, Ryuji; Walzer, Andreas

    2016-01-01

    Polyandry is more widespread than anticipated from Bateman’s principle but its ultimate (evolutionary) causes and proximate (mechanistic) correlates are more difficult to pinpoint than those of polygyny. Here, we combined mating experiments, quantification of reproductive traits and microsatellite genotyping to determine the fitness implications of polyandry in two predatory mite species, where males are highly polygynous (up to 45 fertilized females during life), whereas females range from monandry to various polyandry levels. The medium-level polyandrous (up to eight male mates possible) Neoseiulus californicus received clear direct and indirect benefits: multiply mated females produced more offspring with higher survival chances over longer times than singly mated females. In contrast, singly and multiply mated females of the low-level polyandrous (commonly two male mates at maximum) Phytoseiulus persimilis produced similar numbers of offspring having similar survival chances. In both species, multiple mating resulted in mixed offspring paternities, opening the chance for indirect fitness benefits such as enhanced genetic compatibility, complementarity and/or variability. However, the female re-mating likelihood and the paternity chance of non-first male mates were lower in P. persimilis than in N. californicus. Regarding proximate factors, in both species first mating duration and female re-mating likelihood were negatively correlated. Based on occasional fertilization failure of first male mates in P. persimilis, and mixed offspring paternities in both species, we argue that fertilization assurance and the chance to gain indirect fitness benefits are the ultimate drivers of polyandry in P. persimilis, whereas those of N. californicus are higher offspring numbers coupled with enhanced offspring viability and possibly other indirect fitness benefits. Overall, the adaptive significance and proximate events well reflected the polyandry levels. Our study provides a key example for linking behavioral experiments, quantification of reproductive traits and paternity analysis via offspring genotyping to explain the evolution of differing levels of polyandry. PMID:27100395

  11. Ultimate Drivers and Proximate Correlates of Polyandry in Predatory Mites.

    PubMed

    Schausberger, Peter; Patiño-Ruiz, J David; Osakabe, Masahiro; Murata, Yasumasa; Sugimoto, Naoya; Uesugi, Ryuji; Walzer, Andreas

    2016-01-01

    Polyandry is more widespread than anticipated from Bateman's principle but its ultimate (evolutionary) causes and proximate (mechanistic) correlates are more difficult to pinpoint than those of polygyny. Here, we combined mating experiments, quantification of reproductive traits and microsatellite genotyping to determine the fitness implications of polyandry in two predatory mite species, where males are highly polygynous (up to 45 fertilized females during life), whereas females range from monandry to various polyandry levels. The medium-level polyandrous (up to eight male mates possible) Neoseiulus californicus received clear direct and indirect benefits: multiply mated females produced more offspring with higher survival chances over longer times than singly mated females. In contrast, singly and multiply mated females of the low-level polyandrous (commonly two male mates at maximum) Phytoseiulus persimilis produced similar numbers of offspring having similar survival chances. In both species, multiple mating resulted in mixed offspring paternities, opening the chance for indirect fitness benefits such as enhanced genetic compatibility, complementarity and/or variability. However, the female re-mating likelihood and the paternity chance of non-first male mates were lower in P. persimilis than in N. californicus. Regarding proximate factors, in both species first mating duration and female re-mating likelihood were negatively correlated. Based on occasional fertilization failure of first male mates in P. persimilis, and mixed offspring paternities in both species, we argue that fertilization assurance and the chance to gain indirect fitness benefits are the ultimate drivers of polyandry in P. persimilis, whereas those of N. californicus are higher offspring numbers coupled with enhanced offspring viability and possibly other indirect fitness benefits. Overall, the adaptive significance and proximate events well reflected the polyandry levels. Our study provides a key example for linking behavioral experiments, quantification of reproductive traits and paternity analysis via offspring genotyping to explain the evolution of differing levels of polyandry.

  12. Saharan dust contribution to PM levels: The EC LIFE+ DIAPASON project

    NASA Astrophysics Data System (ADS)

    Gobbi, G. P.; Wille, H.; Sozzi, R.; Angelini, F.; Barnaba, F.; Costabile, F.; Frey, S.; Bolignano, A.; Di Giosa, A.

    2012-04-01

    The contribution of Saharan-dust advections to both daily and annual PM average values can be significant all over Southern Europe. The most important effects of dust on the number of PM exceedances are mostly observed in polluted areas and large cities. While a wide literature exists documenting episodes of Saharan dust transport towards the Euro-Mediterranean region and Europe in general, a limited number of studies are still available providing statistically significant results on the impact of Saharan dust on the particulate matter loads over the continent. A four-year (2001-2004) study performed in Rome (Italy) found these events to contribute to the average ground PM10 with about 15±10 µg/m3 on about 17% of the days in a year. Since the PM10 yearly average of many traffic stations in Rome is close to 40 μg/m3, these events can cause the PM10 concentration to exceed air quality limit values (50 μg/m3 as daily average) set by the EU Air Quality Directive 2008/50/EC. Although the European legislation allows Member States to subtract the contribution of natural sources before counting PM10 exceedances, definition of an optimal methodology to quantitatively assess such contribution is still in progress. On the basis of the current European Guidelines on the assessment of natural contributions to PM, the DIAPASON project ("Desert-dust Impact on Air quality through model-Predictions and Advanced Sensors ObservatioNs", recently funded under the EC LIFE+ program) has been formulated to provide a robust, user-oriented methodology to assess the presence of desert dust and its contribution to PM levels. To this end, in addition to satellite-based data and model forecasts, the DIAPASON methodology will employ innovative and affordable technologies, partly prototyped within the project itself, as an operational Polarization Lidar-Ceilometer (laser radar) capable of detecting and profiling dust clouds from the ground up to 10 km altitude. The DIAPASON Project (2011-2014) will be first implemented as a network of three stations in the Rome metropolitan area. However, the DIAPASON methodology to detect/quantify the Saharan dust contribution to PM will be designed to be easily applicable by air-quality and meteorological agencies. In fact, the possibility of manufacturing cheap, operational polarization lidar-ceilometers and scatter them on the territory will also represent a breakthrough in the detection and quantification of other atmospheric aerosol layers, as volcanic or wild-fire plumes, with further benefits in terms of meteo forecasts, flight security and air quality assessments.

  13. Possible Improvements of the ACE Diversity Interchange Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Zhou, Ning; Makarov, Yuri V.

    2010-07-26

    North American Electric Reliability Corporation (NERC) grid is operated by about 131 balancing authorities (BA). Within each BA, operators are responsible for managing the unbalance (caused by both load and wind). As wind penetration levels increase, the challenges of managing power variation increases. Working independently, balancing area with limited regulating/load following generation and high wind power penetration faces significant challenges. The benefits of BA cooperation and consolidation increase when there is a significant wind energy penetration. To explore the benefits of BA cooperation, this paper investigates ACE sharing approach. A technology called ACE diversity interchange (ADI) is already in usemore » in the western interconnection. A new methodology extending ADI is proposed in the paper. The proposed advanced ADI overcoming some limitations existing in conventional ADI. Simulations using real statistical data of CAISO and BPA have shown high performance of the proposed advanced ADI methodology.« less

  14. Quality and Rigor of the Concept Mapping Methodology: A Pooled Study Analysis

    ERIC Educational Resources Information Center

    Rosas, Scott R.; Kane, Mary

    2012-01-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative…

  15. [Evaluation of medication risk in pregnant women: methodology of evaluation and risk management].

    PubMed

    Eléfant, E; Sainte-Croix, A

    1997-01-01

    This round table discussion was devoted to the description of the tools currently available for the evaluation of drug risks and management during pregnancy. Five topics were submitted for discussion: pre-clinical data, methodological tools, benefit/risk ratio before prescription, teratogenic or fetal risk evaluation, legal comments.

  16. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  17. "Charlie: Please Respond!" Using a Participatory Methodology with Individuals on the Autism Spectrum

    ERIC Educational Resources Information Center

    MacLeod, Andrea Georgia; Lewis, Ann; Robertson, Christopher

    2014-01-01

    This paper explores a participatory research approach used with 10 higher education students on the autism spectrum (mainly diagnosed with Asperger syndrome). The methodology sought to overcome barriers to participation. Participants' views were sought on the benefits and challenges related to their participation. Most participants opted for…

  18. Toward a Methodology for Conducting Social Impact Assessments Using Quality of Social Life Indicators.

    ERIC Educational Resources Information Center

    Olsen, Marvin E.; Merwin, Donna J.

    Broadly conceived, social impacts refer to all changes in the structure and functioning of patterned social ordering that occur in conjunction with an environmental, technological, or social innovation or alteration. Departing from the usual cost-benefit analysis approach, a new methodology proposes conducting social impact assessment grounded in…

  19. ENVIRONMENTAL GOODS AND SERIVCES FROM RESTORATION ALTERNATIVES: EMERGY-BASED METHOD OF BENEFIT VALUE

    EPA Science Inventory

    Although economic benefit-cost analyses of environmental regulations has been conducted since the 1970s, an effective methodology has yet to be developed for the integrated assessment of regulatory impacts on the larger system as a whole, including its social and environmental as...

  20. Analysis of National Solid Waste Recycling Programs and Development of Solid Waste Recycling Cost Functions: A Summary of the Literature (1999)

    EPA Pesticide Factsheets

    Discussion of methodological issues for conducting benefit-cost analysis and provides guidance for selecting and applying the most appropriate and useful mechanisms in benefit-cost analysis of toxic substances, hazardous materials, and solid waste control

  1. Valuing Non-CO2 GHG Emission Changes in Benefit-Cost Analysis

    EPA Science Inventory

    The climate impacts of greenhouse gas (GHG) emissions impose social costs on society. To date, EPA has not had an approach to estimate the economic benefits of reducing emissions of non-CO2 GHGs (or the costs of increasing them) that is consistent with the methodology underlying...

  2. Behavioural Insights into Benefits Claimants' Training

    ERIC Educational Resources Information Center

    Gloster, Rosie; Buzzeo, Jonathan; Cox, Annette; Bertram, Christine; Tassinari, Arianna; Schmidtke, Kelly Ann; Vlaev, Ivo

    2018-01-01

    Purpose: The purpose of this paper is to explore the behavioural determinants of work-related benefits claimants' training behaviours and to suggest ways to improve claimants' compliance with training referrals. Design/methodology/approach: Qualitative interviews were conducted with 20 Jobcentre Plus staff and training providers, and 60 claimants.…

  3. A Cost-Benefit Methodology for Summative Evaluation.

    ERIC Educational Resources Information Center

    Churchman, David

    Although there is high interest in determining whether or not an educational program provides good value for its cost, it is difficult to make this determination, since people are not generally conceptualized as products and since educational benefits are not easily translated into financial terms. Economic principles suggest that the cost of…

  4. The Bilingual Advantage: Language, Literacy and the US Labor Market

    ERIC Educational Resources Information Center

    Callahan, Rebecca M., Ed.; Gándara, Patricia C., Ed.

    2014-01-01

    The Bilingual Advantage draws together researchers from education, economics, sociology, anthropology and linguistics to examine the economic and employment benefits of bilingualism in the US labor market, countering past research that shows no such benefits exist. Collectively, the authors draw on novel methodological approaches and new data to…

  5. The Lone Wolf Threat: A Different Approach

    DTIC Science & Technology

    2016-03-24

    Monitoring.”  4    with the benefit of hindsight, apply the methodology of action to each study. Chapter Six will present an analytical model used on the... benefit , and demonstrate a profound lack of remorse. It is sometimes referred to as sociopathic personality disorder, or sociopathy.1 Conversely, someone...prosocial or antisocial. Prosocial behavior is voluntary behavior intended to benefit another, such as assisting, sharing, and comforting others. For the

  6. A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.

    PubMed

    Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew

    2016-01-01

    While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  8. ICP-MS/MS-Based Ionomics: A Validated Methodology to Investigate the Biological Variability of the Human Ionome.

    PubMed

    Konz, Tobias; Migliavacca, Eugenia; Dayon, Loïc; Bowman, Gene; Oikonomidi, Aikaterini; Popp, Julius; Rezzi, Serge

    2017-05-05

    We here describe the development, validation and application of a quantitative methodology for the simultaneous determination of 29 elements in human serum using state-of-the-art inductively coupled plasma triple quadrupole mass spectrometry (ICP-MS/MS). This new methodology offers high-throughput elemental profiling using simple dilution of minimal quantity of serum samples. We report the outcomes of the validation procedure including limits of detection/quantification, linearity of calibration curves, precision, recovery and measurement uncertainty. ICP-MS/MS-based ionomics was used to analyze human serum of 120 older adults. Following a metabolomic data mining approach, the generated ionome profiles were subjected to principal component analysis revealing gender and age-specific differences. The ionome of female individuals was marked by higher levels of calcium, phosphorus, copper and copper to zinc ratio, while iron concentration was lower with respect to male subjects. Age was associated with lower concentrations of zinc. These findings were complemented with additional readouts to interpret micronutrient status including ceruloplasmin, ferritin and inorganic phosphate. Our data supports a gender-specific compartmentalization of the ionome that may reflect different bone remodelling in female individuals. Our ICP-MS/MS methodology enriches the panel of validated "Omics" approaches to study molecular relationships between the exposome and the ionome in relation with nutrition and health.

  9. Quantitative evaluation of alternatively spliced mRNA isoforms by label-free real-time plasmonic sensing.

    PubMed

    Huertas, César S; Carrascosa, L G; Bonnal, S; Valcárcel, J; Lechuga, L M

    2016-04-15

    Alternative splicing of mRNA precursors enables cells to generate different protein outputs from the same gene depending on their developmental or homeostatic status. Its deregulation is strongly linked to disease onset and progression. Current methodologies for monitoring alternative splicing demand elaborate procedures and often present difficulties in discerning between closely related isoforms, e.g. due to cross-hybridization during their detection. Herein, we report a general methodology using a Surface Plasmon Resonance (SPR) biosensor for label-free monitoring of alternative splicing events in real-time, without any cDNA synthesis or PCR amplification requirements. We applied this methodology to RNA isolated from HeLa cells for the quantification of alternatively spliced isoforms of the Fas gene, involved in cancer progression through regulation of programmed cell death. We demonstrate that our methodology is isoform-specific, with virtually no cross-hybridization, achieving limits of detection (LODs) in the picoMolar (pM) range. Similar results were obtained for the detection of the BCL-X gene mRNA isoforms. The results were independently validated by RT-qPCR, with excellent concordance in the determination of isoform ratios. The simplicity and robustness of this biosensor technology can greatly facilitate the exploration of alternative splicing biomarkers in disease diagnosis and therapy. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Counting the cost: estimating the economic benefit of pedophile treatment programs.

    PubMed

    Shanahan, M; Donato, R

    2001-04-01

    The principal objective of this paper is to identify the economic costs and benefits of pedophile treatment programs incorporating both the tangible and intangible cost of sexual abuse to victims. Cost estimates of cognitive behavioral therapy programs in Australian prisons are compared against the tangible and intangible costs to victims of being sexually abused. Estimates are prepared that take into account a number of problematic issues. These include the range of possible recidivism rates for treatment programs; the uncertainty surrounding the number of child sexual molestation offences committed by recidivists; and the methodological problems associated with estimating the intangible costs of sexual abuse on victims. Despite the variation in parameter estimates that impact on the cost-benefit analysis of pedophile treatment programs, it is found that potential range of economic costs from child sexual abuse are substantial and the economic benefits to be derived from appropriate and effective treatment programs are high. Based on a reasonable set of parameter estimates, in-prison, cognitive therapy treatment programs for pedophiles are likely to be of net benefit to society. Despite this, a critical area of future research must include further methodological developments in estimating the quantitative impact of child sexual abuse in the community.

  11. Production, characterization, and mechanical behavior of cementitious materials incorporating carbon nanofibers

    NASA Astrophysics Data System (ADS)

    Yazdanbakhsh, Ardavan

    Carbon nanotubes (CNTs) and carbon nanofirbers (CNFs) have excellent properties (mechanical, electrical, magnetic, etc.), which can make them effective nanoreinforcements for improving the properties of materials. The incorporation of CNT/Fs in a wide variety of materials has been researched extensively in the past decade. However, the past study on the reinforcement of cementitious materials with these nanofilaments has been limited. The findings from those studies indicate that CNT/Fs did not significantly improve the mechanical properties of cementitious materials. Two major parameters influence the effectiveness of any discrete inclusion in composite material: The dispersion quality of the inclusions and the interfacial bond between the inclusions and matrix. The main focus of this dissertation is on the dispersion factor, and consists of three main tasks: First a novel thermodynamic-based method for dispersion quantification was developed. Second, a new method, incorporating the utilization of silica fume, was devised to improve and stabilize the dispersion of CNFs in cement paste. And third, the dispersion quantification method and mechanical testing were employed to measure, compare, and correlate the dispersion and mechanical properties of CNF-incorporated cement paste produced with the conventional and new methods. Finally, the main benefits, including the increase in strength and resistance to shrinkage cracking, obtained from the utilization of CNFs in cement paste will be presented. The investigations and the corresponding results show that the novel dispersion quantification method can be implemented easily to perform a wide variety of tasks ranging from measuring dispersion of nanofilaments in composites using their optical/SEM micrographs as input, to measuring the effect of cement particle/clump size on the dispersion of nano inclusions in cement paste. It was found that cement particles do not affect the dispersion of nano inclusions in cement paste significantly while the dispersion of nano inclusions can notably degenerates if the cement particles are agglomerated. The novel dispersion quantification method shows that, the dispersion of CNFs in cement paste significantly improves by utilizing silica fume. However, it was found that the dispersion of silica fume particles is an important parameter and poorly dispersed silica fume cannot enhance the overall dispersion of nano inclusions in cementitious materials. Finally, the mechanical testing and experimentations showed that CNFs, in absence of moist curing, even if poorly dispersed, can provide important benefits in terms of strength and crack resistance.

  12. A new scenario-based approach to damage detection using operational modal parameter estimates

    NASA Astrophysics Data System (ADS)

    Hansen, J. B.; Brincker, R.; López-Aenlle, M.; Overgaard, C. F.; Kloborg, K.

    2017-09-01

    In this paper a vibration-based damage localization and quantification method, based on natural frequencies and mode shapes, is presented. The proposed technique is inspired by a damage assessment methodology based solely on the sensitivity of mass-normalized experimental determined mode shapes. The present method differs by being based on modal data extracted by means of Operational Modal Analysis (OMA) combined with a reasonable Finite Element (FE) representation of the test structure and implemented in a scenario-based framework. Besides a review of the basic methodology this paper addresses fundamental theoretical as well as practical considerations which are crucial to the applicability of a given vibration-based damage assessment configuration. Lastly, the technique is demonstrated on an experimental test case using automated OMA. Both the numerical study as well as the experimental test case presented in this paper are restricted to perturbations concerning mass change.

  13. Quantification of DNA-associated proteins inside eukaryotic cells using single-molecule localization microscopy

    PubMed Central

    Etheridge, Thomas J.; Boulineau, Rémi L.; Herbert, Alex; Watson, Adam T.; Daigaku, Yasukazu; Tucker, Jem; George, Sophie; Jönsson, Peter; Palayret, Matthieu; Lando, David; Laue, Ernest; Osborne, Mark A.; Klenerman, David; Lee, Steven F.; Carr, Antony M.

    2014-01-01

    Development of single-molecule localization microscopy techniques has allowed nanometre scale localization accuracy inside cells, permitting the resolution of ultra-fine cell structure and the elucidation of crucial molecular mechanisms. Application of these methodologies to understanding processes underlying DNA replication and repair has been limited to defined in vitro biochemical analysis and prokaryotic cells. In order to expand these techniques to eukaryotic systems, we have further developed a photo-activated localization microscopy-based method to directly visualize DNA-associated proteins in unfixed eukaryotic cells. We demonstrate that motion blurring of fluorescence due to protein diffusivity can be used to selectively image the DNA-bound population of proteins. We designed and tested a simple methodology and show that it can be used to detect changes in DNA binding of a replicative helicase subunit, Mcm4, and the replication sliding clamp, PCNA, between different stages of the cell cycle and between distinct genetic backgrounds. PMID:25106872

  14. Standards and Methodologies for Characterizing Radiobiological Impact of High-Z Nanoparticles

    PubMed Central

    Subiel, Anna; Ashmore, Reece; Schettino, Giuseppe

    2016-01-01

    Research on the application of high-Z nanoparticles (NPs) in cancer treatment and diagnosis has recently been the subject of growing interest, with much promise being shown with regards to a potential transition into clinical practice. In spite of numerous publications related to the development and application of nanoparticles for use with ionizing radiation, the literature is lacking coherent and systematic experimental approaches to fully evaluate the radiobiological effectiveness of NPs, validate mechanistic models and allow direct comparison of the studies undertaken by various research groups. The lack of standards and established methodology is commonly recognised as a major obstacle for the transition of innovative research ideas into clinical practice. This review provides a comprehensive overview of radiobiological techniques and quantification methods used in in vitro studies on high-Z nanoparticles and aims to provide recommendations for future standardization for NP-mediated radiation research. PMID:27446499

  15. Quantitative Analysis of Motor Status in Parkinson's Disease Using Wearable Devices: From Methodological Considerations to Problems in Clinical Applications.

    PubMed

    Suzuki, Masahiko; Mitoma, Hiroshi; Yoneyama, Mitsuru

    2017-01-01

    Long-term and objective monitoring is necessary for full assessment of the condition of patients with Parkinson's disease (PD). Recent advances in biotechnology have seen the development of various types of wearable (body-worn) sensor systems. By using accelerometers and gyroscopes, these devices can quantify motor abnormalities, including decreased activity and gait disturbances, as well as nonmotor signs, such as sleep disturbances and autonomic dysfunctions in PD. This review discusses methodological problems inherent in wearable devices. Until now, analysis of the mean values of motion-induced signals on a particular day has been widely applied in the clinical management of PD patients. On the other hand, the reliability of these devices to detect various events, such as freezing of gait and dyskinesia, has been less than satisfactory. Quantification of disease-specific changes rather than nonspecific changes is necessary.

  16. Space-Time Data fusion for Remote Sensing Applications

    NASA Technical Reports Server (NTRS)

    Braverman, Amy; Nguyen, H.; Cressie, N.

    2011-01-01

    NASA has been collecting massive amounts of remote sensing data about Earth's systems for more than a decade. Missions are selected to be complementary in quantities measured, retrieval techniques, and sampling characteristics, so these datasets are highly synergistic. To fully exploit this, a rigorous methodology for combining data with heterogeneous sampling characteristics is required. For scientific purposes, the methodology must also provide quantitative measures of uncertainty that propagate input-data uncertainty appropriately. We view this as a statistical inference problem. The true but notdirectly- observed quantities form a vector-valued field continuous in space and time. Our goal is to infer those true values or some function of them, and provide to uncertainty quantification for those inferences. We use a spatiotemporal statistical model that relates the unobserved quantities of interest at point-level to the spatially aggregated, observed data. We describe and illustrate our method using CO2 data from two NASA data sets.

  17. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    PubMed Central

    Sosa-Ferrera, Zoraida; Mahugo-Santana, Cristina; Santana-Rodríguez, José Juan

    2013-01-01

    Endocrine-disruptor compounds (EDCs) can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented. PMID:23738329

  18. Quantitative Analysis of Motor Status in Parkinson's Disease Using Wearable Devices: From Methodological Considerations to Problems in Clinical Applications

    PubMed Central

    2017-01-01

    Long-term and objective monitoring is necessary for full assessment of the condition of patients with Parkinson's disease (PD). Recent advances in biotechnology have seen the development of various types of wearable (body-worn) sensor systems. By using accelerometers and gyroscopes, these devices can quantify motor abnormalities, including decreased activity and gait disturbances, as well as nonmotor signs, such as sleep disturbances and autonomic dysfunctions in PD. This review discusses methodological problems inherent in wearable devices. Until now, analysis of the mean values of motion-induced signals on a particular day has been widely applied in the clinical management of PD patients. On the other hand, the reliability of these devices to detect various events, such as freezing of gait and dyskinesia, has been less than satisfactory. Quantification of disease-specific changes rather than nonspecific changes is necessary. PMID:28607801

  19. High resolution earth observation from geostationary orbit by optical aperture synthesys

    NASA Astrophysics Data System (ADS)

    Mesrine, M.; Thomas, E.; Garin, S.; Blanc, P.; Alis, C.; Cassaing, F.; Laubier, D.

    2017-11-01

    In this paper, we describe Optical Aperture Synthesis (OAS) imaging instrument concepts studied by Alcatel Alenia Space under a CNES R&T contract in term of technical feasibility. First, the methodology to select the aperture configuration is proposed, based on the definition and quantification of image quality criteria adapted to an OAS instrument for direct imaging of extended objects. The following section presents, for each interferometer type (Michelson and Fizeau), the corresponding optical configurations compatible with a large field of view from GEO orbit. These optical concepts take into account the constraints imposed by the foreseen resolution and the implementation of the co-phasing functions. The fourth section is dedicated to the analysis of the co-phasing methodologies, from the configuration deployment to the fine stabilization during observation. Finally, we present a trade-off analysis allowing to select the concept wrt mission specification and constraints related to instrument accommodation under launcher shroud and in-orbit deployment.

  20. Optimization of ultrasonic-assisted extraction of antioxidant compounds from Guava (Psidium guajava L.) leaves using response surface methodology

    PubMed Central

    Kong, Fansheng; Yu, Shujuan; Feng, Zeng; Wu, Xinlan

    2015-01-01

    Objective: To optimization of extraction of antioxidant compounds from guava (Psidium guajava L.) leaves and showed that the guava leaves are the potential source of antioxidant compounds. Materials and Methods: The bioactive polysaccharide compounds of guava leaves (P. guajava L.) were obtained using ultrasonic-assisted extraction. Extraction was carried out according to Box-Behnken central composite design, and independent variables were temperature (20–60°C), time (20–40 min) and power (200–350 W). The extraction process was optimized by using response surface methodology for the highest crude extraction yield of bioactive polysaccharide compounds. Results: The optimal conditions were identified as 55°C, 30 min, and 240 W. 1,1-diphenyl-2-picryl-hydrazyl and hydroxyl free radical scavenging were conducted. Conclusion: The results of quantification showed that the guava leaves are the potential source of antioxidant compounds. PMID:26246720

  1. Optimization of ultrasonic-assisted extraction of antioxidant compounds from Guava (Psidium guajava L.) leaves using response surface methodology.

    PubMed

    Kong, Fansheng; Yu, Shujuan; Feng, Zeng; Wu, Xinlan

    2015-01-01

    To optimization of extraction of antioxidant compounds from guava (Psidium guajava L.) leaves and showed that the guava leaves are the potential source of antioxidant compounds. The bioactive polysaccharide compounds of guava leaves (P. guajava L.) were obtained using ultrasonic-assisted extraction. Extraction was carried out according to Box-Behnken central composite design, and independent variables were temperature (20-60°C), time (20-40 min) and power (200-350 W). The extraction process was optimized by using response surface methodology for the highest crude extraction yield of bioactive polysaccharide compounds. The optimal conditions were identified as 55°C, 30 min, and 240 W. 1,1-diphenyl-2-picryl-hydrazyl and hydroxyl free radical scavenging were conducted. The results of quantification showed that the guava leaves are the potential source of antioxidant compounds.

  2. Dual Contrast - Magnetic Resonance Fingerprinting (DC-MRF): A Platform for Simultaneous Quantification of Multiple MRI Contrast Agents.

    PubMed

    Anderson, Christian E; Donnola, Shannon B; Jiang, Yun; Batesole, Joshua; Darrah, Rebecca; Drumm, Mitchell L; Brady-Kalnay, Susann M; Steinmetz, Nicole F; Yu, Xin; Griswold, Mark A; Flask, Chris A

    2017-08-16

    Injectable Magnetic Resonance Imaging (MRI) contrast agents have been widely used to provide critical assessments of disease for both clinical and basic science imaging research studies. The scope of available MRI contrast agents has expanded over the years with the emergence of molecular imaging contrast agents specifically targeted to biological markers. Unfortunately, synergistic application of more than a single molecular contrast agent has been limited by MRI's ability to only dynamically measure a single agent at a time. In this study, a new Dual Contrast - Magnetic Resonance Fingerprinting (DC - MRF) methodology is described that can detect and independently quantify the local concentration of multiple MRI contrast agents following simultaneous administration. This "multi-color" MRI methodology provides the opportunity to monitor multiple molecular species simultaneously and provides a practical, quantitative imaging framework for the eventual clinical translation of molecular imaging contrast agents.

  3. The Role of Leisure Engagement for Health Benefits Among Korean Older Women.

    PubMed

    Kim, Junhyoung; Irwin, Lori; Kim, May; Chin, Seungtae; Kim, Jun

    2015-01-01

    This qualitative study was designed to examine the benefits of leisure to older Korean women. Using a constructive grounded theory methodology, in this study we identified three categories of benefits from leisure activities: (a) developing social connections, (b) enhancing psychological well-being, and (c) improving physical health. The findings of this study demonstrate that involvement in leisure activities offers substantial physical, psychological, and social benefits for older Korean women. The results also suggest that these benefits can provide an opportunity for older Korean adults to improve their health and well-being, which, in turn, may help promote successful aging.

  4. Joint distribution approaches to simultaneously quantifying benefit and risk.

    PubMed

    Shaffer, Michele L; Watterberg, Kristi L

    2006-10-12

    The benefit-risk ratio has been proposed to measure the tradeoff between benefits and risks of two therapies for a single binary measure of efficacy and a single adverse event. The ratio is calculated from the difference in risk and difference in benefit between therapies. Small sample sizes or expected differences in benefit or risk can lead to no solution or problematic solutions for confidence intervals. Alternatively, using the joint distribution of benefit and risk, confidence regions for the differences in risk and benefit can be constructed in the benefit-risk plane. The information in the joint distribution can be summarized by choosing regions of interest in this plane. Using Bayesian methodology provides a very flexible framework for summarizing information in the joint distribution. Data from a National Institute of Child Health & Human Development trial of hydrocortisone illustrate the construction of confidence regions and regions of interest in the benefit-risk plane, where benefit is survival without supplemental oxygen at 36 weeks postmenstrual age, and risk is gastrointestinal perforation. For the subgroup of infants exposed to chorioamnionitis the confidence interval based on the benefit-risk ratio is wide (Benefit-risk ratio: 1.52; 90% confidence interval: 0.23 to 5.25). Choosing regions of appreciable risk and acceptable risk in the benefit-risk plane confirms the uncertainty seen in the wide confidence interval for the benefit-risk ratio--there is a greater than 50% chance of falling into the region of acceptable risk--while visually allowing the uncertainty in risk and benefit to be shown separately. Applying Bayesian methodology, an incremental net health benefit analysis shows there is a 72% chance of having a positive incremental net benefit if hydrocortisone is used in place of placebo if one is willing to incur at most one gastrointestinal perforation for each additional infant that survives without supplemental oxygen. If the benefit-risk ratio is presented, the joint distribution of benefit and risk also should be shown. These regions avoid the ambiguity associated with collapsing benefit and risk to a single dimension. Bayesian methods allow even greater flexibility in simultaneously quantifying benefit and risk.

  5. [Electronic versus paper-based patient records: a cost-benefit analysis].

    PubMed

    Neubauer, A S; Priglinger, S; Ehrt, O

    2001-11-01

    The aim of this study is to compare the costs and benefits of electronic, paperless patient records with the conventional paper-based charts. Costs and benefits of planned electronic patient records are calculated for a University eye hospital with 140 beds. Benefit is determined by direct costs saved by electronic records. In the example shown, the additional benefits of electronic patient records, as far as they can be quantified total 192,000 DM per year. The costs of the necessary investments are 234,000 DM per year when using a linear depreciation over 4 years. In total, there are additional annual costs for electronic patient records of 42,000 DM. Different scenarios were analyzed. By increasing the time of depreciation to 6 years, the cost deficit reduces to only approximately 9,000 DM. Increased wages reduce the deficit further while the deficit increases with a loss of functions of the electronic patient record. However, several benefits of electronic records regarding research, teaching, quality control and better data access cannot be easily quantified and would greatly increase the benefit to cost ratio. Only part of the advantages of electronic patient records can easily be quantified in terms of directly saved costs. The small cost deficit calculated in this example is overcompensated by several benefits, which can only be enumerated qualitatively due to problems in quantification.

  6. Haptic Technologies for MEMS Design

    NASA Astrophysics Data System (ADS)

    Calis, Mustafa; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents for the first time a design methodology for MEMS/NEMS based on haptic sensing technologies. The software tool created as a result of this methodology will enable designers to model and interact in real time with their virtual prototype. One of the main advantages of haptic sensing is the ability to bring unusual microscopic forces back to the designer's world. Other significant benefits for developing such a methodology include gain productivity and the capability to include manufacturing costs within the design cycle.

  7. An integrated methodology to assess the benefits of urban green space.

    PubMed

    De Ridder, K; Adamec, V; Bañuelos, A; Bruse, M; Bürger, M; Damsgaard, O; Dufek, J; Hirsch, J; Lefebre, F; Pérez-Lacorzana, J M; Thierry, A; Weber, C

    2004-12-01

    The interrelated issues of urban sprawl, traffic congestion, noise, and air pollution are major socioeconomic problems faced by most European cities. A methodology is currently being developed for evaluating the role of green space and urban form in alleviating the adverse effects of urbanisation, mainly focusing on the environment but also accounting for socioeconomic aspects. The objectives and structure of the methodology are briefly outlined and illustrated with preliminary results obtained from case studies performed on several European cities.

  8. Research Plan for Fire Signatures and Detection

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Viewgraphs on the prevention, suppression, and detection of fires aboard a spacecraft is presented. The topics include: 1) Fire Prevention, Detection, and Suppression Sub-Element Products; 2) FPDS Organizing Questions; 3) FPDS Organizing Questions; 4) Signatures, Sensors, and Simulations; 5) Quantification of Fire and Pre-Fire Signatures; 6) Smoke; 7) DAFT Hardware; 8) Additional Benefits of DAFT; 9) Development and Characterization of Sensors 10) Simulation of the Transport of Smoke and Fire Precursors; and 11) FPDS Organizing Questions.

  9. Cutaneous T cell lymphoma: Current practices in blood assessment and the utility of T-cell receptor Vβ chain restriction

    PubMed Central

    Gibson, Juliet F; Huang, Jing; Liu, Kristina J; Carlson, Kacie R; Foss, Francine; Choi, Jaehyuk; Edelson, Richard; Hussong, Jerry W.; Mohl, Ramsey; Hill, Sally; Girardi, Sally

    2016-01-01

    Background Accurate quantification of malignant cells in the peripheral blood of patients with cutaneous T cell lymphoma (CTCL) is important for early detection, prognosis, and monitoring disease burden. Objective Determine the spectrum of current clinical practices; critically evaluate elements of current ISCL B1 and B2 staging criteria; and assess the potential role of TCR-Vβ analysis by flow cytometry. Methods We assessed current clinical practices by survey, and performed a retrospective analysis of 161 patients evaluated at Yale (2011-2014) to compare the sensitivity, specificity, PPV, and NPV of parameters for ISCL B2 staging. Results There was heterogeneity in clinical practices among institutions. ISCL B1 criteria did not capture five Yale cohort patients with immunophenotypic abnormalities who later progressed. TCR-Vβ testing was more specific than PCR and aided diagnosis in detecting clonality, but was of limited benefit in quantification of tumor burden. Limitations Because of limited follow-up involving a single center, further investigation will be necessary to conclude whether our proposed diagnostic algorithm is of general clinical benefit. Conclusion We propose further study of “modified B1 criteria”: CD4/CD8 ratio ≥5, %CD4+/CD26- ≥ 20%, %CD4+/CD7- ≥ 20%, with evidence of clonality. TCR-Vβ testing should be considered in future diagnostic and staging algorithms. PMID:26874819

  10. Determination of hydroxylated polychlorinated biphenyls (HO-PCBs) in blood plasma by high-performance liquid chromatography-electrospray ionization-tandem quadrupole mass spectrometry.

    PubMed

    Letcher, R J; Li, H X; Chu, S G

    2005-01-01

    Hydroxylated metabolites of polychlorinated biphenyls (HO-PCBs) and pentachlorophenol (PCP) are halogenated phenolic compounds, and they are increasingly common as environmental contaminants mainly in the blood of wildlife and humans. A methodology based on high-performance liquid chromatography (reversed-phase)-electrospray (negative) ionization-tandem quadrupole mass spectrometry (LC-ESI(-)-MS-MS) in the select ion monitoring or multiple reaction monitoring modes was developed for HO-PCB and PCP determination in blood plasma and serum. Among 11 environmentally relevant HO-PCB congeners and PCP spiked to fetal calf serum, quantitative assessments, including matrix effects on ESI(-) suppression/ enhancement, showed process (recovery) efficiencies of 73% to 89% without internal standard (IS) correction, and 88% to 103% with IS correction, and method limits of quantification ranging from 1 to 50 pg/g (wet weight). Using the developed LC-ESI(-)-MS methodology in comparison with GC-MS and GC-ECD based approaches, similar results were found for HO-PCB identification and quantification in the plasma of polar bear (Ursus maritimus) from the Canadian arctic. LC-ESI(-)-MS identified four HO-PCB congeners [4'-HO-2,2',4,6,6'-pentachlorobiphenyl (4'-HO-CB104), 4-HO-2,3,3',4',5-pentachlorobiphenyl (4-HO-CB107), 4-HO-2,3,3',5,5',6-hexachlorobiphenyl (4-HO-CB165) and 3'-HO-2,2',3,4,4',5,5'-heptachlorobiphenyl (3'-HO-CB180)], and 14 additional tetra- to hepta-chlorinated HO-PCBs isomers in the polar bear plasma.

  11. Assessing flow paths in a karst aquifer based on multiple dye tracing tests using stochastic simulation and the MODFLOW-CFP code

    NASA Astrophysics Data System (ADS)

    Assari, Amin; Mohammadi, Zargham

    2017-09-01

    Karst systems show high spatial variability of hydraulic parameters over small distances and this makes their modeling a difficult task with several uncertainties. Interconnections of fractures have a major role on the transport of groundwater, but many of the stochastic methods in use do not have the capability to reproduce these complex structures. A methodology is presented for the quantification of tortuosity using the single normal equation simulation (SNESIM) algorithm and a groundwater flow model. A training image was produced based on the statistical parameters of fractures and then used in the simulation process. The SNESIM algorithm was used to generate 75 realizations of the four classes of fractures in a karst aquifer in Iran. The results from six dye tracing tests were used to assign hydraulic conductivity values to each class of fractures. In the next step, the MODFLOW-CFP and MODPATH codes were consecutively implemented to compute the groundwater flow paths. The 9,000 flow paths obtained from the MODPATH code were further analyzed to calculate the tortuosity factor. Finally, the hydraulic conductivity values calculated from the dye tracing experiments were refined using the actual flow paths of groundwater. The key outcomes of this research are: (1) a methodology for the quantification of tortuosity; (2) hydraulic conductivities, that are incorrectly estimated (biased low) with empirical equations that assume Darcian (laminar) flow with parallel rather than tortuous streamlines; and (3) an understanding of the scale-dependence and non-normal distributions of tortuosity.

  12. Pharmacokinetic Profiling of Conjugated Therapeutic Oligonucleotides: A High-Throughput Method Based Upon Serial Blood Microsampling Coupled to Peptide Nucleic Acid Hybridization Assay.

    PubMed

    Godinho, Bruno M D C; Gilbert, James W; Haraszti, Reka A; Coles, Andrew H; Biscans, Annabelle; Roux, Loic; Nikan, Mehran; Echeverria, Dimas; Hassler, Matthew; Khvorova, Anastasia

    2017-12-01

    Therapeutic oligonucleotides, such as small interfering RNAs (siRNAs), hold great promise for the treatment of incurable genetically defined disorders by targeting cognate toxic gene products for degradation. To achieve meaningful tissue distribution and efficacy in vivo, siRNAs must be conjugated or formulated. Clear understanding of the pharmacokinetic (PK)/pharmacodynamic behavior of these compounds is necessary to optimize and characterize the performance of therapeutic oligonucleotides in vivo. In this study, we describe a simple and reproducible methodology for the evaluation of in vivo blood/plasma PK profiles and tissue distribution of oligonucleotides. The method is based on serial blood microsampling from the saphenous vein, coupled to peptide nucleic acid hybridization assay for quantification of guide strands. Performed with minimal number of animals, this method allowed unequivocal detection and sensitive quantification without the need for amplification, or further modification of the oligonucleotides. Using this methodology, we compared plasma clearances and tissue distribution profiles of two different hydrophobically modified siRNAs (hsiRNAs). Notably, cholesterol-hsiRNA presented slow plasma clearances and mainly accumulated in the liver, whereas, phosphocholine-docosahexaenoic acid-hsiRNA was rapidly cleared from the plasma and preferably accumulated in the kidney. These data suggest that the PK/biodistribution profiles of modified hsiRNAs are determined by the chemical nature of the conjugate. Importantly, the method described in this study constitutes a simple platform to conduct pilot assessments of the basic clearance and tissue distribution profiles, which can be broadly applied for evaluation of new chemical variants of siRNAs and micro-RNAs.

  13. Quantification of phototrophic biomass on rocks: optimization of chlorophyll-a extraction by response surface methodology.

    PubMed

    Fernández-Silva, I; Sanmartín, P; Silva, B; Moldes, A; Prieto, B

    2011-01-01

    Biological colonization of rock surfaces constitutes an important problem for maintenance of buildings and monuments. In this work, we aim to establish an efficient extraction protocol for chlorophyll-a specific for rock materials, as this is one of the most commonly used biomarkers for quantifying phototrophic biomass. For this purpose, rock samples were cut into blocks, and three different mechanical treatments were tested, prior to extraction in dimethyl sulfoxide (DMSO). To evaluate the influence of the experimental factors (1) extractant-to-sample ratio, (2) temperature, and (3) time of incubation, on chlorophyll-a recovery (response variable), incomplete factorial designs of experiments were followed. Temperature of incubation was the most relevant variable for chlorophyll-a extraction. The experimental data obtained were analyzed following a response surface methodology, which allowed the development of empirical models describing the interrelationship between the considered response and experimental variables. The optimal extraction conditions for chlorophyll-a were estimated, and the expected yields were calculated. Based on these results, we propose a method involving application of ultrasound directly to intact sample, followed by incubation in 0.43 ml DMSO/cm(2) sample at 63°C for 40 min. Confirmation experiments were performed at the predicted optimal conditions, allowing chlorophyll-a recovery of 84.4 ± 11.6% (90% was expected), which implies a substantial improvement with respect to the expected recovery using previous methods (68%). This method will enable detection of small amounts of photosynthetic microorganisms and quantification of the extent of biocolonization of stone surfaces.

  14. The why and how of amino acid analytics in cancer diagnostics and therapy.

    PubMed

    Manig, Friederike; Kuhne, Konstantin; von Neubeck, Cläre; Schwarzenbolz, Uwe; Yu, Zhanru; Kessler, Benedikt M; Pietzsch, Jens; Kunz-Schughart, Leoni A

    2017-01-20

    Pathological alterations in cell functions are frequently accompanied by metabolic reprogramming including modifications in amino acid metabolism. Amino acid detection is thus integral to the diagnosis of many hereditary metabolic diseases. The development of malignant diseases as metabolic disorders comes along with a complex dysregulation of genetic and epigenetic factors affecting metabolic enzymes. Cancer cells might transiently or permanently become auxotrophic for non-essential or semi-essential amino acids such as asparagine or arginine. Also, transformed cells are often more susceptible to local shortage of essential amino acids such as methionine than normal tissues. This offers new points of attacking unique metabolic features in cancer cells. To better understand these processes, highly sensitive methods for amino acid detection and quantification are required. Our review summarizes the main methodologies for amino acid detection with a particular focus on applications in biomedicine and cancer, provides a historical overview of the methodological pre-requisites in amino acid analytics. We compare classical and modern approaches such as the combination of gas chromatography and liquid chromatography with mass spectrometry (GC-MS/LC-MS). The latter is increasingly applied in clinical routine. We therefore illustrate an LC-MS workflow for analyzing arginine and methionine as well as their precursors and analogs in biological material. Pitfalls during protocol development are discussed, but LC-MS emerges as a reliable and sensitive tool for the detection of amino acids in biological matrices. Quantification is challenging, but of particular interest in cancer research as targeting arginine and methionine turnover in cancer cells represent novel treatment strategies. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Advanced Crash Avoidance Technologies (ACAT) Program - Final Report of the Volvo-Ford-UMTRI Project: Safety Impact Methodology for Lane Departure Warning - Method Development and Estimation of Benefits

    DOT National Transportation Integrated Search

    2010-10-01

    The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...

  16. Data Management inside the Library: Assessing Electronic Resources Data Using the Data Asset Framework Methodology

    ERIC Educational Resources Information Center

    Ogier, Andi; Hall, Monena; Bailey, Annette; Stovall, Connie

    2014-01-01

    Rapidly growing within academic libraries, library data services have often been focused on assessing research trends and building partnerships outside the library. There are distinct benefits, however, to using data audit methodologies created for these external assessments of researcher practices inside the library as well. In this article, we…

  17. A Longitudinal Evaluation of a Project-Based Learning Initiative in an Engineering Undergraduate Programme

    ERIC Educational Resources Information Center

    Hall, Wayne; Palmer, Stuart; Bennett, Mitchell

    2012-01-01

    Project-based learning (PBL) is a well-known student-centred methodology for engineering design education. The methodology claims to offer a number of educational benefits. This paper evaluates the student perceptions of the initial and second offering of a first-year design unit at Griffith University in Australia. It builds on an earlier…

  18. Improved intracellular PHA determinations with novel spectrophotometric quantification methodologies based on Sudan black dye.

    PubMed

    Porras, Mauricio A; Villar, Marcelo A; Cubitto, María A

    2018-05-01

    The presence of intracellular polyhydroxyalkanoates (PHAs) is usually studied using Sudan black dye solution (SB). In a previous work it was shown that the PHA could be directly quantified using the absorbance of SB fixed by PHA granules in wet cell samples. In the present paper, the optimum SB amount and the optimum conditions to be used for SB assays were determined following an experimental design by hybrid response surface methodology and desirability-function. In addition, a new methodology was developed in which it is shown that the amount of SB fixed by PHA granules can also be determined indirectly through the absorbance of the supernatant obtained from the stained cell samples. This alternative methodology allows a faster determination of the PHA content (involving 23 and 42 min for indirect and direct determinations, respectively), and can be undertaken by means of basic laboratory equipment and reagents. The correlation between PHA content in wet cell samples and the spectra of the SB stained supernatant was determined by means of multivariate and linear regression analysis. The best calibration adjustment (R 2  = 0.91, RSE: 1.56%), and the good PHA prediction obtained (RSE = 1.81%), shows that the proposed methodology constitutes a reasonably precise way for PHA content determination. Thus, this methodology could anticipate the probable results of the above mentioned direct PHA determination. Compared with the most used techniques described in the scientific literature, the combined implementation of these two methodologies seems to be one of the most economical and environmentally friendly, suitable for rapid monitoring of the intracellular PHA content. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Health Insurance: Comparison of Coverage for Federal and Private Sector Employees. Briefing Report to the Chairman, Subcommittee on Civil Service, Post Office, and General Services, Committee on Governmental Affairs, U.S. Senate.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Div. of Human Resources.

    This briefing report was developed to provide a Senate subcommittee with information concerning certain benefit features of the Federal Employees Health Benefits Program (FEHBP). It compares coverage for selected health benefits in the federal and private sectors for a 6-year period (1980-1985). A description of methodology states that information…

  20. How to Measure Costs and Benefits of eHealth Interventions: An Overview of Methods and Frameworks

    PubMed Central

    2015-01-01

    Information on the costs and benefits of eHealth interventions is needed, not only to document value for money and to support decision making in the field, but also to form the basis for developing business models and to facilitate payment systems to support large-scale services. In the absence of solid evidence of its effects, key decision makers may doubt the effectiveness, which, in turn, limits investment in, and the long-term integration of, eHealth services. However, it is not realistic to conduct economic evaluations of all eHealth applications and services in all situations, so we need to be able to generalize from those we do conduct. This implies that we have to select the most appropriate methodology and data collection strategy in order to increase the transferability across evaluations. This paper aims to contribute to the understanding of how to apply economic evaluation methodology in the eHealth field. It provides a brief overview of basic health economics principles and frameworks and discusses some methodological issues and challenges in conducting cost-effectiveness analysis of eHealth interventions. Issues regarding the identification, measurement, and valuation of costs and benefits are outlined. Furthermore, this work describes the established techniques of combining costs and benefits, presents the decision rules for identifying the preferred option, and outlines approaches to data collection strategies. Issues related to transferability and complexity are also discussed. PMID:26552360

Top