Science.gov

Sample records for advanced quantitative methods

  1. Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment

    NASA Technical Reports Server (NTRS)

    Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.

    1979-01-01

    The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.

  2. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods

    PubMed Central

    Ahmed, Rafay; Oborski, Matthew J; Hwang, Misun; Lieberman, Frank S; Mountz, James M

    2014-01-01

    Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12–15 months for glioblastomas and 2–5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies, and importantly, for facilitating patient management, sparing patients from weeks or months of toxicity and ineffective treatment. This review will present an overview of epidemiology, molecular pathogenesis and current advances in diagnoses, and management of malignant gliomas. PMID:24711712

  3. Response monitoring using quantitative ultrasound methods and supervised dictionary learning in locally advanced breast cancer

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Fung, Brandon; Tadayyon, Hadi; Tran, William T.; Czarnota, Gregory J.

    2016-03-01

    A non-invasive computer-aided-theragnosis (CAT) system was developed for the early assessment of responses to neoadjuvant chemotherapy in patients with locally advanced breast cancer. The CAT system was based on quantitative ultrasound spectroscopy methods comprising several modules including feature extraction, a metric to measure the dissimilarity between "pre-" and "mid-treatment" scans, and a supervised learning algorithm for the classification of patients to responders/non-responders. One major requirement for the successful design of a high-performance CAT system is to accurately measure the changes in parametric maps before treatment onset and during the course of treatment. To this end, a unified framework based on Hilbert-Schmidt independence criterion (HSIC) was used for the design of feature extraction from parametric maps and the dissimilarity measure between the "pre-" and "mid-treatment" scans. For the feature extraction, HSIC was used to design a supervised dictionary learning (SDL) method by maximizing the dependency between the scans taken from "pre-" and "mid-treatment" with "dummy labels" given to the scans. For the dissimilarity measure, an HSIC-based metric was employed to effectively measure the changes in parametric maps as an indication of treatment effectiveness. The HSIC-based feature extraction and dissimilarity measure used a kernel function to nonlinearly transform input vectors into a higher dimensional feature space and computed the population means in the new space, where enhanced group separability was ideally obtained. The results of the classification using the developed CAT system indicated an improvement of performance compared to a CAT system with basic features using histogram of intensity.

  4. Structural Analysis and Quantitative Determination of Clevidipine Butyrate Impurities Using an Advanced RP-HPLC Method.

    PubMed

    Zhou, Yuxia; Zhou, Fan; Yan, Fei; Yang, Feng; Yao, Yuxian; Zou, Qiaogen

    2016-03-01

    Eleven potential impurities, including process-related compounds and degradation products, have been analyzed by comprehensive studies on the manufacturing process of clevidipine butyrate. Possible formation mechanisms could also be devised. MS and NMR techniques have been used for the structural characterization of three previously unreported impurities (Imp-3, Imp-5 and Imp-11). To separate and quantify the potential impurities in a simultaneous fashion, an efficient and advanced RP-HPLC method has been developed. In doing so, four major degradation products (Imp-2, Imp-4, Imp-8 and Imp-10) can be observed under varying stress conditions. This analytical method has been validated according to ICH guidelines with respect to specificity, accuracy, linearity, robustness and stability. The method described has been demonstrated to be applicable in routine quality control processes and stability evaluation studies of clevidipine butyrate. PMID:26489435

  5. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to

  6. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  7. Biological Matrix Effects in Quantitative Tandem Mass Spectrometry-Based Analytical Methods: Advancing Biomonitoring

    PubMed Central

    Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd

    2015-01-01

    The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585

  8. Advanced Mass Spectrometric Methods for the Rapid and Quantitative Characterization of Proteomes

    DOE PAGESBeta

    Smith, Richard D.

    2002-01-01

    Progress is reviewedmore » towards the development of a global strategy that aims to extend the sensitivity, dynamic range, comprehensiveness and throughput of proteomic measurements based upon the use of high performance separations and mass spectrometry. The approach uses high accuracy mass measurements from Fourier transform ion cyclotron resonance mass spectrometry (FTICR) to validate peptide ‘accurate mass tags’ (AMTs) produced by global protein enzymatic digestions for a specific organism, tissue or cell type from ‘potential mass tags’ tentatively identified using conventional tandem mass spectrometry (MS/MS). This provides the basis for subsequent measurements without the need for MS/ MS. High resolution capillary liquid chromatography separations combined with high sensitivity, and high resolution accurate FTICR measurements are shown to be capable of characterizing peptide mixtures of more than 10 5 components. The strategy has been initially demonstrated using the microorganisms Saccharomyces cerevisiae and Deinococcus radiodurans. Advantages of the approach include the high confidence of protein identification, its broad proteome coverage, high sensitivity, and the capability for stableisotope labeling methods for precise relative protein abundance measurements. Abbreviations : LC, liquid chromatography; FTICR, Fourier transform ion cyclotron resonance; AMT, accurate mass tag; PMT, potential mass tag; MMA, mass measurement accuracy; MS, mass spectrometry; MS/MS, tandem mass spectrometry; ppm, parts per million.« less

  9. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  10. Advanced stability indicating chemometric methods for quantitation of amlodipine and atorvastatin in their quinary mixture with acidic degradation products

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2016-02-01

    Two advanced, accurate and precise chemometric methods are developed for the simultaneous determination of amlodipine besylate (AML) and atorvastatin calcium (ATV) in the presence of their acidic degradation products in tablet dosage forms. The first method was Partial Least Squares (PLS-1) and the second was Artificial Neural Networks (ANN). PLS was compared to ANN models with and without variable selection procedure (genetic algorithm (GA)). For proper analysis, a 5-factor 5-level experimental design was established resulting in 25 mixtures containing different ratios of the interfering species. Fifteen mixtures were used as calibration set and the other ten mixtures were used as validation set to validate the prediction ability of the suggested models. The proposed methods were successfully applied to the analysis of pharmaceutical tablets containing AML and ATV. The methods indicated the ability of the mentioned models to solve the highly overlapped spectra of the quinary mixture, yet using inexpensive and easy to handle instruments like the UV-VIS spectrophotometer.

  11. Advanced stability indicating chemometric methods for quantitation of amlodipine and atorvastatin in their quinary mixture with acidic degradation products.

    PubMed

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2016-02-01

    Two advanced, accurate and precise chemometric methods are developed for the simultaneous determination of amlodipine besylate (AML) and atorvastatin calcium (ATV) in the presence of their acidic degradation products in tablet dosage forms. The first method was Partial Least Squares (PLS-1) and the second was Artificial Neural Networks (ANN). PLS was compared to ANN models with and without variable selection procedure (genetic algorithm (GA)). For proper analysis, a 5-factor 5-level experimental design was established resulting in 25 mixtures containing different ratios of the interfering species. Fifteen mixtures were used as calibration set and the other ten mixtures were used as validation set to validate the prediction ability of the suggested models. The proposed methods were successfully applied to the analysis of pharmaceutical tablets containing AML and ATV. The methods indicated the ability of the mentioned models to solve the highly overlapped spectra of the quinary mixture, yet using inexpensive and easy to handle instruments like the UV-VIS spectrophotometer. PMID:26513228

  12. Advances in Quantitative Analyses and Reference Materials Related to Laser Ablation ICP-MS: A Look at Methods and New Directions

    NASA Astrophysics Data System (ADS)

    Koenig, A. E.; Ridley, W. I.

    2009-12-01

    The role of laser ablation ICP-MS (LA-ICP-MS) continues to expand both in geological sciences and other fields. As the technique continues to gain popularity, so too does the need for good reference materials and methods development and validation. Matrix matched reference materials (RMs) are required for calibration and quality control of LA-ICP-MS analyses. New advances in technology such as <200nm lasers and femtosecond lasers have reduced the dependence on matrix matching to some degree, but general matrix matching is still preferred. Much work has revolved around the available RMs such as the NIST 61x silicate glasses and several series of basaltic composition glasses such as the USGS natural basaltic glasses BCR-2g and synthetic basaltic glasses, the GS series (e.g. GSD-1g). While many quantitative hurdles have been recognized by analogous techniques such as EPMA and SIMS, some of these hurdles have not been fully addressed or validated for some cases of LA-ICP-MS. Trace element mapping by LA-ICP-MS is rapidly becoming more widespread for samples. Here relative differences in raw signal can be easily and rapidly obtained. However as too often is the case the magnitude of the relative differences in raw intensity are a function of different ablation yields, sample density or other factors. Methods of quantification for trace element mapping will be presented. The USGS has been developing microanalytical RMs intended for LA-ICP-MS for several years. The widely popular basaltic rock powders BCR-2, BIR-1 and BHVO-2 have all been successfully converted to homogeneous glasses suitable for LA-ICP-MS and have been in use by many workers. The newer synthetic basaltic glass GS series consists of 4 glasses of basaltic composition artificially doped at nominal concentrations of almost of trace elements at 400, 40, 4 and < 1 ppm. Additional developments in non-silcate or basaltic materials include the previously released MASS-1 Cu, Fe, Zn sulfide calibration RM (Wilson et

  13. Multisite comparison of methods for the quantitation of the surface expression of CD38 on CD8(+) T lymphocytes. The ACTG Advanced Flow Cytometry Focus Group.

    PubMed

    Schmitz, J L; Czerniewski, M A; Edinger, M; Plaeger, S; Gelman, R; Wilkening, C L; Zawadzki, J A; Wormsley, S B

    2000-06-15

    We evaluated the effect of specimen processing variations and quantitation methods on quantitative determination of CD38 expression on CD8 T lymphocytes. Neither lysing reagent (ammonium chloride versus BD FACSlyse), fixation (paraformaldehyde versus no final fixation step), nor acquisition delay (acquisition within 6 h after fixation versus 24 h after fixation) had a significant effect on CD38 relative fluorescent intensity or CD38 quantitative estimates (RFI or antibodies bound per cell). The only significant difference in fluorescent intensity and CD38 antibodies bound per cell (ABC) was encountered when whole blood was held for 24 h prior to staining and fixation and then acquired after another 24-h hold. However, for all sample processing methods above, the CD4 biologic calibrator and QuantiBRITE bead methods gave significantly different estimates of CD38 intensity. In many cases, however, these differences are relatively small and were more pronounced in certain laboratories. We conclude that there is some flexibility in sample processing methods for quantitative CD38 determination; however, it is preferable for a laboratory to employ one method of fluorescence quantitation calculation consistently because small differences are detected between different methods. Cytometry (Comm. Clin. Cytometry) 42:174-179, 2000. PMID:10861690

  14. Labeling of virus components for advanced, quantitative imaging analyses.

    PubMed

    Sakin, Volkan; Paci, Giulia; Lemke, Edward A; Müller, Barbara

    2016-07-01

    In recent years, investigation of virus-cell interactions has moved from ensemble measurements to imaging analyses at the single-particle level. Advanced fluorescence microscopy techniques provide single-molecule sensitivity and subdiffraction spatial resolution, allowing observation of subviral details and individual replication events to obtain detailed quantitative information. To exploit the full potential of these techniques, virologists need to employ novel labeling strategies, taking into account specific constraints imposed by viruses, as well as unique requirements of microscopic methods. Here, we compare strengths and limitations of various labeling methods, exemplify virological questions that were successfully addressed, and discuss challenges and future potential of novel approaches in virus imaging. PMID:26987299

  15. Quantitative characterization of the protein contents of the exocrine pancreatic acinar cell by soft x-ray microscopy and advanced digital imaging methods

    SciTech Connect

    Loo Jr., Billy W.

    2000-06-09

    The study of the exocrine pancreatic acinar cell has been central to the development of models of many cellular processes, especially of protein transport and secretion. Traditional methods used to examine this system have provided a wealth of qualitative information from which mechanistic models have been inferred. However they have lacked the ability to make quantitative measurements, particularly of the distribution of protein in the cell, information critical for grounding of models in terms of magnitude and relative significance. This dissertation describes the development and application of new tools that were used to measure the protein content of the major intracellular compartments in the acinar cell, particularly the zymogen granule. Soft x-ray microscopy permits image formation with high resolution and contrast determined by the underlying protein content of tissue rather than staining avidity. A sample preparation method compatible with x-ray microscopy was developed and its properties evaluated. Automatic computerized methods were developed to acquire, calibrate, and analyze large volumes of x-ray microscopic images of exocrine pancreatic tissue sections. Statistics were compiled on the protein density of several organelles, and on the protein density, size, and spatial distribution of tens of thousands of zymogen granules. The results of these measurements, and how they compare to predictions of different models of protein transport, are discussed.

  16. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams. PMID:27505659

  17. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three

  18. Quantitative Real-Time PCR: Recent Advances.

    PubMed

    Singh, Charanjeet; Roy-Chowdhuri, Sinchita

    2016-01-01

    Quantitative real-time polymerase chain reaction is a technique for simultaneous amplification and product quantification of a target DNA as the process takes place in real time in a "closed-tube" system. Although this technique can provide an absolute quantification of the initial template copy number, quantification relative to a control sample or second sequence is typically adequate. The quantification process employs melting curve analysis and/or fluorescent detection systems and can provide amplification and genotyping in a relatively short time. Here we describe the properties and uses of various fluorescent detection systems used for quantification. PMID:26843055

  19. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  20. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  1. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  2. Inside single cells: quantitative analysis with advanced optics and nanomaterials.

    PubMed

    Cui, Yi; Irudayaraj, Joseph

    2015-01-01

    Single-cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites, and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single-cell activity. To obtain quantitative information (e.g., molecular quantity, kinetics, and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single-cell studies, both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live-cell analysis. Although a considerable proportion of single-cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single-cell analysis. PMID:25430077

  3. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  4. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  5. A collaborative enterprise for multi-stakeholder participation in the advancement of quantitative imaging.

    PubMed

    Buckler, Andrew J; Bresolin, Linda; Dunnick, N Reed; Sullivan, Daniel C

    2011-03-01

    Medical imaging has seen substantial and rapid technical advances during the past decade, including advances in image acquisition devices, processing and analysis software, and agents to enhance specificity. Traditionally, medical imaging has defined anatomy, but increasingly newer, more advanced, imaging technologies provide biochemical and physiologic information based on both static and dynamic modalities. These advanced technologies are important not only for detecting disease but for characterizing and assessing change of disease with time or therapy. Because of the rapidity of these advances, research to determine the utility of quantitative imaging in either clinical research or clinical practice has not had time to mature. Methods to appropriately develop, assess, regulate, and reimburse must be established for these advanced technologies. Efficient and methodical processes that meet the needs of stakeholders in the biomedical research community, therapeutics developers, and health care delivery enterprises will ultimately benefit individual patients. To help address this, the authors formed a collaborative program-the Quantitative Imaging Biomarker Alliance. This program draws from the very successful precedent set by the Integrating the Healthcare Enterprise effort but is adapted to the needs of imaging science. Strategic guidance supporting the development, qualification, and deployment of quantitative imaging biomarkers will lead to improved standardization of imaging tests, proof of imaging test performance, and greater use of imaging to predict the biologic behavior of tissue and monitor therapy response. These, in turn, confer value to corporate stakeholders, providing incentives to bring new and innovative products to market. PMID:21339352

  6. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  7. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. PMID:26763302

  8. Advanced probabilistic method of development

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1987-01-01

    Advanced structural reliability methods are utilized on the Probabilistic Structural Analysis Methods (PSAM) project to provide a tool for analysis and design of space propulsion system hardware. The role of the effort at the University of Arizona is to provide reliability technology support to this project. PSAM computer programs will provide a design tool for analyzing uncertainty associated with thermal and mechanical loading, material behavior, geometry, and the analysis methods used. Specifically, reliability methods are employed to perform sensitivity analyses, to establish the distribution of a critical response variable (e.g., stress, deflection), to perform reliability assessment, and ultimately to produce a design which will minimize cost and/or weight. Uncertainties in the design factors of space propulsion hardware are described by probability models constructed using statistical analysis of data. Statistical methods are employed to produce a probability model, i.e., a statistical synthesis or summary of each design variable in a format suitable for reliability analysis and ultimately, design decisions.

  9. Quantitative statistical methods for image quality assessment.

    PubMed

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  10. Quantitative Statistical Methods for Image Quality Assessment

    PubMed Central

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  11. Quantitative rotating frame relaxometry methods in MRI.

    PubMed

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27100142

  12. Quantitative Methods for Assessing Drug Synergism

    PubMed Central

    2011-01-01

    Two or more drugs that individually produce overtly similar effects will sometimes display greatly enhanced effects when given in combination. When the combined effect is greater than that predicted by their individual potencies, the combination is said to be synergistic. A synergistic interaction allows the use of lower doses of the combination constituents, a situation that may reduce adverse reactions. Drug combinations are quite common in the treatment of cancers, infections, pain, and many other diseases and situations. The determination of synergism is a quantitative pursuit that involves a rigorous demonstration that the combination effect is greater than that which is expected from the individual drug’s potencies. The basis of that demonstration is the concept of dose equivalence, which is discussed here and applied to an experimental design and data analysis known as isobolographic analysis. That method, and a related method of analysis that also uses dose equivalence, are presented in this brief review, which provides the mathematical basis for assessing synergy and an optimization strategy for determining the dose combination. PMID:22737266

  13. Sparse methods for Quantitative Susceptibility Mapping

    NASA Astrophysics Data System (ADS)

    Bilgic, Berkin; Chatnuntawech, Itthi; Langkammer, Christian; Setsompop, Kawin

    2015-09-01

    Quantitative Susceptibility Mapping (QSM) aims to estimate the tissue susceptibility distribution that gives rise to subtle changes in the main magnetic field, which are captured by the image phase in a gradient echo (GRE) experiment. The underlying susceptibility distribution is related to the acquired tissue phase through an ill-posed linear system. To facilitate its inversion, spatial regularization that imposes sparsity or smoothness assumptions can be employed. This paper focuses on efficient algorithms for regularized QSM reconstruction. Fast solvers that enforce sparsity under Total Variation (TV) and Total Generalized Variation (TGV) constraints are developed using Alternating Direction Method of Multipliers (ADMM). Through variable splitting that permits closed-form iterations, the computation efficiency of these solvers are dramatically improved. An alternative approach to improve the conditioning of the ill-posed inversion is to acquire multiple GRE volumes at different head orientations relative to the main magnetic field. The phase information from such multi-orientation acquisition can be combined to yield exquisite susceptibility maps and obviate the need for regularized reconstruction, albeit at the cost of increased data acquisition time.

  14. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  15. Meaning in Method: The Rhetoric of Quantitative and Qualitative Research.

    ERIC Educational Resources Information Center

    Firestone, William A.

    The current debate about quantitative and qualitative research methods focuses on whether there is a necessary connection between method-type and research paradigm that makes the different approaches incompatible. This paper argues that the connection is not so much logical as rhetorical. Quantitative methods express the assumptions of a…

  16. Advances in Quantitative UV-Visible Spectroscopy for Clinical and Pre-clinical Application in Cancer

    PubMed Central

    Brown, J. Quincy; Vishwanath, Karthik; Palmer, Gregory M.; Ramanujam, Nirmala

    2009-01-01

    Summary Methods of optical spectroscopy which provide quantitative, physically or physiologically meaningful measures of tissue properties are an attractive tool for the study, diagnosis, prognosis, and treatment of various cancers. Recent development of methodologies to convert measured reflectance and fluorescence spectra from tissue to cancer-relevant parameters such as vascular volume, oxygenation, extracellular matrix extent, metabolic redox states, and cellular proliferation have significantly advanced the field of tissue optical spectroscopy. The number of publications reporting quantitative tissue spectroscopy results in the UV-visible wavelength range has increased sharply in the last 3 years, and includes new and emerging studies which correlate optically-measured parameters with independent measures such as immunohistochemistry, which should aid in increased clinical acceptance of these technologies. PMID:19268567

  17. Blending Qualitative & Quantitative Research Methods in Theses and Dissertations.

    ERIC Educational Resources Information Center

    Thomas, R. Murray

    This guide discusses combining qualitative and quantitative research methods in theses and dissertations. It covers a wide array of methods, the strengths and limitations of each, and how they can be effectively interwoven into various research designs. The first chapter is "The Qualitative and the Quantitative." Part 1, "A Catalogue of…

  18. Advanced accelerator methods: The cyclotrino

    SciTech Connect

    Welch, J.J.; Bertsche, K.J.; Friedman, P.G.; Morris, D.E.; Muller, R.A.

    1987-04-01

    Several new and unusual, advanced techniques in the small cyclotron are described. The cyclotron is run at low energy, using negative ions and at high harmonics. Electrostatic focusing is used exclusively. The ion source and injection system is in the center, which unfortunately does not provide enough current, but the new system design should solve this problem. An electrostatic extractor that runs at low voltage, under 5 kV, and a microchannel plate detector which is able to discriminate low energy ions from the /sup 14/C are used. The resolution is sufficient for /sup 14/C dating and a higher intensity source should allow dating of a milligram size sample of 30,000 year old material with less than 10% uncertainty.

  19. Quantitative ultrasound characterization of locally advanced breast cancer by estimation of its scatterer properties

    SciTech Connect

    Tadayyon, Hadi; Sadeghi-Naini, Ali; Czarnota, Gregory; Wirtzfeld, Lauren; Wright, Frances C.

    2014-01-15

    Purpose: Tumor grading is an important part of breast cancer diagnosis and currently requires biopsy as its standard. Here, the authors investigate quantitative ultrasound parameters in locally advanced breast cancers that can potentially separate tumors from normal breast tissue and differentiate tumor grades. Methods: Ultrasound images and radiofrequency data from 42 locally advanced breast cancer patients were acquired and analyzed. Parameters related to the linear regression of the power spectrum—midband fit, slope, and 0-MHz-intercept—were determined from breast tumors and normal breast tissues. Mean scatterer spacing was estimated from the spectral autocorrelation, and the effective scatterer diameter and effective acoustic concentration were estimated from the Gaussian form factor. Parametric maps of each quantitative ultrasound parameter were constructed from the gated radiofrequency segments in tumor and normal tissue regions of interest. In addition to the mean values of the parametric maps, higher order statistical features, computed from gray-level co-occurrence matrices were also determined and used for characterization. Finally, linear and quadratic discriminant analyses were performed using combinations of quantitative ultrasound parameters to classify breast tissues. Results: Quantitative ultrasound parameters were found to be statistically different between tumor and normal tissue (p < 0.05). The combination of effective acoustic concentration and mean scatterer spacing could separate tumor from normal tissue with 82% accuracy, while the addition of effective scatterer diameter to the combination did not provide significant improvement (83% accuracy). Furthermore, the two advanced parameters, including effective scatterer diameter and mean scatterer spacing, were found to be statistically differentiating among grade I, II, and III tumors (p = 0.014 for scatterer spacing, p = 0.035 for effective scatterer diameter). The separation of the tumor

  20. Advanced reliability methods - A review

    NASA Astrophysics Data System (ADS)

    Forsyth, David S.

    2016-02-01

    There are a number of challenges to the current practices for Probability of Detection (POD) assessment. Some Nondestructive Testing (NDT) methods, especially those that are image-based, may not provide a simple relationship between a scalar NDT response and a damage size. Some damage types are not easily characterized by a single scalar metric. Other sensing paradigms, such as structural health monitoring, could theoretically replace NDT but require a POD estimate. And the cost of performing large empirical studies to estimate POD can be prohibitive. The response of the research community has been to develop new methods that can be used to generate the same information, POD, in a form that can be used by engineering designers. This paper will highlight approaches to image-based data and complex defects, Model Assisted POD estimation, and Bayesian methods for combining information. This paper will also review the relationship of the POD estimate, confidence bounds, tolerance bounds, and risk assessment.

  1. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  2. Advances in Quantitative Proteomics of Microbes and Microbial Communities

    NASA Astrophysics Data System (ADS)

    Waldbauer, J.; Zhang, L.; Rizzo, A. I.

    2015-12-01

    Quantitative measurements of gene expression are key to developing a mechanistic, predictive understanding of how microbial metabolism drives many biogeochemical fluxes and responds to environmental change. High-throughput RNA-sequencing can afford a wealth of information about transcript-level expression patterns, but it is becoming clear that expression dynamics are often very different at the protein level where biochemistry actually occurs. These divergent dynamics between levels of biological organization necessitate quantitative proteomic measurements to address many biogeochemical questions. The protein-level expression changes that underlie shifts in the magnitude, or even the direction, of metabolic and biogeochemical fluxes can be quite subtle and test the limits of current quantitative proteomics techniques. Here we describe methodologies for high-precision, whole-proteome quantification that are applicable to both model organisms of biogeochemical interest that may not be genetically tractable, and to complex community samples from natural environments. Employing chemical derivatization of peptides with multiple isotopically-coded tags, this strategy is rapid and inexpensive, can be implemented on a wide range of mass spectrometric instrumentation, and is relatively insensitive to chromatographic variability. We demonstrate the utility of this quantitative proteomics approach in application to both isolates and natural communities of sulfur-metabolizing and photosynthetic microbes.

  3. Quantitative methods for ecological network analysis.

    PubMed

    Ulanowicz, Robert E

    2004-12-01

    The analysis of networks of ecological trophic transfers is a useful complement to simulation modeling in the quest for understanding whole-ecosystem dynamics. Trophic networks can be studied in quantitative and systematic fashion at several levels. Indirect relationships between any two individual taxa in an ecosystem, which often differ in either nature or magnitude from their direct influences, can be assayed using techniques from linear algebra. The same mathematics can also be employed to ascertain where along the trophic continuum any individual taxon is operating, or to map the web of connections into a virtual linear chain that summarizes trophodynamic performance by the system. Backtracking algorithms with pruning have been written which identify pathways for the recycle of materials and energy within the system. The pattern of such cycling often reveals modes of control or types of functions exhibited by various groups of taxa. The performance of the system as a whole at processing material and energy can be quantified using information theory. In particular, the complexity of process interactions can be parsed into separate terms that distinguish organized, efficient performance from the capacity for further development and recovery from disturbance. Finally, the sensitivities of the information-theoretic system indices appear to identify the dynamical bottlenecks in ecosystem functioning. PMID:15556474

  4. Advanced Fine Particulate Characterization Methods

    SciTech Connect

    Steven Benson; Lingbu Kong; Alexander Azenkeng; Jason Laumb; Robert Jensen; Edwin Olson; Jill MacKenzie; A.M. Rokanuzzaman

    2007-01-31

    The characterization and control of emissions from combustion sources are of significant importance in improving local and regional air quality. Such emissions include fine particulate matter, organic carbon compounds, and NO{sub x} and SO{sub 2} gases, along with mercury and other toxic metals. This project involved four activities including Further Development of Analytical Techniques for PM{sub 10} and PM{sub 2.5} Characterization and Source Apportionment and Management, Organic Carbonaceous Particulate and Metal Speciation for Source Apportionment Studies, Quantum Modeling, and High-Potassium Carbon Production with Biomass-Coal Blending. The key accomplishments included the development of improved automated methods to characterize the inorganic and organic components particulate matter. The methods involved the use of scanning electron microscopy and x-ray microanalysis for the inorganic fraction and a combination of extractive methods combined with near-edge x-ray absorption fine structure to characterize the organic fraction. These methods have direction application for source apportionment studies of PM because they provide detailed inorganic analysis along with total organic and elemental carbon (OC/EC) quantification. Quantum modeling using density functional theory (DFT) calculations was used to further elucidate a recently developed mechanistic model for mercury speciation in coal combustion systems and interactions on activated carbon. Reaction energies, enthalpies, free energies and binding energies of Hg species to the prototype molecules were derived from the data obtained in these calculations. Bimolecular rate constants for the various elementary steps in the mechanism have been estimated using the hard-sphere collision theory approximation, and the results seem to indicate that extremely fast kinetics could be involved in these surface reactions. Activated carbon was produced from a blend of lignite coal from the Center Mine in North Dakota and

  5. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  6. Research radiometric calibration quantitative transfer methods between internal and external

    NASA Astrophysics Data System (ADS)

    Guo, Ju Guang; Ma, Yong hui; Zhang, Guang; Yang, Zhi hui

    2015-10-01

    This paper puts forward a method by realizing the internal and external radiation calibration transfer for infrared radiation characteristics quantitative measuring system. Through technological innovation and innovation application to establish a theoretical model of the corresponding radiated transfer method. This method can be well in engineering application for technology conversion process of radiometric calibration that with relatively simple and effective calibration in the half light path radiation instead of complex difficult whole optical path radiometric calibration. At the same time, it also will provide the basis of effective support to further carry out the target radiated characteristics quantitative measurement and application for ground type infrared radiated quantitative measuring system.

  7. Recent advances in lattice Boltzmann methods

    SciTech Connect

    Chen, S.; Doolen, G.D.; He, X.; Nie, X.; Zhang, R.

    1998-12-31

    In this paper, the authors briefly present the basic principles of lattice Boltzmann method and summarize recent advances of the method, including the application of the lattice Boltzmann method for fluid flows in MEMS and simulation of the multiphase mixing and turbulence.

  8. Review of Quantitative Software Reliability Methods

    SciTech Connect

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of digital systems

  9. Methods and Challenges in Quantitative Imaging Biomarker Development

    PubMed Central

    Abramson, Richard G.; Burton, Kirsteen R.; Yu, John-Paul J.; Scalzetti, Ernest M.; Yankeelov, Thomas E.; Rosenkrantz, Andrew B.; Mendiratta-Lala, Mishal; Bartholmai, Brian J.; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M.

    2014-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This manuscript, drafted by the Association of University Radiologists (AUR) Radiology Research Alliance (RRA) Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field. PMID:25481515

  10. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  11. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, Frank A.

    1982-01-01

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  12. New Fluorescence Microscopy Methods for Microbiology: Sharper, Faster, and Quantitative

    PubMed Central

    Gitai, Zemer

    2009-01-01

    Summary In addition to the inherent interest stemming from their ecological and human health impacts, microbes have many advantages as model organisms, including ease of growth and manipulation and relatively simple genomes. However, the imaging of bacteria via light microscopy has been limited by their small sizes. Recent advances in fluorescence microscopy that allow imaging of structures at extremely high resolutions are thus of particular interest to the modern microbiologist. In addition, advances in high-throughput microscopy and quantitative image analysis are enabling cellular imaging to finally take advantage of the full power of bacterial numbers and ease of manipulation. These technical developments are ushering in a new era of using fluorescence microscopy to understand bacterial systems in a detailed, comprehensive, and quantitative manner. PMID:19356974

  13. Sample Collection Method Bias Effects in Quantitative Phosphoproteomics.

    PubMed

    Kanshin, Evgeny; Tyers, Michael; Thibault, Pierre

    2015-07-01

    Current advances in selective enrichment, fractionation, and MS detection of phosphorylated peptides allowed identification and quantitation of tens of thousands phosphosites from minute amounts of biological material. One of the major challenges in the field is preserving the in vivo phosphorylation state of the proteins throughout the sample preparation workflow. This is typically achieved by using phosphatase inhibitors and denaturing conditions during cell lysis. Here we determine if the upstream cell collection techniques could introduce changes in protein phosphorylation. To evaluate the effect of sample collection protocols on the global phosphorylation status of the cell, we compared different sample workflows by metabolic labeling and quantitative mass spectrometry on Saccharomyces cerevisiae cell cultures. We identified highly similar phosphopeptides for cells harvested in ice cold isotonic phosphate buffer, cold ethanol, trichloroacetic acid, and liquid nitrogen. However, quantitative analyses revealed that the commonly used phosphate buffer unexpectedly activated signaling events. Such effects may introduce systematic bias in phosphoproteomics measurements and biochemical analysis. PMID:26040406

  14. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  15. ABRF-PRG07: Advanced Quantitative Proteomics Study

    PubMed Central

    Falick, Arnold M.; Lane, William S.; Lilley, Kathryn S.; MacCoss, Michael J.; Phinney, Brett S.; Sherman, Nicholas E.; Weintraub, Susan T.; Witkowska, H. Ewa; Yates, Nathan A.

    2011-01-01

    A major challenge for core facilities is determining quantitative protein differences across complex biological samples. Although there are numerous techniques in the literature for relative and absolute protein quantification, the majority is nonroutine and can be challenging to carry out effectively. There are few studies comparing these technologies in terms of their reproducibility, accuracy, and precision, and no studies to date deal with performance across multiple laboratories with varied levels of expertise. Here, we describe an Association of Biomolecular Resource Facilities (ABRF) Proteomics Research Group (PRG) study based on samples composed of a complex protein mixture into which 12 known proteins were added at varying but defined ratios. All of the proteins were present at the same concentration in each of three tubes that were provided. The primary goal of this study was to allow each laboratory to evaluate its capabilities and approaches with regard to: detection and identification of proteins spiked into samples that also contain complex mixtures of background proteins and determination of relative quantities of the spiked proteins. The results returned by 43 participants were compiled by the PRG, which also collected information about the strategies used to assess overall performance and as an aid to development of optimized protocols for the methodologies used. The most accurate results were generally reported by the most experienced laboratories. Among laboratories that used the same technique, values that were closer to the expected ratio were obtained by more experienced groups. PMID:21455478

  16. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  17. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  18. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    The development of parametric cost estimating methods for advanced space systems in the conceptual design phase is discussed. The process of identifying variables which drive cost and the relationship between weight and cost are discussed. A theoretical model of cost is developed and tested using a historical data base of research and development projects.

  19. Integrating Qualitative and Quantitative Evaluation Methods in Substance Abuse Research.

    ERIC Educational Resources Information Center

    Dennis, Michael L.; And Others

    1994-01-01

    Some specific opportunities and techniques are described for combining and integrating qualitative and quantitative methods from the design stage of a substance abuse program evaluation through implementation and reporting. The multiple problems and requirements of such an evaluation make integrated methods essential. (SLD)

  20. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  1. System and methods for wide-field quantitative fluorescence imaging during neurosurgery.

    PubMed

    Valdes, Pablo A; Jacobs, Valerie L; Wilson, Brian C; Leblond, Frederic; Roberts, David W; Paulsen, Keith D

    2013-08-01

    We report an accurate, precise and sensitive method and system for quantitative fluorescence image-guided neurosurgery. With a low-noise, high-dynamic-range CMOS array, we perform rapid (integration times as low as 50 ms per wavelength) hyperspectral fluorescence and diffuse reflectance detection and apply a correction algorithm to compensate for the distorting effects of tissue absorption and scattering. Using this approach, we generated quantitative wide-field images of fluorescence in tissue-simulating phantoms for the fluorophore PpIX, having concentrations and optical absorption and scattering variations over clinically relevant ranges. The imaging system was tested in a rodent model of glioma, detecting quantitative levels down to 20 ng/ml. The resulting performance is a significant advance on existing wide-field quantitative imaging techniques, and provides performance comparable to a point-spectroscopy probe that has previously demonstrated significant potential for improved detection of malignant brain tumors during surgical resection. PMID:23903142

  2. Current methods and advances in bone densitometry

    NASA Technical Reports Server (NTRS)

    Guglielmi, G.; Gluer, C. C.; Majumdar, S.; Blunt, B. A.; Genant, H. K.

    1995-01-01

    Bone mass is the primary, although not the only, determinant of fracture. Over the past few years a number of noninvasive techniques have been developed to more sensitively quantitate bone mass. These include single and dual photon absorptiometry (SPA and DPA), single and dual X-ray absorptiometry (SXA and DXA) and quantitative computed tomography (QCT). While differing in anatomic sites measured and in their estimates of precision, accuracy, and fracture discrimination, all of these methods provide clinically useful measurements of skeletal status. It is the intent of this review to discuss the pros and cons of these techniques and to present the new applications of ultrasound (US) and magnetic resonance (MRI) in the detection and management of osteoporosis.

  3. Current methods and advances in bone densitometry.

    PubMed

    Guglielmi, G; Gluer, C C; Majumdar, S; Blunt, B A; Genant, H K

    1995-01-01

    Bone mass is the primary, although not the only, determinant of fracture. Over the past few years a number of noninvasive techniques have been developed to more sensitively quantitate bone mass. These include single and dual photon absorptiometry (SPA and DPA), single and dual X-ray absorptiometry (SXA and DXA) and quantitative computed tomography (QCT). While differing in anatomic sites measured and in their estimates of precision, accuracy, and fracture discrimination, all of these methods provide clinically useful measurements of skeletal status. It is the intent of this review to discuss the pros and cons of these techniques and to present the new applications of ultrasound (US) and magnetic resonance (MRI) in the detection and management of osteoporosis. PMID:11539928

  4. An Advanced Quantitative Echosound Methodology for Femoral Neck Densitometry.

    PubMed

    Casciaro, Sergio; Peccarisi, Marco; Pisani, Paola; Franchini, Roberto; Greco, Antonio; De Marco, Tommaso; Grimaldi, Antonella; Quarta, Laura; Quarta, Eugenio; Muratore, Maruizio; Conversano, Francesco

    2016-06-01

    The aim of this paper was to investigate the clinical feasibility and the accuracy in femoral neck densitometry of the Osteoporosis Score (O.S.), an ultrasound (US) parameter for osteoporosis diagnosis that has been recently introduced for lumbar spine applications. A total of 377 female patients (aged 61-70 y) underwent both a femoral dual X-ray absorptiometry (DXA) and an echographic scan of the proximal femur. Recruited patients were sub-divided into a reference database used for ultrasound spectral model construction and a study population for repeatability assessments and accuracy evaluations. Echographic images and radiofrequency signals were analyzed through a fully automatic algorithm that performed a series of combined spectral and statistical analyses, providing as a final output the O.S. value of the femoral neck. Assuming DXA as a gold standard reference, the accuracy of O.S.-based diagnoses resulted 94.7%, with k = 0.898 (p < 0.0001). Significant correlations were also found between O.S.-estimated bone mineral density and corresponding DXA values, with r(2) up to 0.79 and root mean square error = 5.9-7.4%. The reported accuracy levels, combined with the proven ease of use and very good measurement repeatability, provide the adopted method with a potential for clinical routine application in osteoporosis diagnosis. PMID:27033331

  5. Employing quantitative and qualitative methods in one study.

    PubMed

    Mason, S A

    There is an apparent lack of epistemological rigour when quantitative and qualitative methods are combined in the same study, because they reflect opposing positivist and interpretive perspectives. When and how to use methodological pluralism is discussed in this article. PMID:8400784

  6. Robust quantitative parameter estimation by advanced CMP measurements for vadose zone hydrological studies

    NASA Astrophysics Data System (ADS)

    Koyama, C.; Wang, H.; Khuut, T.; Kawai, T.; Sato, M.

    2015-12-01

    Soil moisture plays a crucial role in the understanding of processes in the vadose zone hydrology. In the last two decades ground penetrating radar (GPR) has been widely discussed has nondestructive measurement technique for soil moisture data. Especially the common mid-point (CMP) technique, which has been used in both seismic and GPR surveys to investigate the vertical velocity profiles, has a very high potential for quantitaive obervsations from the root zone to the ground water aquifer. However, the use is still rather limited today and algorithms for robust quantitative paramter estimation are lacking. In this study we develop an advanced processing scheme for operational soil moisture reetrieval at various depth. Using improved signal processing, together with a semblance - non-normalized cross-correlation sum combined stacking approach and the Dix formula, the interval velocities for multiple soil layers are obtained from the RMS velocities allowing for more accurate estimation of the permittivity at the reflecting point. Where the presence of a water saturated layer, like a groundwater aquifer, can be easily identified by its RMS velocity due to the high contrast compared to the unsaturated zone. By using a new semi-automated measurement technique the acquisition time for a full CMP gather with 1 cm intervals along a 10 m profile can be reduced significantly to under 2 minutes. The method is tested and validated under laboratory conditions in a sand-pit as well as on agricultural fields and beach sand in the Sendai city area. Comparison between CMP estimates and TDR measurements yield a very good agreement with RMSE of 1.5 Vol.-%. The accuracy of depth estimation is validated with errors smaller than 2%. Finally, we demonstrate application of the method in a test site in semi-arid Mongolia, namely the Orkhon River catchment in Bulgan, using commercial 100 MHz and 500 MHz RAMAC GPR antennas. The results demonstrate the suitability of the proposed method for

  7. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  8. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  9. Quantitative methods for analyzing cell-cell adhesion in development.

    PubMed

    Kashef, Jubin; Franz, Clemens M

    2015-05-01

    During development cell-cell adhesion is not only crucial to maintain tissue morphogenesis and homeostasis, it also activates signalling pathways important for the regulation of different cellular processes including cell survival, gene expression, collective cell migration and differentiation. Importantly, gene mutations of adhesion receptors can cause developmental disorders and different diseases. Quantitative methods to measure cell adhesion are therefore necessary to understand how cells regulate cell-cell adhesion during development and how aberrations in cell-cell adhesion contribute to disease. Different in vitro adhesion assays have been developed in the past, but not all of them are suitable to study developmentally-related cell-cell adhesion processes, which usually requires working with low numbers of primary cells. In this review, we provide an overview of different in vitro techniques to study cell-cell adhesion during development, including a semi-quantitative cell flipping assay, and quantitative single-cell methods based on atomic force microscopy (AFM)-based single-cell force spectroscopy (SCFS) or dual micropipette aspiration (DPA). Furthermore, we review applications of Förster resonance energy transfer (FRET)-based molecular tension sensors to visualize intracellular mechanical forces acting on cell adhesion sites. Finally, we describe a recently introduced method to quantitate cell-generated forces directly in living tissues based on the deformation of oil microdroplets functionalized with adhesion receptor ligands. Together, these techniques provide a comprehensive toolbox to characterize different cell-cell adhesion phenomena during development. PMID:25448695

  10. Advanced Bayesian Method for Planetary Surface Navigation

    NASA Technical Reports Server (NTRS)

    Center, Julian

    2015-01-01

    Autonomous Exploration, Inc., has developed an advanced Bayesian statistical inference method that leverages current computing technology to produce a highly accurate surface navigation system. The method combines dense stereo vision and high-speed optical flow to implement visual odometry (VO) to track faster rover movements. The Bayesian VO technique improves performance by using all image information rather than corner features only. The method determines what can be learned from each image pixel and weighs the information accordingly. This capability improves performance in shadowed areas that yield only low-contrast images. The error characteristics of the visual processing are complementary to those of a low-cost inertial measurement unit (IMU), so the combination of the two capabilities provides highly accurate navigation. The method increases NASA mission productivity by enabling faster rover speed and accuracy. On Earth, the technology will permit operation of robots and autonomous vehicles in areas where the Global Positioning System (GPS) is degraded or unavailable.

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  12. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  13. Informatics Methods to Enable Sharing of Quantitative Imaging Research Data

    PubMed Central

    Levy, Mia A.; Freymann, John B.; Kirby, Justin S.; Fedorov, Andriy; Fennessy, Fiona M.; Eschrich, Steven A.; Berglund, Anders E.; Fenstermacher, David A.; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L.; Brown, Bartley J.; Braun, Terry A.; Dekker, Andre; Roelofs, Erik; Mountz, James M.; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-01-01

    Introduction The National Cancer Institute (NCI) Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. Methods We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. Results There area variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. Conclusions As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. PMID:22770688

  14. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  15. A quantitative analytical method to test for salt effects on giant unilamellar vesicles.

    PubMed

    Hadorn, Maik; Boenzli, Eva; Hotz, Peter Eggenberger

    2011-01-01

    Today, free-standing membranes, i.e. liposomes and vesicles, are used in a multitude of applications, e.g. as drug delivery devices and artificial cell models. Because current laboratory techniques do not allow handling of large sample sizes, systematic and quantitative studies on the impact of different effectors, e.g. electrolytes, are limited. In this work, we evaluated the Hofmeister effects of ten alkali metal halides on giant unilamellar vesicles made of palmitoyloleoylphosphatidylcholine for a large sample size by combining the highly parallel water-in-oil emulsion transfer vesicle preparation method with automatic haemocytometry. We found that this new quantitative screening method is highly reliable and consistent with previously reported results. Thus, this method may provide a significant methodological advance in analysis of effects on free-standing model membranes. PMID:22355683

  16. A quantitative method for measuring the quality of history matches

    SciTech Connect

    Shaw, T.S.; Knapp, R.M.

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  17. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  18. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  19. Analytical methods for quantitation of prenylated flavonoids from hops

    PubMed Central

    Nikolić, Dejan; van Breemen, Richard B.

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  20. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  1. A method for quantitative wet chemical analysis of urinary calculi.

    PubMed

    Larsson, L; Sörbo, B; Tiselius, H G; Ohman, S

    1984-06-27

    We describe a simple method for quantitative chemical analysis of urinary calculi requiring no specialized equipment. Pulverized calculi are dried over silica gel at room temperature and dissolved in nitric acid, which was the only effective agent for complete dissolution. Calcium, magnesium, ammonium, and phosphate are then determined by conventional methods. Oxalate is determined by a method based on the quenching action of oxalate on the fluorescence of a zirconium-flavonol complex. Uric acid, when treated with nitric acid, is stoichiometrically converted to alloxan, which is determined fluorimetrically with 1,2-phenylenediamine. Similarly, cystine is oxidized by nitric acid to sulfate, which is determined turbidimetrically as barium sulfate. Protein is determined spectrophotometrically as xanthoprotein. The total mass recovery of authentic calculi was 92.2 +/- 6.7 (SD) per cent. The method permits analysis of calculi as small as 1.0 mg. Internal quality control is performed with specially designed control samples. PMID:6086179

  2. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  3. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  4. Linguistic Alternatives to Quantitative Research Strategies. Part One: How Linguistic Mechanisms Advance Research Outcomes

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2007-01-01

    Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…

  5. [Method of quantitative determination of staphylococcal hyaluronidase activity].

    PubMed

    Generalov, I I

    1998-03-01

    The proposed method for measuring hyaluronidase activity of microorganism is based on prevention of hyaluronic acid clot formation with rivanol under the effect of hyaluronidase. This made possible the quantitative and qualitative detection of hyaluronidase activities of different staphylococcus species and strains. The maximum level of the enzyme and highest rate of its detection were typical of St. aureus. Its strains producing hyaluronidase in quantities of at least 0.5 IU are significantly (p < 0.01) more often isolated from medical staff. PMID:9575732

  6. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular. PMID:23650936

  7. A comparison of ancestral state reconstruction methods for quantitative characters.

    PubMed

    Royer-Carenzi, Manuela; Didier, Gilles

    2016-09-01

    Choosing an ancestral state reconstruction method among the alternatives available for quantitative characters may be puzzling. We present here a comparison of seven of them, namely the maximum likelihood, restricted maximum likelihood, generalized least squares under Brownian, Brownian-with-trend and Ornstein-Uhlenbeck models, phylogenetic independent contrasts and squared parsimony methods. A review of the relations between these methods shows that the maximum likelihood, the restricted maximum likelihood and the generalized least squares under Brownian model infer the same ancestral states and can only be distinguished by the distributions accounting for the reconstruction uncertainty which they provide. The respective accuracy of the methods is assessed over character evolution simulated under a Brownian motion with (and without) directional or stabilizing selection. We give the general form of ancestral state distributions conditioned on leaf states under the simulation models. Ancestral distributions are used first, to give a theoretical lower bound of the expected reconstruction error, and second, to develop an original evaluation scheme which is more efficient than comparing the reconstructed and the simulated states. Our simulations show that: (i) the distributions of the reconstruction uncertainty provided by the methods generally make sense (some more than others); (ii) it is essential to detect the presence of an evolutionary trend and to choose a reconstruction method accordingly; (iii) all the methods show good performances on characters under stabilizing selection; (iv) without trend or stabilizing selection, the maximum likelihood method is generally the most accurate. PMID:27234644

  8. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  9. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  10. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  11. 7 CFR 27.92 - Method of payment; advance deposit.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of payment; advance deposit. 27.92 Section 27... Micronaire § 27.92 Method of payment; advance deposit. Any payment or advance deposit under this subpart...,” and may not be made in cash except in cases where the total payment or deposit does not exceed...

  12. NOTE: An innovative phantom for quantitative and qualitative investigation of advanced x-ray imaging technologies

    NASA Astrophysics Data System (ADS)

    Chiarot, C. B.; Siewerdsen, J. H.; Haycocks, T.; Moseley, D. J.; Jaffray, D. A.

    2005-11-01

    Development, characterization, and quality assurance of advanced x-ray imaging technologies require phantoms that are quantitative and well suited to such modalities. This note reports on the design, construction, and use of an innovative phantom developed for advanced imaging technologies (e.g., multi-detector CT and the numerous applications of flat-panel detectors in dual-energy imaging, tomosynthesis, and cone-beam CT) in diagnostic and image-guided procedures. The design addresses shortcomings of existing phantoms by incorporating criteria satisfied by no other single phantom: (1) inserts are fully 3D—spherically symmetric rather than cylindrical; (2) modules are quantitative, presenting objects of known size and contrast for quality assurance and image quality investigation; (3) features are incorporated in ideal and semi-realistic (anthropomorphic) contexts; and (4) the phantom allows devices to be inserted and manipulated in an accessible module (right lung). The phantom consists of five primary modules: (1) head, featuring contrast-detail spheres approximate to brain lesions; (2) left lung, featuring contrast-detail spheres approximate to lung modules; (3) right lung, an accessible hull in which devices may be placed and manipulated; (4) liver, featuring conrast-detail spheres approximate to metastases; and (5) abdomen/pelvis, featuring simulated kidneys, colon, rectum, bladder, and prostate. The phantom represents a two-fold evolution in design philosophy—from 2D (cylindrically symmetric) to fully 3D, and from exclusively qualitative or quantitative to a design accommodating quantitative study within an anatomical context. It has proven a valuable tool in investigations throughout our institution, including low-dose CT, dual-energy radiography, and cone-beam CT for image-guided radiation therapy and surgery.

  13. An innovative phantom for quantitative and qualitative investigation of advanced x-ray imaging technologies.

    PubMed

    Chiarot, C B; Siewerdsen, J H; Haycocks, T; Moseley, D J; Jaffray, D A

    2005-11-01

    Development, characterization, and quality assurance of advanced x-ray imaging technologies require phantoms that are quantitative and well suited to such modalities. This note reports on the design, construction, and use of an innovative phantom developed for advanced imaging technologies (e.g., multi-detector CT and the numerous applications of flat-panel detectors in dual-energy imaging, tomosynthesis, and cone-beam CT) in diagnostic and image-guided procedures. The design addresses shortcomings of existing phantoms by incorporating criteria satisfied by no other single phantom: (1) inserts are fully 3D--spherically symmetric rather than cylindrical; (2) modules are quantitative, presenting objects of known size and contrast for quality assurance and image quality investigation; (3) features are incorporated in ideal and semi-realistic (anthropomorphic) contexts; and (4) the phantom allows devices to be inserted and manipulated in an accessible module (right lung). The phantom consists of five primary modules: (1) head, featuring contrast-detail spheres approximate to brain lesions; (2) left lung, featuring contrast-detail spheres approximate to lung modules; (3) right lung, an accessible hull in which devices may be placed and manipulated; (4) liver, featuring contrast-detail spheres approximate to metastases; and (5) abdomen/pelvis, featuring simulated kidneys, colon, rectum, bladder, and prostate. The phantom represents a two-fold evolution in design philosophy--from 2D (cylindrically symmetric) to fully 3D, and from exclusively qualitative or quantitative to a design accommodating quantitative study within an anatomical context. It has proven a valuable tool in investigations throughout our institution, including low-dose CT, dual-energy radiography, and cone-beam CT for image-guided radiation therapy and surgery. PMID:16237228

  14. Advanced electromagnetic methods for aerospace vehicles

    NASA Astrophysics Data System (ADS)

    Balanis, Constantine A.; Sun, Weimin; El-Sharawy, El-Budawy; Aberle, James T.; Birtcher, Craig R.; Peng, Jian; Tirkas, Panayiotis A.; Kokotoff, David; Zavosh, Frank

    1993-06-01

    The Advanced Helicopter Electromagnetics (AHE) Industrial Associates Program has continuously progressed with its research effort focused on subjects identified and recommended by the Advisory Task Force of the program. The research activities in this reporting period have been steered toward practical helicopter electromagnetic problems, such as HF antenna problems and antenna efficiencies, recommended by the AHE members at the annual conference held at Arizona State University on 28-29 Oct. 1992 and the last biannual meeting held at the Boeing Helicopter on 19-20 May 1993. The main topics addressed include the following: Composite Materials and Antenna Technology. The research work on each topic is closely tied with the AHE Consortium members' interests. Significant progress in each subject is reported. Special attention in the area of Composite Materials has been given to the following: modeling of material discontinuity and their effects on towel-bar antenna patterns; guidelines for composite material modeling by using the Green's function approach in the NEC code; measurements of towel-bar antennas grounded with a partially material-coated plate; development of 3-D volume mesh generator for modeling thick and volumetric dielectrics by using FD-TD method; FDTD modeling of horn antennas with composite E-plane walls; and antenna efficiency analysis for a horn antenna loaded with composite dielectric materials.

  15. Advanced continuous cultivation methods for systems microbiology.

    PubMed

    Adamberg, Kaarel; Valgepea, Kaspar; Vilu, Raivo

    2015-09-01

    Increasing the throughput of systems biology-based experimental characterization of in silico-designed strains has great potential for accelerating the development of cell factories. For this, analysis of metabolism in the steady state is essential as only this enables the unequivocal definition of the physiological state of cells, which is needed for the complete description and in silico reconstruction of their phenotypes. In this review, we show that for a systems microbiology approach, high-resolution characterization of metabolism in the steady state--growth space analysis (GSA)--can be achieved by using advanced continuous cultivation methods termed changestats. In changestats, an environmental parameter is continuously changed at a constant rate within one experiment whilst maintaining cells in the physiological steady state similar to chemostats. This increases the resolution and throughput of GSA compared with chemostats, and, moreover, enables following of the dynamics of metabolism and detection of metabolic switch-points and optimal growth conditions. We also describe the concept, challenge and necessary criteria of the systematic analysis of steady-state metabolism. Finally, we propose that such systematic characterization of the steady-state growth space of cells using changestats has value not only for fundamental studies of metabolism, but also for systems biology-based metabolic engineering of cell factories. PMID:26220303

  16. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Sun, Weimin; El-Sharawy, El-Budawy; Aberle, James T.; Birtcher, Craig R.; Peng, Jian; Tirkas, Panayiotis A.; Andrew, William V.; Kokotoff, David; Zavosh, Frank

    1993-01-01

    The Advanced Helicopter Electromagnetics (AHE) Industrial Associates Program has fruitfully completed its fourth year. Under the support of the AHE members and the joint effort of the research team, new and significant progress has been achieved in the year. Following the recommendations by the Advisory Task Force, the research effort is placed on more practical helicopter electromagnetic problems, such as HF antennas, composite materials, and antenna efficiencies. In this annual report, the main topics to be addressed include composite materials and antenna technology. The research work on each topic has been driven by the AHE consortium members' interests and needs. The remarkable achievements and progresses in each subject is reported respectively in individual sections of the report. The work in the area of composite materials includes: modeling of low conductivity composite materials by using Green's function approach; guidelines for composite material modeling by using the Green's function approach in the NEC code; development of 3-D volume mesh generator for modeling thick and volumetric dielectrics by using FD-TD method; modeling antenna elements mounted on a composite Comanche tail stabilizer; and antenna pattern control and efficiency estimate for a horn antenna loaded with composite dielectric materials.

  17. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Sun, Weimin; El-Sharawy, El-Budawy; Aberle, James T.; Birtcher, Craig R.; Peng, Jian; Tirkas, Panayiotis A.; Kokotoff, David; Zavosh, Frank

    1993-01-01

    The Advanced Helicopter Electromagnetics (AHE) Industrial Associates Program has continuously progressed with its research effort focused on subjects identified and recommended by the Advisory Task Force of the program. The research activities in this reporting period have been steered toward practical helicopter electromagnetic problems, such as HF antenna problems and antenna efficiencies, recommended by the AHE members at the annual conference held at Arizona State University on 28-29 Oct. 1992 and the last biannual meeting held at the Boeing Helicopter on 19-20 May 1993. The main topics addressed include the following: Composite Materials and Antenna Technology. The research work on each topic is closely tied with the AHE Consortium members' interests. Significant progress in each subject is reported. Special attention in the area of Composite Materials has been given to the following: modeling of material discontinuity and their effects on towel-bar antenna patterns; guidelines for composite material modeling by using the Green's function approach in the NEC code; measurements of towel-bar antennas grounded with a partially material-coated plate; development of 3-D volume mesh generator for modeling thick and volumetric dielectrics by using FD-TD method; FDTD modeling of horn antennas with composite E-plane walls; and antenna efficiency analysis for a horn antenna loaded with composite dielectric materials.

  18. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples. PMID:24190861

  19. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  20. Quantitative methods in electroencephalography to access therapeutic response.

    PubMed

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  1. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  2. A Quantitative Method for Weight Selection in SGDDP.

    PubMed

    Huang, Qin; Chen, Gang; Yuan, Zhilong; Zhang, Ying; Wenrich, Judy

    2015-01-01

    Ethnic factors pose major challenge to evaluating the treatment effect of a new drug in a targeted ethnic (TE) population in emerging regions based on the results from a multiregional clinical trial (MRCT). To address this issue with statistical rigor, Huang et al. (2012) proposed a new design of a simultaneous global drug development program (SGDDP) which used weighted Z tests to combine the information collected from the nontargeted ethnic (NTE) group in the MRCT with that from the TE group in both the MRCT and a simultaneously designed local clinical trial (LCT). An important and open question in the SGDDP design was how to downweight the information collected from the NTE population to reflect the potential impact of ethnic factors and ensure that the effect size for TE patients is clinically meaningful. In this paper, we will relate the weight selection for the SGDDP to Method 1 proposed in the Japanese regulatory guidance published by the Ministry of Health, Labour and Welfare (MHLW) in 2007. Method 1 is only applicable when true effect sizes are assumed to be equal for both TE and NTE groups. We modified the Method 1 formula for more general scenarios, and use it to develop a quantitative method of weight selection for the design of the SGDDP which, at the same time, also provides sufficient power to descriptively check the consistency of the effect size for TE patients to a clinically meaningful magnitude. PMID:25365548

  3. A simple regression-based method to map quantitative trait loci underlying function-valued phenotypes.

    PubMed

    Kwak, Il-Youp; Moore, Candace R; Spalding, Edgar P; Broman, Karl W

    2014-08-01

    Most statistical methods for quantitative trait loci (QTL) mapping focus on a single phenotype. However, multiple phenotypes are commonly measured, and recent technological advances have greatly simplified the automated acquisition of numerous phenotypes, including function-valued phenotypes, such as growth measured over time. While methods exist for QTL mapping with function-valued phenotypes, they are generally computationally intensive and focus on single-QTL models. We propose two simple, fast methods that maintain high power and precision and are amenable to extensions with multiple-QTL models using a penalized likelihood approach. After identifying multiple QTL by these approaches, we can view the function-valued QTL effects to provide a deeper understanding of the underlying processes. Our methods have been implemented as a package for R, funqtl. PMID:24931408

  4. A Quantitative Vainberg Method for Black Box Scattering

    NASA Astrophysics Data System (ADS)

    Galkowski, Jeffrey

    2016-05-01

    We give a quantitative version of Vainberg's method relating pole free regions to propagation of singularities for black box scatterers. In particular, we show that there is a logarithmic resonance free region near the real axis of size {τ} with polynomial bounds on the resolvent if and only if the wave propagator gains derivatives at rate {τ} . Next we show that if there exist singularities in the wave trace at times tending to infinity which smooth at rate {τ} , then there are resonances in logarithmic strips whose width is given by {τ} . As our main application of these results, we give sharp bounds on the size of resonance free regions in scattering on geometrically nontrapping manifolds with conic points. Moreover, these bounds are generically optimal on exteriors of nontrapping polygonal domains.

  5. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    SciTech Connect

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.; Palmer, M.A.; Furman, M.; Harkay, K.

    2011-03-28

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and one can obtain best fit values for important simulation parameters with a chi-square minimization method.

  6. A quantitative dimming method for LED based on PWM

    NASA Astrophysics Data System (ADS)

    Wang, Jiyong; Mou, Tongsheng; Wang, Jianping; Tian, Xiaoqing

    2012-10-01

    Traditional light sources were required to provide stable and uniform illumination for a living or working environment considering performance of visual function of human being. The requirement was always reasonable until non-visual functions of the ganglion cells in the retina photosensitive layer were found. New generation of lighting technology, however, is emerging based on novel lighting materials such as LED and photobiological effects on human physiology and behavior. To realize dynamic lighting of LED whose intensity and color were adjustable to the need of photobiological effects, a quantitative dimming method based on Pulse Width Modulation (PWM) and light-mixing technology was presented. Beginning with two channels' PWM, this paper demonstrated the determinacy and limitation of PWM dimming for realizing Expected Photometric and Colorimetric Quantities (EPCQ), in accordance with the analysis on geometrical, photometric, colorimetric and electrodynamic constraints. A quantitative model which mapped the EPCQ into duty cycles was finally established. The deduced model suggested that the determinacy was a unique individuality only for two channels' and three channels' PWM, but the limitation was an inevitable commonness for multiple channels'. To examine the model, a light-mixing experiment with two kinds of white LED simulated variations of illuminance and Correlation Color Temperature (CCT) from dawn to midday. Mean deviations between theoretical values and measured values were obtained, which were 15lx and 23K respectively. Result shows that this method can effectively realize the light spectrum which has a specific requirement of EPCQ, and provides a theoretical basis and a practical way for dynamic lighting of LED.

  7. Advanced radiochromic film methodologies for quantitative dosimetry of small and nonstandard fields

    NASA Astrophysics Data System (ADS)

    Rosen, Benjamin S.

    Radiotherapy treatments with small and nonstandard fields are increasing in use as collimation and targeting become more advanced, which spare normal tissues while increasing tumor dose. However, dosimetry of small and nonstandard fields is more difficult than that of conventional fields due to loss of lateral charged-particle equilibrium, tight measurement setup requirements, source occlusion, and the volume-averaging effect of conventional dosimeters. This work aims to create new small and nonstandard field dosimetry protocols using radiochromic film (RCF) in conjunction with novel readout and analysis methodologies. It also is the intent of this work to develop an improved understanding of RCF structure and mechanics for its quantitative use in general applications. Conventional digitization techniques employ white-light, flatbed document scanners or scanning-laser densitometers which are not optimized for RCF dosimetry. A point-by-point precision laser densitometry system (LDS) was developed for this work to overcome the film-scanning artifacts associated with the use of conventional digitizers, such as positional scan dependence, off-axis light scatter, glass bed interference, and low signal-to-noise ratios. The LDS was shown to be optically traceable to national standards and to provide highly reproducible density measurements. Use of the LDS resulted in increased agreement between RCF dose measurements and the single-hit detector model of film response, facilitating traceable RCF calibrations based on calibrated physical quantities. GafchromicRTM EBT3 energy response to a variety of reference x-ray and gamma-ray beam qualities was also investigated. Conventional Monte Carlo methods are not capable of predicting film intrinsic energy response to arbitrary particle spectra. Therefore, a microdosimetric model was developed to simulate the underlying physics of the radiochromic mechanism and was shown to correctly predict the intrinsic response relative to a

  8. A Quantitative Assessment Method for Ascaris Eggs on Hands

    PubMed Central

    Jeandron, Aurelie; Ensink, Jeroen H. J.; Thamsborg, Stig M.; Dalsgaard, Anders; Sengupta, Mita E.

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  9. Quantitative methods for somatosensory evaluation in atypical odontalgia.

    PubMed

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana; Bonjardim, Leonardo Rigoldi; Conti, Paulo César Rodrigues; Svensson, Peter

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection (CPT), and/or wind-up ratio (WUR). The publications meeting the inclusion criteria revealed that only mechanical allodynia tests (DMA1, DMA2, and WUR) were significantly higher and pain threshold tests to heat stimulation (HPT) were significantly lower in the affected side, compared with the contralateral side, in AO patients; however, for MDT, MPT, PPT, CDT, and WDT, the results were not significant. These data support the presence of central sensitization features, such as allodynia and temporal summation. In contrast, considerable inconsistencies between studies were found when AO patients were compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests. PMID:25627886

  10. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  11. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING

    EPA Science Inventory

    The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...

  12. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  13. Ongoing advances in quantitative PpIX fluorescence guided intracranial tumor resection (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Olson, Jonathan D.; Kanick, Stephen C.; Bravo, Jaime J.; Roberts, David W.; Paulsen, Keith D.

    2016-03-01

    Aminolevulinc-acid induced protoporphyrin IX (ALA-PpIX) is being investigated as a biomarker to guide neurosurgical resection of brain tumors. ALA-PpIX fluorescence can be observed visually in the surgical field; however, raw fluorescence emissions can be distorted by factors other than the fluorophore concentration. Specifically, fluorescence emissions are mixed with autofluorescence and attenuated by background absorption and scattering properties of the tissue. Recent work at Dartmouth has developed advanced fluorescence detection approaches that return quantitative assessments of PpIX concentration, which are independent of background optical properties. The quantitative fluorescence imaging (qFI) approach has increased sensitivity to residual disease within the resection cavity at the end of surgery that was not visible to the naked eye through the operating microscope. This presentation outlines clinical observations made during an ongoing investigation of ALA-PpIX based guidance of tumor resection. PpIX fluorescence measurements made in a wide-field hyperspectral imaging approach are co-registered with point-assessment using a fiber optic probe. Data show variations in the measured PpIX accumulation among different clinical tumor grades (i.e. high grade glioma, low grade glioma), types (i.e. primary tumors. metastases) and normal structures of interest (e.g. normal cortex, hippocampus). These results highlight the contrast enhancement and underscore the potential clinical benefit offered from quantitative measurements of PpIX concentration during resection of intracranial tumors.

  14. Advances in liquid chromatography-high-resolution mass spectrometry for quantitative and qualitative environmental analysis.

    PubMed

    Aceña, Jaume; Stampachiacchiere, Serena; Pérez, Sandra; Barceló, Damià

    2015-08-01

    This review summarizes the advances in environmental analysis by liquid chromatography-high-resolution mass spectrometry (LC-HRMS) during the last decade and discusses different aspects of their application. LC-HRMS has become a powerful tool for simultaneous quantitative and qualitative analysis of organic pollutants, enabling their quantitation and the search for metabolites and transformation products or the detection of unknown compounds. LC-HRMS provides more information than low-resolution (LR) MS for each sample because it can accurately determine the mass of the molecular ion and its fragment ions if it can be used for MS-MS. Another advantage is that the data can be processed using either target analysis, suspect screening, retrospective analysis, or non-target screening. With the growing popularity and acceptance of HRMS analysis, current guidelines for compound confirmation need to be revised for quantitative and qualitative purposes. Furthermore, new commercial software and user-built libraries are required to mine data in an efficient and comprehensive way. The scope of this critical review is not to provide a comprehensive overview of the many studies performed with LC-HRMS in the field of environmental analysis, but to reveal its advantages and limitations using different workflows. PMID:26138893

  15. Combined megaplex TCR isolation and SMART-based real-time quantitation methods for quantitating antigen-specific T cell clones in mycobacterial infection

    PubMed Central

    Du, George; Qiu, Liyou; Shen, Ling; Sehgal, Probhat; Shen, Yun; Huang, Dan; Letvin, Norman L.; Chen, Zheng W.

    2010-01-01

    Despite recent advances in measuring cellular immune responses, the quantitation of antigen-specific T cell clones in infections or diseases remains challenging. Here, we employed combined megaplex TCR isolation and SMART-based real-time quantitation methods to quantitate numerous antigen-specific T cell clones using limited amounts of specimens. The megaplex TCR isolation covered the repertoire comprised of recombinants from 24 Vβ families and 13 Jβ segments, and allowed us to isolate TCR VDJ clonotypic sequences from one or many PPD-specific IFNγ-producing T cells that were purified by flow cytometry sorting. The SMART amplification technique was then validated for its capacity to proportionally enrich cellular TCR mRNA/cDNA for real-time quantitation of large numbers of T cell clones. SMART amplified cDNA was shown to maintain relative expression levels of TCR genes when compared to unamplified cDNA. While the SMART-based real-time quantitative PCR conferred a detection limit of 10−5 to 10−6 antigen-specific T cells, the clonotypic primers specifically amplified and quantitated the target clone TCR but discriminated other clones that differed by ≥2 bases in the DJ regions. Furthermore, the combined megaplex TCR isolation and SMART-based real-time quantiation methods allowed us to quantitate large numbers of PPD-specific IFNγ-producing T cell clones using as few as 2×106 PBMC collected weekly after mycobacterial infection. This assay system may be useful for studies of antigen-specific T cell clones in tumors, autoimmune and infectious diseases. PMID:16403511

  16. Simple laboratory methods for quantitative IR measurements of CW agents

    NASA Astrophysics Data System (ADS)

    Puckrin, Eldon; Thériault, Jean-Marc; Lavoie, Hugo; Dubé, Denis; Lepage, Carmela J.; Petryk, Michael

    2005-11-01

    A simple method is presented for quantitatively measuring the absorbance of chemical warfare (CW) agents and their simulants in the vapour phase. The technique is based on a standard lab-bench FTIR spectrometer, 10-cm gas cell, a high accuracy Baratron pressure manometer, vacuum pump and simple stainless-steel hardware components. The results of this measurement technique are demonstrated for sarin (GB) and soman (GD). A second technique is also introduced for the passive IR detection of CW agents in an open- air path located in a fumehood. Using a modified open-cell with a pathlength of 45 cm, open-air passive infrared measurements have been obtained for simulants and several classical CW agents. Detection, identification and quantification results based on passive infrared measurements are presented for GB and the CW agent simulant, DMMP, using the CATSI sensor which has been developed by DRDC Valcartier. The open-cell technique represents a relatively simple and feasible method for examining the detection capability of passive sensors, such as CATSI, for CW agents.

  17. Spy quantitative inspection with a machine vision light sectioning method

    NASA Astrophysics Data System (ADS)

    Tu, Da-Wei; Lin, Cai-Xing

    2000-08-01

    Machine vision light sectioning sensing is developed and expanded to the range of spy quantitative inspection for hole-like work pieces in this paper. A light beam from a semiconductor laser diode is converged into a line-shape by a cylindrical lens. A special compact reflecting-refracting prism group is designed to ensure that such a sectioning light is projected axially onto the inner surface, and to make the deformed line be imaged onto a CCD sensitive area. The image is digitized and captured into a computer by a 512×512 pixel card, and machine vision image processing methods such as thresholding, line centre detect and the least-squares method are developed for contour feature extraction and description. Two other important problems in such an inspection system are how to orientate the deep-going optical probe and how to bring the projected line into focus. A focusing criterion based on image position deviation and a four-step orientating procedure are put forward, and analysed to be feasible respectively. The experimental results show that the principle is correct and the techniques are realizable, and a good future for application in industry is possible.

  18. Breast tumour visualization using 3D quantitative ultrasound methods

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Raheem, Abdul; Tadayyon, Hadi; Liu, Simon; Hadizad, Farnoosh; Czarnota, Gregory J.

    2016-04-01

    Breast cancer is one of the most common cancer types accounting for 29% of all cancer cases. Early detection and treatment has a crucial impact on improving the survival of affected patients. Ultrasound (US) is non-ionizing, portable, inexpensive, and real-time imaging modality for screening and quantifying breast cancer. Due to these attractive attributes, the last decade has witnessed many studies on using quantitative ultrasound (QUS) methods in tissue characterization. However, these studies have mainly been limited to 2-D QUS methods using hand-held US (HHUS) scanners. With the availability of automated breast ultrasound (ABUS) technology, this study is the first to develop 3-D QUS methods for the ABUS visualization of breast tumours. Using an ABUS system, unlike the manual 2-D HHUS device, the whole patient's breast was scanned in an automated manner. The acquired frames were subsequently examined and a region of interest (ROI) was selected in each frame where tumour was identified. Standard 2-D QUS methods were used to compute spectral and backscatter coefficient (BSC) parametric maps on the selected ROIs. Next, the computed 2-D parameters were mapped to a Cartesian 3-D space, interpolated, and rendered to provide a transparent color-coded visualization of the entire breast tumour. Such 3-D visualization can potentially be used for further analysis of the breast tumours in terms of their size and extension. Moreover, the 3-D volumetric scans can be used for tissue characterization and the categorization of breast tumours as benign or malignant by quantifying the computed parametric maps over the whole tumour volume.

  19. Why Video? How Technology Advances Method

    ERIC Educational Resources Information Center

    Downing, Martin J., Jr.

    2008-01-01

    This paper reports on the use of video to enhance qualitative research. Advances in technology have improved our ability to capture lived experiences through visual means. I reflect on my previous work with individuals living with HIV/AIDS, the results of which are described in another paper, to evaluate the effectiveness of video as a medium that…

  20. Overview of Student Affairs Research Methods: Qualitative and Quantitative.

    ERIC Educational Resources Information Center

    Perl, Emily J.; Noldon, Denise F.

    2000-01-01

    Reviews the strengths and weaknesses of quantitative and qualitative research in student affairs research, noting that many student affairs professionals question the value of more traditional quantitative approaches to research, though they typically have very good people skills that they have applied to being good qualitative researchers.…

  1. Quantitative Methods for Comparing Different Polyline Stream Network Models

    SciTech Connect

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  2. Quantitative methods in the study of trypanosomes and their applications*

    PubMed Central

    Lumsden, W. H. R.

    1963-01-01

    In the first part of this paper the author summarizes and discusses previous quantitative work on trypanosomes, with particular reference to biometrical studies, in vivo and in vitro studies on numbers of trypanosomes, studies on hosts infected with trypanosomes, and physiological studies. The second part discusses recent work done at the East African Trypanosomiasis Research Organization. A method for the measurement of the infectivity of trypanosome suspensions, based on serial dilution and inoculation into test animals, is outlined, and applications likely to improve diagnostic procedures are suggested for it. Such applications might include: the establishment of experimental procedures not significantly reducing the infectivity of trypanosomes under experiment; determination of the effects on the infectivity of preserved material of some of the factors in the process of preservation, important for the preparation of standard material; comparison of the efficiency of different culture media for the isolation of trypanosomes; study of the distribution of trypanosomes in the vertebrate host; and measurement of the susceptibility of trypanosomes to drugs. The author stresses the importance of relating future experimental work with trypanosomes to preserved material for which comprehensive documentation is available. PMID:20604152

  3. Method and apparatus for advancing tethers

    DOEpatents

    Zollinger, W. Thor

    1998-01-01

    A tether puller for advancing a tether through a channel may include a bellows assembly having a leading end fixedly attached to the tether at a first position and a trailing end fixedly attached to the tether at a second position so that the leading and trailing ends of the bellows assembly are located a substantially fixed distance apart. The bellows assembly includes a plurality of independently inflatable elements each of which may be separately inflated to an extended position and deflated to a retracted position. Each of the independently inflatable elements expands radially and axially upon inflation. An inflation system connected to the independently inflatable elements inflates and deflates selected ones of the independently inflatable elements to cause the bellows assembly to apply a tractive force to the tether and advance it in the channel.

  4. Method and apparatus for advancing tethers

    DOEpatents

    Zollinger, W.T.

    1998-06-02

    A tether puller for advancing a tether through a channel may include a bellows assembly having a leading end fixedly attached to the tether at a first position and a trailing end fixedly attached to the tether at a second position so that the leading and trailing ends of the bellows assembly are located a substantially fixed distance apart. The bellows assembly includes a plurality of independently inflatable elements each of which may be separately inflated to an extended position and deflated to a retracted position. Each of the independently inflatable elements expands radially and axially upon inflation. An inflation system connected to the independently inflatable elements inflates and deflates selected ones of the independently inflatable elements to cause the bellows assembly to apply a tractive force to the tether and advance it in the channel. 9 figs.

  5. Controlling template erosion with advanced cleaning methods

    NASA Astrophysics Data System (ADS)

    Singh, SherJang; Yu, Zhaoning; Wähler, Tobias; Kurataka, Nobuo; Gauzner, Gene; Wang, Hongying; Yang, Henry; Hsu, Yautzong; Lee, Kim; Kuo, David; Dress, Peter

    2012-03-01

    We studied the erosion and feature stability of fused silica patterns under different template cleaning conditions. The conventional SPM cleaning is compared with an advanced non-acid process. Spectroscopic ellipsometry optical critical dimension (SE-OCD) measurements were used to characterize the changes in pattern profile with good sensitivity. This study confirmed the erosion of the silica patterns in the traditional acid-based SPM cleaning mixture (H2SO4+H2O2) at a rate of ~0.1nm per cleaning cycle. The advanced non-acid clean process however only showed CD shift of ~0.01nm per clean. Contamination removal & pattern integrity of sensitive 20nm features under MegaSonic assisted cleaning is also demonstrated.

  6. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Sun, Weimin; El-Sharawy, El-Budawy; Aberle, James T.; Birtcher, Craig R.; Peng, Jian; Tirkas, Panayiotis A.

    1992-01-01

    The Advanced Helicopter Electromagnetics (AHE) Industrial Associates Program continues its research on variety of main topics identified and recommended by the Advisory Task Force of the program. The research activities center on issues that advance technology related to helicopter electromagnetics. While most of the topics are a continuation of previous works, special effort has been focused on some of the areas due to recommendations from the last annual conference. The main topics addressed in this report are: composite materials, and antenna technology. The area of composite materials continues getting special attention in this period. The research has focused on: (1) measurements of the electrical properties of low-conductivity materials; (2) modeling of material discontinuity and their effects on the scattering patterns; (3) preliminary analysis on interaction of electromagnetic fields with multi-layered graphite fiberglass plates; and (4) finite difference time domain (FDTD) modeling of fields penetration through composite panels of a helicopter.

  7. NATO PILOT STUDY ON ADVANCED CANCER RISK ASSESSMENT METHODS

    EPA Science Inventory

    NCEA scientists are participating in a study of advanced cancer risk assessment methods, conducted under the auspices of NATO's Committee on the Challenges of Modern Society. The product will be a book of case studies that illustrate advanced cancer risk assessment methods, avail...

  8. Advanced methods of structural and trajectory analysis for transport aircraft

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1995-01-01

    This report summarizes the efforts in two areas: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of trajectory optimization. The majority of the effort was spent in the structural weight area. A draft of 'Analytical Fuselage and Wing Weight Estimation of Transport Aircraft', resulting from this research, is included as an appendix.

  9. Zebrafish Caudal Fin Angiogenesis Assay—Advanced Quantitative Assessment Including 3-Way Correlative Microscopy

    PubMed Central

    Correa Shokiche, Carlos; Schaad, Laura; Triet, Ramona; Jazwinska, Anna; Tschanz, Stefan A.; Djonov, Valentin

    2016-01-01

    Background Researchers evaluating angiomodulating compounds as a part of scientific projects or pre-clinical studies are often confronted with limitations of applied animal models. The rough and insufficient early-stage compound assessment without reliable quantification of the vascular response counts, at least partially, to the low transition rate to clinics. Objective To establish an advanced, rapid and cost-effective angiogenesis assay for the precise and sensitive assessment of angiomodulating compounds using zebrafish caudal fin regeneration. It should provide information regarding the angiogenic mechanisms involved and should include qualitative and quantitative data of drug effects in a non-biased and time-efficient way. Approach & Results Basic vascular parameters (total regenerated area, vascular projection area, contour length, vessel area density) were extracted from in vivo fluorescence microscopy images using a stereological approach. Skeletonization of the vasculature by our custom-made software Skelios provided additional parameters including “graph energy” and “distance to farthest node”. The latter gave important insights into the complexity, connectivity and maturation status of the regenerating vascular network. The employment of a reference point (vascular parameters prior amputation) is unique for the model and crucial for a proper assessment. Additionally, the assay provides exceptional possibilities for correlative microscopy by combining in vivo-imaging and morphological investigation of the area of interest. The 3-way correlative microscopy links the dynamic changes in vivo with their structural substrate at the subcellular level. Conclusions The improved zebrafish fin regeneration model with advanced quantitative analysis and optional 3-way correlative morphology is a promising in vivo angiogenesis assay, well-suitable for basic research and preclinical investigations. PMID:26950851

  10. A Method for Quantitatively Evaluating a University Library Collection

    ERIC Educational Resources Information Center

    Golden, Barbara

    1974-01-01

    The acquisitions department of the University of Nebraska at Omaha library conducted a quantitative evaluation of the library's book collection in relation to the course offerings of the university. (Author/LS)

  11. A Method to Prioritize Quantitative Traits and Individuals for Sequencing in Family-Based Studies

    PubMed Central

    Shah, Kaanan P.; Douglas, Julie A.

    2013-01-01

    Owing to recent advances in DNA sequencing, it is now technically feasible to evaluate the contribution of rare variation to complex traits and diseases. However, it is still cost prohibitive to sequence the whole genome (or exome) of all individuals in each study. For quantitative traits, one strategy to reduce cost is to sequence individuals in the tails of the trait distribution. However, the next challenge becomes how to prioritize traits and individuals for sequencing since individuals are often characterized for dozens of medically relevant traits. In this article, we describe a new method, the Rare Variant Kinship Test (RVKT), which leverages relationship information in family-based studies to identify quantitative traits that are likely influenced by rare variants. Conditional on nuclear families and extended pedigrees, we evaluate the power of the RVKT via simulation. Not unexpectedly, the power of our method depends strongly on effect size, and to a lesser extent, on the frequency of the rare variant and the number and type of relationships in the sample. As an illustration, we also apply our method to data from two genetic studies in the Old Order Amish, a founder population with extensive genealogical records. Remarkably, we implicate the presence of a rare variant that lowers fasting triglyceride levels in the Heredity and Phenotype Intervention (HAPI) Heart study (p = 0.044), consistent with the presence of a previously identified null mutation in the APOC3 gene that lowers fasting triglyceride levels in HAPI Heart study participants. PMID:23626830

  12. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; El-Sharawy, El-Budawy; Hashemi-Yeganeh, Shahrokh; Aberle, James T.; Birtcher, Craig R.

    1991-01-01

    The Advanced Helicopter Electromagnetics is centered on issues that advance technology related to helicopter electromagnetics. Progress was made on three major topics: composite materials; precipitation static corona discharge; and antenna technology. In composite materials, the research has focused on the measurements of their electrical properties, and the modeling of material discontinuities and their effect on the radiation pattern of antennas mounted on or near material surfaces. The electrical properties were used to model antenna performance when mounted on composite materials. Since helicopter platforms include several antenna systems at VHF and UHF bands, measuring techniques are being explored that can be used to measure the properties at these bands. The effort on corona discharge and precipitation static was directed toward the development of a new two dimensional Voltage Finite Difference Time Domain computer program. Results indicate the feasibility of using potentials for simulating electromagnetic problems in the cases where potentials become primary sources. In antenna technology the focus was on Polarization Diverse Conformal Microstrip Antennas, Cavity Backed Slot Antennas, and Varactor Tuned Circular Patch Antennas. Numerical codes were developed for the analysis of two probe fed rectangular and circular microstrip patch antennas fed by resistive and reactive power divider networks.

  13. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  14. Benefits of an Advanced Quantitative Precipitation Information System - San Francisco Bay Area Case Study

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Johnson, L. E.; White, A. B.

    2014-12-01

    Advancements in monitoring and prediction of precipitation and severe storms can provide significant benefits for water resource managers, allowing them to mitigate flood damage risks, capture additional water supplies and offset drought impacts, and enhance ecosystem services. A case study for the San Francisco Bay area provides the context for quantification of the benefits of an Advanced Quantitative Precipitation Information (AQPI) system. The AQPI builds off more than a decade of NOAA research and applications of advanced precipitation sensors, data assimilation, numerical models of storms and storm runoff, and systems integration for real-time operations. An AQPI would dovetail with the current National Weather Service forecast operations to provide higher resolution monitoring of rainfall events and longer lead time forecasts. A regional resource accounting approach has been developed to quantify the incremental benefits assignable to the AQPI system; these benefits total to $35 M/yr in the 9 county Bay region. Depending on the jurisdiction large benefits for flood damage avoidance may accrue for locations having dense development in flood plains. In other locations forecst=based reservoir operations can increase reservoir storage for water supplies. Ecosystem services benefits for fisheries may be obtained from increased reservoir storage and downstream releases. Benefits in the transporation sectors are associated with increased safety and avoided delays. Compared to AQPI system implementation and O&M costs over a 10 year operations period, a benefit - cost (B/C) ratio is computed which ranges between 2.8 to 4. It is important to acknowledge that many of the benefits are dependent on appropriate and adequate response by the hazards and water resources management agencies and citizens.

  15. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime

    PubMed Central

    Fitterer, Jessica L.; Nelson, Trisalyn A.

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016

  16. Advances in multiplexed MRM-based protein biomarker quantitation toward clinical utility.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Hardie, Darryl B; Borchers, Christoph H

    2014-05-01

    Accurate and rapid protein quantitation is essential for screening biomarkers for disease stratification and monitoring, and to validate the hundreds of putative markers in human biofluids, including blood plasma. An analytical method that utilizes stable isotope-labeled standard (SIS) peptides and selected/multiple reaction monitoring-mass spectrometry (SRM/MRM-MS) has emerged as a promising technique for determining protein concentrations. This targeted approach has analytical merit, but its true potential (in terms of sensitivity and multiplexing) has yet to be realized. Described herein is a method that extends the multiplexing ability of the MRM method to enable the quantitation 142 high-to-moderate abundance proteins (from 31mg/mL to 44ng/mL) in undepleted and non-enriched human plasma in a single run. The proteins have been reported to be associated to a wide variety of non-communicable diseases (NCDs), from cardiovascular disease (CVD) to diabetes. The concentrations of these proteins in human plasma are inferred from interference-free peptides functioning as molecular surrogates (2 peptides per protein, on average). A revised data analysis strategy, involving the linear regression equation of normal control plasma, has been instituted to enable the facile application to patient samples, as demonstrated in separate nutrigenomics and CVD studies. The exceptional robustness of the LC/MS platform and the quantitative method, as well as its high throughput, makes the assay suitable for application to patient samples for the verification of a condensed or complete protein panel. This article is part of a Special Issue entitled: Biomarkers: A Proteomic Challenge. PMID:23806606

  17. Advanced particulate matter control apparatus and methods

    DOEpatents

    Miller, Stanley J.; Zhuang, Ye; Almlie, Jay C.

    2012-01-10

    Apparatus and methods for collection and removal of particulate matter, including fine particulate matter, from a gas stream, comprising a unique combination of high collection efficiency and ultralow pressure drop across the filter. The apparatus and method utilize simultaneous electrostatic precipitation and membrane filtration of a particular pore size, wherein electrostatic collection and filtration occur on the same surface.

  18. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  19. Quantitative NDA measurements of advanced reprocessing product materials containing uranium, neptunium, plutonium, and americium

    NASA Astrophysics Data System (ADS)

    Goddard, Braden

    The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.

  20. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  1. Recent Advancements in Quantitative Full-Wavefield Electromagnetic Induction and Ground Penetrating Radar Inversion for Shallow Subsurface Characterization

    NASA Astrophysics Data System (ADS)

    Van Der Kruk, J.; Yang, X.; Klotzsche, A.; von Hebel, C.; Busch, S.; Mester, A.; Huisman, J. A.; Vereecken, H.

    2014-12-01

    Ray-based or approximate forward modeling techniques have been often used to reduce the computational demands for inversion purposes. Due to increasing computational power and possible parallelization of inversion algorithms, accurate forward modeling can be included in advanced inversion approaches such that the full-wavefield content can be exploited. Here, recent developments of large-scale quantitative electromagnetic induction (EMI) inversion and full-waveform ground penetrating radar (GPR) inversions are discussed that yield higher resolution of quantitative medium properties compared to conventional approaches due to the use of accurate modeling tools that are based on Maxwell's equations. For a limited number of parameters, a combined global and local search using the simplex search algorithm or the shuffled complex evolution (SCE) can be used for inversion. Examples will be shown where calibrated large-scale multi-configuration EMI data measured with new generation multi-offset EMI systems are inverted for a layered electrical conductivity earth, and quantitative permittivity and conductivity values of a layered subsurface can be obtained using on-ground GPR full-waveform inversion that includes the estimation of the unknown source wavelet. For a large number of unknowns, gradient-based optimization methods are commonly used that need a good start model to prevent it from being trapped in a local minimum. Examples will be shown where the non-linearity invoked by the presence of high contrast media can be tamed by using a novel combined frequency-time-domain full-waveform inversion, and a low-velocity waveguide layer can be imaged by using crosshole GPR full-waveform inversion, after adapting the starting model using waveguide identification in the measured data. Synthetic data calculated using the inverted permittivity and conductivity models show similar amplitudes and phases as observed in the measured data, which indicates the reliability of the

  2. Advanced spectral methods for climatic time series

    USGS Publications Warehouse

    Ghil, M.; Allen, M.R.; Dettinger, M.D.; Ide, K.; Kondrashov, D.; Mann, M.E.; Robertson, A.W.; Saunders, A.; Tian, Y.; Varadi, F.; Yiou, P.

    2002-01-01

    The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict climatic variability. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this review we describe the connections between time series analysis and nonlinear dynamics, discuss signal- to-noise enhancement, and present some of the novel methods for spectral analysis. The various steps, as well as the advantages and disadvantages of these methods, are illustrated by their application to an important climatic time series, the Southern Oscillation Index. This index captures major features of interannual climate variability and is used extensively in its prediction. Regional and global sea surface temperature data sets are used to illustrate multivariate spectral methods. Open questions and further prospects conclude the review.

  3. Advanced verification methods for OVI security ink

    NASA Astrophysics Data System (ADS)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  4. Indentation Methods in Advanced Materials Research Introduction

    SciTech Connect

    Pharr, George Mathews; Cheng, Yang-Tse; Hutchings, Ian; Sakai, Mototsugu; Moody, Neville; Sundararajan, G.; Swain, Michael V.

    2009-01-01

    Since its commercialization early in the 20th century, indentation testing has played a key role in the development of new materials and understanding their mechanical behavior. Progr3ess in the field has relied on a close marriage between research in the mechanical behavior of materials and contact mechanics. The seminal work of Hertz laid the foundations for bringing these two together, with his contributions still widely utilized today in examining elastic behavior and the physics of fracture. Later, the pioneering work of Tabor, as published in his classic text 'The Hardness of Metals', exapdned this understanding to address the complexities of plasticity. Enormous progress in the field has been achieved in the last decade, made possible both by advances in instrumentation, for example, load and depth-sensing indentation and scanning electron microscopy (SEM) and transmission electron microscopy (TEM) based in situ testing, as well as improved modeling capabilities that use computationally intensive techniques such as finite element analysis and molecular dynamics simulation. The purpose of this special focus issue is to present recent state of the art developments in the field.

  5. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  6. Analyzing the Students' Academic Integrity using Quantitative Methods

    ERIC Educational Resources Information Center

    Teodorescu, Daniel; Andrei, Tudorel; Tusa, Erika; Herteliu, Claudiu; Stancu, Stelian

    2007-01-01

    The transition period in Romania has generated a series of important changes, including the reforming of the Romanian tertiary education. This process has been accelerated after the signing of the Bologna treaty. Important changes were recorded in many of the quantitative aspects (such as number of student enrolled, pupil-student ratio etc) as…

  7. Quantitative methods for studying hemostasis in zebrafish larvae.

    PubMed

    Rost, M S; Grzegorski, S J; Shavit, J A

    2016-01-01

    Hemostasis is a coordinated system through which blood is prevented from exiting a closed circulatory system. We have taken advantage of the zebrafish, an emerging model for the study of blood coagulation, and describe three techniques for quantitative analysis of primary and secondary hemostasis. Collectively, these three techniques comprise a toolset to aid in our understanding of hemostasis and pathological clotting. PMID:27312499

  8. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  9. Advances in methods for deepwater TLP installations

    SciTech Connect

    Wybro, P.G.

    1995-10-01

    This paper describes a method suitable for installing deepwater TLP structures in water depths beyond 3,000 ft. An overview is presented of previous TLP installation, wherein an evaluation is made of the various methods and their suitability to deepwater applications. A novel method for installation of deepwater TLP`s is described. This method of installation is most suitable for deepwater and/or large TLP structures, but can also be used in moderate water depth as well. The tendon installation method utilizes the so-called Platform Arrestor Concept (PAC), wherein tendon sections are transported by barges to site, and assembled vertically using a dynamically position crane vessel. The tendons are transferred to the platform where they are hung off until there are a full complement of tendons. The hull lock off operation is performed on all tendons simultaneously, avoiding dangerous platform resonant behavior. The installation calls for relatively simple installation equipment, and also enables the use of simple tendon tie-off equipment, such as a single piece nut.

  10. Advanced reliability method for fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Wirsching, P. H.

    1984-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) may become extremely difficult or very inefficient. This study suggests using a simple and easily constructed second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  11. Transonic wing analysis using advanced computational methods

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  12. Temporality Matters: Advancing a Method for Analyzing Problem-Solving Processes in a Computer-Supported Collaborative Environment

    ERIC Educational Resources Information Center

    Kapur, Manu

    2011-01-01

    This paper argues for a need to develop methods for examining temporal patterns in computer-supported collaborative learning (CSCL) groups. It advances one such quantitative method--Lag-sequential Analysis (LsA)--and instantiates it in a study of problem-solving interactions of collaborative groups in an online, synchronous environment. LsA…

  13. Advanced method for making vitreous waste forms

    SciTech Connect

    Pope, J.M.; Harrison, D.E.

    1980-01-01

    A process is described for making waste glass that circumvents the problems of dissolving nuclear waste in molten glass at high temperatures. Because the reactive mixing process is independent of the inherent viscosity of the melt, any glass composition can be prepared with equal facility. Separation of the mixing and melting operations permits novel glass fabrication methods to be employed.

  14. Advancing-layers method for generation of unstructured viscous grids

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar

    1993-01-01

    A novel approach for generating highly stretched grids which is based on a modified advancing-front technique and benefits from the generality, flexibility, and grid quality of the conventional advancing-front-based Euler grid generators is presented. The method is self-sufficient for the insertion of grid points in the boundary layer and beyond. Since it is based on a totally unstructured grid strategy, the method alleviates the difficulties stemming from the structural limitations of the prismatic techniques.

  15. Combining qualitative and quantitative methods in assessing hospital learning environments.

    PubMed

    Chan, D S

    2001-08-01

    Clinical education is a vital component in the curricula of pre-registration nursing courses and provides student nurses with the opportunity to combine cognitive, psychomotor, and affective skills. Clinical practice enables the student to develop competencies in the application of knowledge, skills, and attitudes to clinical field situations. It is, therefore, vital that the valuable clinical time be utilised effectively and productively. Nursing students' perception of the hospital learning environment were assessed by combining quantitative and qualitative approaches. The Clinical Learning Environment Inventory, based on the theoretical framework of learning environment studies, was developed and validated. The quantitative and qualitative findings reinforced each other. It was found that there were significant differences in students' perceptions of the actual clinical learning environment and their preferred learning environment. Generally, students preferred a more positive and favourable clinical environment than they perceived as being actually present. PMID:11470103

  16. Advanced Electromagnetic Methods for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Polycarpou, Anastasis; Birtcher, Craig R.; Georgakopoulos, Stavros; Han, Dong-Ho; Ballas, Gerasimos

    1999-01-01

    The imminent destructive threats of Lightning on helicopters and other airborne systems has always been a topic of great interest to this research grant. Previously, the lightning induced currents on the surface of the fuselage and its interior were predicted using the finite-difference time-domain (FDTD) method as well as the NEC code. The limitations of both methods, as applied to lightning, were identified and extensively discussed in the last meeting. After a thorough investigation of the capabilities of the FDTD, it was decided to incorporate into the numerical method a subcell model to accurately represent current diffusion through conducting materials of high conductivity and finite thickness. Because of the complexity of the model, its validity will be first tested for a one-dimensional FDTD problem. Although results are not available yet, the theory and formulation of the subcell model are presented and discussed here to a certain degree. Besides lightning induced currents in the interior of an aircraft, penetration of electromagnetic fields through apertures (e.g., windows and cracks) could also be devastating for the navigation equipment, electronics, and communications systems in general. The main focus of this study is understanding and quantifying field penetration through apertures. The simulation is done using the FDTD method and the predictions are compared with measurements and moment method solutions obtained from the NASA Langley Research Center. Cavity-backed slot (CBS) antennas or slot antennas in general have many applications in aircraft-satellite type of communications. These can be flushmounted on the surface of the fuselage and, therefore, they retain the aerodynamic shape of the aircraft. In the past, input impedance and radiation patterns of CBS antennas were computed using a hybrid FEM/MoM code. The analysis is now extended to coupling between two identical slot antennas mounted on the same structure. The predictions are performed

  17. A Comparative Study on Tobacco Cessation Methods: A Quantitative Systematic Review

    PubMed Central

    Heydari, Gholamreza; Masjedi, Mohammadreza; Ahmady, Arezoo Ebn; Leischow, Scott J.; Lando, Harry A.; Shadmehr, Mohammad Behgam; Fadaizadeh, Lida

    2014-01-01

    Background: During recent years, there have been many advances in different types of pharmacological and non-pharmacological tobacco control treatments. In this study, we aimed to identify the most effective smoking cessation methods used in quit based upon a review of the literature. Methods: We did a search of PubMed, limited to English publications from 2000 to 2012. Two trained reviewers independently assessed titles, abstracts and full texts of articles after a pilot inter-rater reliability assessment which was conducted by the author (GH). The total number of papers and their conclusions including recommendation of that method (positive) or not supporting (negative) was computed for each method. The number of negative papers was subtracted from the number of positive ones for each method. In cases of inconsistency between the two reviewers, these were adjudicated by author. Results: Of the 932 articles that were critically assessed, 780 studies supported quit smoking methods. In 90 studies, the methods were not supported or rejected and in 62 cases the methods were not supported. Nicotine replacement therapy (NRT), Champix and Zyban with 352, 117 and 71 studies respectively were the most supported methods and e-cigarettes and non-Nicotine medications with one case were the least supported methods. Finally, NRT with 39 and Champix and education with 36 scores were the most supported methods. Conclusions: Results of this review indicate that the scientific papers in the most recent decade recommend the use of NRT and Champix in combination with educational interventions. Additional research is needed to compare qualitative and quantitative studies for smoking cessation. PMID:25013685

  18. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1988-01-01

    Parametric cost estimating methods for space systems in the conceptual design phase are developed. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance, and time. The relationship between weight and cost is examined in detail. A theoretical model of cost is developed and tested statistically against a historical data base of major research and development programs. It is concluded that the technique presented is sound, but that it must be refined in order to produce acceptable cost estimates.

  19. Using advanced intercross lines for high-resolution mapping of HDL cholesterol quantitative trait loci.

    PubMed

    Wang, Xiaosong; Le Roy, Isabelle; Nicodeme, Edwige; Li, Renhua; Wagner, Richard; Petros, Christina; Churchill, Gary A; Harris, Stephen; Darvasi, Ariel; Kirilovsky, Jorge; Roubertoux, Pierre L; Paigen, Beverly

    2003-07-01

    Mapping quantitative trait loci (QTLs) with high resolution facilitates identification and positional cloning of the underlying genes. The novel approach of advanced intercross lines (AILs) generates many more recombination events and thus can potentially narrow QTLs significantly more than do conventional backcrosses and F2 intercrosses. In this study, we carried out QTL analyses in (C57BL/6J x NZB/BlNJ) x C57BL/6J backcross progeny fed either chow or an atherogenic diet to detect QTLs that regulate high-density lipoprotein cholesterol (HDL)concentrations, and in (C57BL/6J x NZB/BlNJ) F11 AIL progeny to confirm and narrow those QTLs. QTLs for HDL concentrations were found on chromosomes 1, 5, and 16. AIL not only narrowed the QTLs significantly more than did a conventional backcross but also resolved a chromosome 5 QTL identified in the backcross into two QTLs, the peaks of both being outside the backcross QTL region. We tested 27 candidate genes and found significant mRNA expression differences for 12 (Nr1i3, Apoa2, Sap, Tgfb2, Fgfbp1, Prom, Ppargc1, Tcf1, Ncor2, Srb1, App, and Ifnar). Some of these underlay the same QTL, indicating that expression differences are common and not sufficient to identify QTL genes. All the major HDL QTLs in our study had homologous counterparts in humans, implying that their underlying genes regulate HDL in humans. PMID:12805272

  20. Using Advanced Intercross Lines for High-Resolution Mapping of HDL Cholesterol Quantitative Trait Loci

    PubMed Central

    Wang, Xiaosong; Le Roy, Isabelle; Nicodeme, Edwige; Li, Renhua; Wagner, Richard; Petros, Christina; Churchill, Gary A.; Harris, Stephen; Darvasi, Ariel; Kirilovsky, Jorge; Roubertoux, Pierre L.; Paige, Beverly

    2003-01-01

    Mapping quantitative trait loci (QTLs)with high resolution facilitates identification and positional cloning of the underlying genes. The novel approach of advanced intercross lines (AILs) generates many more recombination events and thus can potentially narrow QTLs significantly more than do conventional backcrosses and F2 intercrosses. In this study, we carried out QTL analyses in (C57BL/6J × NZB/BlNJ)× C57BL/6J backcross progeny fed either chow or an atherogenic diet to detect QTLs that regulate high-density lipoprotein cholesterol (HDL)concentrations, and in (C57BL/6J × NZB/BlNJ)F11 AIL progeny to confirm and narrow those QTLs. QTLs for HDL concentrations were found on chromosomes 1, 5, and 16. AIL not only narrowed the QTLs significantly more than did a conventional backcross but also resolved a chromosome 5 QTL identified in the backcross into two QTLs, the peaks of both being outside the backcross QTL region. We tested 27 candidate genes and found significant mRNA expression differences for 12 (Nr1i3, Apoa2, Sap, Tgfb2, Fgfbp1, Prom, Ppargc1, Tcf1, Ncor2, Srb1, App, and Ifnar). Some of these underlay the same QTL, indicating that expression differences are common and not sufficient to identify QTL genes. All the major HDL QTLs in our study had homologous counterparts in humans, implying that their underlying genes regulate HDL in humans. PMID:12805272

  1. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method

    PubMed Central

    Yang, Ganglong; Xu, Zhipeng; Lu, Wei; Li, Xiang; Sun, Chengwen; Guo, Jia; Xue, Peng; Guan, Feng

    2015-01-01

    The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia), KK47 (low grade nonmuscle invasive bladder cancer, NMIBC), and YTS1 (metastatic bladder cancer) have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC) progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO) term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer. PMID:26230496

  2. Advances in organometallic synthesis with mechanochemical methods.

    PubMed

    Rightmire, Nicholas R; Hanusa, Timothy P

    2016-02-14

    Solvent-based syntheses have long been normative in all areas of chemistry, although mechanochemical methods (specifically grinding and milling) have been used to good effect for decades in organic, and to a lesser but growing extent, inorganic coordination chemistry. Organometallic synthesis, in contrast, represents a relatively underdeveloped area for mechanochemical research, and the potential benefits are considerable. From access to new classes of unsolvated complexes, to control over stoichiometries that have not been observed in solution routes, mechanochemical (or 'M-chem') approaches have much to offer the synthetic chemist. It has already become clear that removing the solvent from an organometallic reaction can change reaction pathways considerably, so that prediction of the outcome is not always straightforward. This Perspective reviews recent developments in the field, and describes equipment that can be used in organometallic synthesis. Synthetic chemists are encouraged to add mechanochemical methods to their repertoire in the search for new and highly reactive metal complexes and novel types of organometallic transformations. PMID:26763151

  3. Advancements in Research Synthesis Methods: From a Methodologically Inclusive Perspective

    ERIC Educational Resources Information Center

    Suri, Harsh; Clarke, David

    2009-01-01

    The dominant literature on research synthesis methods has positivist and neo-positivist origins. In recent years, the landscape of research synthesis methods has changed rapidly to become inclusive. This article highlights methodologically inclusive advancements in research synthesis methods. Attention is drawn to insights from interpretive,…

  4. Advances in LC: bioanalytical method transfer.

    PubMed

    Wright, Patricia; Wright, Adrian

    2016-09-01

    There are three main reasons for transferring from an existing bioanalytical assay to an alternative chromatographic method: speed, cost and sensitivity. These represent a challenge to the analyst in that there is an interplay between these three considerations and one factor is often improved at the expense of another. These three factors act as drivers to encourage technology development and support its uptake. The more recently introduced chromatographic technologies may show significant improvements against one of more of these factors relative to conventional 4.6-mm id reversed-phase HPLC. In this article, some of these new chromatographic approaches will be considered in terms of what they can offer the bioanalysts. PMID:27491842

  5. Method for depth-resolved quantitation of optical properties in layered media using spatially modulated quantitative spectroscopy

    PubMed Central

    Saager, Rolf B.; Truong, Alex; Cuccia, David J.; Durkin, Anthony J.

    2011-01-01

    We have demonstrated that spatially modulated quantitative spectroscopy (SMoQS) is capable of extracting absolute optical properties from homogeneous tissue simulating phantoms that span both the visible and near-infrared wavelength regimes. However, biological tissue, such as skin, is highly structured, presenting challenges to quantitative spectroscopic techniques based on homogeneous models. In order to more accurately address the challenges associated with skin, we present a method for depth-resolved optical property quantitation based on a two layer model. Layered Monte Carlo simulations and layered tissue simulating phantoms are used to determine the efficacy and accuracy of SMoQS to quantify layer specific optical properties of layered media. Initial results from both the simulation and experiment show that this empirical method is capable of determining top layer thickness within tens of microns across a physiological range for skin. Layer specific chromophore concentration can be determined to <±10% the actual values, on average, whereas bulk quantitation in either visible or near infrared spectroscopic regimes significantly underestimates the layer specific chromophore concentration and can be confounded by top layer thickness. PMID:21806282

  6. Advanced Fuzzy Potential Field Method for Mobile Robot Obstacle Avoidance

    PubMed Central

    Park, Jong-Wook; Kwak, Hwan-Joo; Kang, Young-Chang; Kim, Dong W.

    2016-01-01

    An advanced fuzzy potential field method for mobile robot obstacle avoidance is proposed. The potential field method primarily deals with the repulsive forces surrounding obstacles, while fuzzy control logic focuses on fuzzy rules that handle linguistic variables and describe the knowledge of experts. The design of a fuzzy controller—advanced fuzzy potential field method (AFPFM)—that models and enhances the conventional potential field method is proposed and discussed. This study also examines the rule-explosion problem of conventional fuzzy logic and assesses the performance of our proposed AFPFM through simulations carried out using a mobile robot. PMID:27123001

  7. Advanced Fuzzy Potential Field Method for Mobile Robot Obstacle Avoidance.

    PubMed

    Park, Jong-Wook; Kwak, Hwan-Joo; Kang, Young-Chang; Kim, Dong W

    2016-01-01

    An advanced fuzzy potential field method for mobile robot obstacle avoidance is proposed. The potential field method primarily deals with the repulsive forces surrounding obstacles, while fuzzy control logic focuses on fuzzy rules that handle linguistic variables and describe the knowledge of experts. The design of a fuzzy controller--advanced fuzzy potential field method (AFPFM)--that models and enhances the conventional potential field method is proposed and discussed. This study also examines the rule-explosion problem of conventional fuzzy logic and assesses the performance of our proposed AFPFM through simulations carried out using a mobile robot. PMID:27123001

  8. Advanced electromagnetic methods for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Choi, Jachoon; El-Sharawy, El-Budawy; Hashemi-Yeganeh, Shahrokh; Birtcher, Craig R.

    1990-01-01

    High- and low-frequency methods to analyze various radiation elements located on aerospace vehicles with combinations of conducting, nonconducting, and energy absorbing surfaces and interfaces. The focus was on developing fundamental concepts, techniques, and algorithms which would remove some of the present limitations in predicting radiation characteristics of antennas on complex aerospace vehicles. In order to accomplish this, the following subjects were examined: (1) the development of techniques for rigorous analysis of surface discontinuities of metallic and nonmetallic surfaces using the equivalent surface impedance concept and Green's function; (2) the effects of anisotropic material on antenna radiation patterns through the use of an equivalent surface impedance concept which is incorporated into the existing numerical electromagnetics computer codes; and (3) the fundamental concepts of precipitation static (P-Static), such as formulations and analytical models. A computer code was used to model the P-Static process on a simple structure. Measurement techniques were also developed to characterized the electrical properties at microwave frequencies. Samples of typical materials used in airframes were tested and the results are included.

  9. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  10. [Quantitative analysis of alloy steel based on laser induced breakdown spectroscopy with partial least squares method].

    PubMed

    Cong, Zhi-Bo; Sun, Lan-Xiang; Xin, Yong; Li, Yang; Qi, Li-Feng; Yang, Zhi-Jia

    2014-02-01

    In the present paper both the partial least squares (PLS) method and the calibration curve (CC) method are used to quantitatively analyze the laser induced breakdown spectroscopy data obtained from the standard alloy steel samples. Both the major and trace elements were quantitatively analyzed. By comparing the results of two different calibration methods some useful results were obtained: for major elements, the PLS method is better than the CC method in quantitative analysis; more importantly, for the trace elements, the CC method can not give the quantitative results due to the extremely weak characteristic spectral lines, but the PLS method still has a good ability of quantitative analysis. And the regression coefficient of PLS method is compared with the original spectral data with background interference to explain the advantage of the PLS method in the LIBS quantitative analysis. Results proved that the PLS method used in laser induced breakdown spectroscopy is suitable for quantitative analysis of trace elements such as C in the metallurgical industry. PMID:24822436

  11. Unstructured viscous grid generation by advancing-front method

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar

    1993-01-01

    A new method of generating unstructured triangular/tetrahedral grids with high-aspect-ratio cells is proposed. The method is based on new grid-marching strategy referred to as 'advancing-layers' for construction of highly stretched cells in the boundary layer and the conventional advancing-front technique for generation of regular, equilateral cells in the inviscid-flow region. Unlike the existing semi-structured viscous grid generation techniques, the new procedure relies on a totally unstructured advancing-front grid strategy resulting in a substantially enhanced grid flexibility and efficiency. The method is conceptually simple but powerful, capable of producing high quality viscous grids for complex configurations with ease. A number of two-dimensional, triangular grids are presented to demonstrate the methodology. The basic elements of the method, however, have been primarily designed with three-dimensional problems in mind, making it extendible for tetrahedral, viscous grid generation.

  12. Advanced Ablative Insulators and Methods of Making Them

    NASA Technical Reports Server (NTRS)

    Congdon, William M.

    2005-01-01

    Advanced ablative (more specifically, charring) materials that provide temporary protection against high temperatures, and advanced methods of designing and manufacturing insulators based on these materials, are undergoing development. These materials and methods were conceived in an effort to replace the traditional thermal-protection systems (TPSs) of re-entry spacecraft with robust, lightweight, better-performing TPSs that can be designed and manufactured more rapidly and at lower cost. These materials and methods could also be used to make improved TPSs for general aerospace, military, and industrial applications.

  13. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    SciTech Connect

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  14. Methods for quantitative determination of drug localized in the skin.

    PubMed

    Touitou, E; Meidan, V M; Horwitz, E

    1998-12-01

    The quantification of drugs within the skin is essential for topical and transdermal delivery research. Over the last two decades, horizontal sectioning, consisting of both tape stripping and parallel slicing through the deeper tissues has constituted the traditional investigative technique. In recent years, this methodology has been augmented by such procedures as heat separation, qualitative autoradiography, isolation of the pilosebaceous units and the use of induced follicle-free skin. The development of skin quantitative autoradiography represents an entirely novel approach which permits quantification and visualization of the penetrant throughout a vertical cross-section of skin. Noninvasive strategies involve the application of optical measuring systems such as attenuated total reflectance Fourier transform infrared, fluorescence, remittance or photothermal spectroscopies. PMID:9801425

  15. Quantitative estimation of poikilocytosis by the coherent optical method

    NASA Astrophysics Data System (ADS)

    Safonova, Larisa P.; Samorodov, Andrey V.; Spiridonov, Igor N.

    2000-05-01

    The investigation upon the necessity and the reliability required of the determination of the poikilocytosis in hematology has shown that existing techniques suffer from grave shortcomings. To determine a deviation of the erythrocytes' form from the normal (rounded) one in blood smears it is expedient to use an integrative estimate. The algorithm which is based on the correlation between erythrocyte morphological parameters with properties of the spatial-frequency spectrum of blood smear is suggested. During analytical and experimental research an integrative form parameter (IFP) which characterizes the increase of the relative concentration of cells with the changed form over 5% and the predominating type of poikilocytes was suggested. An algorithm of statistically reliable estimation of the IFP on the standard stained blood smears has been developed. To provide the quantitative characterization of the morphological features of cells a form vector has been proposed, and its validity for poikilocytes differentiation was shown.

  16. Strategy to Promote Active Learning of an Advanced Research Method

    ERIC Educational Resources Information Center

    McDermott, Hilary J.; Dovey, Terence M.

    2013-01-01

    Research methods courses aim to equip students with the knowledge and skills required for research yet seldom include practical aspects of assessment. This reflective practitioner report describes and evaluates an innovative approach to teaching and assessing advanced qualitative research methods to final-year psychology undergraduate students. An…

  17. Advancing the sensitivity of selected reaction monitoring-based targeted quantitative proteomics

    SciTech Connect

    Shi, Tujin; Su, Dian; Liu, Tao; Tang, Keqi; Camp, David G.; Qian, Weijun; Smith, Richard D.

    2012-04-01

    Selected reaction monitoring (SRM)—also known as multiple reaction monitoring (MRM)—has emerged as a promising high-throughput targeted protein quantification technology for candidate biomarker verification and systems biology applications. A major bottleneck for current SRM technology, however, is insufficient sensitivity for e.g., detecting low-abundance biomarkers likely present at the pg/mL to low ng/mL range in human blood plasma or serum, or extremely low-abundance signaling proteins in the cells or tissues. Herein we review recent advances in methods and technologies, including front-end immunoaffinity depletion, fractionation, selective enrichment of target proteins/peptides or their posttranslational modifications (PTMs), as well as advances in MS instrumentation, which have significantly enhanced the overall sensitivity of SRM assays and enabled the detection of low-abundance proteins at low to sub- ng/mL level in human blood plasma or serum. General perspectives on the potential of achieving sufficient sensitivity for detection of pg/mL level proteins in plasma are also discussed.

  18. A Primer In Advanced Fatigue Life Prediction Methods

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.

    2000-01-01

    Metal fatigue has plagued structural components for centuries, and it remains a critical durability issue in today's aerospace hardware. This is true despite vastly improved and advanced materials, increased mechanistic understanding, and development of accurate structural analysis and advanced fatigue life prediction tools. Each advance is quickly taken advantage of to produce safer, more reliable more cost effective, and better performing products. In other words, as the envelop is expanded, components are then designed to operate just as close to the newly expanded envelop as they were to the initial one. The problem is perennial. The economic importance of addressing structural durability issues early in the design process is emphasized. Tradeoffs with performance, cost, and legislated restrictions are pointed out. Several aspects of structural durability of advanced systems, advanced materials and advanced fatigue life prediction methods are presented. Specific items include the basic elements of durability analysis, conventional designs, barriers to be overcome for advanced systems, high-temperature life prediction for both creep-fatigue and thermomechanical fatigue, mean stress effects, multiaxial stress-strain states, and cumulative fatigue damage accumulation assessment.

  19. Disordered Speech Assessment Using Automatic Methods Based on Quantitative Measures

    NASA Astrophysics Data System (ADS)

    Gu, Lingyun; Harris, John G.; Shrivastav, Rahul; Sapienza, Christine

    2005-12-01

    Speech quality assessment methods are necessary for evaluating and documenting treatment outcomes of patients suffering from degraded speech due to Parkinson's disease, stroke, or other disease processes. Subjective methods of speech quality assessment are more accurate and more robust than objective methods but are time-consuming and costly. We propose a novel objective measure of speech quality assessment that builds on traditional speech processing techniques such as dynamic time warping (DTW) and the Itakura-Saito (IS) distortion measure. Initial results show that our objective measure correlates well with the more expensive subjective methods.

  20. Quantitative biomechanical comparison of ankle fracture casting methods.

    PubMed

    Shipman, Alastair; Alsousou, Joseph; Keene, David J; Dyson, Igor N; Lamb, Sarah E; Willett, Keith M; Thompson, Mark S

    2015-06-01

    The incidence of ankle fractures is increasing rapidly due to the ageing demographic. In older patients with compromised distal circulation, conservative treatment of fractures may be indicated. High rates of malunion and complications due to skin fragility motivate the design of novel casting systems, but biomechanical stability requirements are poorly defined. This article presents the first quantitative study of ankle cast stability and hypothesises that a newly proposed close contact cast (CCC) system provides similar biomechanical stability to standard casts (SC). Two adult mannequin legs transected at the malleoli, one incorporating an inflatable model of tissue swelling, were stabilised with casts applied by an experienced surgeon. They were cyclically loaded in torsion, measuring applied rotation angle and resulting torque. CCC stiffness was equal to or greater than that of SC in two measures of ankle cast resistance to torsion. The effect of swelling reduction at the ankle site was significantly greater on CCC than on SC. The data support the hypothesis that CCC provides similar biomechanical stability to SC and therefore also the clinical use of CCC. They suggest that more frequent re-application of CCC is likely required to maintain stability following resolution of swelling at the injury site. PMID:25719278

  1. Quantitative analysis with advanced compensated polarized light microscopy on wavelength dependence of linear birefringence of single crystals causing arthritis

    NASA Astrophysics Data System (ADS)

    Takanabe, Akifumi; Tanaka, Masahito; Taniguchi, Atsuo; Yamanaka, Hisashi; Asahi, Toru

    2014-07-01

    To improve our ability to identify single crystals causing arthritis, we have developed a practical measurement system of polarized light microscopy called advanced compensated polarized light microscopy (A-CPLM). The A-CPLM system is constructed by employing a conventional phase retardation plate, an optical fibre and a charge-coupled device spectrometer in a polarized light microscope. We applied the A-CPLM system to measure linear birefringence (LB) in the visible region, which is an optical anisotropic property, for tiny single crystals causing arthritis, i.e. monosodium urate monohydrate (MSUM) and calcium pyrophosphate dihydrate (CPPD). The A-CPLM system performance was evaluated by comparing the obtained experimental data using the A-CPLM system with (i) literature data for a standard sample, MgF2, and (ii) experimental data obtained using an established optical method, high-accuracy universal polarimeter, for the MSUM. The A-CPLM system was found to be applicable for measuring the LB spectra of the single crystals of MSUM and CPPD, which cause arthritis, in the visible regions. We quantitatively reveal the large difference in LB between MSUM and CPPD crystals. These results demonstrate the usefulness of the A-CPLM system for distinguishing the crystals causing arthritis.

  2. Using Active Learning to Teach Concepts and Methods in Quantitative Biology.

    PubMed

    Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A

    2015-11-01

    This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. PMID:26269460

  3. Human-System Safety Methods for Development of Advanced Air Traffic Management Systems

    SciTech Connect

    Nelson, W.R.

    1999-05-24

    The Idaho National Engineering and Environmental Laboratory (INEEL) is supporting the National Aeronautics and Space Administration in the development of advanced air traffic management (ATM) systems as part of the Advanced Air Transportation Technologies program. As part of this program INEEL conducted a survey of human-system safety methods that have been applied to complex technical systems, to identify lessons learned from these applications and provide recommendations for the development of advanced ATM systems. The domains that were surveyed included offshore oil and gas, commercial nuclear power, commercial aviation, and military. The survey showed that widely different approaches are used in these industries, and that the methods used range from very high-level, qualitative approaches to very detailed quantitative methods such as human reliability analysis (HRA) and probabilistic safety assessment (PSA). In addition, the industries varied widely in how effectively they incorporate human-system safety assessment in the design, development, and testing of complex technical systems. In spite of the lack of uniformity in the approaches and methods used, it was found that methods are available that can be combined and adapted to support the development of advanced air traffic management systems.

  4. Advanced surface paneling method for subsonic and supersonic flow

    NASA Technical Reports Server (NTRS)

    Erickson, L. L.; Johnson, F. T.; Ehlers, F. E.

    1976-01-01

    Numerical results illustrating the capabilities of an advanced aerodynamic surface paneling method are presented. The method is applicable to both subsonic and supersonic flow, as represented by linearized potential flow theory. The method is based on linearly varying sources and quadratically varying doublets which are distributed over flat or curved panels. These panels are applied to the true surface geometry of arbitrarily shaped three dimensional aerodynamic configurations.

  5. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  6. Semi-quantitative method to estimate levels of Campylobacter

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  7. Deep neural nets as a method for quantitative structure-activity relationships.

    PubMed

    Ma, Junshui; Sheridan, Robert P; Liaw, Andy; Dahl, George E; Svetnik, Vladimir

    2015-02-23

    Neural networks were widely used for quantitative structure-activity relationships (QSAR) in the 1990s. Because of various practical issues (e.g., slow on large problems, difficult to train, prone to overfitting, etc.), they were superseded by more robust methods like support vector machine (SVM) and random forest (RF), which arose in the early 2000s. The last 10 years has witnessed a revival of neural networks in the machine learning community thanks to new methods for preventing overfitting, more efficient training algorithms, and advancements in computer hardware. In particular, deep neural nets (DNNs), i.e. neural nets with more than one hidden layer, have found great successes in many applications, such as computer vision and natural language processing. Here we show that DNNs can routinely make better prospective predictions than RF on a set of large diverse QSAR data sets that are taken from Merck's drug discovery effort. The number of adjustable parameters needed for DNNs is fairly large, but our results show that it is not necessary to optimize them for individual data sets, and a single set of recommended parameters can achieve better performance than RF for most of the data sets we studied. The usefulness of the parameters is demonstrated on additional data sets not used in the calibration. Although training DNNs is still computationally intensive, using graphical processing units (GPUs) can make this issue manageable. PMID:25635324

  8. Reconstruction-classification method for quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Malone, Emma; Powell, Samuel; Cox, Ben T.; Arridge, Simon

    2015-12-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in two and three dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches.

  9. A method and fortran program for quantitative sampling in paleontology

    USGS Publications Warehouse

    Tipper, J.C.

    1976-01-01

    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  10. Quantitative assessment of susceptibility weighted imaging processing methods

    PubMed Central

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2013-01-01

    Purpose To evaluate different susceptibility weighted imaging (SWI) phase processing methods and parameter selection, thereby improving understanding of potential artifacts, as well as facilitating choice of methodology in clinical settings. Materials and Methods Two major phase processing methods, Homodyne-filtering and phase unwrapping-high pass (HP) filtering, were investigated with various phase unwrapping approaches, filter sizes, and filter types. Magnitude and phase images were acquired from a healthy subject and brain injury patients on a 3T clinical Siemens MRI system. Results were evaluated based on image contrast to noise ratio and presence of processing artifacts. Results When using a relatively small filter size (32 pixels for the matrix size 512 × 512 pixels), all Homodyne-filtering methods were subject to phase errors leading to 2% to 3% masked brain area in lower and middle axial slices. All phase unwrapping-filtering/smoothing approaches demonstrated fewer phase errors and artifacts compared to the Homodyne-filtering approaches. For performing phase unwrapping, Fourier-based methods, although less accurate, were 2–4 orders of magnitude faster than the PRELUDE, Goldstein and Quality-guide methods. Conclusion Although Homodyne-filtering approaches are faster and more straightforward, phase unwrapping followed by HP filtering approaches perform more accurately in a wider variety of acquisition scenarios. PMID:24923594

  11. HPTLC Method for Quantitative Determination of Zopiclone and Its Impurity.

    PubMed

    Naguib, Ibrahim A; Abdelrahman, Maha M; El Ghobashy, Mohamed R; Ali, Nesma A

    2015-09-01

    This study was designed to establish, optimize and validate a sensitive, selective and accurate high-performance thin layer chromatographic (HPTLC) method for determination of zopiclone (ZPC) and its main impurity, 2-amino-5-chloropyridine, one of its degradation products, in raw material and pharmaceutical formulation. The proposed method was applied for analysis of ZPC and its impurity over the concentration range of 0.3-1.4 and 0.05-0.8 µg/band with accuracy of mean percentage recovery 99.92% ± 1.521 and 99.28% ± 2.296, respectively. The method is based on the separation of two components followed by densitometric measurement of the separated peaks at 305 nm. The separation was carried out on silica gel HPTLC F254 plates, using chloroform-methanol-glacial acetic acid (9:1:0.1, by volume) as a developing system. The suggested method was validated according to International Conference on Harmonization guidelines and can be applied for routine analysis in quality control laboratories. The results obtained by the proposed method were statistically compared with the reported method revealing high accuracy and good precision. PMID:25740427

  12. A quantitative comparative analysis of Advancement via Independent Determination (AVID) in Texas middle schools

    NASA Astrophysics Data System (ADS)

    Reed, Krystal Astra

    The "Advancement via Individual Determination (AVID) program was designed to provide resources and strategies that enable underrepresented minority students to attend 4-year colleges" (AVID Center, 2013, p. 2). These students are characterized as the forgotten middle in that they have high test scores, average-to-low grades, minority or low socioeconomic status, and will be first-generation college students (AVID, 2011). Research indicates (Huerta, Watt, & Butcher, 2013) that strict adherence to 11 program components supports success of students enrolled in AVID, and AVID certification depends on districts following those components. Several studies (AVID Center, 2013) have investigated claims about the AVID program through qualitative analyses; however, very few have addressed this program quantitatively. This researcher sought to determine whether differences existed between student achievement and attendance rates between AVID and non-AVID middle schools. To achieve this goal, the researcher compared eighth-grade science and seventh- and eighth-grade mathematics scores from the 2007 to 2011 Texas Assessment of Knowledge and Skills (TAKS) and overall attendance rates in demographically equivalent AVID and non-AVID middle schools. Academic Excellence Indicator System (AEIS) reports from the Texas Education Agency (TEA) were used to obtain 2007 to 2011 TAKS results and attendance information for the selected schools. The results indicated a statistically significant difference between AVID demonstration students and non-AVID students in schools with similar CI. No statistically significant differences were found on any component of the TAKS for AVID economically disadvantaged students. The mean scores indicated an achievement gap between non-AVID and AVID demonstration middle schools. The findings from the other three research questions indicated no statistically significant differences between AVID and non-AVID student passing rates on the seventh- and eighth

  13. Comparative evaluation of two quantitative precipitation estimation methods in Korea

    NASA Astrophysics Data System (ADS)

    Ko, H.; Nam, K.; Jung, H.

    2013-12-01

    The spatial distribution and intensity of rainfall is necessary for hydrological model, particularly, grid based distributed model. The weather radar is much higher spatial resolution (1kmx1km) than rain gauges (~13km) although radar is indirect measurement of rainfall and rain gauges are directly observed it. And also, radar is provided areal and gridded rainfall information while rain gauges are provided point data. Therefore, radar rainfall data can be useful for input data on the hydrological model. In this study, we compared two QPE schemes to produce radar rainfall for hydrological utilization. The two methods are 1) spatial adjustment and 2) real-time Z-R relationship adjustment (hereafter RAR; Radar-Aws Rain rate). We computed and analyzed the statistics such as ME (Mean Error), RMSE (Root mean square Error), and correlation using cross-validation method (here, leave-one-out method).

  14. A quantitative measurement method for comparison of seated postures.

    PubMed

    Hillman, Susan J; Hollington, James

    2016-05-01

    This technical note proposes a method to measure and compare seated postures. The three-dimensional locations of palpable anatomical landmarks corresponding to the anterior superior iliac spines, clavicular notch, head, shoulders and knees are measured in terms of x, y and z co-ordinates in the reference system of the measuring apparatus. These co-ordinates are then transformed onto a body-based axis system which allows comparison within-subject. The method was tested on eleven unimpaired adult participants and the resulting data used to calculate a Least Significant Difference (LSD) for the measure, which is used to determine whether two postures are significantly different from one another. The method was found to be sensitive to the four following standardised static postural perturbations: posterior pelvic tilt, pelvic obliquity, pelvic rotation, and abduction of the thighs. The resulting data could be used as an outcome measure for the postural alignment aspect of seating interventions in wheelchairs. PMID:26920073

  15. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  16. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1981-02-25

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules or ions.

  17. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, Edward F.; Keller, Richard A.; Apel, Charles T.

    1983-01-01

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions.

  18. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1983-09-06

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions. 6 figs.

  19. Selection methods in forage breeding: a quantitative appraisal

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Forage breeding can be extraordinarily complex because of the number of species, perenniality, mode of reproduction, mating system, and the genetic correlation for some traits evaluated in spaced plants vs. performance under cultivation. Aiming to compare eight forage breeding methods for direct sel...

  20. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    EPA Science Inventory

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  1. Advanced boundary layer transition measurement methods for flight applications

    NASA Technical Reports Server (NTRS)

    Holmes, B. J.; Croom, C. C.; Gail, P. D.; Manuel, G. S.; Carraway, D. L.

    1986-01-01

    In modern laminar flow flight research, it is important to understand the specific cause(s) of laminar to turbulent boundary-layer transition. Such information is crucial to the exploration of the limits of practical application of laminar flow for drag reduction on aircraft. The transition modes of interest in current flight investigations include the viscous Tollmien-Schlichting instability, the inflectional instability at laminar separation, and the crossflow inflectional instability, as well as others. This paper presents the results to date of research on advanced devices and methods used for the study of laminar boundary-layer transition phenomena in the flight environment. Recent advancements in the development of arrayed hot-film devices and of a new flow visualization method are discussed. Arrayed hot-film devices have been designed to detect the presence of laminar separation, and of crossflow vorticity. The advanced flow visualization method utilizes color changes in liquid-crystal coatings to detect boundary-layer transition at high altitude flight conditions. Flight and wind tunnel data are presented to illustrate the design and operation of these advanced methods. These new research tools provide information on disturbance growth and transition mode which is essential to furthering our understanding of practical design limits for applications of laminar flow technology.

  2. [Study on quantitative methods of cleistocalycis operculati cortex].

    PubMed

    Chen, Li-Si; Ou, Jia-Ju; Li, Shu-Yuan; Lu, Song-Gui

    2014-08-01

    Cleistocalycis Operculati Cortex is the dry bark of Cleistocalyx operculatus. It is the raw material of Compound Hibiscuse which is external sterilization antipruritic drugs. The quality standard of Cleistocalycis Operculati Cortex in Guangdong Province "standard for the traditional Chinese medicine" (second volumes) only contains TLC identification. It is unable to effectively monitor and control the quality of Cleistocalycis Operculati Cortex. A reversed-phase HPLC method was established for the determination of 3, 3'-O-dimethylellagic acid from Cleistocalycis Operculati Cortex and the content was calculated by external standard method for the first time. Under the selected chromatographic conditions, the target components between peaks to achieve effective separation. 3,3'-O- dimethylellagic acid standard solution at the concentration of 1.00 - 25.0 mg x L(-1) showed a good linear relationship. The standard curve was Y = 77.33X + 7.904, r = 0.999 5. The average recovery was 101.0%, RSD was 1.3%. The HPLC method for the determination of 3,3'-O-dimethylellagic acid in Cleistocalycis Operculati Cortex is accurate and reliable. It can provide a strong technical support for monitoring the quality of Cleistocalycis Operculati Cortex. PMID:25509300

  3. Domain Decomposition By the Advancing-Partition Method

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    2008-01-01

    A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.

  4. Advanced propulsion for LEO-Moon transport. 1: A method for evaluating advanced propulsion performance

    NASA Technical Reports Server (NTRS)

    Stern, Martin O.

    1992-01-01

    This report describes a study to evaluate the benefits of advanced propulsion technologies for transporting materials between low Earth orbit and the Moon. A relatively conventional reference transportation system, and several other systems, each of which includes one advanced technology component, are compared in terms of how well they perform a chosen mission objective. The evaluation method is based on a pairwise life-cycle cost comparison of each of the advanced systems with the reference system. Somewhat novel and economically important features of the procedure are the inclusion not only of mass payback ratios based on Earth launch costs, but also of repair and capital acquisition costs, and of adjustments in the latter to reflect the technological maturity of the advanced technologies. The required input information is developed by panels of experts. The overall scope and approach of the study are presented in the introduction. The bulk of the paper describes the evaluation method; the reference system and an advanced transportation system, including a spinning tether in an eccentric Earth orbit, are used to illustrate it.

  5. Compatibility of Qualitative and Quantitative Methods: Studying Child Sexual Abuse in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    1987-01-01

    Illustrates how the combined use of qualitative and quantitative methods were necessary in obtaining a clearer understanding of the process of incest in American society. Argues that the exclusive use of one methodology would have obscured important information. (FMW)

  6. Qualitative and quantitative determination of ubiquinones by the method of high-efficiency liquid chromatography

    SciTech Connect

    Yanotovskii, M.T.; Mogilevskaya, M.P.; Obol'nikova, E.A.; Kogan, L.M.; Samokhvalov, G.I.

    1986-07-10

    A method has been developed for the qualitative and quantitative determination of ubiquinones CoQ/sub 6/-CoQ/sub 10/, using high-efficiency reversed-phase liquid chromatography. Tocopherol acetate was used as the internal standard.

  7. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  8. Advances and future directions of research on spectral methods

    NASA Technical Reports Server (NTRS)

    Patera, A. T.

    1986-01-01

    Recent advances in spectral methods are briefly reviewed and characterized with respect to their convergence and computational complexity. Classical finite element and spectral approaches are then compared, and spectral element (or p-type finite element) approximations are introduced. The method is applied to the full Navier-Stokes equations, and examples are given of the application of the technique to several transitional flows. Future directions of research in the field are outlined.

  9. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  10. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  11. Advances in subtyping methods of foodborne disease pathogens.

    PubMed

    Boxrud, Dave

    2010-04-01

    Current subtyping methods for the detection of foodborne disease outbreaks have limitations that reduce their use by public health laboratories. Recent advances in subtyping of foodborne disease pathogens utilize techniques that identify nucleic acid polymorphisms. Recent methods of nucleic acid characterization such as microarrays and mass spectrometry (MS) may provide improvements such as increasing speed and data portability while decreasing labor compared to current methods. This article discusses multiple-locus variable-number tandem-repeat analysis, single-nucleotide polymorphisms, nucleic acid sequencing, whole genome sequencing, variable absent or present loci, microarrays and MS as potential subtyping methods to enhance our ability to detect foodborne disease outbreaks. PMID:20299203

  12. Advances in whole-embryo imaging: a quantitative transition is underway.

    PubMed

    Pantazis, Periklis; Supatto, Willy

    2014-05-01

    With the advent of imaging probes and live microscopy, developmental biologists have markedly extended our understanding of the molecular and cellular details of embryonic development. To fully comprehend the complex mechanistic framework that forms the developing organism, quantitative studies with high fidelity in space and time are now required. We discuss how integrating established, newly introduced and future imaging tools with quantitative analysis will ensure that imaging can fulfil its promise to elucidate how new life begins. PMID:24739741

  13. Spinal Cord Segmentation by One Dimensional Normalized Template Matching: A Novel, Quantitative Technique to Analyze Advanced Magnetic Resonance Imaging Data.

    PubMed

    Cadotte, Adam; Cadotte, David W; Livne, Micha; Cohen-Adad, Julien; Fleet, David; Mikulis, David; Fehlings, Michael G

    2015-01-01

    Spinal cord segmentation is a developing area of research intended to aid the processing and interpretation of advanced magnetic resonance imaging (MRI). For example, high resolution three-dimensional volumes can be segmented to provide a measurement of spinal cord atrophy. Spinal cord segmentation is difficult due to the variety of MRI contrasts and the variation in human anatomy. In this study we propose a new method of spinal cord segmentation based on one-dimensional template matching and provide several metrics that can be used to compare with other segmentation methods. A set of ground-truth data from 10 subjects was manually-segmented by two different raters. These ground truth data formed the basis of the segmentation algorithm. A user was required to manually initialize the spinal cord center-line on new images, taking less than one minute. Template matching was used to segment the new cord and a refined center line was calculated based on multiple centroids within the segmentation. Arc distances down the spinal cord and cross-sectional areas were calculated. Inter-rater validation was performed by comparing two manual raters (n = 10). Semi-automatic validation was performed by comparing the two manual raters to the semi-automatic method (n = 10). Comparing the semi-automatic method to one of the raters yielded a Dice coefficient of 0.91 +/- 0.02 for ten subjects, a mean distance between spinal cord center lines of 0.32 +/- 0.08 mm, and a Hausdorff distance of 1.82 +/- 0.33 mm. The absolute variation in cross-sectional area was comparable for the semi-automatic method versus manual segmentation when compared to inter-rater manual segmentation. The results demonstrate that this novel segmentation method performs as well as a manual rater for most segmentation metrics. It offers a new approach to study spinal cord disease and to quantitatively track changes within the spinal cord in an individual case and across cohorts of subjects. PMID:26445367

  14. Correlation of Quantitative Motor State Assessment Using a Kinetograph and Patient Diaries in Advanced PD: Data from an Observational Study

    PubMed Central

    Ossig, Christiana; Gandor, Florin; Fauser, Mareike; Bosredon, Cecile; Churilov, Leonid; Reichmann, Heinz; Horne, Malcolm K.; Ebersbach, Georg; Storch, Alexander

    2016-01-01

    Introduction Effective management and development of new treatment strategies for response fluctuations in advanced Parkinson’s disease (PD) largely depends on clinical rating instruments such as the PD home diary. The Parkinson’s kinetigraph (PKG) measures movement accelerations and analyzes the spectral power of the low frequencies of the accelerometer data. New algorithms convert each hour of continuous PKG data into one of the three motor categories used in the PD home diary, namely motor Off state and On state with and without dyskinesia. Objective To compare quantitative motor state assessment in fluctuating PD patients using the PKG with motor state ratings from PD home diaries. Methods Observational cohort study on 24 in-patients with documented motor fluctuations who completed diaries by rating motor Off, On without dyskinesia, On with dyskinesia, and asleep for every hour for 5 consecutive days. Simultaneously collected PKG data (recorded between 6 am and 10 pm) were analyzed and calibrated to the patient’s individual thresholds for Off and dyskinetic state by novel algorithms classifying the continuous accelerometer data into these motor states for every hour between 6 am and 10 pm. Results From a total of 2,040 hours, 1,752 hours (87.4%) were available for analyses from calibrated PKG data (7.5% sleeping time and 5.1% unclassified motor state time were excluded from analyses). Distributions of total motor state hours per day measured by PKG showed moderate-to-strong correlation to those assessed by diaries for the different motor states (Pearson’s correlations coefficients: 0.404–0.658), but inter-rating method agreements on the single-hour-level were only low-to-moderate (Cohen’s κ: 0.215–0.324). Conclusion The PKG has been shown to capture motor fluctuations in patients with advanced PD. The limited correlation of hour-to-hour diary and PKG recordings should be addressed in further studies. PMID:27556806

  15. Pleistocene Lake Bonneville and Eberswalde Crater of Mars: Quantitative Methods for Recognizing Poorly Developed Lacustrine Shorelines

    NASA Astrophysics Data System (ADS)

    Jewell, P. W.

    2014-12-01

    The ability to quantify shoreline features on Earth has been aided by advances in acquisition of high-resolution topography through laser imaging and photogrammetry. Well-defined and well-documented features such as the Bonneville, Provo, and Stansbury shorelines of Late Pleistocene Lake Bonneville are recognizable to the untrained eye and easily mappable on aerial photos. The continuity and correlation of lesser shorelines must rely quantitative algorithms for processing high-resolution data in order to gain widespread scientific acceptance. Using Savitsky-Golay filters and the geomorphic methods and criteria described by Hare et al. [2001], minor, transgressive, erosional shorelines of Lake Bonneville have been identified and correlated across the basin with varying degrees of statistical confidence. Results solve one of the key paradoxes of Lake Bonneville first described by G. K. Gilbert in the late 19th century and point the way for understanding climatically driven oscillations of the Last Glacial Maximum in the Great Basin of the United States. Similar techniques have been applied to the Eberswalde Crater area of Mars using HRiSE DEMs (1 m horizontal resolution) where a paleolake is hypothesized to have existed. Results illustrate the challenges of identifying shorelines where long term aeolian processes have degraded the shorelines and field validation is not possible. The work illustrates the promises and challenges of indentifying remnants of a global ocean elsewhere on the red planet.

  16. Quantitative Trait Locus Mapping Methods for Diversity Outbred Mice

    PubMed Central

    Gatti, Daniel M.; Svenson, Karen L.; Shabalin, Andrey; Wu, Long-Yang; Valdar, William; Simecek, Petr; Goodwin, Neal; Cheng, Riyan; Pomp, Daniel; Palmer, Abraham; Chesler, Elissa J.; Broman, Karl W.; Churchill, Gary A.

    2014-01-01

    Genetic mapping studies in the mouse and other model organisms are used to search for genes underlying complex phenotypes. Traditional genetic mapping studies that employ single-generation crosses have poor mapping resolution and limit discovery to loci that are polymorphic between the two parental strains. Multiparent outbreeding populations address these shortcomings by increasing the density of recombination events and introducing allelic variants from multiple founder strains. However, multiparent crosses present new analytical challenges and require specialized software to take full advantage of these benefits. Each animal in an outbreeding population is genetically unique and must be genotyped using a high-density marker set; regression models for mapping must accommodate multiple founder alleles, and complex breeding designs give rise to polygenic covariance among related animals that must be accounted for in mapping analysis. The Diversity Outbred (DO) mice combine the genetic diversity of eight founder strains in a multigenerational breeding design that has been maintained for >16 generations. The large population size and randomized mating ensure the long-term genetic stability of this population. We present a complete analytical pipeline for genetic mapping in DO mice, including algorithms for probabilistic reconstruction of founder haplotypes from genotyping array intensity data, and mapping methods that accommodate multiple founder haplotypes and account for relatedness among animals. Power analysis suggests that studies with as few as 200 DO mice can detect loci with large effects, but loci that account for <5% of trait variance may require a sample size of up to 1000 animals. The methods described here are implemented in the freely available R package DOQTL. PMID:25237114

  17. Quantitative trait locus mapping methods for diversity outbred mice.

    PubMed

    Gatti, Daniel M; Svenson, Karen L; Shabalin, Andrey; Wu, Long-Yang; Valdar, William; Simecek, Petr; Goodwin, Neal; Cheng, Riyan; Pomp, Daniel; Palmer, Abraham; Chesler, Elissa J; Broman, Karl W; Churchill, Gary A

    2014-09-01

    Genetic mapping studies in the mouse and other model organisms are used to search for genes underlying complex phenotypes. Traditional genetic mapping studies that employ single-generation crosses have poor mapping resolution and limit discovery to loci that are polymorphic between the two parental strains. Multiparent outbreeding populations address these shortcomings by increasing the density of recombination events and introducing allelic variants from multiple founder strains. However, multiparent crosses present new analytical challenges and require specialized software to take full advantage of these benefits. Each animal in an outbreeding population is genetically unique and must be genotyped using a high-density marker set; regression models for mapping must accommodate multiple founder alleles, and complex breeding designs give rise to polygenic covariance among related animals that must be accounted for in mapping analysis. The Diversity Outbred (DO) mice combine the genetic diversity of eight founder strains in a multigenerational breeding design that has been maintained for >16 generations. The large population size and randomized mating ensure the long-term genetic stability of this population. We present a complete analytical pipeline for genetic mapping in DO mice, including algorithms for probabilistic reconstruction of founder haplotypes from genotyping array intensity data, and mapping methods that accommodate multiple founder haplotypes and account for relatedness among animals. Power analysis suggests that studies with as few as 200 DO mice can detect loci with large effects, but loci that account for <5% of trait variance may require a sample size of up to 1000 animals. The methods described here are implemented in the freely available R package DOQTL. PMID:25237114

  18. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  19. Fracture Toughness in Advanced Monolithic Ceramics - SEPB Versus SEVENB Methods

    NASA Technical Reports Server (NTRS)

    Choi, S. R.; Gyekenyesi, J. P.

    2005-01-01

    Fracture toughness of a total of 13 advanced monolithic ceramics including silicon nitrides, silicon carbide, aluminas, and glass ceramic was determined at ambient temperature by using both single edge precracked beam (SEPB) and single edge v-notched beam (SEVNB) methods. Relatively good agreement in fracture toughness between the two methods was observed for advanced ceramics with flat R-curves; whereas, poor agreement in fracture toughness was seen for materials with rising R-curves. The discrepancy in fracture toughness between the two methods was due to stable crack growth with crack closure forces acting in the wake region of cracks even in SEVNB test specimens. The effect of discrepancy in fracture toughness was analyzed in terms of microstructural feature (grain size and shape), toughening exponent, and stable crack growth determined using back-face strain gaging.

  20. Optimization of Quantitative PCR Methods for Enteropathogen Detection

    PubMed Central

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M.; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R.

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen’s extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease. PMID:27336160

  1. A practical and sensitive method of quantitating lymphangiogenesis in vivo.

    PubMed

    Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K

    2013-07-01

    To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology. PMID:23711825

  2. Quantitative research on the primary process: method and findings.

    PubMed

    Holt, Robert R

    2002-01-01

    Freud always defined the primary process metapsychologically, but he described the ways it shows up in dreams, parapraxes, jokes, and symptoms with enough observational detail to make it possible to create an objective, reliable scoring system to measure its manifestations in Rorschach responses, dreams, TAT stories, free associations, and other verbal texts. That system can identify signs of the thinker's efforts, adaptive or maladaptive, to control or defend against the emergence of primary process. A prerequisite and a consequence of the research that used this system was clarification and elaboration of the psychoanalytic theory of thinking. Results of empirical tests of several propositions derived from psychoanalytic theory are summarized. Predictions concerning the method's most useful index, of adaptive vs. maladaptive regression, have been repeatedly verified: People who score high on this index (who are able to produce well-controlled "primary products" in their Rorschach responses), as compared to those who score at the maladaptive pole (producing primary-process-filled responses with poor reality testing, anxiety, and pathological defensive efforts), are better able to tolerate sensory deprivation, are more able to enter special states of consciousness comfortably (drug-induced, hypnotic, etc.), and have higher achievements in artistic creativity, while schizophrenics tend to score at the extreme of maladaptive regression. Capacity for adaptive regression also predicts success in psychotherapy, and rises with the degree of improvement after both psychotherapy and drug treatment. Some predictive failures have been theoretically interesting: Kris's hypothesis about creativity and the controlled use of primary process holds for males but usually not for females. This body of work is presented as a refutation of charges, brought by such critics as Crews, that psychoanalysis cannot become a science. PMID:12206540

  3. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    NASA Astrophysics Data System (ADS)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  4. A method for quantitatively estimating diffuse and discrete hydrothermal discharge

    NASA Astrophysics Data System (ADS)

    Baker, Edward T.; Massoth, Gary J.; Walker, Sharon L.; Embley, Robert W.

    1993-07-01

    Submarine hydrothermal fluids discharge as undiluted, high-temperature jets and as diffuse, highly diluted, low-temperature percolation. Estimates of the relative contribution of each discharge type, which are important for the accurate determination of local and global hydrothermal budgets, are difficult to obtain directly. In this paper we describe a new method of using measurements of hydrothermal tracers such as Fe/Mn, Fe/heat, and Mn/heat in high-temperature fluids, low-temperature fluids, and the neutrally buoyant plume to deduce the relative contribution of each discharge type. We sampled vent fluids from the north Cleft vent field on the Juan de Fuca Ridge in 1988, 1989 and 1991, and plume samples every year from 1986 to 1991. The tracers were, on average, 3 to 90 times greater in high-temperature than in low-temperature fluids, with plume values intermediate. A mixing model calculates that high-temperature fluids contribute only ˜ 3% of the fluid mass flux but > 90% of the hydrothermal Fe and > 60% of the hydrothermal Mn to the overlying plume. Three years of extensive camera-CTD sled tows through the vent field show that diffuse venting is restricted to a narrow fissure zone extending for 18 km along the axial strike. Linear plume theory applied to the temperature plumes detected when the sled crossed this zone yields a maximum likelihood estimate for the diffuse heat flux of8.9 × 10 4 W/m, for a total flux of 534 MW, considering that diffuse venting is active along only one-third of the fissure system. For mean low- and high-temperature discharge of 25°C and 319°C, respectively, the discrete heat flux must be 266 MW to satisfy the mass flux partitioning. If the north Cleft vent field is globally representative, the assumption that high-temperature discharge dominates the mass flux in axial vent fields leads to an overestimation of the flux of many non-conservative hydrothermal species by about an order of magnitude.

  5. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  6. The Ten Beads Method: A Novel Way to Collect Quantitative Data in Rural Uganda

    PubMed Central

    Bwambale, Francis Mulekya; Moyer, Cheryl A.; Komakech, Innocent; -Mangen, Fred-Wabwire; Lori, Jody R

    2013-01-01

    This paper illustrates how locally appropriate methods can be used to collect quantitative data from illiterate respondents. This method uses local beads to represent quantities, which is a novel yet potentially valuable methodological improvement over standard Western survey methods. PMID:25170477

  7. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  8. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  9. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, Theodore H. H.

    1991-01-01

    The following tasks on the study of advanced stress analysis methods applicable to turbine engine structures are described: (1) constructions of special elements which contain traction-free circular boundaries; (2) formulation of new version of mixed variational principles and new version of hybrid stress elements; (3) establishment of methods for suppression of kinematic deformation modes; (4) construction of semiLoof plate and shell elements by assumed stress hybrid method; and (5) elastic-plastic analysis by viscoplasticity theory using the mechanical subelement model.

  10. Multiparametric monitoring of chemotherapy treatment response in locally advanced breast cancer using quantitative ultrasound and diffuse optical spectroscopy

    PubMed Central

    Tran, William T.; Childs, Charmaine; Chin, Lee; Slodkowska, Elzbieta; Sannachi, Lakshmanan; Tadayyon, Hadi; Watkins, Elyse; Wong, Sharon Lemon; Curpen, Belinda; Kaffas, Ahmed El; Al-Mahrouki, Azza; Sadeghi-Naini, Ali; Czarnota, Gregory J.

    2016-01-01

    Purpose This study evaluated pathological response to neoadjuvant chemotherapy using quantitative ultrasound (QUS) and diffuse optical spectroscopy imaging (DOSI) biomarkers in locally advanced breast cancer (LABC). Materials and Methods The institution's ethics review board approved this study. Subjects (n = 22) gave written informed consent prior to participating. US and DOSI data were acquired, relative to the start of neoadjuvant chemotherapy, at weeks 0, 1, 4, 8 and preoperatively. QUS parameters including the mid-band fit (MBF), 0-MHz intercept (SI), and the spectral slope (SS) were determined from tumor ultrasound data using spectral analysis. In the same patients, DOSI was used to measure parameters relating to tumor hemoglobin and composition. Discriminant analysis and receiver-operating characteristic (ROC) analysis was used to classify clinical and pathological response during treatment and to estimate the area under the curve (AUC). Additionally, multivariate analysis was carried out for pairwise QUS/DOSI parameter combinations using a logistic regression model. Results Individual QUS and DOSI parameters, including the (SI), oxy-hemoglobin (HbO2), and total hemoglobin (HbT) were significant markers for response after one week of treatment (p < 0.01). Multivariate (pairwise) combinations increased the sensitivity, specificity and AUC at this time; the SI + HbO2 showed a sensitivity/specificity of 100%, and an AUC of 1.0. Conclusions QUS and DOSI demonstrated potential as coincident markers for treatment response and may potentially facilitate response-guided therapies. Multivariate QUS and DOSI parameters increased the sensitivity and specificity of classifying LABC patients as early as one week after treatment. PMID:26942698

  11. Task 4.4 - development of supercritical fluid extraction methods for the quantitation of sulfur forms in coal

    SciTech Connect

    Timpe, R.C.

    1995-04-01

    Development of advanced fuel forms depends on having reliable quantitative methods for their analysis. Determination of the true chemical forms of sulfur in coal is necessary to develop more effective methods to reduce sulfur content. Past work at the Energy & Environmental Research Center (EERC) indicates that sulfur chemistry has broad implications in combustion, gasification, pyrolysis, liquefaction, and coal-cleaning processes. Current analytical methods are inadequate for accurately measuring sulfur forms in coal. This task was concerned with developing methods to quantitate and identify major sulfur forms in coal based on direct measurement (as opposed to present techniques based on indirect measurement and difference values). The focus was on the forms that were least understood and for which the analytical methods have been the poorest, i.e., organic and elemental sulfur. Improved measurement techniques for sulfatic and pyritic sulfur also need to be developed. A secondary goal was to understand the interconversion of sulfur forms in coal during thermal processing. EERC has developed the first reliable analytical method for extracting and quantitating elemental sulfur from coal (1). This method has demonstrated that elemental sulfur can account for very little or as much as one-third of the so-called organic sulfur fraction. This method has disproved the generally accepted idea that elemental sulfur is associated with the organic fraction. A paper reporting the results obtained on this subject entitled {open_quote}Determination of Elemental Sulfur in Coal by Supercritical Fluid Extraction and Gas Chromatography with Atomic Emission Detection{close_quote} was published in Fuel (A).

  12. Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research

    PubMed Central

    SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN

    2015-01-01

    Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073

  13. Quantitative evaluation of proteins with bicinchoninic acid (BCA): resonance Raman and surface-enhanced resonance Raman scattering-based methods.

    PubMed

    Chen, Lei; Yu, Zhi; Lee, Youngju; Wang, Xu; Zhao, Bing; Jung, Young Mee

    2012-12-21

    A rapid and highly sensitive bicinchoninic acid (BCA) reagent-based protein quantitation tool was developed using competitive resonance Raman (RR) and surface-enhanced resonance Raman scattering (SERRS) methods. A chelation reaction between BCA and Cu(+), which is reduced by protein in an alkaline environment, is exploited to create a BCA-Cu(+) complex that has strong RR and SERRS activities. Using these methods, protein concentrations in solutions can be quantitatively measured at concentrations as low as 50 μg mL(-1) and 10 pg mL(-1). There are many advantages of using RR and SERRS-based assays. These assays exhibit a much wider linear concentration range and provide an additional one (RR method) to four (SERRS method) orders of magnitude increase in detection limits relative to UV-based methods. Protein-to-protein variation is determined using a reference to a standard curve at concentrations of BSA that exhibits excellent recoveries. These novel methods are extremely accurate in detecting total protein concentrations in solution. This improvement in protein detection sensitivity could yield advances in the biological sciences and medical diagnostic field and extend the applications of reagent-based protein assay techniques. PMID:23099478

  14. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  15. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials. PMID:11767156

  16. Advanced preservation methods and nutrient retention in fruits and vegetables.

    PubMed

    Barrett, Diane M; Lloyd, Beate

    2012-01-15

    Despite the recommendations of international health organizations and scientific research carried out around the world, consumers do not take in sufficient quantities of healthy fruit and vegetable products. The use of new, 'advanced' preservation methods creates a unique opportunity for food manufacturers to retain nutrient content similar to that found in fresh fruits and vegetables. This review presents a summary of the published literature regarding the potential of high-pressure and microwave preservation, the most studied of the 'advanced' processes, to retain the natural vitamin A, B, C, phenolic, mineral and fiber content in fruits and vegetables at the time of harvest. Comparisons are made with more traditional preservation methods that utilize thermal processing. Case studies on specific commodities which have received the most attention are highlighted; these include apples, carrots, oranges, tomatoes and spinach. In addition to summarizing the literature, the review includes a discussion of postharvest losses in general and factors affecting nutrient losses in fruits and vegetables. Recommendations are made for future research required to evaluate these advanced process methods. PMID:22102258

  17. A simple method for the subnanomolar quantitation of seven ophthalmic drugs in the rabbit eye.

    PubMed

    Latreille, Pierre-Luc; Banquy, Xavier

    2015-05-01

    This study describes the development and validation of a new liquid chromatography-tandem mass spectrometry (MS/MS) method capable of simultaneous quantitation of seven ophthalmic drugs-pilocarpine, lidocaine, atropine, proparacaine, timolol, prednisolone, and triamcinolone acetonide-within regions of the rabbit eye. The complete validation of the method was performed using an Agilent 1100 series high-performance liquid chromatography system coupled to a 4000 QTRAP MS/MS detector in positive TurboIonSpray mode with pooled drug solutions. The method sensitivity, evaluated by the lower limit of quantitation in two simulated matrices, yielded lower limits of quantitation of 0.25 nmol L(-1) for most of the drugs. The precision in the low, medium, and high ranges of the calibration curves, the freeze-thaw stability over 1 month, the intraday precision, and the interday precision were all within a 15% limit. The method was used to quantitate the different drugs in the cornea, aqueous humor, vitreous humor, and remaining eye tissues of the rabbit eye. It was validated to a concentration of up to 1.36 ng/g in humors and 5.43 ng/g in tissues. The unprecedented low detection limit of the present method and its ease of implementation allow easy, robust, and reliable quantitation of multiple drugs for rapid in vitro and in vivo evaluation of the local pharmacokinetics of these compounds. PMID:25749792

  18. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  19. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  20. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a particular…

  1. A method for the quantitative determination of crystalline phases by X-ray

    NASA Technical Reports Server (NTRS)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  2. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  3. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  4. Advances in surface plasmon resonance imaging enable quantitative tracking of nanoscale changes in thickness and roughness.

    PubMed

    Raegen, Adam N; Reiter, Kyle; Dion, Alexander; Clarke, Anthony J; Lipkowski, Jacek; Dutcher, John R

    2014-04-01

    To date, detailed studies of the thickness of coatings using surface plasmon resonance have been limited to samples that are very uniform in thickness, and this technique has not been applied quantitatively to samples that are inherently rough or undergo instabilities with time. Our manuscript describes a significant improvement to surface plasmon resonance imaging (SPRi) that allows this sensitive technique to be used for quantitative tracking of the thickness and roughness of surface coatings that are rough on the scale of tens of nanometers. We tested this approach by studying samples with an idealized, one-dimensional roughness: patterned channels in a thin polymer film. We find that a novel analysis of the SPRi data collected with the plane of incidence parallel to the patterned channels allows the determination of the thickness profile of the channels in the polymer film, which is in agreement with that measured using atomic force microscopy. We have further validated our approach by performing SPRi measurements perpendicular to the patterned channels, for which the measured SPR curve agrees well with the single SPR curve calculated using the average thickness determined from the thickness profile as determined using AFM. We applied this analysis technique to track the average thickness and RMS roughness of cellulose microfibrils upon exposure to cellulolytic enzymes, providing quantitative determinations of the times of action of the enzymes that are of direct interest to the cellulosic ethanol industry. PMID:24605881

  5. Immunoassay Methods and their Applications in Pharmaceutical Analysis: Basic Methodology and Recent Advances

    PubMed Central

    Darwish, Ibrahim A.

    2006-01-01

    Immunoassays are bioanalytical methods in which the quantitation of the analyte depends on the reaction of an antigen (analyte) and an antibody. Immunoassays have been widely used in many important areas of pharmaceutical analysis such as diagnosis of diseases, therapeutic drug monitoring, clinical pharmacokinetic and bioequivalence studies in drug discovery and pharmaceutical industries. The importance and widespread of immunoassay methods in pharmaceutical analysis are attributed to their inherent specificity, high-throughput, and high sensitivity for the analysis of wide range of analytes in biological samples. Recently, marked improvements were achieved in the field of immunoassay development for the purposes of pharmaceutical analysis. These improvements involved the preparation of the unique immunoanalytical reagents, analysis of new categories of compounds, methodology, and instrumentation. The basic methodologies and recent advances in immunoassay methods applied in different fields of pharmaceutical analysis have been reviewed. PMID:23674985

  6. A quantitative PCR method to quantify ruminant DNA in porcine crude heparin.

    PubMed

    Concannon, Sean P; Wimberley, P Brett; Workman, Wesley E

    2011-01-01

    Heparin is a well-known glycosaminoglycan extracted from porcine intestines. Increased vigilance for transmissible spongiform encephalopathy in animal-derived pharmaceuticals requires methods to prevent the introduction of heparin from ruminants into the supply chain. The sensitivity, specificity, and precision of the quantitative polymerase chain reaction (PCR) make it a superior analytical platform for screening heparin raw material for bovine-, ovine-, and caprine-derived material. A quantitative PCR probe and primer set homologous to the ruminant Bov-A2 short interspersed nuclear element (SINE) locus (Mendoza-Romero et al. J. Food Prot. 67:550-554, 2004) demonstrated nearly equivalent affinities for bovine, ovine, and caprine DNA targets, while exhibiting no cross-reactivity with porcine DNA in the quantitative PCR method. A second PCR primer and probe set, specific for the porcine PRE1 SINE sequence, was also developed to quantify the background porcine DNA level. DNA extraction and purification was not necessary for analysis of the raw heparin samples, although digestion of the sample with heparinase was employed. The method exhibits a quantitation range of 0.3-3,000 ppm ruminant DNA in heparin. Validation parameters of the method included accuracy, repeatability, precision, specificity, range, quantitation limit, and linearity. PMID:21058016

  7. Rapid quantitative pharmacodynamic imaging by a novel method: theory, simulation testing and proof of principle.

    PubMed

    Black, Kevin J; Koller, Jonathan M; Miller, Brad D

    2013-01-01

    Pharmacological challenge imaging has mapped, but rarely quantified, the sensitivity of a biological system to a given drug. We describe a novel method called rapid quantitative pharmacodynamic imaging. This method combines pharmacokinetic-pharmacodynamic modeling, repeated small doses of a challenge drug over a short time scale, and functional imaging to rapidly provide quantitative estimates of drug sensitivity including EC 50 (the concentration of drug that produces half the maximum possible effect). We first test the method with simulated data, assuming a typical sigmoidal dose-response curve and assuming imperfect imaging that includes artifactual baseline signal drift and random error. With these few assumptions, rapid quantitative pharmacodynamic imaging reliably estimates EC 50 from the simulated data, except when noise overwhelms the drug effect or when the effect occurs only at high doses. In preliminary fMRI studies of primate brain using a dopamine agonist, the observed noise level is modest compared with observed drug effects, and a quantitative EC 50 can be obtained from some regional time-signal curves. Taken together, these results suggest that research and clinical applications for rapid quantitative pharmacodynamic imaging are realistic. PMID:23940831

  8. A rapid, sensitive, and selective method for quantitation of lamprey migratory pheromones in river water.

    PubMed

    Stewart, Michael; Baker, Cindy F; Cooney, Terry

    2011-11-01

    The methodology of using fish pheromones, or chemical signatures, as a tool to monitor or manage species of fish is rapidly gaining popularity. Unequivocal detection and accurate quantitation of extremely low concentrations of these chemicals in natural waters is paramount to using this technique as a management tool. Various species of lamprey are known to produce a mixture of three important migratory pheromones; petromyzonol sulfate (PS), petromyzonamine disulfate (PADS), and petromyzosterol disulfate (PSDS), but presently there are no established robust methods for quantitation of all three pheromones. In this study, we report a new, highly sensitive and selective method for the rapid identification and quantitation of these pheromones in river water samples. The procedure is based on pre-concentration, followed by liquid chromatography/tandem mass spectrometry (LC/MS/MS) analysis. The method is fast, with unambiguous pheromone determination. Practical quantitation limits of 0.25 ng/l were achieved for PS and PADS and 2.5 ng/l for PSDS in river water, using a 200-fold pre-concentration, However, lower quantitation limits can be achieved with greater pre-concentration. The methodology can be modified easily to include other chemicals of interest. Furthermore, the pre-concentration step can be applied easily in the field, circumventing potential stability issues of these chemicals. PMID:22076684

  9. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    PubMed Central

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  10. Advanced reactor physics methods for heterogeneous reactor cores

    NASA Astrophysics Data System (ADS)

    Thompson, Steven A.

    To maintain the economic viability of nuclear power the industry has begun to emphasize maximizing the efficiency and output of existing nuclear power plants by using longer fuel cycles, stretch power uprates, shorter outage lengths, mixed-oxide (MOX) fuel and more aggressive operating strategies. In order to accommodate these changes, while still satisfying the peaking factor and power envelope requirements necessary to maintain safe operation, more complexity in commercial core designs have been implemented, such as an increase in the number of sub-batches and an increase in the use of both discrete and integral burnable poisons. A consequence of the increased complexity of core designs, as well as the use of MOX fuel, is an increase in the neutronic heterogeneity of the core. Such heterogeneous cores introduce challenges for the current methods that are used for reactor analysis. New methods must be developed to address these deficiencies while still maintaining the computational efficiency of existing reactor analysis methods. In this thesis, advanced core design methodologies are developed to be able to adequately analyze the highly heterogeneous core designs which are currently in use in commercial power reactors. These methodological improvements are being pursued with the goal of not sacrificing the computational efficiency which core designers require. More specifically, the PSU nodal code NEM is being updated to include an SP3 solution option, an advanced transverse leakage option, and a semi-analytical NEM solution option.

  11. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-01

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed. PMID:26928571

  12. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788. PMID:21071908

  13. Advanced magnetic resonance imaging techniques in the preterm brain: methods and applications.

    PubMed

    Tao, Joshua D; Neil, Jeffrey J

    2014-01-01

    Brain development and brain injury in preterm infants are areas of active research. Magnetic resonance imaging (MRI), a non-invasive tool applicable to both animal models and human infants, provides a wealth of information on this process by bridging the gap between histology (available from animal studies) and developmental outcome (available from clinical studies). Moreover, MRI also offers information regarding diagnosis and prognosis in the clinical setting. Recent advances in MR methods - diffusion tensor imaging, volumetric segmentation, surface based analysis, functional MRI, and quantitative metrics - further increase the sophistication of information available regarding both brain structure and function. In this review, we discuss the basics of these newer methods as well as their application to the study of premature infants. PMID:25055864

  14. Advanced fluorescence microscopy methods for the real-time study of transcription and chromatin dynamics

    PubMed Central

    Annibale, Paolo; Gratton, Enrico

    2014-01-01

    In this contribution we provide an overview of the recent advances allowed by the use of fluorescence microscopy methods in the study of transcriptional processes and their interplay with the chromatin architecture in living cells. Although the use of fluorophores to label nucleic acids dates back at least to about half a century ago,1 two recent breakthroughs have effectively opened the way to use fluorescence routinely for specific and quantitative probing of chromatin organization and transcriptional activity in living cells: namely, the possibility of labeling first the chromatin loci and then the mRNA synthesized from a gene using fluorescent proteins. In this contribution we focus on methods that can probe rapid dynamic processes by analyzing fast fluorescence fluctuations. PMID:25764219

  15. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells.

    PubMed

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R(2) > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/10(6) cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/10(6) letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  16. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells

    PubMed Central

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R2 > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/106 cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/106 letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  17. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  18. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  19. Advanced measurement and quantitative appraise of anisodamine on calcium triggered in cardiac myocyte.

    PubMed

    Tian-Nan, Wang; Hui-Juan, Yang; Gu-Ling; Jie-Yue, Li; Xiao-Xiang, Zheng

    2005-01-01

    Both patch clamp and laser scanning confocal microscopy (LSCM) was applied to appraise the Anisodamine's cardioprotective effects quantitatively and its mechanism were studied. MTT measurement was observed cell viability and Fluo-3/AM was utilized for real-time free calcium with LSCM; ICa,Lexposed to Anisodamine was measured by whole-cell patch clamp recording technique. Our study observed that KCL-triggered calcium elevation could be decreased by Anisodamine at dose of 0.04mmol/L and 0.08mmol/L with the decreased value of 46.6%, 54.3% correspondingly. Further study of Anisodamine at 0.08mmol/L showed a marked inhibitory modulation of L-type Ca2+current density in a time-dependent manner with decreased ratio of (34.8±7.9) % (n=6, P<0.01), from the value of -4.474 pA/pF to - 2.882 pA/pF and accelerated Vi1/2of current inactivation curve from -14.7mV to -28.4mV and delayed Vi1/2of current activation curve from -15.6mV to -9.51mV. Our result suggests that Anisodamine conferred the cardioprotection by decreasing calcium elevation quantitatively; the likely mechanism was suggested to be responsible for inhibition of L-calcium channel. PMID:17282068

  20. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  1. The Use of Quantitative Methods as an Aid to Decision Making in Educational Administration.

    ERIC Educational Resources Information Center

    Alkin, Marvin C.

    Three quantitative methods are outlined, with suggestions for application to particular problem areas of educational administration: (1) The Leontief input-output analysis, incorporating a "transaction table" for displaying relationships between economic outputs and inputs, mainly applicable to budget analysis and planning; (2) linear programing,…

  2. A simple method for quantitative diagnosis of small hive beetles, Aethina tumida, in the field

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Here we present a simple and fast method for quantitative diagnosis of small hive beetles (= SHB) in honeybee field colonies using corrugated plastic “diagnostic-strips”. In Australia, we evaluated its efficacy by comparing the number of lured SHB with the total number of beetles in the hives. The d...

  3. Examination of Quantitative Methods Used in Early Intervention Research: Linkages with Recommended Practices.

    ERIC Educational Resources Information Center

    Snyder, Patricia; Thompson, Bruce; McLean, Mary E.; Smith, Barbara J.

    2002-01-01

    Findings are reported related to the research methods and statistical techniques used in 450 group quantitative studies examined by the Council for Exceptional Children's Division for Early Childhood Recommended Practices Project. Studies were analyzed across seven dimensions including sampling procedures, variable selection, variable definition,…

  4. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  5. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  6. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  7. Improved GC/MS method for quantitation of n-Alkanes in plant and fecal material

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A gas chromatography-mass spectrometry (GC/MS) method for the quantitation of n-alkanes (carbon backbones ranging from 21 to 36 carbon atoms) in forage and fecal samples has been developed. Automated solid-liquid extraction using elevated temperature and pressure minimized extraction time to 30 min...

  8. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    NASA Astrophysics Data System (ADS)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  9. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    PubMed Central

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-01-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output. PMID:26430292

  10. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    ERIC Educational Resources Information Center

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  11. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  12. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  13. Paradigms Lost and Pragmatism Regained: Methodological Implications of Combining Qualitative and Quantitative Methods

    ERIC Educational Resources Information Center

    Morgan, David L.

    2007-01-01

    This article examines several methodological issues associated with combining qualitative and quantitative methods by comparing the increasing interest in this topic with the earlier renewal of interest in qualitative research during the 1980s. The first section argues for the value of Kuhn's concept of paradigm shifts as a tool for examining…

  14. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  15. The Promise of Mixed-Methods for Advancing Latino Health Research

    PubMed Central

    Apesoa-Varano, Ester Carolina; Hinton, Ladson

    2015-01-01

    Mixed-methods research in the social sciences has been conducted for quite some time. More recently, mixed-methods have become popular in health research, with the National Institutes of Health leading the impetus to fund studies that implement such an approach. The public health issues facing us today are great and they range from policy and other macro-level issues, to systems level problems to individuals' health behaviors. For Latinos, who are projected to become the largest minority group bearing a great deal of the burden of social inequality in the U.S., it is important to understand the deeply-rooted nature of these health disparities in order to close the gap in health outcomes. Mixed-methodology thus holds promise for advancing research on Latino heath by tackling health disparities from a variety of standpoints and approaches. The aim of this manuscript is to provide two examples of mixed methods research, each of which addresses a health topic of considerable importance to older Latinos and their families. These two examples will illustrate a) the complementary use of qualitative and quantitative methods to advance health of older Latinos in an area that is important from a public health perspective, and b) the “translation” of findings from observational studies (informed by social science and medicine) to the development and testing of interventions. PMID:23996325

  16. Methods and Systems for Advanced Spaceport Information Management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  17. Methods and systems for advanced spaceport information management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  18. Quantitative Analysis of Diverse Lactobacillus Species Present in Advanced Dental Caries

    PubMed Central

    Byun, Roy; Nadkarni, Mangala A.; Chhour, Kim-Ly; Martin, F. Elizabeth; Jacques, Nicholas A.; Hunter, Neil

    2004-01-01

    Our previous analysis of 65 advanced dental caries lesions by traditional culture techniques indicated that lactobacilli were numerous in the advancing front of the progressive lesion. Production of organic acids by lactobacilli is considered to be important in causing decalcification of the dentinal matrix. The present study was undertaken to define more precisely the diversity of lactobacilli found in this environment and to quantify the major species and phylotypes relative to total load of lactobacilli by real-time PCR. Pooled DNA was amplified by PCR with Lactobacillus genus-specific primers for subsequent cloning, sequencing, and phylogenetic analysis. Based on 16S ribosomal DNA sequence comparisons, 18 different phylotypes of lactobacilli were detected, including strong representation of both novel and gastrointestinal phylotypes. Specific PCR primers were designed for nine prominent species, including Lactobacillus gasseri, L. ultunensis, L. salivarius, L. rhamnosus, L. casei, L. crispatus, L. delbrueckii, L. fermentum, and L. gallinarum. More than three different species were identified as being present in most of the dentine samples, confirming the widespread distribution and numerical importance of various Lactobacillus spp. in carious dentine. Quantification by real-time PCR revealed various proportions of the nine species colonizing carious dentine, with higher mean loads of L. gasseri and L. ultunensis than of the other prevalent species. The findings provide a basis for further characterization of the pathogenicity of Lactobacillus spp. in the context of extension of the carious lesion. PMID:15243071

  19. Sensitivity analysis of infectious disease models: methods, advances and their application.

    PubMed

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V

    2013-09-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods-scatter plots, the Morris and Sobol' methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method-and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  20. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  1. Pressure ulcer prevention algorithm content validation: a mixed-methods, quantitative study.

    PubMed

    van Rijswijk, Lia; Beitz, Janice M

    2015-04-01

    Translating pressure ulcer prevention (PUP) evidence-based recommendations into practice remains challenging for a variety of reasons, including the perceived quality, validity, and usability of the research or the guideline itself. Following the development and face validation testing of an evidence-based PUP algorithm, additional stakeholder input and testing were needed. Using convenience sampling methods, wound care experts attending a national wound care conference and a regional wound ostomy continence nursing (WOCN) conference and/or graduates of a WOCN program were invited to participate in an Internal Review Board-approved, mixed-methods quantitative survey with qualitative components to examine algorithm content validity. After participants provided written informed consent, demographic variables were collected and participants were asked to comment on and rate the relevance and appropriateness of each of the 26 algorithm decision points/steps using standard content validation study procedures. All responses were anonymous. Descriptive summary statistics, mean relevance/appropriateness scores, and the content validity index (CVI) were calculated. Qualitative comments were transcribed and thematically analyzed. Of the 553 wound care experts invited, 79 (average age 52.9 years, SD 10.1; range 23-73) consented to participate and completed the study (a response rate of 14%). Most (67, 85%) were female, registered (49, 62%) or advanced practice (12, 15%) nurses, and had > 10 years of health care experience (88, 92%). Other health disciplines included medical doctors, physical therapists, nurse practitioners, and certified nurse specialists. Almost all had received formal wound care education (75, 95%). On a Likert-type scale of 1 (not relevant/appropriate) to 4 (very relevant and appropriate), the average score for the entire algorithm/all decision points (N = 1,912) was 3.72 with an overall CVI of 0.94 (out of 1). The only decision point/step recommendation

  2. A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra

    NASA Astrophysics Data System (ADS)

    Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

    2013-06-01

    A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

  3. Linking multidimensional functional diversity to quantitative methods: a graphical hypothesis--evaluation framework.

    PubMed

    Boersma, Kate S; Dee, Laura E; Miller, Steve J; Bogan, Michael T; Lytle, David A; Gitelman, Alix I

    2016-03-01

    Functional trait analysis is an appealing approach to study differences among biological communities because traits determine species' responses to the environment and their impacts on ecosystem functioning. Despite a rapidly expanding quantitative literature, it remains challenging to conceptualize concurrent changes in multiple trait dimensions ("trait space") and select quantitative functional diversity methods to test hypotheses prior to analysis. To address this need, we present a widely applicable framework for visualizing ecological phenomena in trait space to guide the selection, application, and interpretation of quantitative functional diversity methods. We describe five hypotheses that represent general patterns of responses to disturbance in functional community ecology and then apply a formal decision process to determine appropriate quantitative methods to test ecological hypotheses. As a part of this process, we devise a new statistical approach to test for functional turnover among communities. Our combination of hypotheses and metrics can be applied broadly to address ecological questions across a range of systems and study designs. We illustrate the framework with a case study of disturbance in freshwater communities. This hypothesis-driven approach will increase the rigor and transparency of applied functional trait studies. PMID:27197386

  4. Development of a Quantitative SRM-Based Proteomics Method to Study Iron Metabolism of Synechocystis sp. PCC 6803.

    PubMed

    Vuorijoki, Linda; Isojärvi, Janne; Kallio, Pauli; Kouvonen, Petri; Aro, Eva-Mari; Corthals, Garry L; Jones, Patrik R; Muth-Pawlak, Dorota

    2016-01-01

    The cyanobacterium Synechocystis sp. PCC 6803 (S. 6803) is a well-established model species in oxygenic photosynthesis research and a potential host for biotechnological applications. Despite recent advances in genome sequencing and microarray techniques applied in systems biology, quantitative proteomics approaches with corresponding accuracy and depth are scarce for S. 6803. In this study, we developed a protocol to screen changes in the expression of 106 proteins representing central metabolic pathways in S. 6803 with a targeted mass spectrometry method, selected reaction monitoring (SRM). We evaluated the response to the exposure of both short- and long-term iron deprivation. The experimental setup enabled the relative quantification of 96 proteins, with 87 and 92 proteins showing adjusted p-values <0.01 under short- and long-term iron deficiency, respectively. The high sensitivity of the SRM method for S. 6803 was demonstrated by providing quantitative data for altogether 64 proteins that previously could not be detected with the classical data-dependent MS approach under similar conditions. This highlights the effectiveness of SRM for quantification and extends the analytical capability to low-abundance proteins in unfractionated samples of S. 6803. The SRM assays and other generated information are now publicly available via PASSEL and Panorama. PMID:26652789

  5. Advances in quantitative muscle ultrasonography using texture analysis of ultrasound images.

    PubMed

    Molinari, Filippo; Caresio, Cristina; Acharya, U Rajendra; Mookiah, Muthu Rama Krishnan; Minetto, Marco Alessandro

    2015-09-01

    Musculoskeletal ultrasound imaging can be used to investigate the skeletal muscle structure in terms of architecture (thickness, cross-sectional area, fascicle length and fascicle pennation angle) and texture. Gray-scale analysis is commonly used to characterize transverse scans of the muscle. Gray mean value is used to distinguish between normal and pathologic muscles, but it depends on the image acquisition system and its settings. In this study, quantitative ultrasonography was performed on five muscles (biceps brachii, vastus lateralis, rectus femoris, medial gastrocnemius and tibialis anterior) of 20 healthy patients (10 women, 10 men) to assess the characterization performance of higher-order texture descriptors to differentiate genders and muscle types. A total of 53 features (7 first-order descriptors, 24 Haralick features, 20 Galloway features and 2 local binary pattern features) were extracted from each muscle region of interest (ROI) and were used to perform the multivariate linear regression analysis (MANOVA). Our results show that first-order descriptors, Haralick features (energy, entropy and correlation measured along different angles) and local binary pattern (LBP) energy and entropy were highly linked to the gender, whereas Haralick entropy and symmetry, Galloway texture descriptors and LBP entropy helped to distinguish muscle types. Hence, the combination of first-order and higher-order texture descriptors (Haralick, Galloway and LBP) can be used to discriminate gender and muscle types. Therefore, multi-texture analysis may be useful to investigate muscle damage and myopathic disorders. PMID:26026375

  6. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer

    NASA Astrophysics Data System (ADS)

    Fu, Guanglei; Sanjay, Sharma T.; Dou, Maowei; Li, Xiujun

    2016-03-01

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays. Electronic supplementary information (ESI) available: Additional information on FTIR characterization (Fig. S1), photothermal immunoassay of PSA in human serum samples (Table S1), and the Experimental section, including preparation of antibody-conjugated iron oxide NPs, sandwich-type immunoassay, characterization, and photothermal detection protocol. See DOI: 10.1039/c5nr09051b

  7. Research using qualitative, quantitative or mixed methods and choice based on the research.

    PubMed

    McCusker, K; Gunaydin, S

    2015-10-01

    Research is fundamental to the advancement of medicine and critical to identifying the most optimal therapies unique to particular societies. This is easily observed through the dynamics associated with pharmacology, surgical technique and the medical equipment used today versus short years ago. Advancements in knowledge synthesis and reporting guidelines enhance the quality, scope and applicability of results; thus, improving health science and clinical practice and advancing health policy. While advancements are critical to the progression of optimal health care, the high cost associated with these endeavors cannot be ignored. Research fundamentally needs to be evaluated to identify the most efficient methods of evaluation. The primary objective of this paper is to look at a specific research methodology when applied to the area of clinical research, especially extracorporeal circulation and its prognosis for the future. PMID:25378417

  8. Advanced superposition methods for high speed turbopump vibration analysis

    NASA Technical Reports Server (NTRS)

    Nielson, C. E.; Campany, A. D.

    1981-01-01

    The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.

  9. Quantitative Methods for Evaluating the Efficacy of Thalamic Deep Brain Stimulation in Patients with Essential Tremor

    PubMed Central

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Background Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. Methods We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. Results The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Discussion Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life. PMID:24255800

  10. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works

  11. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and

  12. Advanced numerical methods in mesh generation and mesh adaptation

    SciTech Connect

    Lipnikov, Konstantine; Danilov, A; Vassilevski, Y; Agonzal, A

    2010-01-01

    Numerical solution of partial differential equations requires appropriate meshes, efficient solvers and robust and reliable error estimates. Generation of high-quality meshes for complex engineering models is a non-trivial task. This task is made more difficult when the mesh has to be adapted to a problem solution. This article is focused on a synergistic approach to the mesh generation and mesh adaptation, where best properties of various mesh generation methods are combined to build efficiently simplicial meshes. First, the advancing front technique (AFT) is combined with the incremental Delaunay triangulation (DT) to build an initial mesh. Second, the metric-based mesh adaptation (MBA) method is employed to improve quality of the generated mesh and/or to adapt it to a problem solution. We demonstrate with numerical experiments that combination of all three methods is required for robust meshing of complex engineering models. The key to successful mesh generation is the high-quality of the triangles in the initial front. We use a black-box technique to improve surface meshes exported from an unattainable CAD system. The initial surface mesh is refined into a shape-regular triangulation which approximates the boundary with the same accuracy as the CAD mesh. The DT method adds robustness to the AFT. The resulting mesh is topologically correct but may contain a few slivers. The MBA uses seven local operations to modify the mesh topology. It improves significantly the mesh quality. The MBA method is also used to adapt the mesh to a problem solution to minimize computational resources required for solving the problem. The MBA has a solid theoretical background. In the first two experiments, we consider the convection-diffusion and elasticity problems. We demonstrate the optimal reduction rate of the discretization error on a sequence of adaptive strongly anisotropic meshes. The key element of the MBA method is construction of a tensor metric from hierarchical edge

  13. Terahertz absorbance spectrum fitting method for quantitative detection of concealed contraband

    NASA Astrophysics Data System (ADS)

    Wang, Yingxin; Zhao, Ziran; Chen, Zhiqiang; Kang, Kejun; Feng, Bing; Zhang, Yan

    2007-12-01

    We present a quantitative method for the nondestructive detection of concealed contraband based on terahertz transmission spectroscopy. Without knowing the prior information of barrier materials, the amount of concealed contraband can be extracted by approximating the terahertz absorbance spectrum of the barrier material with a low-order polynomial and then fitting the measured absorbance spectrum of the inspected object with the polynomial and the known standard spectrum of this kind of contraband. We verify the validity of this method using a sample of explosive 1,3,5-trinitro-s-triazine (RDX) covered with several different barrier materials which are commonly encountered in actual inspection, and good agreement between the calculated and actual value of the amount of RDX is obtained for the experiments performed under both nitrogen and air atmospheres. This indicates that the presented method can achieve quantitative detection of hidden contraband, which is important for security inspection applications.

  14. Quantitative EDXS analysis of organic materials using the ζ-factor method.

    PubMed

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. PMID:24012932

  15. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize. PMID:23470871

  16. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    SciTech Connect

    Kiefel, Denis E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  17. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    NASA Astrophysics Data System (ADS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  18. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  19. Advanced Motion Compensation Methods for Intravital Optical Microscopy

    PubMed Central

    Vinegoni, Claudio; Lee, Sungon; Feruglio, Paolo Fumene; Weissleder, Ralph

    2013-01-01

    Intravital microscopy has emerged in the recent decade as an indispensible imaging modality for the study of the micro-dynamics of biological processes in live animals. Technical advancements in imaging techniques and hardware components, combined with the development of novel targeted probes and new mice models, have enabled us to address long-standing questions in several biology areas such as oncology, cell biology, immunology and neuroscience. As the instrument resolution has increased, physiological motion activities have become a major obstacle that prevents imaging live animals at resolutions analogue to the ones obtained in vitro. Motion compensation techniques aim at reducing this gap and can effectively increase the in vivo resolution. This paper provides a technical review of some of the latest developments in motion compensation methods, providing organ specific solutions. PMID:24273405

  20. Advancing MODFLOW Applying the Derived Vector Space Method

    NASA Astrophysics Data System (ADS)

    Herrera, G. S.; Herrera, I.; Lemus-García, M.; Hernandez-Garcia, G. D.

    2015-12-01

    The most effective domain decomposition methods (DDM) are non-overlapping DDMs. Recently a new approach, the DVS-framework, based on an innovative discretization method that uses a non-overlapping system of nodes (the derived-nodes), was introduced and developed by I. Herrera et al. [1, 2]. Using the DVS-approach a group of four algorithms, referred to as the 'DVS-algorithms', which fulfill the DDM-paradigm (i.e. the solution of global problems is obtained by resolution of local problems exclusively) has been derived. Such procedures are applicable to any boundary-value problem, or system of such equations, for which a standard discretization method is available and then software with a high degree of parallelization can be constructed. In a parallel talk, in this AGU Fall Meeting, Ismael Herrera will introduce the general DVS methodology. The application of the DVS-algorithms has been demonstrated in the solution of several boundary values problems of interest in Geophysics. Numerical examples for a single-equation, for the cases of symmetric, non-symmetric and indefinite problems were demonstrated before [1,2]. For these problems DVS-algorithms exhibited significantly improved numerical performance with respect to standard versions of DDM algorithms. In view of these results our research group is in the process of applying the DVS method to a widely used simulator for the first time, here we present the advances of the application of this method for the parallelization of MODFLOW. Efficiency results for a group of tests will be presented. References [1] I. Herrera, L.M. de la Cruz and A. Rosas-Medina. Non overlapping discretization methods for partial differential equations, Numer Meth Part D E, (2013). [2] Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  1. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  2. Sensitivity analysis of infectious disease models: methods, advances and their application

    PubMed Central

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  3. Penumbra Pattern Assessment in Acute Stroke Patients: Comparison of Quantitative and Non-Quantitative Methods in Whole Brain CT Perfusion

    PubMed Central

    Baumann, Alena B.; Meinel, Felix G.; Helck, Andreas D.; Opherk, Christian; Straube, Andreas; Reiser, Maximilian F.; Sommer, Wieland H.

    2014-01-01

    Background And Purpose While penumbra assessment has become an important part of the clinical decision making for acute stroke patients, there is a lack of studies measuring the reliability and reproducibility of defined assessment techniques in the clinical setting. Our aim was to determine reliability and reproducibility of different types of three-dimensional penumbra assessment methods in stroke patients who underwent whole brain CT perfusion imaging (WB-CTP). Materials And Methods We included 29 patients with a confirmed MCA infarction who underwent initial WB-CTP with a scan coverage of 100 mm in the z-axis. Two blinded and experienced readers assessed the flow-volume-mismatch twice and in two quantitative ways: Performing a volumetric mismatch analysis using OsiriX imaging software (MMVOL) and visual estimation of mismatch (MMEST). Complementarily, the semiquantitative Alberta Stroke Programme Early CT Score for CT perfusion was used to define mismatch (MMASPECTS). A favorable penumbral pattern was defined by a mismatch of ≥30% in combination with a cerebral blood flow deficit of ≤90 ml and an MMASPECTS score of ≥1, respectively. Inter- and intrareader agreement was determined by Kappa-values and ICCs. Results Overall, MMVOL showed considerably higher inter-/intrareader agreement (ICCs: 0.751/0.843) compared to MMEST (0.292/0.749). In the subgroup of large (≥50 mL) perfusion deficits, inter- and intrareader agreement of MMVOL was excellent (ICCs: 0.961/0.942), while MMEST interreader agreement was poor (0.415) and intrareader agreement was good (0.919). With respect to penumbra classification, MMVOL showed the highest agreement (interreader agreement: 25 agreements/4 non-agreements/κ: 0.595; intrareader agreement 27/2/0.833), followed by MMEST (22/7/0.471; 23/6/0.577), and MMASPECTS (18/11/0.133; 21/8/0.340). Conclusion The evaluated approach of volumetric mismatch assessment is superior to pure visual and ASPECTS penumbra pattern assessment in WB

  4. A gas chromatography-mass spectrometry method for the quantitation of clobenzorex.

    PubMed

    Cody, J T; Valtier, S

    1999-01-01

    Drugs metabolized to amphetamine or methamphetamine are potentially significant concerns in the interpretation of amphetamine-positive urine drug-testing results. One of these compounds, clobenzorex, is an anorectic drug that is available in many countries. Clobenzorex (2-chlorobenzylamphetamine) is metabolized to amphetamine by the body and excreted in the urine. Following administration, the parent compound was detectable for a shorter time than the metabolite amphetamine, which could be detected for days. Because of the potential complication posed to the interpretation of amphetamin-positive drug tests following administration of this drug, the viability of a current amphetamine procedure using liquid-liquid extraction and conversion to the heptafluorobutyryl derivative followed by gas chromatography-mass spectrometry (GC-MS) analysis was evaluated for identification and quantitation of clobenzorex. Qualitative identification of the drug was relatively straightforward. Quantitative analysis proved to be a far more challenging process. Several compounds were evaluated for use as the internal standard in this method, including methamphetamine-d11, fenfluramine, benzphetamine, and diphenylamine. Results using these compounds proved to be less than satisfactory because of poor reproducibility of the quantitative values. Because of its similar chromatographic properties to the parent drug, the compound 3-chlorobenzylamphetamine (3-Cl-clobenzorex) was evaluated in this study as the internal standard for the quantitation of clobenzorex. Precision studies showed 3-Cl-clobenzorex to produce accurate and reliable quantitative results (within-run relative standard deviations [RSDs] < 6.1%, between-run RSDs < 6.0%). The limits of detection and quantitation for this assay were determined to be 1 ng/mL for clobenzorex. PMID:10595847

  5. Radial period extraction method employing frequency measurement for quantitative collimation testing

    NASA Astrophysics Data System (ADS)

    Li, Sikun; Wang, Xiangzhao

    2016-01-01

    A radial period extraction method employing frequency measurement is proposed for quantitative collimation testing using spiral gratings. The radial period of the difference-frequency fringe is treated as a measure of the collimation condition. A frequency measurement technique based on wavelet transform and a statistical approach is presented to extract the radial period directly from the amplitude-transmittance spiral fringe. A basic constraint to set the parameters of the wavelet is introduced. Strict mathematical demonstration is given. The method outperforms methods employing phase measurement in terms of precision, stability and noise immune ability.

  6. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  7. A Vision of Quantitative Imaging Technology for Validation of Advanced Flight Technologies

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Kerns, Robert V.; Jones, Kenneth M.; Grinstead, Jay H.; Schwartz, Richard J.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Dantowitz, Ronald F.

    2011-01-01

    Flight-testing is traditionally an expensive but critical element in the development and ultimate validation and certification of technologies destined for future operational capabilities. Measurements obtained in relevant flight environments also provide unique opportunities to observe flow phenomenon that are often beyond the capabilities of ground testing facilities and computational tools to simulate or duplicate. However, the challenges of minimizing vehicle weight and internal complexity as well as instrumentation bandwidth limitations often restrict the ability to make high-density, in-situ measurements with discrete sensors. Remote imaging offers a potential opportunity to noninvasively obtain such flight data in a complementary fashion. The NASA Hypersonic Thermodynamic Infrared Measurements Project has demonstrated such a capability to obtain calibrated thermal imagery on a hypersonic vehicle in flight. Through the application of existing and accessible technologies, the acreage surface temperature of the Shuttle lower surface was measured during reentry. Future hypersonic cruise vehicles, launcher configurations and reentry vehicles will, however, challenge current remote imaging capability. As NASA embarks on the design and deployment of a new Space Launch System architecture for access beyond earth orbit (and the commercial sector focused on low earth orbit), an opportunity exists to implement an imagery system and its supporting infrastructure that provides sufficient flexibility to incorporate changing technology to address the future needs of the flight test community. A long term vision is offered that supports the application of advanced multi-waveband sensing technology to aid in the development of future aerospace systems and critical technologies to enable highly responsive vehicle operations across the aerospace continuum, spanning launch, reusable space access and global reach. Motivations for development of an Agency level imagery

  8. Business Scenario Evaluation Method Using Monte Carlo Simulation on Qualitative and Quantitative Hybrid Model

    NASA Astrophysics Data System (ADS)

    Samejima, Masaki; Akiyoshi, Masanori; Mitsukuni, Koshichiro; Komoda, Norihisa

    We propose a business scenario evaluation method using qualitative and quantitative hybrid model. In order to evaluate business factors with qualitative causal relations, we introduce statistical values based on propagation and combination of effects of business factors by Monte Carlo simulation. In propagating an effect, we divide a range of each factor by landmarks and decide an effect to a destination node based on the divided ranges. In combining effects, we decide an effect of each arc using contribution degree and sum all effects. Through applied results to practical models, it is confirmed that there are no differences between results obtained by quantitative relations and results obtained by the proposed method at the risk rate of 5%.

  9. Qualitative and quantitative methods to determine miscibility in amorphous drug-polymer systems.

    PubMed

    Meng, Fan; Dave, Vivek; Chauhan, Harsh

    2015-09-18

    Amorphous drug-polymer systems or amorphous solid dispersions are commonly used in pharmaceutical industry to enhance the solubility of compounds with poor aqueous solubility. The degree of miscibility between drug and polymer is important both for solubility enhancement as well as for the formation of a physically stable amorphous system. Calculation of solubility parameters, Computational data mining, Tg measurements by DSC and Raman mapping are established traditional methods used to qualitatively detect the drug-polymer miscibility. Calculation of Flory-Huggins interaction parameter, computational analysis of X-Ray Diffraction (XRD) data, solid state Nuclear Magnetic Resonance (NMR) spectroscopy and Atomic Forced Microscopy (AFM) have been recently developed to quantitatively determine the miscibility in amorphous drug-polymer systems. This brief review introduces and compiles these qualitative and quantitative methods employed in the evaluation of drug-polymer miscibility. Combination of these techniques can provide deeper insights into the true miscibility of the drug-polymer systems. PMID:26006307

  10. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  11. A hybrid approach to advancing quantitative prediction of tissue distribution of basic drugs in human

    SciTech Connect

    Poulin, Patrick; Ekins, Sean; Theil, Frank-Peter

    2011-01-15

    A general toxicity of basic drugs is related to phospholipidosis in tissues. Therefore, it is essential to predict the tissue distribution of basic drugs to facilitate an initial estimate of that toxicity. The objective of the present study was to further assess the original prediction method that consisted of using the binding to red blood cells measured in vitro for the unbound drug (RBCu) as a surrogate for tissue distribution, by correlating it to unbound tissue:plasma partition coefficients (Kpu) of several tissues, and finally to predict volume of distribution at steady-state (V{sub ss}) in humans under in vivo conditions. This correlation method demonstrated inaccurate predictions of V{sub ss} for particular basic drugs that did not follow the original correlation principle. Therefore, the novelty of this study is to provide clarity on the actual hypotheses to identify i) the impact of pharmacological mode of action on the generic correlation of RBCu-Kpu, ii) additional mechanisms of tissue distribution for the outlier drugs, iii) molecular features and properties that differentiate compounds as outliers in the original correlation analysis in order to facilitate its applicability domain alongside the properties already used so far, and finally iv) to present a novel and refined correlation method that is superior to what has been previously published for the prediction of human V{sub ss} of basic drugs. Applying a refined correlation method after identifying outliers would facilitate the prediction of more accurate distribution parameters as key inputs used in physiologically based pharmacokinetic (PBPK) and phospholipidosis models.

  12. Quantitative electromechanical impedance method for nondestructive testing based on a piezoelectric bimorph cantilever

    NASA Astrophysics Data System (ADS)

    Fu, Ji; Tan, Chi; Li, Faxin

    2015-06-01

    The electromechanical impedance (EMI) method, which holds great promise in structural health monitoring (SHM), is usually treated as a qualitative method. In this work, we proposed a quantitative EMI method based on a piezoelectric bimorph cantilever using the sample’s local contact stiffness (LCS) as the identification parameter for nondestructive testing (NDT). Firstly, the equivalent circuit of the contact vibration system was established and the analytical relationship between the cantilever’s contact resonance frequency and the LCS was obtained. As the LCS is sensitive to typical defects such as voids and delamination, the proposed EMI method can then be used for NDT. To verify the equivalent circuit model, two piezoelectric bimorph cantilevers were fabricated and their free resonance frequencies were measured and compared with theoretical predictions. It was found that the stiff cantilever’s EMI can be well predicted by the equivalent circuit model while the soft cantilever’s cannot. Then, both cantilevers were assembled into a homemade NDT system using a three-axis motorized stage for LCS scanning. Testing results on a specimen with a prefabricated defect showed that the defect could be clearly reproduced in the LCS image, indicating the validity of the quantitative EMI method for NDT. It was found that the single-frequency mode of the EMI method can also be used for NDT, which is faster but not quantitative. Finally, several issues relating to the practical application of the NDT method were discussed. The proposed EMI-based NDT method offers a simple and rapid solution for damage evaluation in engineering structures and may also shed some light on EMI-based SHM.

  13. A new method for quantitating total lesion glucose metabolic changes in serial tumor FDG PET studies

    SciTech Connect

    Wu, H.M.; Hoh, C.K.; Huang, S.C.; Phelps, M.E.

    1994-05-01

    Accurate quantitative FDG PET studies have the potential for important applications in clinical oncology for monitoring therapy induced changes in tumor glycolytic rates. Due to a number of technical problems that complicate the use of quantitative PET tumor imaging, methods which can maximize the accuracy and precision of such measurements are advantageous. In this study, we developed and evaluated a method for reducing the errors caused by the conventional single plane, single ROI analysis in parametric images generated from pixel by pixel Patlak graphic analysis (PGA) in FDG PET studies of melanoma patients. We compared this new method to the conventional ROI method. The new processing method involves (1) generating the correlation coefficient (r) constrained Patlak parametric images from dynamic PET data; (2) summing up all the planes which cover the lesion; (3) defining a single ROI which covers the whole lesion in the summing image and determining the total lesion glucose metabolic index (K{sub T}, ml/min/lesion). Although only a single ROI was defined on the summing image, the glucose metabolic index obtained showed negligible difference (<1%) compared to those obtained from multiple ROIs on multiple planes of unconstrained parametric images. When the dynamic PET images were rotated and translated to simulate different patient positionings between scans at different times, the results obtained from the new method showed negligible difference (<2%). In summary, we present a simple but reliable method to quantitatively monitor the total lesion glucose metabolic changes during tumor growth. The method has several advantages over the conventional single ROI, single plane evaluation: (1) less sensitive to the ROI definition; (2) smaller intra- and inter-observer variations and (3) not requiring image registrations of serial scan data.

  14. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    PubMed Central

    Bosschaart, Nienke; van Leeuwen, Ton G.; Aalders, Maurice C.G.; Faber, Dirk J.

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of Kraszewski, in support of their conclusion that SOCT optimization should include window shape, next to choice of window size and analysis algorithm. PMID:25401016

  15. Advanced methods for preparation and characterization of infrared detector materials

    NASA Technical Reports Server (NTRS)

    Broerman, J. G.; Morris, B. J.; Meschter, P. J.

    1983-01-01

    Crystals were prepared by the Bridgman-Stockbarger method with a wide range of crystal growth rates and temperature gradients adequate to prevent constitutional supercooling under diffusion-limited, steady-state, growth conditions. The longitudinal compositional gradients for different growth conditions and alloy compositions were calculated and compared with experimental data to develop a quantitative model of solute redistribution during the crystal growth of the alloys. Measurements were performed to ascertain the effect of growth conditions on radial compositional gradients. The pseudobinary HgTe-CdTe constitutional phase diagram was determined by precision differential-thermal-analysis measurements and used to calculate the segregation coefficient of Cd as a function of x and interface temperature. Experiments were conducted to determine the ternary phase equilibria in selected regions of the Hg-Cd-Te constitutional phase diagram. Electron and hole mobilities as functions of temperature were analyzed to establish charge-carrier scattering probabilities. Computer algorithms specific to Hg(1-x)CdxTe were developed for calculations of the charge-carrier concentration, charge-carrier mobilities, Hall coefficient, and Dermi Fermi energy as functions of x, temperature, ionized donor and acceptor concentrations, and neutral defect concentrations.

  16. New advanced control methods for doubly salient permanent magnet motor

    SciTech Connect

    Blaabjerg, F.; Christensen, L.; Rasmussen, P.O.; Oestergaard, L.; Pedersen, P.

    1995-12-31

    High performance and high efficiency in adjustable speed drives are needed and new motor constructions are world wide investigated and analyzed. This paper deals with advanced control of a recently developed Doubly Salient Permanent Magnet (DSPM) motor. The construction of the DSPM motor is shown and a dynamical model of the motor is used for simulations. As supply to the DSPM motor, a power converter with a split capacitor is used to reduce the number of devices, and a basic control method for this converter is explained. This control method will cause an unequal voltage distribution across the capacitors because the motor is asymmetrical and a decrease in efficiency and a poorer dynamic performance are the results. To minimize the problems with the unequal load of the capacitors in the converter, a new charge control strategy is developed. The efficiency of the motor can also be improved by using a power minimizing scheme based on changing the turn-on and turn-off angles of the current. The two different strategies are implemented in an adjustable-speed drive, and it is concluded that both control strategies improve the performance of the drive.

  17. Quantitative Cardiac Perfusion: A Noninvasive Spin-labeling Method That Exploits Coronary Vessel Geometry1

    PubMed Central

    Reeder, Scott B.; Atalay, Michael K.; McVeigh, Elliot R.; Zerhouni, Elias A.; Forder, John R.

    2007-01-01

    PURPOSE: To quantitate myocardial arterial perfusion with a noninvasive magnetic resonance (MR) imaging technique that exploits the geometry of coronary vessel anatomy. MATERIALS AND METHODS: MR imaging was performed with a spinlabeling method in six arrested rabbit hearts at 4.7 T. Selective inversion of magnetization in the short-axis imaging section along with all myocardium apical to that section produces signal enhancement from arterial perfusion. A linescan protocol was used for validation of flow enhancement. Flow was quantitated from two images and validated with spin-echo (SE) imaging. Regional perfusion defects were created by means of coronary artery ligation and delineated with gadolinium-enhanced imaging. RESULTS: Linescan estimates of T1 obtained at physiologic flows agreed with model predictions. Flow-induced signal enhancement measured on SE images also agreed with expected values. Finally, perfusion abnormalities created by means of coronary artery ligation were detected. CONCLUSION: This spin-labeling method provides quantitative estimates of myocardial arterial perfusion in this model and may hold promise for clinical applications. PMID:8657907

  18. Multipoint methods for linkage analysis of quantitative trait loci in sib pairs

    SciTech Connect

    Cardon, L.R. |; Cherny, S.S.; Fulker, D.W.

    1994-09-01

    The sib-pair method of Haseman and Elston is widely used for linkage analysis of quantitative traits. The method requires no assumptions concerning the mode of transmission of the trait, it is robust with respect to genetic heterogeneity, and it is computationally efficient. However, the practical usefulness of the method is limited by its statistical power, requiring large numbers of sib paris and highly informative markers to detect genetic loci of only moderate effect size. We have developed a family of interval mapping procedures which dramatically increase the statistical power of the classical sib-pair approach. The methods make use of information from pairs of markers which flank a putative quantitative trait locus (QTL) in order to estimate the location and effect size of the QTL. Here we describe an extension of the interval mapping procedure which takes into account all available marker information on a chromosome simultaneously, rather than just pairs of markers. The method provides a computationally fast approximation to full multipoint analysis of sib-pair data using a modified Haserman-Elston approach. It gives very similar results to the earlier interval mapping procedure when marker information is relatively uniform and a coarse map is used. However, there is a substantial improvement over the original method when markers differ in information content and when a dense map is employed. The method is illustrated using real and simulated sib-pair data.

  19. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    NASA Astrophysics Data System (ADS)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  20. Advanced Signal Processing Methods Applied to Digital Mammography

    NASA Technical Reports Server (NTRS)

    Stauduhar, Richard P.

    1997-01-01

    The work reported here is on the extension of the earlier proposal of the same title, August 1994-June 1996. The report for that work is also being submitted. The work reported there forms the foundation for this work from January 1997 to September 1997. After the earlier work was completed there were a few items that needed to be completed prior to submission of a new and more comprehensive proposal for further research. Those tasks have been completed and two new proposals have been submitted, one to NASA, and one to Health & Human Services WS). The main purpose of this extension was to refine some of the techniques that lead to automatic large scale evaluation of full mammograms. Progress on each of the proposed tasks follows. Task 1: A multiresolution segmentation of background from breast has been developed and tested. The method is based on the different noise characteristics of the two different fields. The breast field has more power in the lower octaves and the off-breast field behaves similar to a wideband process, where more power is in the high frequency octaves. After the two fields are separated by lowpass filtering, a region labeling routine is used to find the largest contiguous region, the breast. Task 2: A wavelet expansion that can decompose the image without zero padding has been developed. The method preserves all properties of the power-of-two wavelet transform and does not add appreciably to computation time or storage. This work is essential for analysis of the full mammogram, as opposed to selecting sections from the full mammogram. Task 3: A clustering method has been developed based on a simple counting mechanism. No ROC analysis has been performed (and was not proposed), so we cannot finally evaluate this work without further support. Task 4: Further testing of the filter reveals that different wavelet bases do yield slightly different qualitative results. We cannot provide quantitative conclusions about this for all possible bases

  1. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  2. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this. PMID:19602858

  3. Performance analysis of quantitative phase retrieval method in Zernike phase contrast X-ray microscopy

    NASA Astrophysics Data System (ADS)

    Heng, Chen; Kun, Gao; Da-Jiang, Wang; Li, Song; Zhi-Li, Wang

    2016-02-01

    Since the invention of Zernike phase contrast method in 1930, it has been widely used in optical microscopy and more recently in X-ray microscopy. Considering the image contrast is a mixture of absorption and phase information, we recently have proposed and demonstrated a method for quantitative phase retrieval in Zernike phase contrast X-ray microscopy. In this contribution, we analyze the performance of this method at different photon energies. Intensity images of PMMA samples are simulated at 2.5 keV and 6.2 keV, respectively, and phase retrieval is performed using the proposed method. The results demonstrate that the proposed phase retrieval method is applicable over a wide energy range. For weakly absorbing features, the optimal photon energy is 2.5 keV, from the point of view of image contrast and accuracy of phase retrieval. On the other hand, in the case of strong absorption objects, a higher photon energy is preferred to reduce the error of phase retrieval. These results can be used as guidelines to perform quantitative phase retrieval in Zernike phase contrast X-ray microscopy with the proposed method. Supported by the State Key Project for Fundamental Research (2012CB825801), National Natural Science Foundation of China (11475170, 11205157 and 11179004) and Anhui Provincial Natural Science Foundation (1508085MA20).

  4. A new method for robust quantitative and qualitative analysis of real-time PCR

    PubMed Central

    Shain, Eric B.; Clemens, John M.

    2008-01-01

    An automated data analysis method for real-time PCR needs to exhibit robustness to the factors that routinely impact the measurement and analysis of real-time PCR data. Robust analysis is paramount to providing the same interpretation for results regardless of the skill of the operator performing or reviewing the work. We present a new method for analysis of real-time PCR data, the maxRatio method, which identifies a consistent point within or very near the exponential region of the PCR signal without requiring user intervention. Compared to other analytical techniques that generate only a cycle number, maxRatio generates several measurements of amplification including cycle numbers and relative measures of amplification efficiency and curve shape. By using these values, the maxRatio method can make highly reliable reactive/nonreactive determination along with quantitative evaluation. Application of the maxRatio method to the analysis of quantitative and qualitative real-time PCR assays is shown along with examples of method robustness to, and detection of, amplification response anomalies. PMID:18603594

  5. An Augmented Classical Least Squares Method for Quantitative Raman Spectral Analysis against Component Information Loss

    PubMed Central

    Zhou, Yan; Cao, Hui

    2013-01-01

    We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR. PMID:23956689

  6. Quantitative assessments of traumatic axonal injury in human brain: concordance of microdialysis and advanced MRI.

    PubMed

    Magnoni, Sandra; Mac Donald, Christine L; Esparza, Thomas J; Conte, Valeria; Sorrell, James; Macrì, Mario; Bertani, Giulio; Biffi, Riccardo; Costa, Antonella; Sammons, Brian; Snyder, Abraham Z; Shimony, Joshua S; Triulzi, Fabio; Stocchetti, Nino; Brody, David L

    2015-08-01

    Axonal injury is a major contributor to adverse outcomes following brain trauma. However, the extent of axonal injury cannot currently be assessed reliably in living humans. Here, we used two experimental methods with distinct noise sources and limitations in the same cohort of 15 patients with severe traumatic brain injury to assess axonal injury. One hundred kilodalton cut-off microdialysis catheters were implanted at a median time of 17 h (13-29 h) after injury in normal appearing (on computed tomography scan) frontal white matter in all patients, and samples were collected for at least 72 h. Multiple analytes, such as the metabolic markers glucose, lactate, pyruvate, glutamate and tau and amyloid-β proteins, were measured every 1-2 h in the microdialysis samples. Diffusion tensor magnetic resonance imaging scans at 3 T were performed 2-9 weeks after injury in 11 patients. Stability of diffusion tensor imaging findings was verified by repeat scans 1-3 years later in seven patients. An additional four patients were scanned only at 1-3 years after injury. Imaging abnormalities were assessed based on comparisons with five healthy control subjects for each patient, matched by age and sex (32 controls in total). No safety concerns arose during either microdialysis or scanning. We found that acute microdialysis measurements of the axonal cytoskeletal protein tau in the brain extracellular space correlated well with diffusion tensor magnetic resonance imaging-based measurements of reduced brain white matter integrity in the 1-cm radius white matter-masked region near the microdialysis catheter insertion sites. Specifically, we found a significant inverse correlation between microdialysis measured levels of tau 13-36 h after injury and anisotropy reductions in comparison with healthy controls (Spearman's r = -0.64, P = 0.006). Anisotropy reductions near microdialysis catheter insertion sites were highly correlated with reductions in multiple additional white matter

  7. Quantitative assessments of traumatic axonal injury in human brain: concordance of microdialysis and advanced MRI

    PubMed Central

    Magnoni, Sandra; Mac Donald, Christine L.; Esparza, Thomas J.; Conte, Valeria; Sorrell, James; Macrì, Mario; Bertani, Giulio; Biffi, Riccardo; Costa, Antonella; Sammons, Brian; Snyder, Abraham Z.; Shimony, Joshua S.; Triulzi, Fabio; Stocchetti, Nino

    2015-01-01

    Axonal injury is a major contributor to adverse outcomes following brain trauma. However, the extent of axonal injury cannot currently be assessed reliably in living humans. Here, we used two experimental methods with distinct noise sources and limitations in the same cohort of 15 patients with severe traumatic brain injury to assess axonal injury. One hundred kilodalton cut-off microdialysis catheters were implanted at a median time of 17 h (13–29 h) after injury in normal appearing (on computed tomography scan) frontal white matter in all patients, and samples were collected for at least 72 h. Multiple analytes, such as the metabolic markers glucose, lactate, pyruvate, glutamate and tau and amyloid-β proteins, were measured every 1–2 h in the microdialysis samples. Diffusion tensor magnetic resonance imaging scans at 3 T were performed 2–9 weeks after injury in 11 patients. Stability of diffusion tensor imaging findings was verified by repeat scans 1–3 years later in seven patients. An additional four patients were scanned only at 1–3 years after injury. Imaging abnormalities were assessed based on comparisons with five healthy control subjects for each patient, matched by age and sex (32 controls in total). No safety concerns arose during either microdialysis or scanning. We found that acute microdialysis measurements of the axonal cytoskeletal protein tau in the brain extracellular space correlated well with diffusion tensor magnetic resonance imaging-based measurements of reduced brain white matter integrity in the 1-cm radius white matter-masked region near the microdialysis catheter insertion sites. Specifically, we found a significant inverse correlation between microdialysis measured levels of tau 13–36 h after injury and anisotropy reductions in comparison with healthy controls (Spearman’s r = −0.64, P = 0.006). Anisotropy reductions near microdialysis catheter insertion sites were highly correlated with reductions in multiple additional

  8. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era

    SciTech Connect

    Chiu, Weihsueh A.; Euling, Susan Y.; Scott, Cheryl Siegel; Subramaniam, Ravi P.

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA) — i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on “augmentation” of weight of evidence — using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards “integration” of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for “expansion” of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual “reorientation” of QRA towards approaches that more directly link environmental exposures to human outcomes.

  9. Quantitative analysis methods for three-dimensional microstructure of the solid-oxide fuel cell anode

    NASA Astrophysics Data System (ADS)

    Song, X.; Guan, Y.; Liu, G.; Chen, L.; Xiong, Y.; Zhang, X.; Tian, Y.

    2013-10-01

    The electrochemical performance is closely related to three-dimensional microstructure of the Ni-YSZ anode. X-ray nano-tomography combined with quantitative analysis methods has been applied to non-destructively study the internal microstructure of the porous Ni-YSZ anode. In this paper, the methods for calculating some critical structural parameters, such as phase volume fraction, connectivity and active triple phase boundary (TPB) density were demonstrated. These structural parameters help us to optimize electrodes and improve the performance.

  10. A method for three-dimensional quantitative observation of the microstructure of biological samples

    NASA Astrophysics Data System (ADS)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  11. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  12. A rapid quantitative method of carisoprodol and meprobamate by liquid chromatography-tandem mass spectrometry.

    PubMed

    Essler, Shannon; Bruns, Kerry; Frontz, Michael; McCutcheon, J Rod

    2012-11-01

    The identification and quantitation of carisoprodol (Soma) and its chief metabolite meprobamate, which is also a clinically prescribed drug, remains a challenge for forensic toxicology laboratories. Carisoprodol and meprobamate are notable for their widespread use as muscle relaxants and their frequent identification in the blood of impaired drivers. Routine screening is possible in both an acidic/neutral pH screen and a traditional basic screen. An improvement in directed testing quantitations was desirable over the current options of an underivatized acidic/neutral extraction or a basic screen, neither of which used ideal internal standards. A new method was developed that utilized a simple protein precipitation, deuterated internal standards and a short 2-min isocratic liquid chromatography separation, followed by multiple reaction monitoring with tandem mass spectrometry. The linear quantitative range for carisoprodol was determined to be 1-35mg/L and for meprobamate was 0.5-50mg/L. The method was validated for specificity and selectivity, matrix effects, and accuracy and precision. PMID:23040985

  13. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  14. Advances in the analysis of iminocyclitols: Methods, sources and bioavailability.

    PubMed

    Amézqueta, Susana; Torres, Josep Lluís

    2016-05-01

    Iminocyclitols are chemically and metabolically stable, naturally occurring sugar mimetics. Their biological activities make them interesting and extremely promising as both drug leads and functional food ingredients. The first iminocyclitols were discovered using preparative isolation and purification methods followed by chemical characterization using nuclear magnetic resonance spectroscopy. In addition to this classical approach, gas and liquid chromatography coupled to mass spectrometry are increasingly used; they are highly sensitive techniques capable of detecting minute amounts of analytes in a broad spectrum of sources after only minimal sample preparation. These techniques have been applied to identify new iminocyclitols in plants, microorganisms and synthetic mixtures. The separation of iminocyclitol mixtures by chromatography is particularly difficult however, as the most commonly used matrices have very low selectivity for these highly hydrophilic structurally similar molecules. This review critically summarizes recent advances in the analysis of iminocyclitols from plant sources and findings regarding their quantification in dietary supplements and foodstuffs, as well as in biological fluids and organs, from bioavailability studies. PMID:26946023

  15. Underwater Photosynthesis of Submerged Plants – Recent Advances and Methods

    PubMed Central

    Pedersen, Ole; Colmer, Timothy D.; Sand-Jensen, Kaj

    2013-01-01

    We describe the general background and the recent advances in research on underwater photosynthesis of leaf segments, whole communities, and plant dominated aquatic ecosystems and present contemporary methods tailor made to quantify photosynthesis and carbon fixation under water. The majority of studies of aquatic photosynthesis have been carried out with detached leaves or thalli and this selectiveness influences the perception of the regulation of aquatic photosynthesis. We thus recommend assessing the influence of inorganic carbon and temperature on natural aquatic communities of variable density in addition to studying detached leaves in the scenarios of rising CO2 and temperature. Moreover, a growing number of researchers are interested in tolerance of terrestrial plants during flooding as torrential rains sometimes result in overland floods that inundate terrestrial plants. We propose to undertake studies to elucidate the importance of leaf acclimation of terrestrial plants to facilitate gas exchange and light utilization under water as these acclimations influence underwater photosynthesis as well as internal aeration of plant tissues during submergence. PMID:23734154

  16. Methods for integrating optical fibers with advanced aerospace materials

    NASA Astrophysics Data System (ADS)

    Poland, Stephen H.; May, Russell G.; Murphy, Kent A.; Claus, Richard O.; Tran, Tuan A.; Miller, Mark S.

    1993-07-01

    Optical fibers are attractive candidates for sensing applications in near-term smart materials and structures, due to their inherent immunity to electromagnetic interference and ground loops, their capability for distributed and multiplexed operation, and their high sensitivity and dynamic range. These same attributes also render optical fibers attractive for avionics busses for fly-by-light systems in advanced aircraft. The integration of such optical fibers with metal and composite aircraft and aerospace materials, however, remains a limiting factor in their successful use in such applications. This paper first details methods for the practical integration of optical fiber waveguides and cable assemblies onto and into materials and structures. Physical properties of the optical fiber and coatings which affect the survivability of the fiber are then considered. Mechanisms for the transfer of the strain from matrix to fiber for sensor and data bus fibers integrated with composite structural elements are evaluated for their influence on fiber survivability, in applications where strain or impact is imparted to the assembly.

  17. PRATHAM: Parallel Thermal Hydraulics Simulations using Advanced Mesoscopic Methods

    SciTech Connect

    Joshi, Abhijit S; Jain, Prashant K; Mudrich, Jaime A; Popov, Emilian L

    2012-01-01

    At the Oak Ridge National Laboratory, efforts are under way to develop a 3D, parallel LBM code called PRATHAM (PaRAllel Thermal Hydraulic simulations using Advanced Mesoscopic Methods) to demonstrate the accuracy and scalability of LBM for turbulent flow simulations in nuclear applications. The code has been developed using FORTRAN-90, and parallelized using the message passing interface MPI library. Silo library is used to compact and write the data files, and VisIt visualization software is used to post-process the simulation data in parallel. Both the single relaxation time (SRT) and multi relaxation time (MRT) LBM schemes have been implemented in PRATHAM. To capture turbulence without prohibitively increasing the grid resolution requirements, an LES approach [5] is adopted allowing large scale eddies to be numerically resolved while modeling the smaller (subgrid) eddies. In this work, a Smagorinsky model has been used, which modifies the fluid viscosity by an additional eddy viscosity depending on the magnitude of the rate-of-strain tensor. In LBM, this is achieved by locally varying the relaxation time of the fluid.

  18. Quantifying hydrate solidification front advancing using method of characteristics

    NASA Astrophysics Data System (ADS)

    You, Kehua; DiCarlo, David; Flemings, Peter B.

    2015-10-01

    We develop a one-dimensional analytical solution based on the method of characteristics to explore hydrate formation from gas injection into brine-saturated sediments within the hydrate stability zone. Our solution includes fully coupled multiphase and multicomponent flow and the associated advective transport in a homogeneous system. Our solution shows that hydrate saturation is controlled by the initial thermodynamic state of the system and changed by the gas fractional flow. Hydrate saturation in gas-rich systems can be estimated by 1-cl0/cle when Darcy flow dominates, where cl0 is the initial mass fraction of salt in brine, and cle is the mass fraction of salt in brine at three-phase (gas, liquid, and hydrate) equilibrium. Hydrate saturation is constant, gas saturation and gas flux decrease, and liquid saturation and liquid flux increase with the distance from the gas inlet to the hydrate solidification front. The total gas and liquid flux is constant from the gas inlet to the hydrate solidification front and decreases abruptly at the hydrate solidification front due to gas inclusion into the hydrate phase. The advancing velocity of the hydrate solidification front decreases with hydrate saturation at a fixed gas inflow rate. This analytical solution illuminates how hydrate is formed by gas injection (methane, CO2, ethane, propane) at both the laboratory and field scales.

  19. Quantitative measurement of analyte gases in a microwave spectrometer using a dynamic sampling method

    NASA Astrophysics Data System (ADS)

    Zhu, Z.; Matthews, I. P.; Samuel, A. H.

    1996-07-01

    This article reports quantitative measurement of concentrations of water vapor (absorption line at 22.235 GHz) and ethylene oxide (absorption line at 23.123 GHz) in different gas mixtures by means of a microwave spectrometer. The problem of absorption line broadening and the gas memory problem inherent in the quantitative analysis of gases using microwave molecular rotational spectroscopy have been solved. The line broadening problem was minimized by gas dilution with nitrogen and the gas memory problem was effectively reduced by means of a dynamic sampling method. Calibration of ethylene oxide with a dilution factor of 5 has demonstrated that the standard deviations of the calibration data were less than 4.2%. A typical ethylene oxide sterilization production cycle was chosen to monitor chamber ethylene oxide concentrations in the gas dwell phase and the repeatability of these real time measurements was 2.7%.

  20. Development and evaluation of an improved quantitative 90Y bremsstrahlung SPECT method

    PubMed Central

    Rong, Xing; Du, Yong; Ljungberg, Michael; Rault, Erwann; Vandenberghe, Stefaan; Frey, Eric C.

    2012-01-01

    Purpose: Yttrium-90 (90Y) is one of the most commonly used radionuclides in targeted radionuclide therapy (TRT). Since it decays with essentially no gamma photon emissions, surrogate radionuclides (e.g., 111In) or imaging agents (e.g., 99mTc MAA) are typically used for treatment planning. It would, however, be useful to image 90Y directly in order to confirm that the distributions measured with these other radionuclides or agents are the same as for the 90Y labeled agents. As a result, there has been a great deal of interest in quantitative imaging of 90Y bremsstrahlung photons using single photon emission computed tomography (SPECT) imaging. The continuous and broad energy distribution of bremsstrahlung photons, however, imposes substantial challenges on accurate quantification of the activity distribution. The aim of this work was to develop and evaluate an improved quantitative 90Y bremsstrahlung SPECT reconstruction method appropriate for these imaging applications. Methods: Accurate modeling of image degrading factors such as object attenuation and scatter and the collimator-detector response is essential to obtain quantitatively accurate images. All of the image degrading factors are energy dependent. Thus, the authors separated the modeling of the bremsstrahlung photons into multiple categories and energy ranges. To improve the accuracy, the authors used a bremsstrahlung energy spectrum previously estimated from experimental measurements and incorporated a model of the distance between 90Y decay location and bremsstrahlung emission location into the SIMIND code used to generate the response functions and kernels used in the model. This improved Monte Carlo bremsstrahlung simulation was validated by comparison to experimentally measured projection data of a 90Y line source. The authors validated the accuracy of the forward projection model for photons in the various categories and energy ranges using the validated Monte Carlo (MC) simulation method. The

  1. Electrochemical test methods for advanced battery and semiconductor technology

    NASA Astrophysics Data System (ADS)

    Hsu, Chao-Hung

    This dissertation consists of two studies. The first study was the evaluation of metallic materials for advanced lithium ion batteries and the second study was the determination of the dielectric constant k for the low-k materials. The advanced lithium ion battery is miniature for implantable medical devices and capable of being recharged from outside of the body using magnetic induction without physical connections. The stability of metallic materials employed in the lithium ion battery is one of the major safety concerns. Three types of materials---Pt-Ir alloy, Ti alloys, and stainless steels---were evaluated extensively in this study. The electrochemical characteristics of Pt-Ir alloy, Ti alloys, and stainless steels were evaluated in several types of battery electrolytes in order to determine the candidate materials for long-term use in lithium ion batteries. The dissolution behavior of these materials and the decomposition behavior of the battery electrolyte were investigated using the anodic potentiodynamic polarization (APP) technique. Lifetime prediction for metal dissolution was conducted using constant potential polarization (CPP) technique. The electrochemical impedance spectroscopy (EIS) technique was employed to investigate the metal dissolution behavior or the battery electrolyte decomposition at the open circuit potential (OCP). The scanning electron microscope (SEM) was used to observe the morphology changes after these tests. The effects of experimental factors on the corrosion behaviors of the metallic materials and stabilities of the battery electrolytes were also investigated using the 23 factorial design approach. Integration of materials having low dielectric constant k as interlayer dielectrics and/or low-resistivity conductors will partially solve the RC delay problem for the limiting performance of high-speed logic chips. The samples of JSR LKD 5109 material capped by several materials were evaluated by using EIS. The feasibility of using

  2. Development and Applications of Advanced Electronic Structure Methods

    NASA Astrophysics Data System (ADS)

    Bell, Franziska

    This dissertation contributes to three different areas in electronic structure theory. The first part of this thesis advances the fundamentals of orbital active spaces. Orbital active spaces are not only essential in multi-reference approaches, but have also become of interest in single-reference methods as they allow otherwise intractably large systems to be studied. However, despite their great importance, the optimal choice and, more importantly, their physical significance are still not fully understood. In order to address this problem, we studied the higher-order singular value decomposition (HOSVD) in the context of electronic structure methods. We were able to gain a physical understanding of the resulting orbitals and proved a connection to unrelaxed natural orbitals in the case of Moller-Plesset perturbation theory to second order (MP2). In the quest to find the optimal choice of the active space, we proposed a HOSVD for energy-weighted integrals, which yielded the fastest convergence in MP2 correlation energy for small- to medium-sized active spaces to date, and is also potentially transferable to coupled-cluster theory. In the second part, we studied monomeric and dimeric glycerol radical cations and their photo-induced dissociation in collaboration with Prof. Leone and his group. Understanding the mechanistic details involved in these processes are essential for further studies on the combustion of glycerol and carbohydrates. To our surprise, we found that in most cases, the experimentally observed appearance energies arise from the separation of product fragments from one another rather than rearrangement to products. The final chapters of this work focus on the development, assessment, and application of the spin-flip method, which is a single-reference approach, but capable of describing multi-reference problems. Systems exhibiting multi-reference character, which arises from the (near-) degeneracy of orbital energies, are amongst the most

  3. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  4. Full quantitative phase analysis of hydrated lime using the Rietveld method

    SciTech Connect

    Lassinantti Gualtieri, Magdalena

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  5. Studies of polyester fiber as carrier for microbes in a quantitative test method for disinfectants.

    PubMed

    Miner, Norman; Harris, Valerie; Stumph, Sara; Cobb, Amanda; Ortiz, Jennifer

    2004-01-01

    Tests were conducted by a Task Force on Disinfectant Test Methods that was appointed to investigate controversies regarding the accuracy of AOAC test methods for disinfectants as presented in AOAC's Official Methods of Analysis, Chapter 6. The general principles for new and improved AOAC tests are discussed, and a disinfectant test using microbes labeled onto a polyester fiber surface is described. The quantitative test measures the survival of test microbes as a function of exposure time as well as the exposure conditions required to kill 6 log10 of the test microbes. The time required was similar to that for the kinetics of the kill of Bacillus subtilis-labeled cylinders as tested by methods of the AOAC Sporicidal Test 966.04. PMID:15164838

  6. Development of a HPLC Method for the Quantitative Determination of Capsaicin in Collagen Sponge

    PubMed Central

    Guo, Chun-Lian; Chen, Hong-Ying; Cui, Bi-Ling; Chen, Yu-Huan; Zhou, Yan-Fang; Peng, Xin-Sheng; Wang, Qin

    2015-01-01

    Controlling the concentration of drugs in pharmaceutical products is essential to patient's safety. In this study, a simple and sensitive HPLC method is developed to quantitatively analyze capsaicin in collagen sponge. The capsaicin from sponge was extracted for 30 min with ultrasonic wave extraction technique and methanol was used as solvent. The chromatographic method was performed by using isocratic system composed of acetonitrile-water (70 : 30) with a flow rate of 1 mL/min and the detection wavelength was at 280 nm. Capsaicin can be successfully separated with good linearity (the regression equation is A = 9.7182C + 0.8547; R2 = 1.0) and perfect recovery (99.72%). The mean capsaicin concentration in collagen sponge was 49.32 mg/g (RSD = 1.30%; n = 3). In conclusion, the ultrasonic wave extraction method is simple and the extracting efficiency is high. The HPLC assay has excellent sensitivity and specificity and is a convenient method for capsaicin detection in collagen sponge. This paper firstly discusses the quantitative analysis of capsaicin in collagen sponge. PMID:26612986

  7. Multiresidue method for the quantitation of 20 pesticides in aquatic products.

    PubMed

    Cho, Ha Ra; Park, Jun Seo; Kim, Junghyun; Han, Sang Beom; Choi, Yong Seok

    2015-12-01

    As the consumption of aquatic products increased, the need for regulation of pesticide residues in aquatic products also emerged. Thus, in this study, a scheduled multiple reaction monitoring (sMRM) method employing a novel extraction and purification step based on QuEChERS with EDTA was developed for the simultaneous quantitation of 20 pesticides (alachlor, aldicarb, carbofuran, diazinon, dimethoate, dimethomorph, ethoprophos, ferimzone, fluridone, hexaconazole, iprobenfos, malathion, methidathion, methiocarb, phenthoate, phosalone, phosmet, phosphamidon, pirimicarb, and simazine) in aquatic products. Additionally, the present method was validated in the aspects of specificity, linearity (r ≥ 0.980), sensitivity (the limit of quantitation (LOQ) ≤ 5 ng/g), relative standard deviation, RSD (1.0% ≤ RSD ≤ 19.4%), and recovery (60.1% ≤ recovery ≤ 117.9%). Finally, the validated method was applied for the determination of the 20 pesticide residues in eel and shrimp purchased from local food markets. In the present study, QuEChERS with EDTA was successfully expanded to residual pesticide analysis for the first time. The present method could contribute to the rapid and successful establishment of the positive list system in South Korea. PMID:26466578

  8. A quantitative method for the evaluation of three-dimensional structure of temporal bone pneumatization.

    PubMed

    Hill, Cheryl A; Richtsmeier, Joan T

    2008-10-01

    Temporal bone pneumatization has been included in lists of characters used in phylogenetic analyses of human evolution. While studies suggest that the extent of pneumatization has decreased over the course of human evolution, little is known about the processes underlying these changes or their significance. In short, reasons for the observed reduction and the potential reorganization within pneumatized spaces are unknown. Technological limitations have limited previous analyses of pneumatization in extant and fossil species to qualitative observations of the extent of temporal bone pneumatization. In this paper, we introduce a novel application of quantitative methods developed for the study of trabecular bone to the analysis of pneumatized spaces of the temporal bone. This method utilizes high-resolution X-ray computed tomography (HRXCT) images and quantitative software to estimate three-dimensional parameters (bone volume fractions, anisotropy, and trabecular thickness) of bone structure within defined units of pneumatized spaces. We apply this approach in an analysis of temporal bones of diverse but related primate species, Gorilla gorilla, Pan troglodytes, Homo sapiens, and Papio hamadryas anubis, to illustrate the potential of these methods. In demonstrating the utility of these methods, we show that there are interspecific differences in the bone structure of pneumatized spaces, perhaps reflecting changes in the localized growth dynamics, location of muscle attachments, encephalization, or basicranial flexion. PMID:18715622

  9. The quantitative and qualitative recovery of Campylobacter from raw poultry using USDA and Health Canada methods.

    PubMed

    Sproston, E L; Carrillo, C D; Boulter-Bitzer, J

    2014-12-01

    Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating. PMID:25084671

  10. Quantitative Methods for Reservoir Characterization and Improved Recovery: Application to Heavy Oil Sands

    SciTech Connect

    Castle, J.W.; Molz, F.J.; Brame, S.E.; Falta, R.W.

    2003-02-07

    Improved prediction of interwell reservoir heterogeneity was needed to increase productivity and to reduce recovery cost for California's heavy oil sands, which contain approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley. This investigation involved application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation.

  11. Quantitative Methods for Reservoir Characterization and Improved Recovery: Application to Heavy Oil Sands

    SciTech Connect

    Castle, James W.; Molz, Fred J.

    2003-02-07

    Improved prediction of interwell reservoir heterogeneity is needed to increase productivity and to reduce recovery cost for California's heavy oil sands, which contain approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley. This investigation involved application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation.

  12. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  13. Bioinformatics Methods and Tools to Advance Clinical Care

    PubMed Central

    Lecroq, T.

    2015-01-01

    Summary Objectives To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. Method We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. Results The selection and evaluation process of this Yearbook’s section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. Conclusions The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their

  14. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  15. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models

    PubMed Central

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics. PMID:26091413

  16. Quantitative validation of the 3D SAR profile of hyperthermia applicators using the gamma method.

    PubMed

    de Bruijne, Maarten; Samaras, Theodoros; Chavannes, Nicolas; van Rhoon, Gerard C

    2007-06-01

    For quality assurance of hyperthermia treatment planning systems, quantitative validation of the electromagnetic model of an applicator is essential. The objective of this study was to validate a finite-difference time-domain (FDTD) model implementation of the Lucite cone applicator (LCA) for superficial hyperthermia. The validation involved (i) the assessment of the match between the predicted and measured 3D specific absorption rate (SAR) distribution, and (ii) the assessment of the ratio between model power and real-world power. The 3D SAR distribution of seven LCAs was scanned in a phantom bath using the DASY4 dosimetric measurement system. The same set-up was modelled in SEMCAD X. The match between the predicted and the measured SAR distribution was quantified with the gamma method, which combines distance-to-agreement and dose difference criteria. Good quantitative agreement was observed: more than 95% of the measurement points met the acceptance criteria 2 mm/2% for all applicators. The ratio between measured and predicted power absorption ranged from 0.75 to 0.92 (mean 0.85). This study shows that quantitative validation of hyperthermia applicator models is feasible and is worth considering as a part of hyperthermia quality assurance procedures. PMID:17505090

  17. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  18. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    PubMed

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities. PMID:26963561

  19. Quantitative analysis of single amino acid variant peptides associated with pancreatic cancer in serum by an isobaric labeling quantitative method.

    PubMed

    Nie, Song; Yin, Haidi; Tan, Zhijing; Anderson, Michelle A; Ruffin, Mack T; Simeone, Diane M; Lubman, David M

    2014-12-01

    Single amino acid variations are highly associated with many human diseases. The direct detection of peptides containing single amino acid variants (SAAVs) derived from nonsynonymous single nucleotide polymorphisms (SNPs) in serum can provide unique opportunities for SAAV associated biomarker discovery. In the present study, an isobaric labeling quantitative strategy was applied to identify and quantify variant peptides in serum samples of pancreatic cancer patients and other benign controls. The largest number of SAAV peptides to date in serum including 96 unique variant peptides were quantified in this quantitative analysis, of which five variant peptides showed a statistically significant difference between pancreatic cancer and other controls (p-value < 0.05). Significant differences in the variant peptide SDNCEDTPEAGYFAVAVVK from serotransferrin were detected between pancreatic cancer and controls, which was further validated by selected reaction monitoring (SRM) analysis. The novel biomarker panel obtained by combining α-1-antichymotrypsin (AACT), Thrombospondin-1 (THBS1) and this variant peptide showed an excellent diagnostic performance in discriminating pancreatic cancer from healthy controls (AUC = 0.98) and chronic pancreatitis (AUC = 0.90). These results suggest that large-scale analysis of SAAV peptides in serum may provide a new direction for biomarker discovery research. PMID:25393578

  20. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    SciTech Connect

    E. Blanford; E. Keldrauk; M. Laufer; M. Mieler; J. Wei; B. Stojadinovic; P.F. Peterson

    2010-09-20

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement, and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using

  1. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. PMID:27414749

  2. A comparison of quantitative methods for clinical imaging with hyperpolarized 13C‐pyruvate

    PubMed Central

    Daniels, Charlie J.; McLean, Mary A.; Schulte, Rolf F.; Robb, Fraser J.; Gill, Andrew B.; McGlashan, Nicholas; Graves, Martin J.; Schwaiger, Markus; Lomas, David J.; Brindle, Kevin M.

    2016-01-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized 13C‐labelled molecules, such as the conversion of [1‐13C]pyruvate to [1‐13C]lactate, to be dynamically and non‐invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model‐free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two‐way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time‐to‐peak and the lactate‐to‐pyruvate area under the curve ratio were simple model‐free approaches that accurately represented the full reaction, with the time‐to‐peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized 13C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd. PMID:27414749

  3. Prognostic Value of Quantitative Metabolic Metrics on Baseline Pre-Sunitinib FDG PET/CT in Advanced Renal Cell Carcinoma

    PubMed Central

    Minamimoto, Ryogo; Barkhodari, Amir; Harshman, Lauren; Srinivas, Sandy; Quon, Andrew

    2016-01-01

    Purpose The objective of this study was to prospectively evaluate various quantitative metrics on FDG PET/CT for monitoring sunitinib therapy and predicting prognosis in patients with metastatic renal cell cancer (mRCC). Methods Seventeen patients (mean age: 59.0 ± 11.6) prospectively underwent a baseline FDG PET/CT and interim PET/CT after 2 cycles (12 weeks) of sunitinib therapy. We measured the highest maximum standardized uptake value (SUVmax) of all identified lesions (highest SUVmax), sum of SUVmax with maximum six lesions (sum of SUVmax), total lesion glycolysis (TLG) and metabolic tumor volume (MTV) from baseline PET/CT and interim PET/CT, and the % decrease in highest SUVmax of lesion (%Δ highest SUVmax), the % decrease in sum of SUVmax, the % decrease in TLG (%ΔTLG) and the % decrease in MTV (%ΔMTV) between baseline and interim PET/CT, and the imaging results were validated by clinical follow-up at 12 months after completion of therapy for progression free survival (PFS). Results At 12 month follow-up, 6/17 (35.3%) patients achieved PFS, while 11/17 (64.7%) patients were deemed to have progression of disease or recurrence within the previous 12 months. At baseline, PET/CT demonstrated metabolically active cancer in all cases. Using baseline PET/CT alone, all of the quantitative imaging metrics were predictive of PFS. Using interim PET/CT, the %Δ highest SUVmax, %Δ sum of SUVmax, and %ΔTLG were also predictive of PFS. Otherwise, interim PET/CT showed no significant difference between the two survival groups regardless of the quantitative metric utilized including MTV and TLG. Conclusions Quantitative metabolic measurements on baseline PET/CT appears to be predictive of PFS at 12 months post-therapy in patients scheduled to undergo sunitinib therapy for mRCC. Change between baseline and interim PET/CT also appeared to have prognostic value but otherwise interim PET/CT after 12 weeks of sunitinib did not appear to be predictive of PFS. PMID:27123976

  4. Tentative method for the qualitative detection and quantitative assessment of air contamination by drugs.

    PubMed

    Buogo, A; Eboli, V

    1972-06-01

    A method for detecting and measuring air contamination by drugs is described which uses an electrostatic bacterial air sampler, sprayers for micronizing drugs, and Mueller-Hinton medium seeded with a highly susceptible strain of Sarcina lutea. Three antibiotics (penicillin, tetracycline, aminosidine) and a sulfonamide (sulfapyrazine) were identified by pretreating portions of medium, showing no bacterial growth, with penicillinase or p-aminobenzoic acid solution and subsequently determining how both drug(-) susceptible and drug-resistant strains of Staphylococcus aureus were affected by this pretreatment. Quantitative determinations were also attempted by measuring the size of the inhibition zones. PMID:4483536

  5. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  6. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J.; Cremers, David A.

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  7. Quantitative methods for the analysis of protein phosphorylation in drug development.

    PubMed

    Olive, D Michael

    2004-10-01

    Most signal transduction and cell signaling pathways are mediated by protein kinases. Protein kinases have emerged as important cellular regulatory proteins in many aspects of neoplasia. Protein kinase inhibitors offer the opportunity to target diseases such as cancer with chemotherapeutic agents specific for the causative molecular defect. In order to identify possible targets and assess kinase inhibitors, quantitative methods for analyzing protein phosphorylation have been developed. This review examines some of the current formats used for quantifying kinase function for drug development. PMID:15966829

  8. Tentative Method for the Qualitative Detection and Quantitative Assessment of Air Contamination by Drugs

    PubMed Central

    Buogo, A.; Eboli, V.

    1972-01-01

    A method for detecting and measuring air contamination by drugs is described which uses an electrostatic bacterial air sampler, sprayers for micronizing drugs, and Mueller-Hinton medium seeded with a highly susceptible strain of Sarcina lutea. Three antibiotics (penicillin, tetracycline, aminosidine) and a sulfonamide (sulfapyrazine) were identified by pretreating portions of medium, showing no bacterial growth, with penicillinase or p-aminobenzoic acid solution and subsequently determining how both drug- susceptible and drug-resistant strains of Staphylococcus aureus were affected by this pretreatment. Quantitative determinations were also attempted by measuring the size of the inhibition zones. Images PMID:4483536

  9. Fast discrimination of danshen from different geographical areas by NIR spectroscopy and advanced cluster analysis method

    NASA Astrophysics Data System (ADS)

    Li, Ning; Wang, Yan; Xu, Kexin

    2006-09-01

    Near infrared (NIR) diffuse reflection spectroscopy has been an effective way to perform quantitative analysis without the requirement of sample pretreatnient. In this paper, NIR Fourier transform infrared (FTIR) spectroscopy has been introduced to probe spectral features of traditional Chinese medicine Danshen. Infrared fingerprint spectra of Danshen can be established. Influence of differentiation of spectrum is also discussed. After pretreatment and derivation on the spectral data, methods of principal analysis (PCA), soft independent modeling of class analogy (SIMCA) and Artificial Neural Network (ANN) are combined to sort the geographical origins of 53 samples by local modeling. The result show that, as a basis of the other two methods, PCA is a more efficient one for identifying the geographical origins of Danshen. Combining SIMCA with PCA, an effective model is built to analyze the data after normalization and differentiation, the correct identification rate reaches above 90%. Then 36 samples are chosen as training set while other 17 samples being verifying set. Using ANN-based Back Propagation method, after proper training of BP network, the origins of Danshen are completely classified. Therefore, combined with advanced mathematical analysis, NIR diffuse spectroscopy can be a novel and rapid way to accurately evaluate the origin of Chinese medicine, and also to accelerate the modernization process of Chinese drugs.

  10. A method for the quantitative evaluation of SAR distribution in deep regional hyperthermia.

    PubMed

    Baroni, C; Giri, M G; Meliadó, G; Maluta, S; Chierego, G

    2001-01-01

    The Specific Absorption Rate (SAR) distribution pattern visualization by a matrix of E-field light-emitting sensors has demonstrated to be a useful tool to evaluate the characteristics of the applicators used in deep regional hyperthermia and to perform a quality assurance programme. A method to quantify the SAR from photographs of the sensor array--the so-called 'Power Stepping Technique'--has already been proposed. This paper presents a new approach to the quantitative determination of the SAR profiles in a liquid phantom exposed to electromagnetic fields from the Sigma-60 applicator (BSD-2000 system for deep regional hyperthermia). The method is based on the construction of a 'calibration curve' modelling the light-output of an E-field sensor as a function of the supplied voltage and on the use of a reference light source to 'normalize' the light-output readings from the photos of the sensor array, in order to minimize the errors introduced by the non-uniformity of the photographic process. Once the calibration curve is obtained, it is possible, with only one photo, to obtain the quantitative SAR distribution in the operating conditions. For this reason, this method is suitable for equipment characterization and also for the control of the repeatability of power deposition in time. PMID:11587076

  11. A Bead-Based Method for Multiplexed Identification and Quantitation of DNA Sequences Using Flow Cytometry

    PubMed Central

    Spiro, Alexander; Lowe, Mary; Brown, Drew

    2000-01-01

    A new multiplexed, bead-based method which utilizes nucleic acid hybridizations on the surface of microscopic polystyrene spheres to identify specific sequences in heterogeneous mixtures of DNA sequences is described. The method consists of three elements: beads (5.6-μm diameter) with oligomer capture probes attached to the surface, three fluorophores for multiplexed detection, and flow cytometry instrumentation. Two fluorophores are impregnated within each bead in varying amounts to create different bead types, each associated with a unique probe. The third fluorophore is a reporter. Following capture of fluorescent cDNA sequences from environmental samples, the beads are analyzed by flow cytometric techniques which yield a signal intensity for each capture probe proportional to the amount of target sequences in the analyte. In this study, a direct hybrid capture assay was developed and evaluated with regard to sequence discrimination and quantitation of abundances. The target sequences (628 to 728 bp in length) were obtained from the 16S/23S intergenic spacer region of microorganisms collected from polluted groundwater at the nuclear waste site in Hanford, Wash. A fluorescence standard consisting of beads with a known number of fluorescent DNA molecules on the surface was developed, and the resolution, sensitivity, and lower detection limit for measuring abundances were determined. The results were compared with those of a DNA microarray using the same sequences. The bead method exhibited far superior sequence discrimination and possesses features which facilitate accurate quantitation. PMID:11010868

  12. Composition and quantitation of microalgal lipids by ERETIC ¹H NMR method.

    PubMed

    Nuzzo, Genoveffa; Gallo, Carmela; d'Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo

    2013-10-01

    Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (¹H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790

  13. A quantitative solid-state Raman spectroscopic method for control of fungicides.

    PubMed

    Ivanova, Bojidarka; Spiteller, Michael

    2012-07-21

    A new analytical procedure using solid-state Raman spectroscopy within the THz-region for the quantitative determination of mixtures of different conformations of trifloxystrobin (EE, EZ, ZE and ZZ), tebuconazole (1), and propiconazole (2) as an effective method for the fungicide product quality monitoring programmes and control has been developed and validated. The obtained quantities were controlled independently by the validated hybrid HPLC electrospray ionization (ESI) tandem mass spectrometric (MS) and matrix-assisted laser desorption/ionization (MALDI) MS methods in the condensed phase. The quantitative dependences were obtained on the twenty binary mixtures of the analytes and were further tested on the three trade fungicide products, containing mixtures of trifloxystrobin-tebuconazole and trifloxystrobin-propiconazole, as an emissive concentrate or water soluble granules of the active ingredients. The present methods provided sufficient sensitivity as reflected by the metrologic quantities, evaluating the concentration limit of detection (LOD) and quantification (LOQ), linear limit (LL), measurement accuracy and precision, true quantity value, trueness of measurement and more. PMID:22679621

  14. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  15. Generalized multiple internal standard method for quantitative liquid chromatography mass spectrometry.

    PubMed

    Hu, Yuan-Liang; Chen, Zeng-Ping; Chen, Yao; Shi, Cai-Xia; Yu, Ru-Qin

    2016-05-01

    In this contribution, a multiplicative effects model for generalized multiple-internal-standard method (MEMGMIS) was proposed to solve the signal instability problem of LC-MS over time. MEMGMIS model seamlessly integrates the multiple-internal-standard strategy with multivariate calibration method, and takes full use of all the information carried by multiple internal standards during the quantification of target analytes. Unlike the existing methods based on multiple internal standards, MEMGMIS does not require selecting an optimal internal standard for the quantification of a specific analyte from multiple internal standards used. MEMGMIS was applied to a proof-of-concept model system: the simultaneous quantitative analysis of five edible artificial colorants in two kinds of cocktail drinks. Experimental results demonstrated that MEMGMIS models established on LC-MS data of calibration samples prepared with ultrapure water could provide quite satisfactory concentration predictions for colorants in cocktail samples from their LC-MS data measured 10days after the LC-MS analysis of the calibration samples. The average relative prediction errors of MEMGMIS models did not exceed 6.0%, considerably better than the corresponding values of commonly used univariate calibration models combined with multiple internal standards. The advantages of good performance and simple implementation render MEMGMIS model a promising alternative tool in quantitative LC-MS assays. PMID:27072522

  16. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    SciTech Connect

    Gray, Jeffrey F.; Puri, Ashok

    2007-06-15

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10{sup -6}, comparable with the recent results reported in the literature.

  17. Processing of alnico permanent magnets by advanced directional solidification methods

    DOE PAGESBeta

    Zou, Min; Johnson, Francis; Zhang, Wanming; Zhao, Qi; Rutkowski, Stephen F.; Zhou, Lin; Kramer, Matthew J.

    2016-07-05

    Advanced directional solidification methods have been used to produce large (>15 cm length) castings of Alnico permanent magnets with highly oriented columnar microstructures. In combination with subsequent thermomagnetic and draw thermal treatment, this method was used to enable the high coercivity, high-Titanium Alnico composition of 39% Co, 29.5% Fe, 14% Ni, 7.5% Ti, 7% Al, 3% Cu (wt%) to have an intrinsic coercivity (Hci) of 2.0 kOe, a remanence (Br) of 10.2 kG, and an energy product (BH)max of 10.9 MGOe. These properties compare favorably to typical properties for the commercial Alnico 9. Directional solidification of higher Ti compositions yieldedmore » anisotropic columnar grained microstructures if high heat extraction rates through the mold surface of at least 200 kW/m2 were attained. This was achieved through the use of a thin walled (5 mm thick) high thermal conductivity SiC shell mold extracted from a molten Sn bath at a withdrawal rate of at least 200 mm/h. However, higher Ti compositions did not result in further increases in magnet performance. Images of the microstructures collected by scanning electron microscopy (SEM) reveal a majority α phase with inclusions of secondary αγ phase. Transmission electron microscopy (TEM) reveals that the α phase has a spinodally decomposed microstructure of FeCo-rich needles in a NiAl-rich matrix. In the 7.5% Ti composition the diameter distribution of the FeCo needles was bimodal with the majority having diameters of approximately 50 nm with a small fraction having diameters of approximately 10 nm. The needles formed a mosaic pattern and were elongated along one <001> crystal direction (parallel to the field used during magnetic annealing). Cu precipitates were observed between the needles. Regions of abnormal spinodal morphology appeared to correlate with secondary phase precipitates. The presence of these abnormalities did not prevent the material from displaying superior magnetic properties in the 7.5% Ti

  18. Processing of alnico permanent magnets by advanced directional solidification methods

    NASA Astrophysics Data System (ADS)

    Zou, Min; Johnson, Francis; Zhang, Wanming; Zhao, Qi; Rutkowski, Stephen F.; Zhou, Lin; Kramer, Matthew J.

    2016-12-01

    Advanced directional solidification methods have been used to produce large (>15 cm length) castings of Alnico permanent magnets with highly oriented columnar microstructures. In combination with subsequent thermomagnetic and draw thermal treatment, this method was used to enable the high coercivity, high-Titanium Alnico composition of 39% Co, 29.5% Fe, 14% Ni, 7.5% Ti, 7% Al, 3% Cu (wt%) to have an intrinsic coercivity (Hci) of 2.0 kOe, a remanence (Br) of 10.2 kG, and an energy product (BH)max of 10.9 MGOe. These properties compare favorably to typical properties for the commercial Alnico 9. Directional solidification of higher Ti compositions yielded anisotropic columnar grained microstructures if high heat extraction rates through the mold surface of at least 200 kW/m2 were attained. This was achieved through the use of a thin walled (5 mm thick) high thermal conductivity SiC shell mold extracted from a molten Sn bath at a withdrawal rate of at least 200 mm/h. However, higher Ti compositions did not result in further increases in magnet performance. Images of the microstructures collected by scanning electron microscopy (SEM) reveal a majority α phase with inclusions of secondary αγ phase. Transmission electron microscopy (TEM) reveals that the α phase has a spinodally decomposed microstructure of FeCo-rich needles in a NiAl-rich matrix. In the 7.5% Ti composition the diameter distribution of the FeCo needles was bimodal with the majority having diameters of approximately 50 nm with a small fraction having diameters of approximately 10 nm. The needles formed a mosaic pattern and were elongated along one <001> crystal direction (parallel to the field used during magnetic annealing). Cu precipitates were observed between the needles. Regions of abnormal spinodal morphology appeared to correlate with secondary phase precipitates. The presence of these abnormalities did not prevent the material from displaying superior magnetic properties in the 7.5% Ti

  19. New Method for the Quantitative Analysis of Smear Slides in Pelagic and Hemi-Pelagic Sediments of the Bering Sea

    NASA Astrophysics Data System (ADS)

    Drake, M. K.; Aiello, I. W.; Ravelo, A. C.

    2014-12-01

    Petrographic microscopy of smear slides is the standard method to initially investigate marine sediments in core sediment studies (e.g. IODP expeditions). The technique is not commonly used in more complex analysis due to concerns over the subjectivity of the method and variability in operator training and experience. Two initiatives sponsored by Ocean Leadership, a sedimentology training workshop and a digital reference of smear slide components (Marsaglia et al., 2013) have been implemented to address the need for advanced training. While the influence of subjectivity on the quality of data has yet to be rigorously tested, the lack of standardization in the current method of smear slide analysis (SSA) remains a concern. The relative abundance of the three main components, (total diatoms, silt-to-sand sized siliciclastics, and clay minerals) of high and low density Bering Sea hemi-pelagic sediments from the ocean margin (Site U144; Site U1339) and pelagic sediments from the open-ocean (Site U1340) were analyzed. Our analyses show visual estimation is a reproducible method to quantify the relative abundance of the main sediment components. Furthermore, we present a modified method for SSA, with procedural changes objectively guided by statistical analyses, including constraints to increase randomness and precision in both the preparation and analysis of the smear slide. For example, repeated measure ANOVAs found a smear slide could be accurately quantified by counting three fields of view. Similarly, the use of replicate smear slides to quantify a sample was analyzed. Finally, the data produced from this modified SSA shows a strong correlation to continuously logged physical parameters of sediment such as gamma ray attenuation (Site U1339 r2= 0.41; Site U1340 r2= 0.36). Therefore, the modified SSA combined with other independent methods (e.g. laser particle size analysis, scanning electron microscopy, and physical properties) can be a very effective tool for the

  20. A quantitative autoradiographic method for the measurement of local rates of brain protein synthesis

    SciTech Connect

    Dwyer, B.E.; Donatoni, P.; Wasterlain, C.G.

    1982-05-01

    We have developed a new method for measuring local rates of brain protein synthesis in vivo. It combines the intraperitoneal injection of a large dose of low specific activity amino acid with quantitative autoradiography. This method has several advantages: 1) It is ideally suited for young or small animals or where immobilizing an animal is undesirable. 2 The amino acid injection ''floods'' amino acid pools so that errors in estimating precursor specific activity, which is especially important in pathological conditions, are minimized. 3) The method provides for the use of a radioautographic internal standard in which valine incorporation is measured directly. Internal standards from experimental animals correct for tissue protein content and self-absorption of radiation in tissue sections which could vary under experimental conditions.

  1. Sample preparation methods for quantitative detection of DNA by molecular assays and marine biosensors.

    PubMed

    Cox, Annie M; Goodwin, Kelly D

    2013-08-15

    The need for quantitative molecular methods is growing in environmental, food, and medical fields but is hindered by low and variable DNA extraction and by co-extraction of PCR inhibitors. DNA extracts from Enterococcus faecium, seawater, and seawater spiked with E. faecium and Vibrio parahaemolyticus were tested by qPCR for target recovery and inhibition. Conventional and novel methods were tested, including Synchronous Coefficient of Drag Alteration (SCODA) and lysis and purification systems used on an automated genetic sensor (the Environmental Sample Processor, ESP). Variable qPCR target recovery and inhibition were measured, significantly affecting target quantification. An aggressive lysis method that utilized chemical, enzymatic, and mechanical disruption enhanced target recovery compared to commercial kit protocols. SCODA purification did not show marked improvement over commercial spin columns. Overall, data suggested a general need to improve sample preparation and to accurately assess and account for DNA recovery and inhibition in qPCR applications. PMID:23790450

  2. Quantitative trait locus gene mapping: a new method for locating alcohol response genes.

    PubMed

    Crabbe, J C

    1996-01-01

    Alcoholism is a multigenic trait with important non-genetic determinants. Studies with genetic animal models of susceptibility to several of alcohol's effects suggest that several genes contributing modest effects on susceptibility (Quantitative Trait Loci, or QTLs) are important. A new technique of QTL gene mapping has allowed the identification of the location in mouse genome of several such QTLs. The method is described, and the locations of QTLs affecting the acute alcohol withdrawal reaction are described as an example of the method. Verification of these QTLs in ancillary studies is described and the strengths, limitations, and future directions to be pursued are discussed. QTL mapping is a promising method for identifying genes in rodents with the hope of directly extrapolating the results to the human genome. This review is based on a paper presented at the First International Congress of the Latin American Society for Biomedical Research on Alcoholism, Santiago, Chile, November 1994. PMID:12893462

  3. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  4. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  5. "Do I Need Research Skills in Working Life?": University Students' Motivation and Difficulties in Quantitative Methods Courses

    ERIC Educational Resources Information Center

    Murtonen, Mari; Olkinuora, Erkki; Tynjala, Paivi; Lehtinen, Erno

    2008-01-01

    This study explored university students' views of whether they will need research skills in their future work in relation to their approaches to learning, situational orientations on a learning situation of quantitative methods, and difficulties experienced in quantitative research courses. Education and psychology students in both Finland (N =…

  6. A method for estimating the effective number of loci affecting a quantitative character.

    PubMed

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci. PMID:23973416

  7. Advanced Methods of Observing Surface Plasmon Polaritons and Magnons

    NASA Astrophysics Data System (ADS)

    Moghaddam, Abolghasem Mobaraki

    Available from UMI in association with The British Library. Requires signed TDF. The primary objectives of this thesis are the investigation of the theoretical and experimental aspects of the design and construction of advanced techniques for the excitation of surface plasmon-polaritons, surface magneto -plasmon-polaritons and surface magnons. They involve on -line observation of these phenomena and to accomplish these goals, analytical studies of the characteristic behaviour of these phenomena have been undertaken. For excitations of surface plasmon- and surface magneto-plasmon-polaritons the most robust and conventional configuration, namely Prism-Medium-Air, coupled to a novel angle scan (prism spinning) method was employed. The system to be described here can automatically measure the reflectivity of a multilayer system over a range of angles that includes the resonance angle in an Attenuated Total Reflection (ATR) experiment. The computer procedure that controls the system is quite versatile so that it allows any right-angle prism of different angle or refractive index to be utilised. It also provided probes to check for optical alignment within the system. Moreover, it performs the angular scan many times and then averages the results in order to reduce the environmental and other possible sources of noise within the system. The mechanical side of the system is unique and could eventually be adopted as a marketable piece of equipment. It consists of a turntable for holding the prism-sample assembly and a drive motor in conjunction with a servo-potentiometer whose output not only operates the turntable but also sends a signal to a computer to measure accurately its position. The interface unit enables a computer to control automatically an angular scan ATR experiment for measuring the resonance reflectivity spectrum of a multilayer system. The interface unit uses an H-bridge switch formed by four bipolar power transistor and two small signal MOSFETs to convert

  8. Limitations of the ferrozine method for quantitative assay of mineral systems for ferrous and total iron

    NASA Astrophysics Data System (ADS)

    Anastácio, Alexandre S.; Harris, Brittany; Yoo, Hae-In; Fabris, José Domingos; Stucki, Joseph W.

    2008-10-01

    The quantitative assay of clay minerals, soils, and sediments for Fe(II) and total Fe is fundamental to understanding biogeochemical cycles occurring therein. The commonly used ferrozine method was originally designed to assay extracted forms of Fe(II) from non-silicate aqueous systems. It is becoming, however, increasingly the method of choice to report the total reduced state of Fe in soils and sediments. Because Fe in soils and sediments commonly exists in the structural framework of silicates, extraction by HCl, as used in the ferrozine method, fails to dissolve all of the Fe. The phenanthroline (phen) method, on the other hand, was designed to assay silicate minerals for Fe(II) and total Fe and has been proven to be highly reliable. In the present study potential sources of error in the ferrozine method were evaluated by comparing its results to those obtained by the phen method. Both methods were used to analyze clay mineral and soil samples for Fe(II) and total Fe. Results revealed that the conventional ferrozine method under reports total Fe in samples containing Fe in silicates and gives erratic results for Fe(II). The sources of error in the ferrozine method are: (1) HCl fails to dissolve silicates and (2) if the analyte solution contains Fe 3+, the analysis for Fe 2+ will be photosensitive, and reported Fe(II) values will likely be greater than the actual amount in solution. Another difficulty with the ferrozine method is that it is tedious and much more labor intensive than the phen method. For these reasons, the phen method is preferred and recommended. Its procedure is simpler, takes less time, and avoids the errors found in the ferrozine method.

  9. Characterization of working iron Fischer-Tropsch catalysts using quantitative diffraction methods

    NASA Astrophysics Data System (ADS)

    Mansker, Linda Denise

    This study presents the results of the ex-situ characterization of working iron Fischer-Tropsch synthesis (F-TS) catalysts, reacted hundreds of hours at elevated pressures, using a new quantitative x-ray diffraction analytical methodology. Compositions, iron phase structures, and phase particle morphologies were determined and correlated with the observed reaction kinetics. Conclusions were drawn about the character of each catalyst in its most and least active state. The identity of the active phase(s) in the Fe F-TS catalyst has been vigorously debated for more than 45 years. The highly-reduced catalyst, used to convert coal-derived syngas to hydrocarbon products, is thought to form a mixture of oxides, metal, and carbides upon pretreatment and reaction. Commonly, Soxhlet extraction is used to effect catalyst-product slurry separation; however, the extraction process could be producing irreversible changes in the catalyst, contributing to the conflicting results in the literature. X-ray diffraction doesn't require analyte-matrix separation before analysis, and can detect trace phases down to 300 ppm/2 nm; thus, working catalyst slurries could be characterized as-sampled. Data were quantitatively interpreted employing first principles methods, including the Rietveld polycrystalline structure method. Pretreated catalysts and pure phases were examined experimentally and modeled to explore specific behavior under x-rays. Then, the working catalyst slurries were quantitatively characterized. Empirical quantitation factors were calculated from experimental data or single crystal parameters, then validated using the Rietveld method results. In the most active form, after pretreatment in H 2 or in CO at Pambient, well-preserved working catalysts contained significant amounts of Fe7C3 with trace alpha-Fe, once reaction had commenced at elevated pressure. Amounts of Fe3O 4 were constant and small, with carbide dpavg < 15 nm. Small amounts of Fe7C3 were found in unreacted

  10. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions.

    PubMed

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E; Geller, Jil T; Fisher, Susan J; Hall, Steven C; Hazen, Terry C; Brenner, Steven E; Butland, Gareth; Jin, Jian; Witkowska, H Ewa; Chandonia, John-Marc; Biggin, Mark D

    2016-06-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  11. Spatial Access Priority Mapping (SAPM) with Fishers: A Quantitative GIS Method for Participatory Planning

    PubMed Central

    Yates, Katherine L.; Schoeman, David S.

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers’ spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers’ willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision

  12. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  13. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    PubMed

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  14. Quantitative 4D Transcatheter Intraarterial Perfusion MR Imaging as a Method to Standardize Angiographic Chemoembolization Endpoints

    PubMed Central

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2011-01-01

    PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P < 0.001) intraprocedural perfusion reduction. CONCLUSION The SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520

  15. Advanced Extraction Methods for Actinide/Lanthanide Separations

    SciTech Connect

    Scott, M.J.

    2005-12-01

    The separation of An(III) ions from chemically similar Ln(III) ions is perhaps one of the most difficult problems encountered during the processing of nuclear waste. In the 3+ oxidation states, the metal ions have an identical charge and roughly the same ionic radius. They differ strictly in the relative energies of their f- and d-orbitals, and to separate these metal ions, ligands will need to be developed that take advantage of this small but important distinction. The extraction of uranium and plutonium from nitric acid solution can be performed quantitatively by the extraction with the TBP (tributyl phosphate). Commercially, this process has found wide use in the PUREX (plutonium uranium extraction) reprocessing method. The TRUEX (transuranium extraction) process is further used to coextract the trivalent lanthanides and actinides ions from HLLW generated during PUREX extraction. This method uses CMPO [(N, N-diisobutylcarbamoylmethyl) octylphenylphosphineoxide] intermixed with TBP as a synergistic agent. However, the final separation of trivalent actinides from trivalent lanthanides still remains a challenging task. In TRUEX nitric acid solution, the Am(III) ion is coordinated by three CMPO molecules and three nitrate anions. Taking inspiration from this data and previous work with calix[4]arene systems, researchers on this project have developed a C3-symmetric tris-CMPO ligand system using a triphenoxymethane platform as a base. The triphenoxymethane ligand systems have many advantages for the preparation of complex ligand systems. The compounds are very easy to prepare. The steric and solubility properties can be tuned through an extreme range by the inclusion of different alkoxy and alkyl groups such as methyoxy, ethoxy, t-butoxy, methyl, octyl, t-pentyl, or even t-pentyl at the ortho- and para-positions of the aryl rings. The triphenoxymethane ligand system shows promise as an improved extractant for both tetravalent and trivalent actinide recoveries form

  16. Rapid method for glutathione quantitation using high-performance liquid chromatography with coulometric electrochemical detection.

    PubMed

    Bayram, Banu; Rimbach, Gerald; Frank, Jan; Esatbeyoglu, Tuba

    2014-01-15

    A rapid, sensitive, and direct method (without derivatization) was developed for the detection of reduced glutathione (GSH) in cultured hepatocytes (HepG2 cells) using high-performance liquid chromatography with electrochemical detection (HPLC-ECD). The method was validated according to the guidelines of the U.S. Food and Drug Administration in terms of linearity, lower limit of quantitation (LOQ), lower limit of detection (LOD), precision, accuracy, recovery, and stabilities of GSH standards and quality control samples. The total analysis time was 5 min, and the retention time of GSH was 1.78 min. Separation was carried out isocratically using 50 mM sodium phosphate (pH 3.0) as a mobile phase with a fused-core column. The detector response was linear between 0.01 and 80 μmol/L, and the regression coefficient (R(2)) was >0.99. The LOD for GSH was 15 fmol, and the intra- and interday recoveries ranged between 100.7 and 104.6%. This method also enabled the rapid detection (in 4 min) of other compounds involved in GSH metabolism such as uric acid, ascorbic acid, and glutathione disulfite. The optimized and validated HPLC-ECD method was successfully applied for the determination of GSH levels in HepG2 cells treated with buthionine sulfoximine (BSO), an inhibitor, and α-lipoic acid (α-LA), an inducer of GSH synthesis. As expected, the amount of GSH concentration-dependently decreased with BSO and increased with α-LA treatments in HepG2 cells. This method could also be useful for the quantitation of GSH, uric acid, ascorbic acid, and glutathione disulfide in other biological matrices such as tissue homogenates and blood. PMID:24328299

  17. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the

  18. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    PubMed

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  19. An immunochemical method for quantitation of Epinotia aporema granulovirus (EpapGV).

    PubMed

    Parola, Alejandro Daniel; Sciocco-Cap, Alicia; Glikmann, Graciela; Romanowski, Víctor

    2003-09-01

    Epinotia aporema granulovirus (EpapGV) is a baculovirus that affects E. aporema larvae and has proven to be a good candidate for the biocontrol of this important pest in South America. As part of the quality control of the production of a bioinsecticide based on EpapGV, a sensitive method was developed for the detection and quantitation of the virus. To this end, we used the major occlusion body (OB) protein (granulin) to generate polyclonal antibodies in rabbits. Purified IgG fractions from hyperimmune sera were labeled with biotin and used as detecting antibodies in a double antibody sandwich enzyme linked immunosorbent assays (ELISA). No cross-reactivity was detected with any of the nucleopolyhedroviruses (NPV) tested in this study, while a minor degree of reactivity was observed with the closely related Cydia pomonella granulovirus (CpGV). The performance of the ELISA was satisfactory in terms of sensitivity, detecting as little as 0.53 ng/ml of EpapGV granulin in suspensions of purified virus OB. This represented 2.0x10(4) OB/ml. Granulin was also detected in complex and highly diluted bioinsecticidal formulate mixtures. In time course experiments, the virus was detected as early as 24 h post infection (p.i.). The results of the studies demonstrate that this method is a convenient, rapid and inexpensive alternative for routine detection and quantitation of EpapGV. PMID:12951208

  20. A method for the simultaneous identification and quantitation of five superwarfarin rodenticides in human serum.

    PubMed

    Kuijpers, E A; den Hartigh, J; Savelkoul, T J; de Wolff, F A

    1995-01-01

    A high-performance liquid chromatographic method with ultraviolet (UV) and fluorescence detection was developed for the analysis of one indandione and four hydroxycoumarin anticoagulant rodenticides in human serum. The superwarfarin rodenticides, chlorophacinone, bromadiolone, difenacoum, brodifacoum, and difethialone, can be identified and quantitated simultaneously with this method. After adding a buffer (pH 5.5), the anticoagulants were extracted from serum with chloroform-acetone. The organic phase was separated and evaporated to dryness, and the residue was subjected to chromatographic analysis. The anticoagulants were separated by reversed-phase chromatography and detected by UV absorption at 285 nm and by fluorescence at an excitation wavelength of 265 nm and an emission wavelength of 400 nm. Extraction efficiencies from 55 to 131% were obtained. The within-run precision ranged from 2.0 to 7.1% for UV detection and from 0.0 to 4.8% for fluorescence detection. Between-run precision ranged from 1.3 to 16.0% for UV detection and from 1.8 to 9.0% for fluorescence detection. The anticoagulants can be quantitated at serum concentrations down to 3-12 ng/mL for fluorescence detection and down to 20-75 ng/mL for UV detection. No interferences were observed with the related compounds warfarin and vitamin K1. PMID:8577178

  1. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method.

    PubMed

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is (252)Cf or (241)Am-Be. In this study, (252)Cf with a neutron flux of 6.3x10(6)n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with (3)He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of approximately 0.947g/cc and area of 40cmx25cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei. PMID:19285419

  2. A quantitative method for estimation of volume changes in arachnoid foveae with age.

    PubMed

    Duray, Stephen M; Martel, Stacie S

    2006-03-01

    Age-related changes of arachnoid foveae have been described, but objective, quantitative analyses are lacking. A new quantitative method is presented for estimation of change in total volume of arachnoid foveae with age. The pilot sample consisted of nine skulls from the Palmer Anatomy Laboratory. Arachnoid foveae were filled with sand, which was extracted using a vacuum pump. Mass was determined with an analytical balance and converted to volume. A reliability analysis was performed using intraclass correlation coefficients. The method was found to be highly reliable (intraobserver ICC = 0.9935, interobserver ICC = 0.9878). The relationship between total volume and age was then examined in a sample of 63 males of accurately known age from the Hamann-Todd collection. Linear regression analysis revealed no statistically significant relationship between total volume and age, or foveae frequency and age (alpha = 0.05). Development of arachnoid foveae may be influenced by health factors, which could limit its usefulness in aging. PMID:16566755

  3. An evolutionary method for synthesizing technological planning and architectural advance

    NASA Astrophysics Data System (ADS)

    Cole, Bjorn Forstrom

    In the development of systems with ever-increasing performance and/or decreasing drawbacks, there inevitably comes a point where more progress is available by shifting to a new set of principles of use. This shift marks a change in architecture, such as between the piston-driven propeller and the jet engine. The shift also often involves an abandonment of previous competencies that have been developed with great effort, and so a foreknowledge of these shifts can be advantageous. A further motivation for this work is the consideration of the Micro Autonomous Systems and Technology (MAST) project, which aims to develop very small (<5 cm) robots for a variety of uses. This is primarily a technology research project, and there is no baseline morphology for a robot to be considered. This then motivates an interest in the ability to automatically compose physical architectures from a series of components and quantitatively analyze them for a basic, conceptual analysis. The ability to do this would enable researchers to turn attention to the most promising forms. This work presents a method for using technology forecasts of components that enable future architectural shifts in order to forecast those shifts. The method consists of the use of multidimensional S-curves, genetic algorithms, and a graph-based formulation of architecture that is more flexible than other morphological techniques. Potential genetic operators are explored in depth to draft a final graph-based genetic algorithm. This algorithm is then implemented in a design code called Sindri, which leverages a commercial design tool named Pacelab. The first chapters of this thesis provide context and a philosophical background to the studies and research that was conducted. In particular, the idea that technology progresses in a fundamentally gradual way is developed and supported with previous historical research. The import of this is that the future can to some degree be predicted by the past, provided that

  4. A Rapid and Quantitative Flow Cytometry Method for the Analysis of Membrane Disruptive Antimicrobial Activity

    PubMed Central

    O’Brien-Simpson, Neil M.; Pantarat, Namfon; Attard, Troy J.; Walsh, Katrina A.; Reynolds, Eric C.

    2016-01-01

    We describe a microbial flow cytometry method that quantifies within 3 hours antimicrobial peptide (AMP) activity, termed Minimum Membrane Disruptive Concentration (MDC). Increasing peptide concentration positively correlates with the extent of bacterial membrane disruption and the calculated MDC is equivalent to its MBC. The activity of AMPs representing three different membranolytic modes of action could be determined for a range of Gram positive and negative bacteria, including the ESKAPE pathogens, E. coli and MRSA. By using the MDC50 concentration of the parent AMP, the method provides high-throughput, quantitative screening of AMP analogues. A unique feature of the MDC assay is that it directly measures peptide/bacteria interactions and lysed cell numbers rather than bacteria survival as with MIC and MBC assays. With the threat of multi-drug resistant bacteria, this high-throughput MDC assay has the potential to aid in the development of novel antimicrobials that target bacteria with improved efficacy. PMID:26986223

  5. Two methods for the quantitative analysis of surface antigen expression in acute myeloid leukemia (AML).

    PubMed

    Woźniak, Jolanta

    2004-01-01

    The expression of lineage molecules (CD13 and CD33), c-Kit receptor (CD117), CD34, HLA-DR and adhesion molecule CD49d was assessed in acute myeloid leukemia (AML) blast cells from 32 cases, using direct and indirect quantitative cytometric analysis. High correlation (r=0.8) was found between antigen expression intensity values calculated by direct analysis method (ABC) and by indirect analysis method (RFI). Moreover, the differences in expression intensity of CD13, CD117 and CD34 antigens were found between leukemic and normal myeloblasts. This may be helpful in identification of leukemic cells in the diagnostics of minimal residual disease after treatment in AML patients. PMID:15493582

  6. Fast-Neutron Hodoscope at TREAT: Methods for Quantitative Determination of Fuel Dispersal

    SciTech Connect

    De Volci, A.; Fink, C. L.; Marsh, G. E.; Rhodes, E. A.; Stanford, G. S.

    1980-01-01

    Fuel-motion surveillance using the fast-neutron hodoscope in TREAT experiments has advanced from an initial role of providing time/location/velocity data to that of offering quantitative mass results. The material and radiation surroundings of tha test section contribute to intrinsic and instrumental effects upon hodoscope detectors that require detailed corrections. Depending upon the experiment, count rate compensation is usually required for deadtime, power level, nonlinear response, efficiency, background, and detector calibration. Depending on their magnitude and amenability to analytical and empirical treatment, systematic corrections may be needed for self-shielding, self-multiplication, self-attenuation, flux depression, and other effects. Current verified hodoscope response (for 1- to 7-pin fuel bundles) may be paramatrically characterized under optimum conditions by 1-ms time resolution; 0.25-mm lateral and 5-mm axial-motion displacement resolution; and 50-mg single-pin mass resolution. The experimental and theoretical foundation for this performance is given, with particular emphasis on the geometrical response function and the statistical limits of fuel-motion resolution. Comparisons are made with alternative diagnostic systems.

  7. Quantitative mineralogical composition of complex mineral wastes - Contribution of the Rietveld method

    SciTech Connect

    Mahieux, P.-Y.; Aubert, J.-E.; Cyr, M.; Coutand, M.; Husson, B.

    2010-03-15

    The objective of the work presented in this paper is the quantitative determination of the mineral composition of two complex mineral wastes: a sewage sludge ash (SSA) and a municipal solid waste incineration fly ash (MSWIFA). The mineral compositions were determined by two different methods: the first based on calculation using the qualitative mineralogical composition of the waste combined with physicochemical analyses; the second the Rietveld method, which uses only X-ray diffraction patterns. The results obtained are coherent, showing that it is possible to quantify the mineral compositions of complex mineral waste with such methods. The apparent simplicity of the Rietveld method (due principally to the availability of software packages implementing the method) facilitates its use. However, care should be taken since the crystal structure analysis based on powder diffraction data needs experience and a thorough understanding of crystallography. So the use of another, complementary, method such as the first one used in this study, may sometimes be needed to confirm the results.

  8. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    PubMed Central

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  9. Rapid quantitative analysis of lipids using a colorimetric method in a microplate format.

    PubMed

    Cheng, Yu-Shen; Zheng, Yi; VanderGheynst, Jean S

    2011-01-01

    A colorimetric sulfo-phospho-vanillin (SPV) method was developed for high throughput analysis of total lipids. The developed method uses a reaction mixture that is maintained in a 96-well microplate throughout the entire assay. The new assay provides the following advantages over other methods of lipid measurement: (1) background absorbance can be easily corrected for each well, (2) there is less risk of handling and transferring sulfuric acid contained in reaction mixtures, (3) color develops more consistently providing more accurate measurement of absorbance, and (4) the assay can be used for quantitative measurement of lipids extracted from a wide variety of sources. Unlike other spectrophotometric approaches that use fluorescent dyes, the optimal spectra and reaction conditions for the developed assay do not vary with the sample source. The developed method was used to measure lipids in extracts from four strains of microalgae. No significant difference was found in lipid determination when lipid content was measured using the new method and compared to results obtained using a macro-gravimetric method. PMID:21069472

  10. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  11. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody.

    PubMed

    Yoshinari, Tomoya; Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena; Ohkawa, Hideo; Sugita-Konishi, Yoshiko

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L(-1). The coefficients of variation were 7.9% at 0.003 mg L(-1), 5.0% at 0.03 mg L(-1) and 13.7% at 0.3 mg L(-1), respectively. The limit of detection was 0.006 mg L(-1) for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9-100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg(-1). The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R(2) = 0.9760) than the immunochromatographic assay kit (R(2) = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety. PMID:26320967

  12. Maillard reaction products in bread: A novel semi-quantitative method for evaluating melanoidins in bread.

    PubMed

    Helou, Cynthia; Jacolot, Philippe; Niquet-Léridon, Céline; Gadonna-Widehem, Pascale; Tessier, Frédéric J

    2016-01-01

    The aim of this study was to test the methods currently in use and to develop a new protocol for the evaluation of melanoidins in bread. Markers of the early and advanced stages of the Maillard reaction were also followed in the crumb and the crust of bread throughout baking, and in a crust model system. The crumb of the bread contained N(ε)-fructoselysine and N(ε)-carboxymethyllysine but at levels 7 and 5 times lower than the crust, respectively. 5-Hydroxymethylfurfural was detected only in the crust and its model system. The available methods for the semi-quantification of melanoidins were found to be unsuitable for their analysis in bread. Our new method based on size exclusion chromatography and fluorescence measures soluble fluorescent melanoidins in bread. These melanoidin macromolecules (1.7-5.6 kDa) were detected intact in both crust and model system. They appear to contribute to the dietary fibre in bread. PMID:26213055

  13. New Image Reconstruction Methods for Accelerated Quantitative Parameter Mapping and Magnetic Resonance Angiography

    NASA Astrophysics Data System (ADS)

    Velikina, J. V.; Samsonov, A. A.

    2016-02-01

    Advanced MRI techniques often require sampling in additional (non-spatial) dimensions such as time or parametric dimensions, which significantly elongate scan time. Our purpose was to develop novel iterative image reconstruction methods to reduce amount of acquired data in such applications using prior knowledge about signal in the extra dimensions. The efforts have been made to accelerate two applications, namely, time resolved contrast enhanced MR angiography and T1 mapping. Our result demonstrate that significant acceleration (up to 27x times) may be achieved using our proposed iterative reconstruction techniques.

  14. Quantitative Methods for Reservoir Characterization and Improved Recovery: Application to Heavy Oil Sands

    SciTech Connect

    Castle, James W.; Molz, Fred W.; Bridges, Robert A.; Dinwiddie, Cynthia L.; Lorinovich, Caitlin J.; Lu, Silong

    2003-02-07

    This project involved application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation. The investigation was performed in collaboration with Chevron Production Company U.S.A. as an industrial partner, and incorporates data from the Temblor Formation in Chevron's West Coalinga Field, California. Improved prediction of interwell reservoir heterogeneity was needed to increase productivity and to reduce recovery cost for California's heavy oil sands, which contained approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley.

  15. Application of advanced filtering methods to the determination of the interplanetary orbit of Mariner '71.

    NASA Technical Reports Server (NTRS)

    Rourke, K. H.; Jordan, J. F.

    1972-01-01

    This paper presents the results of the applications of advanced filtering methods to the determination of the interplanetary orbit of the Mariner '71 spacecraft. The advanced techniques are specific extensions of the Kalman filter. The special problems associated with applying these techniques are discussed and the particular algorithmic implementations are outlined. The advanced methods are compared against the weighted least squares filters of conventional application. The results reveal that relatively simple advanced filter configurations yield solutions superior to those of the conventional methods when applied to the Mariner '71 radio measurements.

  16. Comparison of six DNA extraction methods for recovery of fungal DNA as assessed by quantitative PCR.

    PubMed

    Fredricks, David N; Smith, Caitlin; Meier, Amalia

    2005-10-01

    The detection of fungal pathogens in clinical samples by PCR requires the use of extraction methods that efficiently lyse fungal cells and recover DNA suitable for amplification. We used quantitative PCR assays to measure the recovery of DNA from two important fungal pathogens subjected to six DNA extraction methods. Aspergillus fumigatus conidia or Candida albicans yeast cells were added to bronchoalveolar lavage fluid and subjected to DNA extraction in order to assess the recovery of DNA from a defined number of fungal propagules. In order to simulate hyphal growth in tissue, Aspergillus fumigatus conidia were allowed to form mycelia in tissue culture media and then harvested for DNA extraction. Differences among the DNA yields from the six extraction methods were highly significant (P<0.0001) in each of the three experimental systems. An extraction method based on enzymatic lysis of fungal cell walls (yeast cell lysis plus the use of GNOME kits) produced high levels of fungal DNA with Candida albicans but low levels of fungal DNA with Aspergillus fumigatus conidia or hyphae. Extraction methods employing mechanical agitation with beads produced the highest yields with Aspergillus hyphae. The Master Pure yeast method produced high levels of DNA from C. albicans but only moderate yields from A. fumigatus. A reagent from one extraction method was contaminated with fungal DNA, including DNA from Aspergillus and Candida species. In conclusion, the six extraction methods produce markedly differing yields of fungal DNA and thus can significantly affect the results of fungal PCR assays. No single extraction method was optimal for all organisms. PMID:16207973

  17. A small-scale method for quantitation of carotenoids in bacteria and yeasts.

    PubMed

    Kaiser, Philipp; Surmann, Peter; Vallentin, Gerald; Fuhrmann, Herbert

    2007-07-01

    Microbial carotenoids are difficult to extract because of their embedding into a compact matrix and prominent sensitivity to degradation. Especially for carotenoid analysis of bacteria and yeasts, there is lack of information about capability, precision and recovery of the method used. Accordingly, we investigated feasibility, throughput and validity of a new small-scale method using Micrococcus luteus and Rhodotorula glutinis for testing purposes. For disintegration and extraction, we combined primarily mild techniques: enzymatically we used combinations of lysozyme and lipase for bacteria as well as lyticase and lipase for yeasts. Additional mechanical treatment included sonication and freeze-thawing cycles. Chemical treatment with dimethylsulfoxide was applied for yeasts only. For extraction we used a methanol-chloroform mixture stabilized efficiently with butylated hydroxytoluene and alpha-tocopherol. Separation of compounds was achieved with HPLC, applying a binary methanol/tert-butyl methyl ether gradient on a polymer reversed C30 phase. Substances of interest were detected and identified applying a photodiode-array (PDA) and carotenoids quantitated as all-trans-beta-carotene equivalents. For evaluation of recovery and reproducibility of the extraction method, we used beta-8'-apo-carotenal as internal standard. The method provides a sensitive tool for the determination of carotenoids from bacteria and yeasts and also for small changes in carotenoid spectrum of a single species. Corequisite large experiments are facilitated by the high throughput of the method. PMID:17509707

  18. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods

    NASA Astrophysics Data System (ADS)

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-01

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  19. Quantitative measurement of ultrasound pressure field by optical phase contrast method and acoustic holography

    NASA Astrophysics Data System (ADS)

    Oyama, Seiji; Yasuda, Jun; Hanayama, Hiroki; Yoshizawa, Shin; Umemura, Shin-ichiro

    2016-07-01

    A fast and accurate measurement of an ultrasound field with various exposure sequences is necessary to ensure the efficacy and safety of various ultrasound applications in medicine. The most common method used to measure an ultrasound pressure field, that is, hydrophone scanning, requires a long scanning time and potentially disturbs the field. This may limit the efficiency of developing applications of ultrasound. In this study, an optical phase contrast method enabling fast and noninterfering measurements is proposed. In this method, the modulated phase of light caused by the focused ultrasound pressure field is measured. Then, a computed tomography (CT) algorithm used to quantitatively reconstruct a three-dimensional (3D) pressure field is applied. For a high-intensity focused ultrasound field, a new approach that combines the optical phase contrast method and acoustic holography was attempted. First, the optical measurement of focused ultrasound was rapidly performed over the field near a transducer. Second, the nonlinear propagation of the measured ultrasound was simulated. The result of the new approach agreed well with that of the measurement using a hydrophone and was improved from that of the phase contrast method alone with phase unwrapping.

  20. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods.

    PubMed

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-15

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%). PMID:26774813

  1. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials. PMID:24282943

  2. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    PubMed

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens. PMID:19007521

  3. Quantitative Determination of Methylcyclohexanone Mixtures Using 13C NMR Spectroscopy: A Project for an Advanced Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Lefevre, Joseph W.; Silveira, Augustine, Jr.

    2000-01-01

    The percentage composition of mixtures of four methylcyclohexanones was determined using 13C NMR spectroscopy as a quantitative analytical method. The data were acquired using standard broadband proton decoupling and inverse-gated decoupling, the latter done both with and without the paramagnetic relaxation reagent chromium(III) acetylacetonate [Cr(acac)3]. The standard broadband decoupled spectrum resulted in percentages far from the actual values owing to the varying nuclear Overhauser enhancements (NOEs) and spin-lattice relaxation times (T1's) of the various carbon atoms. These effects were eliminated in the inverse-gated experiments, and the results were very close to the actual percentages. Before examining the mixtures, the students studied a pure sample of 2-methylcyclohexanone. They assigned the 13C spectrum and determined the T1 of the carbonyl group both with and without Cr(acac)3 using the inversion-recovery method. Then a five-times-T1 delay was inserted between pulses in all subsequent inverse-gated decoupling experiments. This project provides students with valuable experience with modern NMR techniques. These include COrrelated SpectroscopY (COSY), Distortionless Enhancement by Polarization Transfer (DEPT) spectroscopy, HETeronuclear CORrelated (HETCOR) spectroscopy, T1 determination, standard broadband versus inverse-gated decoupling, and the addition of a paramagnetic relaxation reagent to dramatically shorten both the T

  4. Fluorometric method for the simultaneous quantitation of differently-sized nanoparticles in rodent tissue.

    PubMed

    Hussain, N

    2001-02-19

    The oral absorption and systemic translocation of particulate matter via the gastrointestinal tract has been shown by a number of laboratories using a wide variety of particles in different animal species. While there is debate on the magnitude of particle intestinal translocation, which is encumbered by the differing experimental protocols, particularly the method of quantitation of absorbed material, few have sought to examine the pharmacokinetic aspects of particle absorption. We describe in this communication the development of a simple and a rapid fluorometric assay of quantifying tissue-laden fluorescent nanoparticles that is able to isolate, detect and quantify the presence of two or more particle populations differing both in their size and fluorescent label. Six types of polystyrene nanoparticles incorporating different fluorescent markers were spiked in whole livers. The fluorophores were extracted using our previously developed method of freeze-drying the tissue and using chloroform as the extractive solvent. Only two types of particle populations, orange-labelled 40 nm and Fluoroscein-emitting 500 nm nanoparticles, were sufficiently recoverable and provided a high signal-to-noise ratio for further work. The amount of tissue and type of biological tissue type also impacted on the nanoparticle recovery and detection, reflecting, perhaps, the quenching effects of interacting tissue-derived molecules. In addition, the results also indicate that the use of nanoparticles incorporating fluorescent dyes that have emission over 500 nm overcome the tissue interfering autofluorescence for low doses of nanoparticles. The use of this fluorometric method has several advantages compared with other modes of quantitation in that it is rapid, non-radioactive and the marker is non-leaching. More importantly, it allows the simultaneous detection of multiple fluorophores such that two or more different fluorescent particle populations can be detected in the same sample

  5. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    PubMed Central

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  6. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 μg L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams. PMID:26395288

  7. A quantitative method for zoning of protected areas and its spatial ecological implications.

    PubMed

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  8. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  9. Evaluation of a quantitative fit testing method for N95 filtering facepiece respirators.

    PubMed

    Janssen, Larry; Luinenburg, Michael D; Mullins, Haskell E; Danisch, Susan G; Nelson, Thomas J

    2003-01-01

    A method for performing quantitative fit tests (QNFT) with N95 filtering facepiece respirators was developed by earlier investigators. The method employs a simple clamping device to allow the penetration of submicron aerosols through N95 filter media to be measured. The measured value is subtracted from total penetration, with the assumption that the remaining penetration represents faceseal leakage. The developers have used the clamp to assess respirator performance. This study evaluated the clamp's ability to measure filter penetration and determine fit factors. In Phase 1, subjects were quantitatively fit-tested with elastomeric half-facepiece respirators using both generated and ambient aerosols. QNFT were done with each aerosol with both P100 and N95 filters without disturbing the facepiece. In Phase 2 of the study elastomeric half facepieces were sealed to subjects' faces to eliminate faceseal leakage. Ambient aerosol QNFT were performed with P100 and N95 filters without disturbing the facepiece. In both phases the clamp was used to measure N95 filter penetration, which was then subtracted from total penetration for the N95 QNFT. It was hypothesized that N95 fit factors corrected for filter penetration would equal the P100 fit factors. Mean corrected N95 fit factors were significantly different from the P100 fit factors in each phase of the study. In addition, there was essentially no correlation between corrected N95 fit factors and P100 fit factors. It was concluded that the clamp method should not be used to fit-test N95 filtering facepieces or otherwise assess respirator performance. PMID:12908863

  10. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods

    PubMed Central

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2015-01-01

    This data article describes a controlled, spiked proteomic dataset for which the “ground truth” of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values. PMID:26862574

  11. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  12. Autoradiographic method for quantitation of radiolabeled proteins in tissues using indium-111

    SciTech Connect

    Morrell, E.M.; Tompkins, R.G.; Fischman, A.J.; Wilkinson, R.A.; Burke, J.F.; Rubin, R.H.; Strauss, H.W.; Yarmush, M.L. )

    1989-09-01

    A quantitative autoradiographic method was developed to measure 111In-labeled proteins in extravascular tissues with a spatial resolution sufficient to associate these proteins with tissue morphology. A linear relationship between measured grain density and isotope concentration was demonstrated with uniformly-labeled standard sources of epoxy-embedded gelatin containing (111In)albumin; half-distance of spatial resolution was 0.6 micron. The technique was illustrated by measuring 24-hr accumulation of diethylenetriaminepentaacetic acid-coupled 111In-labeled human polyclonal IgG and human serum albumin (HSA) in a thigh infection model in the rat. Gamma camera images localized the infection and showed target-to-background ratios of 2.5 {plus minus} 0.3 for IgG and 1.4 {plus minus} 0.02 for human serum albumin (mean {plus minus} s.d., n = 3). Using quantitative autoradiography, significantly higher average tissue concentrations were found in the infected thighs at 4 to 4.5% of the initial plasma concentrations as compared to 0.2 to 0.3% of initial plasma concentrations in the noninfected thigh (p less than 0.05); these radiolabeled proteins were not inflammatory cell associated and localized primarily within the edematous interstitial spaces of the infection.

  13. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    SciTech Connect

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D. )

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity.

  14. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  15. Three-band MRI image fusion utilizing the wavelet-based method optimized with two quantitative fusion metrics

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Elmaghraby, Adel S.; Frigui, Hichem

    2006-03-01

    In magnetic resonance imaging (MRI), there are three bands of images ("MRI triplet") available, which are T1-, T2- and PD-weighted images. The three images of a MRI triplet provide complementary structure information and therefore it is useful for diagnosis and subsequent analysis to combine three-band images into one. We propose an advanced discrete wavelet transform (αDWT) for three-band MRI image fusion and the αDWT algorithm is further optimized utilizing two quantitative fusion metrics - the image quality index (IQI) and ratio spatial frequency error (rSFe). In the aDWT method, principle component analysis (PCA) and morphological processing are incorporated into a regular DWT fusion algorithm. Furthermore, the αDWT has two adjustable parameters - the level of DWT decomposition (L d) and the length of the selected wavelet (L w) that determinately affect the fusion result. The fused image quality can be quantitatively measured with the established metrics - IQI and rSFe. Varying the control parameters (L d and L w), an iterative fusion procedure can be implemented and running until an optimized fusion is achieved. We fused and analyzed several MRI triplets from the Visible Human Project ® female dataset. From the quantitative and qualitative evaluations of fused images, we found that (1) the αDWTi-IQI algorithm produces a smoothed image whereas the αDWTi-rSFe algorithm yields a sharpened image, (2) fused image "T1+T2" is the most informative one in comparison with other two-in-one fusions (PD+T1 and PD+T2), (3) for three-in-one fusions, no significant difference is observed among the three fusions of (PD+T1)+T2, (PD+T2)+T1 and (T1+T2)+PD, thus the order of fusion does not play an important role. The fused images can significantly benefit medical diagnosis and also the further image processing such as multi-modality image fusion (with CT images), visualization (colorization), segmentation, classification and computer-aided diagnosis (CAD).

  16. Advanced materials and methods for next generation spintronics

    NASA Astrophysics Data System (ADS)

    Siegel, Gene Phillip

    The modern age is filled with ever-advancing electronic devices. The contents of this dissertation continue the desire for faster, smaller, better electronics. Specifically, this dissertation addresses a field known as "spintronics", electronic devices based on an electron's spin, not just its charge. The field of spintronics originated in 1990 when Datta and Das first proposed a "spin transistor" that would function by passing a spin polarized current from a magnetic electrode into a semiconductor channel. The spins in the channel could then be manipulated by applying an electrical voltage across the gate of the device. However, it has since been found that a great amount of scattering occurs at the ferromagnet/semiconductor interface due to the large impedance mismatch that exists between the two materials. Because of this, there were three updated versions of the spintronic transistor that were proposed to improve spin injection: one that used a ferromagnetic semiconductor electrode, one that added a tunnel barrier between the ferromagnet and semiconductor, and one that utilized a ferromagnetic tunnel barrier which would act like a spin filter. It was next proposed that it may be possible to achieve a "pure spin current", or a spin current with no concurrent electric current (i.e., no net flow of electrons). One such method that was discovered is the spin Seebeck effect, which was discovered in 2008 by Uchida et al., in which a thermal gradient in a magnetic material generates a spin current which can be injected into adjacent material as a pure spin current. The first section of this dissertation addresses this spin Seebeck effect (SSE). The goal was to create such a device that both performs better than previously reported devices and is capable of operating without the aid of an external magnetic field. We were successful in this endeavor. The trick to achieving both of these goals was found to be in the roughness of the magnetic layer. A rougher magnetic

  17. Clustering and training set selection methods for improving the accuracy of quantitative laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Anderson, Ryan B.; Bell, James F., III; Wiens, Roger C.; Morris, Richard V.; Clegg, Samuel M.

    2012-04-01

    We investigated five clustering and training set selection methods to improve the accuracy of quantitative chemical analysis of geologic samples by laser induced breakdown spectroscopy (LIBS) using partial least squares (PLS) regression. The LIBS spectra were previously acquired for 195 rock slabs and 31 pressed powder geostandards under 7 Torr CO2 at a stand-off distance of 7 m at 17 mJ per pulse to simulate the operational conditions of the ChemCam LIBS instrument on the Mars Science Laboratory Curiosity rover. The clustering and training set selection methods, which do not require prior knowledge of the chemical composition of the test-set samples, are based on grouping similar spectra and selecting appropriate training spectra for the partial least squares (PLS2) model. These methods were: (1) hierarchical clustering of the full set of training spectra and selection of a subset for use in training; (2) k-means clustering of all spectra and generation of PLS2 models based on the training samples within each cluster; (3) iterative use of PLS2 to predict sample composition and k-means clustering of the predicted compositions to subdivide the groups of spectra; (4) soft independent modeling of class analogy (SIMCA) classification of spectra, and generation of PLS2 models based on the training samples within each class; (5) use of Bayesian information criteria (BIC) to determine an optimal number of clusters and generation of PLS2 models based on the training samples within each cluster. The iterative method and the k-means method using 5 clusters showed the best performance, improving the absolute quadrature root mean squared error (RMSE) by ~ 3 wt.%. The statistical significance of these improvements was ~ 85%. Our results show that although clustering methods can modestly improve results, a large and diverse training set is the most reliable way to improve the accuracy of quantitative LIBS. In particular, additional sulfate standards and specifically fabricated

  18. Methods and Applications for Advancing Distance Education Technologies: International Issues and Solutions

    ERIC Educational Resources Information Center

    Syed, Mahbubur Rahman, Ed.

    2009-01-01

    The emerging field of advanced distance education delivers academic courses across time and distance, allowing educators and students to participate in a convenient learning method. "Methods and Applications for Advancing Distance Education Technologies: International Issues and Solutions" demonstrates communication technologies, intelligent…

  19. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  20. Quantitative analysis of eugenol in clove extract by a validated HPLC method.

    PubMed

    Yun, So-Mi; Lee, Myoung-Heon; Lee, Kwang-Jick; Ku, Hyun-Ok; Son, Seong-Wan; Joo, Yi-Seok

    2010-01-01

    Clove (Eugenia caryophyllata) is a well-known medicinal plant used for diarrhea, digestive disorders, or in antiseptics in Korea. Eugenol is the main active ingredient of clove and has been chosen as a marker compound for the chemical evaluation or QC of clove. This paper reports the development and validation of an HPLC-diode array detection (DAD) method for the determination of eugenol in clove. HPLC separation was accomplished on an XTerra RP18 column (250 x 4.6 mm id, 5 microm) with an isocratic mobile phase of 60% methanol and DAD at 280 nm. Calibration graphs were linear with very good correlation coefficients (r2 > 0.9999) from 12.5 to 1000 ng/mL. The LOD was 0.81 and the LOQ was 2.47 ng/mL. The method showed good intraday precision (%RSD 0.08-0.27%) and interday precision (%RSD 0.32-1.19%). The method was applied to the analysis of eugenol from clove cultivated in various countries (Indonesia, Singapore, and China). Quantitative analysis of the 15 clove samples showed that the content of eugenol varied significantly, ranging from 163 to 1049 ppb. The method of determination of eugenol by HPLC is accurate to evaluate the quality and safety assurance of clove, based on the results of this study. PMID:21313806

  1. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    NASA Astrophysics Data System (ADS)

    Ryan, C. G.; Laird, J. S.; Fisher, L. A.; Kirkham, R.; Moorhead, G. F.

    2015-11-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  2. Methods for the Specific Detection and Quantitation of Amyloid-β Oligomers in Cerebrospinal Fluid.

    PubMed

    Schuster, Judith; Funke, Susanne Aileen

    2016-05-01

    Protein misfolding and aggregation are fundamental features of the majority of neurodegenerative diseases, like Alzheimer's disease (AD), Parkinson's disease, frontotemporal dementia, and prion diseases. Proteinaceous deposits in the brain of the patient, e.g., amyloid plaques consisting of the amyloid-β (Aβ) peptide and tangles composed of tau protein, are the hallmarks of AD. Soluble oligomers of Aβ and tau play a fundamental role in disease progression, and specific detection and quantification of the respective oligomeric proteins in cerebrospinal fluid may provide presymptomatically detectable biomarkers, paving the way for early diagnosis or even prognosis. Several studies on the development of techniques for the specific detection of Aβ oligomers were published, but some of the existing tools do not yet seem to be satisfactory, and the study results are contradicting. The detection of oligomers is challenging due to their polymorphous and unstable nature, their low concentration, and the presence of competing proteins and Aβ monomers in body fluids. Here, we present an overview of the current state of the development of methods for Aβ oligomer specific detection and quantitation. The methods are divided in the three subgroups: (i) enzyme linked immunosorbent assays (ELISA), (ii) methods for single oligomer detection, and (iii) others, which are mainly biosensor based methods. PMID:27163804

  3. Combination of an enzymatic method and HPLC for the quantitation of cholesterol in cultured cells.

    PubMed

    Contreras, J A; Castro, M; Bocos, C; Herrera, E; Lasunción, M A

    1992-06-01

    The study of the cellular events that lead to the foam cell formation requires the development of fast, accurate, and sensitive methods to quantify cholesterol in cultured cells. Here we describe a procedure that allows the rapid determination of free and total cholesterol in a reduced number of cells, which makes it very suitable for cholesterol determination in cell cultures. The method consists of the enzymatic conversion of cholesterol to cholest-4-ene-3-one by cholesterol oxidase followed by the analysis of the sample by high performance liquid chromatography (HPLC) to detect this oxidized product. Due to the relatively high wavelength at which cholest-4-ene-3-one has its maximum absorption (240 nm), other cellular components do not interfere with the chromatographic procedure and prior lipid extraction is not required. Moreover, the duration of each chromatogram is about 3 min, contributing to the celerity of the method. All the cholesteryl esters used (oleate, palmitate, stearate and linoleate) were quantitatively hydrolyzed by incubation with cholesterol esterase; this was observed to occur with both pure standards and in cell homogenates. Sensitivity is enough to allow the determination of free and total cholesterol in less than 5 x 10(3) cells. We have applied this method to human monocyte-derived macrophages and the values obtained for free and total cholesterol are in close agreement with published data. PMID:1512516

  4. Quantitative analysis and efficiency study of PSD methods for a LaBr3:Ce detector

    NASA Astrophysics Data System (ADS)

    Zeng, Ming; Cang, Jirong; Zeng, Zhi; Yue, Xiaoguang; Cheng, Jianping; Liu, Yinong; Ma, Hao; Li, Junli

    2016-03-01

    The LaBr3:Ce scintillator has been widely studied for nuclear spectroscopy because of its optimal energy resolution (<3%@ 662 keV) and time resolution (~300 ps). Despite these promising properties, the intrinsic radiation background of LaBr3:Ce is a critical issue, and pulse shape discrimination (PSD) has been shown to be an efficient potential method to suppress the alpha background from the 227Ac. In this paper, the charge comparison method (CCM) for alpha and gamma discrimination in LaBr3:Ce is quantitatively analysed and compared with two other typical PSD methods using digital pulse processing. The algorithm parameters and discrimination efficiency are calculated for each method. Moreover, for the CCM, the correlation between the CCM feature value distribution and the total charge (energy) is studied, and a fitting equation for the correlation is inferred and experimentally verified. Using the equations, an energy-dependent threshold can be chosen to optimize the discrimination efficiency. Additionally, the experimental results show a potential application in low-activity high-energy γ measurement by suppressing the alpha background.

  5. Quantitative calcium resistivity based method for accurate and scalable water vapor transmission rate measurement.

    PubMed

    Reese, Matthew O; Dameron, Arrelaine A; Kempe, Michael D

    2011-08-01

    The development of flexible organic light emitting diode displays and flexible thin film photovoltaic devices is dependent on the use of flexible, low-cost, optically transparent and durable barriers to moisture and/or oxygen. It is estimated that this will require high moisture barriers with water vapor transmission rates (WVTR) between 10(-4) and 10(-6) g/m(2)/day. Thus there is a need to develop a relatively fast, low-cost, and quantitative method to evaluate such low permeation rates. Here, we demonstrate a method where the resistance changes of patterned Ca films, upon reaction with moisture, enable one to calculate a WVTR between 10 and 10(-6) g/m(2)/day or better. Samples are configured with variable aperture size such that the sensitivity and/or measurement time of the experiment can be controlled. The samples are connected to a data acquisition system by means of individual signal cables permitting samples to be tested under a variety of conditions in multiple environmental chambers. An edge card connector is used to connect samples to the measurement wires enabling easy switching of samples in and out of test. This measurement method can be conducted with as little as 1 h of labor time per sample. Furthermore, multiple samples can be measured in parallel, making this an inexpensive and high volume method for measuring high moisture barriers. PMID:21895269

  6. Spectrophotometric Method for Quantitative Determination of Cefixime in Bulk and Pharmaceutical Preparation Using Ferroin Complex

    NASA Astrophysics Data System (ADS)

    Naeem Khan, M.; Qayum, A.; Ur Rehman, U.; Gulab, H.; Idrees, M.

    2015-09-01

    A method was developed for the quantitative determination of cefixime in bulk and pharmaceutical preparations using ferroin complex. The method is based on the oxidation of the cefixime with Fe(III) in acidic medium. The formed Fe(II) reacts with 1,10-phenanthroline, and the ferroin complex is measured spectrophotometrically at 510 nm against reagent blank. Beer's law was obeyed in the concentration range 0.2-10 μg/ml with a good correlation of 0.993. The molar absorptivity was calculated and was found to be 1.375×105 L/mol × cm. The limit of detection (LOD) and limit of quantification (LOQ) were found to be 0.030 and 0.101 μg/ml respectively. The proposed method has reproducibility with a relative standard deviation of 5.28% (n = 6). The developed method was validated statistically by performing a recoveries study and successfully applied for the determination of cefixime in bulk powder and pharmaceutical formulations without interferences from common excipients. Percent recoveries were found to range from 98.00 to 102.05% for the pure form and 97.83 to 102.50% for pharmaceutical preparations.

  7. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms. PMID:26643074

  8. A Simple, Quantitative Method Using Alginate Gel to Determine Rat Colonic Tumor Volume In Vivo

    PubMed Central

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-01-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% ± 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the ApcPirc/+ rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mm3. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained. PMID:24674588

  9. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGESBeta

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  10. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study

    PubMed Central

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Aglyamov, Salavat R.; Twa, Michael D.; Larin, Kirill V.

    2015-01-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessing biomechanical properties of tissues with a micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Young’s modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods. PMID:25860076

  11. Quantitative test method for evaluation of anti-fingerprint property of coated surfaces

    NASA Astrophysics Data System (ADS)

    Wu, Linda Y. L.; Ngian, S. K.; Chen, Z.; Xuan, D. T. T.

    2011-01-01

    An artificial fingerprint liquid is formulated from artificial sweat, hydroxyl-terminated polydimethylsiloxane and a solvent for direct determination of anti-fingerprint property of a coated surface. A range of smooth and rough surfaces with different anti-fingerprint (AF) properties were fabricated by sol-gel technology, on which the AF liquid contact angles, artificial fingerprint and real human fingerprints (HF) were verified and correlated. It is proved that a surface with AF contact angle above 87° is fingerprint free. This provides an objective and quantitative test method to determine anti-fingerprint property of coated surfaces. It is also concluded that AF property can be achieved on smooth and optically clear surfaces. Deep porous structures are more favorable than bumpy structure for oleophobic and AF properties.

  12. A quantitative and qualitative method to control chemotherapeutic preparations by Fourier transform infrared-ultraviolet spectrophotometry.

    PubMed

    Dziopa, Florian; Galy, Guillaume; Bauler, Stephanie; Vincent, Benoit; Crochon, Sarah; Tall, Mamadou Lamine; Pirot, Fabrice; Pivot, Christine

    2013-06-01

    Chemotherapy products in hospitals include a reconstitution step of manufactured drugs providing an adapted dosage to each patient. The administration of highly iatrogenic drugs raises the question of patients' safety and treatment efficiency. In order to reduce administration errors due to faulty preparations, we introduced a new qualitative and quantitative routine control based on Fourier Transform Infrared (FTIR) and UV-Visible spectrophotometry. This automated method enabled fast and specific control for 14 anticancer drugs. A 1.2 mL sample was used to assay and identify each preparation in less than 90 sec. Over a two-year period, 9370 controlled infusion bags showed a 1.49% nonconformity rate, under 15% tolerance from the theoretical concentration and 96% minimum identification matching factor. This study evaluated the reliability of the control process, as well as its accordance to chemotherapy deliverance requirements. Thus, corrective measures were defined to improve the control process. PMID:23014899

  13. A method to optimize selection on multiple identified quantitative trait loci

    PubMed Central

    Chakraborty, Reena; Moreau, Laurence; Dekkers, Jack CM

    2002-01-01

    A mathematical approach was developed to model and optimize selection on multiple known quantitative trait loci (QTL) and polygenic estimated breeding values in order to maximize a weighted sum of responses to selection over multiple generations. The model allows for linkage between QTL with multiple alleles and arbitrary genetic effects, including dominance, epistasis, and gametic imprinting. Gametic phase disequilibrium between the QTL and between the QTL and polygenes is modeled but polygenic variance is assumed constant. Breeding programs with discrete generations, differential selection of males and females and random mating of selected parents are modeled. Polygenic EBV obtained from best linear unbiased prediction models can be accommodated. The problem was formulated as a multiple-stage optimal control problem and an iterative approach was developed for its solution. The method can be used to develop and evaluate optimal strategies for selection on multiple QTL for a wide range of situations and genetic models. PMID:12081805

  14. Methods for quantitative evaluation of dynamics of repair proteins within irradiated cells

    NASA Astrophysics Data System (ADS)

    Hable, V.; Dollinger, G.; Greubel, C.; Hauptner, A.; Krücken, R.; Dietzel, S.; Cremer, T.; Drexler, G. A.; Friedl, A. A.; Löwe, R.

    2006-04-01

    Living HeLa cells are irradiated well directed with single 100 MeV oxygen ions by the superconducting ion microprobe SNAKE, the Superconducting Nanoscope for Applied Nuclear (=Kern-) Physics Experiments, at the Munich 14 MV tandem accelerator. Various proteins, which are involved directly or indirectly in repair processes, accumulate as clusters (so called foci) at DNA-double strand breaks (DSBs) induced by the ions. The spatiotemporal dynamics of these foci built by the phosphorylated histone γ-H2AX are studied. For this purpose cells are irradiated in line patterns. The γ-H2AX is made visible under the fluorescence microscope using immunofluorescence techniques. Quantitative analysis methods are developed to evaluate the data of the microscopic images in order to analyze movement of the foci and their changing size.

  15. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N., Jr.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  16. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    PubMed Central

    Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho

    2013-01-01

    Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428

  17. Application of bias correction methods to improve the accuracy of quantitative radar rainfall in Korea

    NASA Astrophysics Data System (ADS)

    Lee, J.-K.; Kim, J.-H.; Suk, M.-K.

    2015-11-01

    There are many potential sources of the biases in the radar rainfall estimation process. This study classified the biases from the rainfall estimation process into the reflectivity measurement bias and the rainfall estimation bias by the Quantitative Precipitation Estimation (QPE) model and also conducted the bias correction methods to improve the accuracy of the Radar-AWS Rainrate (RAR) calculation system operated by the Korea Meteorological Administration (KMA). In the Z bias correction for the reflectivity biases occurred by measuring the rainfalls, this study utilized the bias correction algorithm. The concept of this algorithm is that the reflectivity of the target single-pol radars is corrected based on the reference dual-pol radar corrected in the hardware and software bias. This study, and then, dealt with two post-process methods, the Mean Field Bias Correction (MFBC) method and the Local Gauge Correction method (LGC), to correct the rainfall estimation bias by the QPE model. The Z bias and rainfall estimation bias correction methods were applied to the RAR system. The accuracy of the RAR system was improved after correcting Z bias. For the rainfall types, although the accuracy of the Changma front and the local torrential cases was slightly improved without the Z bias correction the accuracy of the typhoon cases got worse than the existing results in particular. As a result of the rainfall estimation bias correction, the Z bias_LGC was especially superior to the MFBC method because the different rainfall biases were applied to each grid rainfall amount in the LGC method. For the rainfall types, the results of the Z bias_LGC showed that the rainfall estimates for all types was more accurate than only the Z bias and, especially, the outcomes in the typhoon cases was vastly superior to the others.

  18. A new quantitative PCR method for the detection of Anaplasma platys in dogs based on the citrate synthase gene.

    PubMed

    da Silva, Claudia B; Pires, Marcus S; Vilela, Joice A R; Peckle, Maristela; da Costa, Renata L; Vitari, Gabriela L V; Santos, Leandro A; Santos, Huarrisson A; Massard, Carlos L

    2016-09-01

    Anaplasma platys is an obligate intracellular bacterium that primarily affects dogs, but it can also infect humans. Our study aimed to standardize a quantitative real-time (q)PCR method using the citrate synthase gene (gltA) as a specific target for A. platys detection in naturally infected dogs. Primers (gltA84F and gltA84R) and probe (PLATYSp) were designed to amplify an 84-bp fragment based on the gltA gene sequences of A. platys available in GenBank. A total of 186 dog blood samples originating from the Brazilian state of Rio de Janeiro were tested by qPCR. Additionally, the same samples were tested by cytology and a nested (n)PCR that targeted the 16S ribosomal DNA to determine the performance of our qPCR method compared to these existing techniques. Among the samples tested with qPCR, 17.2% were considered positive, significantly more than detected by nPCR (14.0%). Under optical microscopy, inclusions were observed in platelets of 25.3% of the samples, and among these samples, only 33.9% were identified as positive for A. platys using qPCR. The qPCR technique proved to be more specific than cytology and to have superior sensitivity to nPCR for detecting A. platys in dogs. The development of this new qPCR method contributes to the advancement of research involving A. platys Furthermore, it can be used to quantify the presence of this bacterium to evaluate the treatment of infected animals, or even as a more sensitive and specific tool for situations indicating possible clinical disease but with negative cytology. PMID:27423737

  19. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples. PMID:25993858

  20. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699

  1. Development of an Analytical Method for Quantitative Determination of Atmospheric Particles By Laap-TOF Instrument

    NASA Astrophysics Data System (ADS)

    Gemayel, R.; Temime-Roussel, B.; Hellebust, S.; Gligorovski, S.; Wortham, H.

    2014-12-01

    A comprehensive understanding of the chemical composition of the atmospheric particles is of paramount importance in order to understand their impact on the health and climate. Hence, there is an imperative need for the development of appropriate analytical methods of analysis for the on-line, time-resolved measurements of atmospheric particles. Laser Ablation Aerosol Particle Time of Flight Mass Spectrometry (LAAP-TOF-MS) allows a real time qualitative analysis of nanoparticles of differing composition and size. LAAP-TOF-MS is aimed for on-line and continuous measurements of atmospheric particles with the fast time resolution in order of millisecond. This system uses a 193 nm excimer laser for particle ablation/ionization and a 403 nm scattering laser for sizing (and single particle detection/triggering). The charged ions are then extracted into a bi-polar Time-of-Flight mass spectrometer. Here we present an analytical methodology for quantitative determination of the composition and size-distribution of the particles by LAAP-TOF instrument. We developed and validate an analytical methodology of this high time resolution instrument by comparison with the conventional analysis systems with lower time resolution (electronic microscopy, optical counters…) with final aim to render the methodology quantitative. This was performed with the aid of other instruments for on-line and off-line measurement such as Scanning Mobility Particle Sizer, electronic microscopy... Validation of the analytical method was performed under laboratory conditions by detection and identification of the targeted main types such as SiO2, CeO2, and TiO2

  2. Advanced 3D inverse method for designing turbomachine blades

    SciTech Connect

    Dang, T.

    1995-10-01

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  3. Advanced 3D inverse method for designing turbomachine blades

    SciTech Connect

    Dang, T.

    1995-12-31

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  4. A Perspective on Implementing a Quantitative Systems Pharmacology Platform for Drug Discovery and the Advancement of Personalized Medicine

    PubMed Central

    Stern, Andrew M.; Schurdak, Mark E.; Bahar, Ivet; Berg, Jeremy M.; Taylor, D. Lansing

    2016-01-01

    Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)–driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted therapeutic strategies likely to achieve clinical validation for appropriate subpopulations of patients. The QSP platform can address biological heterogeneity and anticipate the evolution of resistance mechanisms, which are major challenges for drug development. The implementation of this platform is dedicated to gaining an understanding of mechanism(s) of disease progression to enable the identification of novel therapeutic strategies as well as repurposing drugs. The QSP platform will help promote the paradigm shift from reactive population-based medicine to proactive personalized medicine by focusing on the patient as the starting and the end point. PMID:26962875

  5. A Perspective on Implementing a Quantitative Systems Pharmacology Platform for Drug Discovery and the Advancement of Personalized Medicine.

    PubMed

    Stern, Andrew M; Schurdak, Mark E; Bahar, Ivet; Berg, Jeremy M; Taylor, D Lansing

    2016-07-01

    Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)-driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted therapeutic strategies likely to achieve clinical validation for appropriate subpopulations of patients. The QSP platform can address biological heterogeneity and anticipate the evolution of resistance mechanisms, which are major challenges for drug development. The implementation of this platform is dedicated to gaining an understanding of mechanism(s) of disease progression to enable the identification of novel therapeutic strategies as well as repurposing drugs. The QSP platform will help promote the paradigm shift from reactive population-based medicine to proactive personalized medicine by focusing on the patient as the starting and the end point. PMID:26962875

  6. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  7. Comparison of Concentration Methods for Quantitative Detection of Sewage-Associated Viral Markers in Environmental Waters

    PubMed Central

    Harwood, V. J.; Gyawali, P.; Sidhu, J. P. S.; Toze, S.

    2015-01-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ∼10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg2+) and elution with acid were inefficient and more prone to

  8. Advances in Probes and Methods for Clinical EPR Oximetry

    PubMed Central

    Hou, Huagang; Khan, Nadeem; Jarvis, Lesley A.; Chen, Eunice Y.; Williams, Benjamin B.; Kuppusamy, Periannan

    2015-01-01

    EPR oximetry, which enables reliable, accurate, and repeated measurements of the partial pressure of oxygen in tissues, provides a unique opportunity to investigate the role of oxygen in the pathogenesis and treatment of several diseases including cancer, stroke, and heart failure. Building on significant advances in the in vivo application of EPR oximetry for small animal models of disease, we are developing suitable probes and instrumentation required for use in human subjects. Our laboratory has established the feasibility of clinical EPR oximetry in cancer patients using India ink, the only material presently approved for clinical use. We now are developing the next generation of probes, which are both superior in terms of oxygen sensitivity and biocompatibility including an excellent safety profile for use in humans. Further advances include the development of implantable oxygen sensors linked to an external coupling loop for measurements of deep-tissue oxygenations at any depth, overcoming the current limitation of 10 mm. This paper presents an overview of recent developments in our ability to make meaningful measurements of oxygen partial pressures in human subjects under clinical settings. PMID:24729217

  9. Quantitative Analysis of Single and Mix Food Antiseptics Basing on SERS Spectra with PLSR Method

    NASA Astrophysics Data System (ADS)

    Hou, Mengjing; Huang, Yu; Ma, Lingwei; Zhang, Zhengjun

    2016-06-01

    Usage and dosage of food antiseptics are very concerned due to their decisive influence in food safety. Surface-enhanced Raman scattering (SERS) effect was employed in this research to realize trace potassium sorbate (PS) and sodium benzoate (SB) detection. HfO2 ultrathin film-coated Ag NR array was fabricated as SERS substrate. Protected by HfO2 film, the SERS substrate possesses good acid resistance, which enables it to be applicable in acidic environment where PS and SB work. Regression relationship between SERS spectra of 0.3~10 mg/L PS solution and their concentration was calibrated by partial least squares regression (PLSR) method, and the concentration prediction performance was quite satisfactory. Furthermore, mixture solution of PS and SB was also quantitatively analyzed by PLSR method. Spectrum data of characteristic peak sections corresponding to PS and SB was used to establish the regression models of these two solutes, respectively, and their concentrations were determined accurately despite their characteristic peak sections overlapping. It is possible that the unique modeling process of PLSR method prevented the overlapped Raman signal from reducing the model accuracy.

  10. Quantitative comparison of reconstruction methods for intra-voxel fiber recovery from diffusion MRI.

    PubMed

    Daducci, Alessandro; Canales-Rodríguez, Erick Jorge; Descoteaux, Maxime; Garyfallidis, Eleftherios; Gur, Yaniv; Lin, Ying-Chia; Mani, Merry; Merlet, Sylvain; Paquette, Michael; Ramirez-Manzanares, Alonso; Reisert, Marco; Reis Rodrigues, Paulo; Sepehrband, Farshid; Caruyer, Emmanuel; Choupan, Jeiran; Deriche, Rachid; Jacob, Mathews; Menegaz, Gloria; Prčkovska, Vesna; Rivera, Mariano; Wiaux, Yves; Thiran, Jean-Philippe

    2014-02-01

    Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies. PMID:24132007

  11. Quantitatively estimating defects in graphene devices using discharge current analysis method

    PubMed Central

    Jung, Ukjin; Lee, Young Gon; Kang, Chang Goo; Lee, Sangchul; Kim, Jin Ju; Hwang, Hyeon June; Lim, Sung Kwan; Ham, Moon-Ho; Lee, Byoung Hun

    2014-01-01

    Defects of graphene are the most important concern for the successful applications of graphene since they affect device performance significantly. However, once the graphene is integrated in the device structures, the quality of graphene and surrounding environment could only be assessed using indirect information such as hysteresis, mobility and drive current. Here we develop a discharge current analysis method to measure the quality of graphene integrated in a field effect transistor structure by analyzing the discharge current and examine its validity using various device structures. The density of charging sites affecting the performance of graphene field effect transistor obtained using the discharge current analysis method was on the order of 1014/cm2, which closely correlates with the intensity ratio of the D to G bands in Raman spectroscopy. The graphene FETs fabricated on poly(ethylene naphthalate) (PEN) are found to have a lower density of charging sites than those on SiO2/Si substrate, mainly due to reduced interfacial interaction between the graphene and the PEN. This method can be an indispensable means to improve the stability of devices using a graphene as it provides an accurate and quantitative way to define the quality of graphene after the device fabrication. PMID:24811431

  12. Evaluation of a rapid method for the quantitative estimation of coliforms in meat by impedimetric procedures.

    PubMed Central

    Martins, S B; Selby, M J

    1980-01-01

    A 24-h instrumental procedure is described for the quantitative estimation of coliforms in ground meat. The method is simple and rapid, and it requires but a single sample dilution and four replicates. The data are recorded automatically and can be used to estimate coliforms in the range of 100 to 10,000 organisms per g. The procedure is an impedance detection time (IDT) method using a new medium, tested against 131 stock cultures, that markedly enhances the impedance response of gram-negative organisms, and it is selective for coliforms. Seventy samples of ground beef were analyzed for coliforms by the IDT method and the conventional three-dilution, two-step most-probable-number test tube procedure. Seventy-nine percent of the impedimetric estimates fell within the 95% confidence limits of the most-probable-number values. This corresponds to the criteria used to evaluate other coliform tests, with the added advantage of a single dilution and more rapid results. PMID:6992712

  13. Quantitative determination of aflatoxin B1 concentration in acetonitrile by chemometric methods using terahertz spectroscopy.

    PubMed

    Ge, Hongyi; Jiang, Yuying; Lian, Feiyu; Zhang, Yuan; Xia, Shanhong

    2016-10-15

    Aflatoxins contaminate and colonize agricultural products, such as grain, and thereby potentially cause human liver carcinoma. Detection via conventional methods has proven to be time-consuming and complex. In this paper, the terahertz (THz) spectra of aflatoxin B1 in acetonitrile solutions with concentration ranges of 1-50μg/ml and 1-50μg/l are obtained and analyzed for the frequency range of 0.4-1.6THz. Linear and nonlinear regression models are constructed to relate the absorption spectra and the concentrations of 160 samples using the partial least squares (PLS), principal component regression (PCR), support vector machine (SVM), and PCA-SVM methods. Our results indicate that PLS and PCR models are more accurate for the concentration range of 1-50μg/ml, whereas SVM and PCA-SVM are more accurate for the concentration range of 1-50μg/l. Furthermore, ten unknown concentration samples extracted from mildewed maize are analyzed quantitatively using these methods. PMID:27173565

  14. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. PMID:25842334

  15. Quantitative evaluation of linear and nonlinear methods characterizing interdependencies between brain signals

    PubMed Central

    Ansari-Asl, Karim; Senhadji, Lotfi; Bellanger, Jean-Jacques; Wendling, Fabrice

    2006-01-01

    Brain functional connectivity can be characterized by the temporal evolution of correlation between signals recorded from spatially-distributed regions. It is aimed at explaining how different brain areas interact within networks involved during normal (as in cognitive tasks) or pathological (as in epilepsy) situations. Numerous techniques were introduced for assessing this connectivity. Recently, some efforts were made to compare methods performances but mainly qualitatively and for a special application. In this paper, we go further and propose a comprehensive comparison of different classes of methods (linear and nonlinear regressions, phase synchronization (PS), and generalized synchronization (GS)) based on various simulation models. For this purpose, quantitative criteria are used: in addition to mean square error (MSE) under null hypothesis (independence between two signals) and mean variance (MV) computed over all values of coupling degree in each model, we introduce a new criterion for comparing performances. Results show that the performances of the compared methods are highly depending on the hypothesis regarding the underlying model for the generation of the signals. Moreover, none of them outperforms the others in all cases and the performance hierarchy is model-dependent. PMID:17025676

  16. Quantitative Analysis of Single and Mix Food Antiseptics Basing on SERS Spectra with PLSR Method.

    PubMed

    Hou, Mengjing; Huang, Yu; Ma, Lingwei; Zhang, Zhengjun

    2016-12-01

    Usage and dosage of food antiseptics are very concerned due to their decisive influence in food safety. Surface-enhanced Raman scattering (SERS) effect was employed in this research to realize trace potassium sorbate (PS) and sodium benzoate (SB) detection. HfO2 ultrathin film-coated Ag NR array was fabricated as SERS substrate. Protected by HfO2 film, the SERS substrate possesses good acid resistance, which enables it to be applicable in acidic environment where PS and SB work. Regression relationship between SERS spectra of 0.3~10 mg/L PS solution and their concentration was calibrated by partial least squares regression (PLSR) method, and the concentration prediction performance was quite satisfactory. Furthermore, mixture solution of PS and SB was also quantitatively analyzed by PLSR method. Spectrum data of characteristic peak sections corresponding to PS and SB was used to establish the regression models of these two solutes, respectively, and their concentrations were determined accurately despite their characteristic peak sections overlapping. It is possible that the unique modeling process of PLSR method prevented the overlapped Raman signal from reducing the model accuracy. PMID:27299651

  17. Quantifying social norms: by coupling the ecosystem management concept and semi-quantitative sociological methods

    NASA Astrophysics Data System (ADS)

    Zhang, D.; Xu, H.

    2012-12-01

    Over recent decades, human-induced environmental changes have steadily and rapidly grown in intensity and impact to where they now often exceed natural impacts. As one of important components of human activities, social norms play key roles in environmental and natural resources management. But the lack of relevant quantitative data about social norms greatly limits our scientific understanding of the complex linkages between humans and nature, and hampers our solving of pressing environmental and social problems. In this study, we built a quantified method by coupling the ecosystem management concept, semi-quantitative sociological methods and mathematical statistics. We got the quantified value of social norms from two parts, whether the content of social norms coincide with the concept of ecosystem management (content value) and how about the performance after social norms were put into implementation (implementation value) . First, we separately identified 12 core elements of ecosystem management and 16 indexes of social norms, and then matched them one by one. According to their matched degree, we got the content value of social norms. Second, we selected 8 key factors that can represent the performance of social norms after they were put into implementation, and then we got the implementation value by Delph method. Adding these two parts values, we got the final value of each social norms. Third, we conducted a case study in Heihe river basin, the second largest inland river in China, by selecting 12 official edicts related to the river basin ecosystem management of Heihe River Basin. By doing so, we first got the qualified data of social norms which can be directly applied to the research that involved observational or experimental data collection of natural processes. Second, each value was supported by specific contents, so it can assist creating a clear road map for building or revising management and policy guidelines. For example, in this case study

  18. An advanced deterministic method for spent fuel criticality safety analysis

    SciTech Connect

    DeHart, M.D.

    1998-01-01

    Over the past two decades, criticality safety analysts have come to rely to a large extent on Monte Carlo methods for criticality calculations. Monte Carlo has become popular because of its capability to model complex, non-orthogonal configurations or fissile materials, typical of real world problems. Over the last few years, however, interest in determinist transport methods has been revived, due shortcomings in the stochastic nature of Monte Carlo approaches for certain types of analyses. Specifically, deterministic methods are superior to stochastic methods for calculations requiring accurate neutron density distributions or differential fluxes. Although Monte Carlo methods are well suited for eigenvalue calculations, they lack the localized detail necessary to assess uncertainties and sensitivities important in determining a range of applicability. Monte Carlo methods are also inefficient as a transport solution for multiple pin depletion methods. Discrete ordinates methods have long been recognized as one of the most rigorous and accurate approximations used to solve the transport equation. However, until recently, geometric constraints in finite differencing schemes have made discrete ordinates methods impractical for non-orthogonal configurations such as reactor fuel assemblies. The development of an extended step characteristic (ESC) technique removes the grid structure limitations of traditional discrete ordinates methods. The NEWT computer code, a discrete ordinates code built upon the ESC formalism, is being developed as part of the SCALE code system. This paper will demonstrate the power, versatility, and applicability of NEWT as a state-of-the-art solution for current computational needs.

  19. Advanced methods of microscope control using μManager software

    PubMed Central

    Edelstein, Arthur D.; Tsuchida, Mark A.; Amodaj, Nenad; Pinkard, Henry; Vale, Ronald D.; Stuurman, Nico

    2014-01-01

    μManager is an open-source, cross-platform desktop application, to control a wide variety of motorized microscopes, scientific cameras, stages, illuminators, and other microscope accessories. Since its inception in 2005, μManager has grown to support a wide range of microscopy hardware and is now used by thousands of researchers around the world. The application provides a mature graphical user interface and offers open programming interfaces to facilitate plugins and scripts. Here, we present a guide to using some of the recently added advanced μManager features, including hardware synchronization, simultaneous use of multiple cameras, projection of patterned light onto a specimen, live slide mapping, imaging with multi-well plates, particle localization and tracking, and high-speed imaging. PMID:25606571

  20. Comparison of Advanced Distillation Control Methods, Final Technical Report

    SciTech Connect

    Dr. James B. Riggs

    2000-11-30

    Detailed dynamic simulations of three industrial distillation columns (a propylene/propane splitter, a xylene/toluene column, and a depropanizer) have been used to evaluate configuration selections for single-ended and dual-composition control, as well as to compare conventional and advanced control approaches. In addition, a simulator of a main fractionator was used to compare the control performance of conventional and advanced control. For each case considered, the controllers were tuned by using setpoint changes and tested using feed composition upsets. Proportional Integral (PI) control performance was used to evaluate the configuration selection problem. For single ended control, the energy balance configuration was found to yield the best performance. For dual composition control, nine configurations were considered. It was determined that the use of dynamic simulations is required in order to identify the optimum configuration from among the nine possible choices. The optimum configurations were used to evaluate the relative control performance of conventional PI controllers, MPC (Model Predictive Control), PMBC (Process Model-Based Control), and ANN (Artificial Neural Networks) control. It was determined that MPC works best when one product is much more important than the other, while PI was superior when both products were equally important. PMBC and ANN were not found to offer significant advantages over PI and MPC. MPC was found to outperform conventional PI control for the main fractionator. MPC was applied to three industrial columns: one at Phillips Petroleum and two at Union Carbide. In each case, MPC was found to significantly outperform PI controls. The major advantage of the MPC controller is its ability to effectively handle a complex set of constraints and control objectives.

  1. A quantitative in vitro method to predict the adhesion lifetime of diamond-like carbon thin films on biomedical implants.

    PubMed

    Falub, Claudiu Valentin; Thorwarth, Götz; Affolter, Christian; Müller, Ulrich; Voisard, Cyril; Hauert, Roland

    2009-10-01

    A quantitative method using Rockwell C indentation was developed to study the adhesion of diamond-like carbon (DLC) protective coatings to the CoCrMo biomedical implant alloy when immersed in phosphate-buffered saline (PBS) solution at 37 degrees C. Two kinds of coatings with thicknesses ranging from 0.5 up to 16 microns were investigated, namely DLC and DLC/Si-DLC, where Si-DLC denotes a 90 nm thick DLC interlayer containing Si. The time-dependent delamination of the coating around the indentation was quantified by means of optical investigations of the advancing crack front and calculations of the induced stress using the finite element method (FEM). The cause of delamination for both types of coatings was revealed to be stress-corrosion cracking (SCC) of the interface material. For the DLC coating a typical SCC behavior was observed, including a threshold region (60J m(-2)) and a "stage 1" crack propagation with a crack-growth exponent of 3.0, comparable to that found for ductile metals. The DLC/Si-DLC coating exhibits an SCC process with a crack-growth exponent of 3.3 and a threshold region at 470 Jm(-2), indicating an adhesion in PBS at 37 degrees C that is about eight times better than that of the DLC coating. The SCC curves were fitted to the reaction controlled model typically used to explain the crack propagation in bulk soda lime glass. As this model falls short of accurately describing all the SCC curves, limitations of its application to the interface between a brittle coating and a ductile substrate are discussed. PMID:19450711

  2. A novel volumetric method for quantitation of titanium dioxide in cosmetics.

    PubMed

    Kim, Young So; Kim, Boo-Min; Park, Sang-Chul; Jeong, Hye-Jin; Chang, Ih Seop

    2006-01-01

    Nowadays there are many sun-protection cosmetics incorporating organic or inorganic UV filters as active ingredients. Chemically stable inorganic sunscreen agents, usually metal oxides, are widely employed in high-SPF (sun protection factor) products. Titanium dioxide is one of the most frequently used inorganic UV filters. It has been used as a pigment for a long period of cosmetic history. With the development of micronization techniques, it has become possible to incorporate titanium dioxide in sunscreen formulations without the previous whitening effect, and hence its use in cosmetics has become an important research topic. However, there are very few works related to quantitation of titanium dioxide in sunscreen products. In this research, we analyzed the amounts of titanium dioxide in sunscreen cosmetics by adapting redox titration, reduction of Ti(IV) to Ti(III), and reoxidation to Ti(IV). After calcification of other organic ingredients of cosmetics, titanium dioxide is dissolved by hot sulfuric acid. The dissolved Ti(IV) is reduced to Ti(III) by adding metallic aluminum. The reduced Ti(III) is titrated against a standard oxidizing agent, Fe(III) (ammonium iron(III) sulfate), with potassium thiocyanate as an indicator. In order to test the accuracy and applicability of the proposed method, we analyzed the amounts of titanium dioxide in four types of sunscreen cosmetics, namely cream, make-up base, foundation, and powder, after adding known amounts of titanium dioxide (1 approximately 25 w/w%). The percentages of titanium dioxide recovered in the four types of formulations were in the range between 96% and 105%. We also analyzed seven commercial cosmetic products labeled with titanium dioxide as an ingredient and compared the results with those obtained from ICP-AES (inductively coupled plasma-atomic emission spectrometry), one of the most powerful atomic analysis techniques. The results showed that the titrated amounts were well in accord with the analyzed

  3. Adherence to Scientific Method while Advancing Exposure Science

    EPA Science Inventory

    Paul Lioy was simultaneously a staunch adherent to the scientific method and an innovator of new ways to conduct science, particularly related to human exposure. Current challenges to science and the application of the scientific method are presented as they relate the approaches...

  4. A Novel Method for Relative Quantitation of N-Glycans by Isotopic Labeling Using 18O-Water

    PubMed Central

    Tao, Shujuan; Orlando, Ron

    2014-01-01

    Quantitation is an essential aspect of comprehensive glycomics study. Here, a novel isotopic-labeling method is described for N-glycan quantitation using 18O-water. The incorporation of the 18O-labeling into the reducing end of N-glycans is simply and efficiently achieved during peptide-N4-(N-acetyl-β-glucosaminyl) asparagine amidase F release. This process provides a 2-Da mass difference compared with the N-glycans released in 16O-water. A mathematical calculation method was also developed to determine the 18O/16O ratios from isotopic peaks. Application of this method to several standard glycoprotein mixtures and human serum demonstrated that this method can facilitate the relative quantitation of N-glycans over a linear dynamic range of two orders, with high accuracy and reproducibility. PMID:25365792

  5. Quantitative (1)H NMR method for hydrolytic kinetic investigation of salvianolic acid B.

    PubMed

    Pan, Jianyang; Gong, Xingchu; Qu, Haibin

    2013-11-01

    This work presents an exploratory study for monitoring the hydrolytic process of salvianolic acid B (Sal B) in low oxygen condition using a simple quantitative (1)H NMR (Q-NMR) method. The quantity of the compounds was calculated by the relative ratio of the integral values of the target peak for each compound to the known amount of the internal standard trimethylsilyl propionic acid (TSP). Kinetic runs have been carried out on different initial concentrations of Sal B (5.00, 10.0, 20.0mg/mL) and temperatures of 70, 80, 90°C. The effect of these two factors during the transformation process of Sal B was investigated. The hydrolysis followed pseudo-first-order kinetics and the apparent degradation kinetic constant at 80°C decreased when concentration of Sal B increased. Under the given conditions, the rate constant of overall hydrolysis as a function of temperature obeyed the Arrhenius equation. Six degradation products were identified by NMR and mass spectrometric analysis. Four of these degradation products, i.e. danshensu (DSS), protocatechuic aldehyde (PRO), salvianolic acid D (Sal D) and lithospermic acid (LA) were further identified by comparing the retention times with standard compounds. The advantage of this Q-NMR method was that no reference compounds were required for calibration curves, the quantification could be directly realized on hydrolyzed samples. It was proved to be simple, convenient and accurate for hydrolytic kinetic study of Sal B. PMID:23867115

  6. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    SciTech Connect

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%.

  7. Process analytical technology case study part I: feasibility studies for quantitative near-infrared method development.

    PubMed

    Cogdill, Robert P; Anderson, Carl A; Delgado-Lopez, Miriam; Molseed, David; Chisholm, Robert; Bolton, Raymond; Herkert, Thorsten; Afnán, Ali M; Drennen, James K

    2005-01-01

    This article is the first of a series of articles detailing the development of near-infrared (NIR) methods for solid-dosage form analysis. Experiments were conducted at the Duquesne University Center for Pharmaceutical Technology to qualify the capabilities of instrumentation and sample handling systems, evaluate the potential effect of one source of a process signature on calibration development, and compare the utility of reflection and transmission data collection methods. A database of 572 production-scale sample spectra was used to evaluate the interbatch spectral variability of samples produced under routine manufacturing conditions. A second database of 540 spectra from samples produced under various compression conditions was analyzed to determine the feasibility of pooling spectral data acquired from samples produced at diverse scales. Instrument qualification tests were performed, and appropriate limits for instrument performance were established. To evaluate the repeatability of the sample positioning system, multiple measurements of a single tablet were collected. With the application of appropriate spectral preprocessing techniques, sample repositioning error was found to be insignificant with respect to NIR analyses of product quality attributes. Sample shielding was demonstrated to be unnecessary for transmission analyses. A process signature was identified in the reflection data. Additional tests demonstrated that the process signature was largely orthogonal to spectral variation because of hardness. Principal component analysis of the compression sample set data demonstrated the potential for quantitative model development. For the data sets studied, reflection analysis was demonstrated to be more robust than transmission analysis. PMID:16353986

  8. 3D reconstruction and quantitative assessment method of mitral eccentric regurgitation from color Doppler echocardiography

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Ge, Yi Nan; Wang, Tian Fu; Zheng, Chang Qiong; Zheng, Yi

    2005-10-01

    Based on the two-dimensional color Doppler image in this article, multilane transesophageal rotational scanning method is used to acquire original Doppler echocardiography while echocardiogram is recorded synchronously. After filtering and interpolation, the surface rendering and volume rendering methods are performed. Through analyzing the color-bar information and the color Doppler flow image's superposition principle, the grayscale mitral anatomical structure and color-coded regurgitation velocity parameter were separated from color Doppler flow images, three-dimensional reconstruction of mitral structure and regurgitation velocity distribution was implemented separately, fusion visualization of the reconstructed regurgitation velocity distribution parameter with its corresponding 3D mitral anatomical structures was realized, which can be used in observing the position, phase, direction and measuring the jet length, area, volume, space distribution and severity level of the mitral regurgitation. In addition, in patients with eccentric mitral regurgitation, this new modality overcomes the inherent limitations of two-dimensional color Doppler flow image by depicting the full extent of the jet trajectory, the area of eccentric regurgitation on three-dimensional image was much larger than that on two-dimensional image, the area variation tendency and volume variation tendency of regurgitation have been shown in figure at different angle and different systolic phase. The study shows that three-dimensional color Doppler provides quantitative measurements of eccentric mitral regurgitation that are more accurate and reproducible than conventional color Doppler.

  9. A validated LC-MS-MS method for simultaneous identification and quantitation of rodenticides in blood.

    PubMed

    Bidny, Sergei; Gago, Kim; David, Mark; Duong, Thanh; Albertyn, Desdemona; Gunja, Naren

    2015-04-01

    A rapid, highly sensitive and specific analytical method for the extraction, identification and quantification of nine rodenticides from whole blood has been developed and validated. Commercially available rodenticides in Australia include coumatetralyl, warfarin, brodifacoum, bromadiolone, difenacoum, flocoumafen, difethialone, diphacinone and chlorophacinone. A Waters ACQUITY UPLC TQD system operating in multiple reaction monitoring mode was used to conduct the analysis. Two different ionization techniques, ES+ and ES-, were examined to achieve optimal sensitivity and selectivity resulting in detection by MS-MS using electrospray ionization in positive mode for difenacoum and brodifacoum and in negative mode for all other analytes. All analytes were extracted from 200 µL of whole blood with ethylacetate and separated on a Waters ACQUITY UPLC BEH-C18 column using gradient elution. Ammonium acetate (10 mM, pH 7.5) and methanol were used as mobile phases with a total run time of 8 min. Recoveries were between 70 and 105% with limits of detection ranging from 0.5 to 1 ng/mL. The limit of quantitation was 2 ng/mL for all analytes. Calibration curves were linear within the range 2-200 ng/mL for all analytes with the coefficient of determination ≥0.98. The application of the proposed method using liquid-liquid extraction in a series of clinical investigations and forensic toxicological analyses was successful. PMID:25595137

  10. Simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and dead reckoning

    NASA Astrophysics Data System (ADS)

    Davey, Neil S.; Godil, Haris

    2013-05-01

    This article presents a comparative study between a well-known SLAM (Simultaneous Localization and Mapping) algorithm, called Gmapping, and a standard Dead-Reckoning algorithm; the study is based on experimental results of both approaches by using a commercial skid-based turning robot, P3DX. Five main base-case scenarios are conducted to evaluate and test the effectiveness of both algorithms. The results show that SLAM outperformed the Dead Reckoning in terms of map-making accuracy in all scenarios but one, since SLAM did not work well in a rapidly changing environment. Although the main conclusion about the excellence of SLAM is not surprising, the presented test method is valuable to professionals working in this area of mobile robots, as it is highly practical, and provides solid and valuable results. The novelty of this study lies in its simplicity. The simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and Dead Reckoning and some applications using autonomous robots are being patented by the authors in U.S. Patent Application Nos. 13/400,726 and 13/584,862.

  11. Laser flare photometry: a noninvasive, objective, and quantitative method to measure intraocular inflammation.

    PubMed

    Tugal-Tutkun, Ilknur; Herbort, Carl P

    2010-10-01

    Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis. PMID:19430730

  12. Simple saponification method for the quantitative determination of carotenoids in green vegetables.

    PubMed

    Larsen, Erik; Christensen, Lars P

    2005-08-24

    A simple, reliable, and gentle saponification method for the quantitative determination of carotenoids in green vegetables was developed. The method involves an extraction procedure with acetone and the selective removal of the chlorophylls and esterified fatty acids from the organic phase using a strongly basic resin (Ambersep 900 OH). Extracts from common green vegetables (beans, broccoli, green bell pepper, chive, lettuce, parsley, peas, and spinach) were analyzed by high-performance liquid chromatography (HPLC) for their content of major carotenoids before and after action of Ambersep 900 OH. The mean recovery percentages for most carotenoids [(all-E)-violaxanthin, (all-E)-lutein epoxide, (all-E)-lutein, neolutein A, and (all-E)-beta-carotene] after saponification of the vegetable extracts with Ambersep 900 OH were close to 100% (99-104%), while the mean recovery percentages of (9'Z)-neoxanthin increased to 119% and that of (all-E)-neoxanthin and neolutein B decreased to 90% and 72%, respectively. PMID:16104772

  13. Recommended Methods for Brain Processing and Quantitative Analysis in Rodent Developmental Neurotoxicity Studies.

    PubMed

    Garman, Robert H; Li, Abby A; Kaufmann, Wolfgang; Auer, Roland N; Bolon, Brad

    2016-01-01

    Neuropathology methods in rodent developmental neurotoxicity (DNT) studies have evolved with experience and changing regulatory guidance. This article emphasizes principles and methods to promote more standardized DNT neuropathology evaluation, particularly procurement of highly homologous brain sections and collection of the most reproducible morphometric measurements. To minimize bias, brains from all animals at all dose levels should be processed from brain weighing through paraffin embedding at one time using a counterbalanced design. Morphometric measurements should be anchored by distinct neuroanatomic landmarks that can be identified reliably on the faced block or in unstained sections and which address the region-specific circuitry of the measured area. Common test article-related qualitative changes in the developing brain include abnormal cell numbers (yielding altered regional size), displaced cells (ectopia and heterotopia), and/or aberrant differentiation (indicated by defective myelination or synaptogenesis), but rarely glial or inflammatory reactions. Inclusion of digital images in the DNT pathology raw data provides confidence that the quantitative analysis was done on anatomically matched (i.e., highly homologous) sections. Interpreting DNT neuropathology data and their presumptive correlation with neurobehavioral data requires an integrative weight-of-evidence approach including consideration of maternal toxicity, body weight, brain weight, and the pattern of findings across brain regions, doses, sexes, and ages. PMID:26296631

  14. An Evaluation of Quantitative Methods of Determining the Degree of Melting Experienced by a Chondrule

    NASA Technical Reports Server (NTRS)

    Nettles, J. W.; Lofgren, G. E.; Carlson, W. D.; McSween, H. Y., Jr.

    2004-01-01

    Many workers have considered the degree to which partial melting occurred in chondrules they have studied, and this has led to attempts to find reliable methods of determining the degree of melting. At least two quantitative methods have been used in the literature: a convolution index (CVI), which is a ratio of the perimeter of the chondrule as seen in thin section divided by the perimeter of a circle with the same area as the chondrule, and nominal grain size (NGS), which is the inverse square root of the number density of olivines and pyroxenes in a chondrule (again, as seen in thin section). We have evaluated both nominal grain size and convolution index as melting indicators. Nominal grain size was measured on the results of a set of dynamic crystallization experiments previously described, where aliquots of LEW97008(L3.4) were heated to peak temperatures of 1250, 1350, 1370, and 1450 C, representing varying degrees of partial melting of the starting material. Nominal grain size numbers should correlate with peak temperature (and therefore degree of partial melting) if it is a good melting indicator. The convolution index is not directly testable with these experiments because the experiments do not actually create chondrules (and therefore they have no outline on which to measure a CVI). Thus we had no means to directly test how well the CVI predicted different degrees of melting. Therefore, we discuss the use of the CVI measurement and support the discussion with X-ray Computed Tomography (CT) data.

  15. SEM-contour shape analysis method for advanced semiconductor devices

    NASA Astrophysics Data System (ADS)

    Toyoda, Yasutaka; Shindo, Hiroyuki; Ota, Yoshihiro; Matsuoka, Ryoichi; Hojo, Yutaka; Fuchimoto, Daisuke; Hibino, Daisuke; Sakai, Hideo

    2013-04-01

    The new measuring method that we developed executes a contour shape analysis that is based on the pattern edge information from a SEM image. This analysis helps to create a highly precise quantification of every circuit pattern shape by comparing the contour extracted from the SEM image using a CD measurement algorithm and the ideal circuit pattern. The developed method, in the next phase, can generate four shape indices by using the analysis mass measurement data. When the shape index measured using the developed method is compared the CD, the difference of the shape index and the CD is negligibly small for the quantification of the circuit pattern shape. In addition, when the 2D patterns on a FEM wafer are measured using the developed method, the tendency for shape deformations is precisely caught by the four shape indices. This new method and the evaluation results will be presented in detail in this paper.

  16. Advances in spectroscopic methods for quantifying soil carbon

    USGS Publications Warehouse

    Liebig, Mark; Franzluebbers, Alan J.; Follett, Ronald F.; Hively, W. Dean; Reeves, James B., III; McCarty, Gregory W.; Calderon, Francisco

    2012-01-01

    The gold standard for soil C determination is combustion. However, this method requires expensive consumables, is limited to the determination of the total carbon and in the number of samples which can be processed (~100/d). With increased interest in soil C sequestration, faster methods are needed. Thus, interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared ranges using either proximal or remote sensing. These methods have the ability to analyze more samples (2 to 3X/d) or huge areas (imagery) and do multiple analytes simultaneously, but require calibrations relating spectral and reference data and have specific problems, i.e., remote sensing is capable of scanning entire watersheds, thus reducing the sampling needed, but is limiting to the surface layer of tilled soils and by difficulty in obtaining proper calibration reference values. The objective of this discussion is the present state of spectroscopic methods for soil C determination.

  17. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging

    NASA Astrophysics Data System (ADS)

    Könik, Arda; Kupinski, Meredith; Hendrik Pretorius, P.; King, Michael A.; Barrett, Harrison H.

    2015-08-01

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3 cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested.

  18. Validation of a simple and inexpensive method for the quantitation of infarct in the rat brain.

    PubMed

    Schilichting, C L R; Lima, K C M; Cestari, L A; Sekiyama, J Y; Silva, F M; Milani, H

    2004-04-01

    A gravimetric method was evaluated as a simple, sensitive, reproducible, low-cost alternative to quantify the extent of brain infarct after occlusion of the medial cerebral artery in rats. In ether-anesthetized rats, the left medial cerebral artery was occluded for 1, 1.5 or 2 h by inserting a 4-0 nylon monofilament suture into the internal carotid artery. Twenty-four hours later, the brains were processed for histochemical triphenyltetrazolium chloride (TTC) staining and quantitation of the schemic infarct. In each TTC-stained brain section, the ischemic tissue was dissected with a scalpel and fixed in 10% formalin at 0 masculine C until its total mass could be estimated. The mass (mg) of the ischemic tissue was weighed on an analytical balance and compared to its volume (mm(3)), estimated either by plethysmometry using platinum electrodes or by computer-assisted image analysis. Infarct size as measured by the weighing method (mg), and reported as a percent (%) of the affected (left) hemisphere, correlated closely with volume (mm(3), also reported as %) estimated by computerized image analysis (r = 0.88; P < 0.001; N = 10) or by plethysmography (r = 0.97-0.98; P < 0.0001; N = 41). This degree of correlation was maintained between different experimenters. The method was also sensitive for detecting the effect of different ischemia durations on infarct size (P < 0.005; N = 23), and the effect of drug treatments in reducing the extent of brain damage (P < 0.005; N = 24). The data suggest that, in addition to being simple and low cost, the weighing method is a reliable alternative for quantifying brain infarct in animal models of stroke. PMID:15064814

  19. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging.

    PubMed

    Könik, Arda; Kupinski, Meredith; Pretorius, P Hendrik; King, Michael A; Barrett, Harrison H

    2015-08-21

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3 cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested. PMID:26247228

  20. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...