Science.gov

Sample records for quantitative methods results

  1. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  2. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  3. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  4. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  5. Quantitative results from the focusing schlieren technique

    NASA Technical Reports Server (NTRS)

    Cook, S. P.; Chokani, Ndaona

    1993-01-01

    An iterative theoretical approach to obtain quantitative density data from the focusing schlieren technique is proposed. The approach is based on an approximate modeling of the focusing action in a focusing schlieren system, and an estimation of an appropriate focal plane thickness. The theoretical approach is incorporated in a computer program, and results obtained from a supersonic wind tunnel experiment evaluated by comparison with CFD data. The density distributions compared favorably with CFD predictions. However, improvements to the system are required in order to reduce noise in the data, to improve specifications of a depth of focus, and to refine the modeling of the focusing action.

  6. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. PMID:26763302

  7. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  8. Quantitative statistical methods for image quality assessment.

    PubMed

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  9. Quantitative Statistical Methods for Image Quality Assessment

    PubMed Central

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  10. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  11. Quantitative laser-induced breakdown spectroscopy data using peak area step-wise regression analysis: an alternative method for interpretation of Mars science laboratory results

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Dyar, Melinda D; Schafer, Martha W; Tucker, Jonathan M

    2008-01-01

    The ChemCam instrument on the Mars Science Laboratory (MSL) will include a laser-induced breakdown spectrometer (LIBS) to quantify major and minor elemental compositions. The traditional analytical chemistry approach to calibration curves for these data regresses a single diagnostic peak area against concentration for each element. This approach contrasts with a new multivariate method in which elemental concentrations are predicted by step-wise multiple regression analysis based on areas of a specific set of diagnostic peaks for each element. The method is tested on LIBS data from igneous and metamorphosed rocks. Between 4 and 13 partial regression coefficients are needed to describe each elemental abundance accurately (i.e., with a regression line of R{sup 2} > 0.9995 for the relationship between predicted and measured elemental concentration) for all major and minor elements studied. Validation plots suggest that the method is limited at present by the small data set, and will work best for prediction of concentration when a wide variety of compositions and rock types has been analyzed.

  12. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  13. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  14. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams. PMID:27505659

  15. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  16. Quantitative rotating frame relaxometry methods in MRI.

    PubMed

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27100142

  17. Quantitative Methods for Assessing Drug Synergism

    PubMed Central

    2011-01-01

    Two or more drugs that individually produce overtly similar effects will sometimes display greatly enhanced effects when given in combination. When the combined effect is greater than that predicted by their individual potencies, the combination is said to be synergistic. A synergistic interaction allows the use of lower doses of the combination constituents, a situation that may reduce adverse reactions. Drug combinations are quite common in the treatment of cancers, infections, pain, and many other diseases and situations. The determination of synergism is a quantitative pursuit that involves a rigorous demonstration that the combination effect is greater than that which is expected from the individual drug’s potencies. The basis of that demonstration is the concept of dose equivalence, which is discussed here and applied to an experimental design and data analysis known as isobolographic analysis. That method, and a related method of analysis that also uses dose equivalence, are presented in this brief review, which provides the mathematical basis for assessing synergy and an optimization strategy for determining the dose combination. PMID:22737266

  18. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  19. Sparse methods for Quantitative Susceptibility Mapping

    NASA Astrophysics Data System (ADS)

    Bilgic, Berkin; Chatnuntawech, Itthi; Langkammer, Christian; Setsompop, Kawin

    2015-09-01

    Quantitative Susceptibility Mapping (QSM) aims to estimate the tissue susceptibility distribution that gives rise to subtle changes in the main magnetic field, which are captured by the image phase in a gradient echo (GRE) experiment. The underlying susceptibility distribution is related to the acquired tissue phase through an ill-posed linear system. To facilitate its inversion, spatial regularization that imposes sparsity or smoothness assumptions can be employed. This paper focuses on efficient algorithms for regularized QSM reconstruction. Fast solvers that enforce sparsity under Total Variation (TV) and Total Generalized Variation (TGV) constraints are developed using Alternating Direction Method of Multipliers (ADMM). Through variable splitting that permits closed-form iterations, the computation efficiency of these solvers are dramatically improved. An alternative approach to improve the conditioning of the ill-posed inversion is to acquire multiple GRE volumes at different head orientations relative to the main magnetic field. The phase information from such multi-orientation acquisition can be combined to yield exquisite susceptibility maps and obviate the need for regularized reconstruction, albeit at the cost of increased data acquisition time.

  20. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  1. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  2. Quantitative MR imaging in fracture dating-Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34±15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895±607ms), which decreased over time to a value of 1094±182ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115±80ms) and decreased to 73±33ms within 21 days after the fracture event. After that time point, no significant changes

  3. Meaning in Method: The Rhetoric of Quantitative and Qualitative Research.

    ERIC Educational Resources Information Center

    Firestone, William A.

    The current debate about quantitative and qualitative research methods focuses on whether there is a necessary connection between method-type and research paradigm that makes the different approaches incompatible. This paper argues that the connection is not so much logical as rhetorical. Quantitative methods express the assumptions of a…

  4. Informatics Methods to Enable Sharing of Quantitative Imaging Research Data

    PubMed Central

    Levy, Mia A.; Freymann, John B.; Kirby, Justin S.; Fedorov, Andriy; Fennessy, Fiona M.; Eschrich, Steven A.; Berglund, Anders E.; Fenstermacher, David A.; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L.; Brown, Bartley J.; Braun, Terry A.; Dekker, Andre; Roelofs, Erik; Mountz, James M.; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-01-01

    Introduction The National Cancer Institute (NCI) Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. Methods We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. Results There area variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. Conclusions As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. PMID:22770688

  5. Blending Qualitative & Quantitative Research Methods in Theses and Dissertations.

    ERIC Educational Resources Information Center

    Thomas, R. Murray

    This guide discusses combining qualitative and quantitative research methods in theses and dissertations. It covers a wide array of methods, the strengths and limitations of each, and how they can be effectively interwoven into various research designs. The first chapter is "The Qualitative and the Quantitative." Part 1, "A Catalogue of…

  6. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  7. A quantitative method for measuring the quality of history matches

    SciTech Connect

    Shaw, T.S.; Knapp, R.M.

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  8. Quantitative methods for ecological network analysis.

    PubMed

    Ulanowicz, Robert E

    2004-12-01

    The analysis of networks of ecological trophic transfers is a useful complement to simulation modeling in the quest for understanding whole-ecosystem dynamics. Trophic networks can be studied in quantitative and systematic fashion at several levels. Indirect relationships between any two individual taxa in an ecosystem, which often differ in either nature or magnitude from their direct influences, can be assayed using techniques from linear algebra. The same mathematics can also be employed to ascertain where along the trophic continuum any individual taxon is operating, or to map the web of connections into a virtual linear chain that summarizes trophodynamic performance by the system. Backtracking algorithms with pruning have been written which identify pathways for the recycle of materials and energy within the system. The pattern of such cycling often reveals modes of control or types of functions exhibited by various groups of taxa. The performance of the system as a whole at processing material and energy can be quantified using information theory. In particular, the complexity of process interactions can be parsed into separate terms that distinguish organized, efficient performance from the capacity for further development and recovery from disturbance. Finally, the sensitivities of the information-theoretic system indices appear to identify the dynamical bottlenecks in ecosystem functioning. PMID:15556474

  9. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  10. Research radiometric calibration quantitative transfer methods between internal and external

    NASA Astrophysics Data System (ADS)

    Guo, Ju Guang; Ma, Yong hui; Zhang, Guang; Yang, Zhi hui

    2015-10-01

    This paper puts forward a method by realizing the internal and external radiation calibration transfer for infrared radiation characteristics quantitative measuring system. Through technological innovation and innovation application to establish a theoretical model of the corresponding radiated transfer method. This method can be well in engineering application for technology conversion process of radiometric calibration that with relatively simple and effective calibration in the half light path radiation instead of complex difficult whole optical path radiometric calibration. At the same time, it also will provide the basis of effective support to further carry out the target radiated characteristics quantitative measurement and application for ground type infrared radiated quantitative measuring system.

  11. Review of Quantitative Software Reliability Methods

    SciTech Connect

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of digital systems

  12. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular. PMID:23650936

  13. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  14. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, Frank A.

    1982-01-01

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  15. [Quantitative analysis of alloy steel based on laser induced breakdown spectroscopy with partial least squares method].

    PubMed

    Cong, Zhi-Bo; Sun, Lan-Xiang; Xin, Yong; Li, Yang; Qi, Li-Feng; Yang, Zhi-Jia

    2014-02-01

    In the present paper both the partial least squares (PLS) method and the calibration curve (CC) method are used to quantitatively analyze the laser induced breakdown spectroscopy data obtained from the standard alloy steel samples. Both the major and trace elements were quantitatively analyzed. By comparing the results of two different calibration methods some useful results were obtained: for major elements, the PLS method is better than the CC method in quantitative analysis; more importantly, for the trace elements, the CC method can not give the quantitative results due to the extremely weak characteristic spectral lines, but the PLS method still has a good ability of quantitative analysis. And the regression coefficient of PLS method is compared with the original spectral data with background interference to explain the advantage of the PLS method in the LIBS quantitative analysis. Results proved that the PLS method used in laser induced breakdown spectroscopy is suitable for quantitative analysis of trace elements such as C in the metallurgical industry. PMID:24822436

  16. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  17. Integrating Qualitative and Quantitative Evaluation Methods in Substance Abuse Research.

    ERIC Educational Resources Information Center

    Dennis, Michael L.; And Others

    1994-01-01

    Some specific opportunities and techniques are described for combining and integrating qualitative and quantitative methods from the design stage of a substance abuse program evaluation through implementation and reporting. The multiple problems and requirements of such an evaluation make integrated methods essential. (SLD)

  18. A Quantitative Vainberg Method for Black Box Scattering

    NASA Astrophysics Data System (ADS)

    Galkowski, Jeffrey

    2016-05-01

    We give a quantitative version of Vainberg's method relating pole free regions to propagation of singularities for black box scatterers. In particular, we show that there is a logarithmic resonance free region near the real axis of size {τ} with polynomial bounds on the resolvent if and only if the wave propagator gains derivatives at rate {τ} . Next we show that if there exist singularities in the wave trace at times tending to infinity which smooth at rate {τ} , then there are resonances in logarithmic strips whose width is given by {τ} . As our main application of these results, we give sharp bounds on the size of resonance free regions in scattering on geometrically nontrapping manifolds with conic points. Moreover, these bounds are generically optimal on exteriors of nontrapping polygonal domains.

  19. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  20. A Quantitative Method for Weight Selection in SGDDP.

    PubMed

    Huang, Qin; Chen, Gang; Yuan, Zhilong; Zhang, Ying; Wenrich, Judy

    2015-01-01

    Ethnic factors pose major challenge to evaluating the treatment effect of a new drug in a targeted ethnic (TE) population in emerging regions based on the results from a multiregional clinical trial (MRCT). To address this issue with statistical rigor, Huang et al. (2012) proposed a new design of a simultaneous global drug development program (SGDDP) which used weighted Z tests to combine the information collected from the nontargeted ethnic (NTE) group in the MRCT with that from the TE group in both the MRCT and a simultaneously designed local clinical trial (LCT). An important and open question in the SGDDP design was how to downweight the information collected from the NTE population to reflect the potential impact of ethnic factors and ensure that the effect size for TE patients is clinically meaningful. In this paper, we will relate the weight selection for the SGDDP to Method 1 proposed in the Japanese regulatory guidance published by the Ministry of Health, Labour and Welfare (MHLW) in 2007. Method 1 is only applicable when true effect sizes are assumed to be equal for both TE and NTE groups. We modified the Method 1 formula for more general scenarios, and use it to develop a quantitative method of weight selection for the design of the SGDDP which, at the same time, also provides sufficient power to descriptively check the consistency of the effect size for TE patients to a clinically meaningful magnitude. PMID:25365548

  1. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  2. A quantitative dimming method for LED based on PWM

    NASA Astrophysics Data System (ADS)

    Wang, Jiyong; Mou, Tongsheng; Wang, Jianping; Tian, Xiaoqing

    2012-10-01

    Traditional light sources were required to provide stable and uniform illumination for a living or working environment considering performance of visual function of human being. The requirement was always reasonable until non-visual functions of the ganglion cells in the retina photosensitive layer were found. New generation of lighting technology, however, is emerging based on novel lighting materials such as LED and photobiological effects on human physiology and behavior. To realize dynamic lighting of LED whose intensity and color were adjustable to the need of photobiological effects, a quantitative dimming method based on Pulse Width Modulation (PWM) and light-mixing technology was presented. Beginning with two channels' PWM, this paper demonstrated the determinacy and limitation of PWM dimming for realizing Expected Photometric and Colorimetric Quantities (EPCQ), in accordance with the analysis on geometrical, photometric, colorimetric and electrodynamic constraints. A quantitative model which mapped the EPCQ into duty cycles was finally established. The deduced model suggested that the determinacy was a unique individuality only for two channels' and three channels' PWM, but the limitation was an inevitable commonness for multiple channels'. To examine the model, a light-mixing experiment with two kinds of white LED simulated variations of illuminance and Correlation Color Temperature (CCT) from dawn to midday. Mean deviations between theoretical values and measured values were obtained, which were 15lx and 23K respectively. Result shows that this method can effectively realize the light spectrum which has a specific requirement of EPCQ, and provides a theoretical basis and a practical way for dynamic lighting of LED.

  3. Trojan Horse Method: Recent Results

    SciTech Connect

    Pizzone, R. G.; Spitaleri, C.

    2008-01-24

    Owing the presence of the Coulomb barrier at astrophysically relevant kinetic energies, it is very difficult, or sometimes impossible to measure astrophysical reaction rates in laboratory. This is why different indirect techniques are being used along with direct measurements. The THM is unique indirect technique allowing one measure astrophysical rearrangement reactions down to astrophysical relevant energies. The basic principle and a review of the main application of the Trojan Horse Method are presented. The applications aiming at the extraction of the bare S{sub b}(E) astrophysical factor and electron screening potentials U{sub e} for several two body processes are discussed.

  4. Quantitative methods for somatosensory evaluation in atypical odontalgia.

    PubMed

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana; Bonjardim, Leonardo Rigoldi; Conti, Paulo César Rodrigues; Svensson, Peter

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection (CPT), and/or wind-up ratio (WUR). The publications meeting the inclusion criteria revealed that only mechanical allodynia tests (DMA1, DMA2, and WUR) were significantly higher and pain threshold tests to heat stimulation (HPT) were significantly lower in the affected side, compared with the contralateral side, in AO patients; however, for MDT, MPT, PPT, CDT, and WDT, the results were not significant. These data support the presence of central sensitization features, such as allodynia and temporal summation. In contrast, considerable inconsistencies between studies were found when AO patients were compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests. PMID:25627886

  5. A Quantitative Assessment Method for Ascaris Eggs on Hands

    PubMed Central

    Jeandron, Aurelie; Ensink, Jeroen H. J.; Thamsborg, Stig M.; Dalsgaard, Anders; Sengupta, Mita E.

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  6. Simple laboratory methods for quantitative IR measurements of CW agents

    NASA Astrophysics Data System (ADS)

    Puckrin, Eldon; Thériault, Jean-Marc; Lavoie, Hugo; Dubé, Denis; Lepage, Carmela J.; Petryk, Michael

    2005-11-01

    A simple method is presented for quantitatively measuring the absorbance of chemical warfare (CW) agents and their simulants in the vapour phase. The technique is based on a standard lab-bench FTIR spectrometer, 10-cm gas cell, a high accuracy Baratron pressure manometer, vacuum pump and simple stainless-steel hardware components. The results of this measurement technique are demonstrated for sarin (GB) and soman (GD). A second technique is also introduced for the passive IR detection of CW agents in an open- air path located in a fumehood. Using a modified open-cell with a pathlength of 45 cm, open-air passive infrared measurements have been obtained for simulants and several classical CW agents. Detection, identification and quantification results based on passive infrared measurements are presented for GB and the CW agent simulant, DMMP, using the CATSI sensor which has been developed by DRDC Valcartier. The open-cell technique represents a relatively simple and feasible method for examining the detection capability of passive sensors, such as CATSI, for CW agents.

  7. Employing quantitative and qualitative methods in one study.

    PubMed

    Mason, S A

    There is an apparent lack of epistemological rigour when quantitative and qualitative methods are combined in the same study, because they reflect opposing positivist and interpretive perspectives. When and how to use methodological pluralism is discussed in this article. PMID:8400784

  8. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  9. Quantitative methods for analyzing cell-cell adhesion in development.

    PubMed

    Kashef, Jubin; Franz, Clemens M

    2015-05-01

    During development cell-cell adhesion is not only crucial to maintain tissue morphogenesis and homeostasis, it also activates signalling pathways important for the regulation of different cellular processes including cell survival, gene expression, collective cell migration and differentiation. Importantly, gene mutations of adhesion receptors can cause developmental disorders and different diseases. Quantitative methods to measure cell adhesion are therefore necessary to understand how cells regulate cell-cell adhesion during development and how aberrations in cell-cell adhesion contribute to disease. Different in vitro adhesion assays have been developed in the past, but not all of them are suitable to study developmentally-related cell-cell adhesion processes, which usually requires working with low numbers of primary cells. In this review, we provide an overview of different in vitro techniques to study cell-cell adhesion during development, including a semi-quantitative cell flipping assay, and quantitative single-cell methods based on atomic force microscopy (AFM)-based single-cell force spectroscopy (SCFS) or dual micropipette aspiration (DPA). Furthermore, we review applications of Förster resonance energy transfer (FRET)-based molecular tension sensors to visualize intracellular mechanical forces acting on cell adhesion sites. Finally, we describe a recently introduced method to quantitate cell-generated forces directly in living tissues based on the deformation of oil microdroplets functionalized with adhesion receptor ligands. Together, these techniques provide a comprehensive toolbox to characterize different cell-cell adhesion phenomena during development. PMID:25448695

  10. Spy quantitative inspection with a machine vision light sectioning method

    NASA Astrophysics Data System (ADS)

    Tu, Da-Wei; Lin, Cai-Xing

    2000-08-01

    Machine vision light sectioning sensing is developed and expanded to the range of spy quantitative inspection for hole-like work pieces in this paper. A light beam from a semiconductor laser diode is converged into a line-shape by a cylindrical lens. A special compact reflecting-refracting prism group is designed to ensure that such a sectioning light is projected axially onto the inner surface, and to make the deformed line be imaged onto a CCD sensitive area. The image is digitized and captured into a computer by a 512×512 pixel card, and machine vision image processing methods such as thresholding, line centre detect and the least-squares method are developed for contour feature extraction and description. Two other important problems in such an inspection system are how to orientate the deep-going optical probe and how to bring the projected line into focus. A focusing criterion based on image position deviation and a four-step orientating procedure are put forward, and analysed to be feasible respectively. The experimental results show that the principle is correct and the techniques are realizable, and a good future for application in industry is possible.

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  12. Method for depth-resolved quantitation of optical properties in layered media using spatially modulated quantitative spectroscopy

    PubMed Central

    Saager, Rolf B.; Truong, Alex; Cuccia, David J.; Durkin, Anthony J.

    2011-01-01

    We have demonstrated that spatially modulated quantitative spectroscopy (SMoQS) is capable of extracting absolute optical properties from homogeneous tissue simulating phantoms that span both the visible and near-infrared wavelength regimes. However, biological tissue, such as skin, is highly structured, presenting challenges to quantitative spectroscopic techniques based on homogeneous models. In order to more accurately address the challenges associated with skin, we present a method for depth-resolved optical property quantitation based on a two layer model. Layered Monte Carlo simulations and layered tissue simulating phantoms are used to determine the efficacy and accuracy of SMoQS to quantify layer specific optical properties of layered media. Initial results from both the simulation and experiment show that this empirical method is capable of determining top layer thickness within tens of microns across a physiological range for skin. Layer specific chromophore concentration can be determined to <±10% the actual values, on average, whereas bulk quantitation in either visible or near infrared spectroscopic regimes significantly underestimates the layer specific chromophore concentration and can be confounded by top layer thickness. PMID:21806282

  13. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  14. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  15. Quantitative imaging of volcanic plumes - Results, needs, and future trends

    NASA Astrophysics Data System (ADS)

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-07-01

    Recent technology allows two-dimensional "imaging" of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2 cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry-Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  16. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  17. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  18. Analytical methods for quantitation of prenylated flavonoids from hops

    PubMed Central

    Nikolić, Dejan; van Breemen, Richard B.

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  19. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  20. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    NASA Astrophysics Data System (ADS)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  1. A method for quantitative wet chemical analysis of urinary calculi.

    PubMed

    Larsson, L; Sörbo, B; Tiselius, H G; Ohman, S

    1984-06-27

    We describe a simple method for quantitative chemical analysis of urinary calculi requiring no specialized equipment. Pulverized calculi are dried over silica gel at room temperature and dissolved in nitric acid, which was the only effective agent for complete dissolution. Calcium, magnesium, ammonium, and phosphate are then determined by conventional methods. Oxalate is determined by a method based on the quenching action of oxalate on the fluorescence of a zirconium-flavonol complex. Uric acid, when treated with nitric acid, is stoichiometrically converted to alloxan, which is determined fluorimetrically with 1,2-phenylenediamine. Similarly, cystine is oxidized by nitric acid to sulfate, which is determined turbidimetrically as barium sulfate. Protein is determined spectrophotometrically as xanthoprotein. The total mass recovery of authentic calculi was 92.2 +/- 6.7 (SD) per cent. The method permits analysis of calculi as small as 1.0 mg. Internal quality control is performed with specially designed control samples. PMID:6086179

  2. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  3. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  4. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  5. [Method of quantitative determination of staphylococcal hyaluronidase activity].

    PubMed

    Generalov, I I

    1998-03-01

    The proposed method for measuring hyaluronidase activity of microorganism is based on prevention of hyaluronic acid clot formation with rivanol under the effect of hyaluronidase. This made possible the quantitative and qualitative detection of hyaluronidase activities of different staphylococcus species and strains. The maximum level of the enzyme and highest rate of its detection were typical of St. aureus. Its strains producing hyaluronidase in quantities of at least 0.5 IU are significantly (p < 0.01) more often isolated from medical staff. PMID:9575732

  6. A comparison of ancestral state reconstruction methods for quantitative characters.

    PubMed

    Royer-Carenzi, Manuela; Didier, Gilles

    2016-09-01

    Choosing an ancestral state reconstruction method among the alternatives available for quantitative characters may be puzzling. We present here a comparison of seven of them, namely the maximum likelihood, restricted maximum likelihood, generalized least squares under Brownian, Brownian-with-trend and Ornstein-Uhlenbeck models, phylogenetic independent contrasts and squared parsimony methods. A review of the relations between these methods shows that the maximum likelihood, the restricted maximum likelihood and the generalized least squares under Brownian model infer the same ancestral states and can only be distinguished by the distributions accounting for the reconstruction uncertainty which they provide. The respective accuracy of the methods is assessed over character evolution simulated under a Brownian motion with (and without) directional or stabilizing selection. We give the general form of ancestral state distributions conditioned on leaf states under the simulation models. Ancestral distributions are used first, to give a theoretical lower bound of the expected reconstruction error, and second, to develop an original evaluation scheme which is more efficient than comparing the reconstructed and the simulated states. Our simulations show that: (i) the distributions of the reconstruction uncertainty provided by the methods generally make sense (some more than others); (ii) it is essential to detect the presence of an evolutionary trend and to choose a reconstruction method accordingly; (iii) all the methods show good performances on characters under stabilizing selection; (iv) without trend or stabilizing selection, the maximum likelihood method is generally the most accurate. PMID:27234644

  7. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  8. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  9. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples. PMID:24190861

  10. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  11. Quantitative methods in electroencephalography to access therapeutic response.

    PubMed

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  12. System and methods for wide-field quantitative fluorescence imaging during neurosurgery.

    PubMed

    Valdes, Pablo A; Jacobs, Valerie L; Wilson, Brian C; Leblond, Frederic; Roberts, David W; Paulsen, Keith D

    2013-08-01

    We report an accurate, precise and sensitive method and system for quantitative fluorescence image-guided neurosurgery. With a low-noise, high-dynamic-range CMOS array, we perform rapid (integration times as low as 50 ms per wavelength) hyperspectral fluorescence and diffuse reflectance detection and apply a correction algorithm to compensate for the distorting effects of tissue absorption and scattering. Using this approach, we generated quantitative wide-field images of fluorescence in tissue-simulating phantoms for the fluorophore PpIX, having concentrations and optical absorption and scattering variations over clinically relevant ranges. The imaging system was tested in a rodent model of glioma, detecting quantitative levels down to 20 ng/ml. The resulting performance is a significant advance on existing wide-field quantitative imaging techniques, and provides performance comparable to a point-spectroscopy probe that has previously demonstrated significant potential for improved detection of malignant brain tumors during surgical resection. PMID:23903142

  13. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    SciTech Connect

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.; Palmer, M.A.; Furman, M.; Harkay, K.

    2011-03-28

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and one can obtain best fit values for important simulation parameters with a chi-square minimization method.

  14. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  15. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    NASA Astrophysics Data System (ADS)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  16. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    ERIC Educational Resources Information Center

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  17. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  18. A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra

    NASA Astrophysics Data System (ADS)

    Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

    2013-06-01

    A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

  19. Quantitative analysis of rib kinematics based on dynamic chest bone images: preliminary results

    PubMed Central

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-01-01

    Abstract. An image-processing technique for separating bones from soft tissue in static chest radiographs has been developed. The present study was performed to evaluate the usefulness of dynamic bone images in quantitative analysis of rib movement. Dynamic chest radiographs of 16 patients were obtained using a dynamic flat-panel detector and processed to create bone images by using commercial software (Clear Read BS, Riverain Technologies). Velocity vectors were measured in local areas on the dynamic images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as a reduced rib velocity field, resulting in an asymmetrical distribution of rib movement. Vector maps in all normal cases exhibited left/right symmetric distributions of the velocity field, whereas those in abnormal cases showed asymmetric distributions because of locally limited rib movements. Dynamic bone images were useful for accurate quantitative analysis of rib movements. The present method has a potential for an additional functional examination in chest radiography. PMID:26158097

  20. Crystal alignment of carbonated apatite in bone and calcified tendon: results from quantitative texture analysis.

    PubMed

    Wenk, H R; Heidelbach, F

    1999-04-01

    Calcified tissue contains collagen associated with minute crystallites of carbonated apatite. In this study, methods of quantitative X-ray texture analysis were used to determine the orientation distribution and texture strength of apatite in a calcified turkey tendon and in trabecular and cortical regions of osteonal bovine ankle bone (metacarpus). To resolve local heterogeneity, a 2 or 10 microm synchrotron microfocus X-ray beam (lambda = 0.78 A) was employed. Both samples revealed a strong texture. In the case of turkey tendon, 12 times more c axes of hexagonal apatite were parallel to the fibril axis than perpendicular, and a axes had rotational freedom about the c axis. In bovine bone, the orientation density of the c axes was three times higher parallel to the surface of collagen fibrils than perpendicular to it, and there was no preferential alignment with respect to the long axis of the bone (fiber texture). Whereas half of the apatite crystallites were strongly oriented, the remaining half had a random orientation distribution. The synchrotron X-ray texture results were consistent with previous analyses of mineral orientation in calcified tissues by conventional X-ray and neutron diffraction and electron microscopy, but gave, for the first time, a quantitative description. PMID:10221548

  1. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    SciTech Connect

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  2. Rapid quantitative pharmacodynamic imaging by a novel method: theory, simulation testing and proof of principle.

    PubMed

    Black, Kevin J; Koller, Jonathan M; Miller, Brad D

    2013-01-01

    Pharmacological challenge imaging has mapped, but rarely quantified, the sensitivity of a biological system to a given drug. We describe a novel method called rapid quantitative pharmacodynamic imaging. This method combines pharmacokinetic-pharmacodynamic modeling, repeated small doses of a challenge drug over a short time scale, and functional imaging to rapidly provide quantitative estimates of drug sensitivity including EC 50 (the concentration of drug that produces half the maximum possible effect). We first test the method with simulated data, assuming a typical sigmoidal dose-response curve and assuming imperfect imaging that includes artifactual baseline signal drift and random error. With these few assumptions, rapid quantitative pharmacodynamic imaging reliably estimates EC 50 from the simulated data, except when noise overwhelms the drug effect or when the effect occurs only at high doses. In preliminary fMRI studies of primate brain using a dopamine agonist, the observed noise level is modest compared with observed drug effects, and a quantitative EC 50 can be obtained from some regional time-signal curves. Taken together, these results suggest that research and clinical applications for rapid quantitative pharmacodynamic imaging are realistic. PMID:23940831

  3. Breast tumour visualization using 3D quantitative ultrasound methods

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Raheem, Abdul; Tadayyon, Hadi; Liu, Simon; Hadizad, Farnoosh; Czarnota, Gregory J.

    2016-04-01

    Breast cancer is one of the most common cancer types accounting for 29% of all cancer cases. Early detection and treatment has a crucial impact on improving the survival of affected patients. Ultrasound (US) is non-ionizing, portable, inexpensive, and real-time imaging modality for screening and quantifying breast cancer. Due to these attractive attributes, the last decade has witnessed many studies on using quantitative ultrasound (QUS) methods in tissue characterization. However, these studies have mainly been limited to 2-D QUS methods using hand-held US (HHUS) scanners. With the availability of automated breast ultrasound (ABUS) technology, this study is the first to develop 3-D QUS methods for the ABUS visualization of breast tumours. Using an ABUS system, unlike the manual 2-D HHUS device, the whole patient's breast was scanned in an automated manner. The acquired frames were subsequently examined and a region of interest (ROI) was selected in each frame where tumour was identified. Standard 2-D QUS methods were used to compute spectral and backscatter coefficient (BSC) parametric maps on the selected ROIs. Next, the computed 2-D parameters were mapped to a Cartesian 3-D space, interpolated, and rendered to provide a transparent color-coded visualization of the entire breast tumour. Such 3-D visualization can potentially be used for further analysis of the breast tumours in terms of their size and extension. Moreover, the 3-D volumetric scans can be used for tissue characterization and the categorization of breast tumours as benign or malignant by quantifying the computed parametric maps over the whole tumour volume.

  4. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells.

    PubMed

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R(2) > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/10(6) cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/10(6) letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  5. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells

    PubMed Central

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R2 > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/106 cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/106 letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  6. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788. PMID:21071908

  7. Overview of Student Affairs Research Methods: Qualitative and Quantitative.

    ERIC Educational Resources Information Center

    Perl, Emily J.; Noldon, Denise F.

    2000-01-01

    Reviews the strengths and weaknesses of quantitative and qualitative research in student affairs research, noting that many student affairs professionals question the value of more traditional quantitative approaches to research, though they typically have very good people skills that they have applied to being good qualitative researchers.…

  8. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  9. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  10. Disordered Speech Assessment Using Automatic Methods Based on Quantitative Measures

    NASA Astrophysics Data System (ADS)

    Gu, Lingyun; Harris, John G.; Shrivastav, Rahul; Sapienza, Christine

    2005-12-01

    Speech quality assessment methods are necessary for evaluating and documenting treatment outcomes of patients suffering from degraded speech due to Parkinson's disease, stroke, or other disease processes. Subjective methods of speech quality assessment are more accurate and more robust than objective methods but are time-consuming and costly. We propose a novel objective measure of speech quality assessment that builds on traditional speech processing techniques such as dynamic time warping (DTW) and the Itakura-Saito (IS) distortion measure. Initial results show that our objective measure correlates well with the more expensive subjective methods.

  11. Quantitative Methods for Comparing Different Polyline Stream Network Models

    SciTech Connect

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  12. Quantitative methods in the study of trypanosomes and their applications*

    PubMed Central

    Lumsden, W. H. R.

    1963-01-01

    In the first part of this paper the author summarizes and discusses previous quantitative work on trypanosomes, with particular reference to biometrical studies, in vivo and in vitro studies on numbers of trypanosomes, studies on hosts infected with trypanosomes, and physiological studies. The second part discusses recent work done at the East African Trypanosomiasis Research Organization. A method for the measurement of the infectivity of trypanosome suspensions, based on serial dilution and inoculation into test animals, is outlined, and applications likely to improve diagnostic procedures are suggested for it. Such applications might include: the establishment of experimental procedures not significantly reducing the infectivity of trypanosomes under experiment; determination of the effects on the infectivity of preserved material of some of the factors in the process of preservation, important for the preparation of standard material; comparison of the efficiency of different culture media for the isolation of trypanosomes; study of the distribution of trypanosomes in the vertebrate host; and measurement of the susceptibility of trypanosomes to drugs. The author stresses the importance of relating future experimental work with trypanosomes to preserved material for which comprehensive documentation is available. PMID:20604152

  13. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    PubMed Central

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-01-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output. PMID:26430292

  14. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  15. Quantitative EDXS analysis of organic materials using the ζ-factor method.

    PubMed

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. PMID:24012932

  16. A Method for Quantitatively Evaluating a University Library Collection

    ERIC Educational Resources Information Center

    Golden, Barbara

    1974-01-01

    The acquisitions department of the University of Nebraska at Omaha library conducted a quantitative evaluation of the library's book collection in relation to the course offerings of the university. (Author/LS)

  17. Quantitative biomechanical comparison of ankle fracture casting methods.

    PubMed

    Shipman, Alastair; Alsousou, Joseph; Keene, David J; Dyson, Igor N; Lamb, Sarah E; Willett, Keith M; Thompson, Mark S

    2015-06-01

    The incidence of ankle fractures is increasing rapidly due to the ageing demographic. In older patients with compromised distal circulation, conservative treatment of fractures may be indicated. High rates of malunion and complications due to skin fragility motivate the design of novel casting systems, but biomechanical stability requirements are poorly defined. This article presents the first quantitative study of ankle cast stability and hypothesises that a newly proposed close contact cast (CCC) system provides similar biomechanical stability to standard casts (SC). Two adult mannequin legs transected at the malleoli, one incorporating an inflatable model of tissue swelling, were stabilised with casts applied by an experienced surgeon. They were cyclically loaded in torsion, measuring applied rotation angle and resulting torque. CCC stiffness was equal to or greater than that of SC in two measures of ankle cast resistance to torsion. The effect of swelling reduction at the ankle site was significantly greater on CCC than on SC. The data support the hypothesis that CCC provides similar biomechanical stability to SC and therefore also the clinical use of CCC. They suggest that more frequent re-application of CCC is likely required to maintain stability following resolution of swelling at the injury site. PMID:25719278

  18. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  19. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  20. Business Scenario Evaluation Method Using Monte Carlo Simulation on Qualitative and Quantitative Hybrid Model

    NASA Astrophysics Data System (ADS)

    Samejima, Masaki; Akiyoshi, Masanori; Mitsukuni, Koshichiro; Komoda, Norihisa

    We propose a business scenario evaluation method using qualitative and quantitative hybrid model. In order to evaluate business factors with qualitative causal relations, we introduce statistical values based on propagation and combination of effects of business factors by Monte Carlo simulation. In propagating an effect, we divide a range of each factor by landmarks and decide an effect to a destination node based on the divided ranges. In combining effects, we decide an effect of each arc using contribution degree and sum all effects. Through applied results to practical models, it is confirmed that there are no differences between results obtained by quantitative relations and results obtained by the proposed method at the risk rate of 5%.

  1. PRIORITIZING FUTURE RESEACH ON OFF-LABEL PRESCRIBING: RESULTS OF A QUANTITATIVE EVALUATION

    PubMed Central

    Walton, Surrey M.; Schumock, Glen T.; Lee, Ky-Van; Alexander, G. Caleb; Meltzer, David; Stafford, Randall S.

    2015-01-01

    Background Drug use for indications not approved by the Food and Drug Administration exceeds 20% of prescribing. Available compendia indicate that a minority of off-label uses are well supported by evidence. Policy makers, however, lack information to identify where systematic reviews of the evidence or other research would be most valuable. Methods We developed a quantitative model for prioritizing individual drugs for future research on off-label uses. The base model incorporated three key factors, 1) the volume of off-label use with inadequate evidence, 2) safety, and 3) cost and market considerations. Nationally representative prescribing data were used to estimate the number of off-label drug uses by indication from 1/2005 through 6/2007 in the United States, and these indications were then categorized according to the adequacy of scientific support. Black box warnings and safety alerts were used to quantify drug safety. Drug cost, date of market entry, and marketing expenditures were used to quantify cost and market considerations. Each drug was assigned a relative value for each factor, and the factors were then weighted in the final model to produce a priority score. Sensitivity analyses were conducted by varying the weightings and model parameters. Results Drugs that were consistently ranked highly in both our base model and sensitivity analyses included quetiapine, warfarin, escitalopram, risperidone, montelukast, bupropion, sertraline, venlafaxine, celecoxib, lisinopril, duloxetine, trazodone, olanzapine, and epoetin alfa. Conclusion Future research into off-label drug use should focus on drugs used frequently with inadequate supporting evidence, particularly if further concerns are raised by known safety issues, high drug cost, recent market entry, and extensive marketing. Based on quantitative measures of these factors, we have prioritized drugs where targeted research and policy activities have high potential value. PMID:19025425

  2. Quantitative assessment of susceptibility weighted imaging processing methods

    PubMed Central

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2013-01-01

    Purpose To evaluate different susceptibility weighted imaging (SWI) phase processing methods and parameter selection, thereby improving understanding of potential artifacts, as well as facilitating choice of methodology in clinical settings. Materials and Methods Two major phase processing methods, Homodyne-filtering and phase unwrapping-high pass (HP) filtering, were investigated with various phase unwrapping approaches, filter sizes, and filter types. Magnitude and phase images were acquired from a healthy subject and brain injury patients on a 3T clinical Siemens MRI system. Results were evaluated based on image contrast to noise ratio and presence of processing artifacts. Results When using a relatively small filter size (32 pixels for the matrix size 512 × 512 pixels), all Homodyne-filtering methods were subject to phase errors leading to 2% to 3% masked brain area in lower and middle axial slices. All phase unwrapping-filtering/smoothing approaches demonstrated fewer phase errors and artifacts compared to the Homodyne-filtering approaches. For performing phase unwrapping, Fourier-based methods, although less accurate, were 2–4 orders of magnitude faster than the PRELUDE, Goldstein and Quality-guide methods. Conclusion Although Homodyne-filtering approaches are faster and more straightforward, phase unwrapping followed by HP filtering approaches perform more accurately in a wider variety of acquisition scenarios. PMID:24923594

  3. A quantitative measurement method for comparison of seated postures.

    PubMed

    Hillman, Susan J; Hollington, James

    2016-05-01

    This technical note proposes a method to measure and compare seated postures. The three-dimensional locations of palpable anatomical landmarks corresponding to the anterior superior iliac spines, clavicular notch, head, shoulders and knees are measured in terms of x, y and z co-ordinates in the reference system of the measuring apparatus. These co-ordinates are then transformed onto a body-based axis system which allows comparison within-subject. The method was tested on eleven unimpaired adult participants and the resulting data used to calculate a Least Significant Difference (LSD) for the measure, which is used to determine whether two postures are significantly different from one another. The method was found to be sensitive to the four following standardised static postural perturbations: posterior pelvic tilt, pelvic obliquity, pelvic rotation, and abduction of the thighs. The resulting data could be used as an outcome measure for the postural alignment aspect of seating interventions in wheelchairs. PMID:26920073

  4. Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment

    NASA Technical Reports Server (NTRS)

    Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.

    1979-01-01

    The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.

  5. A quantitative analytical method to test for salt effects on giant unilamellar vesicles.

    PubMed

    Hadorn, Maik; Boenzli, Eva; Hotz, Peter Eggenberger

    2011-01-01

    Today, free-standing membranes, i.e. liposomes and vesicles, are used in a multitude of applications, e.g. as drug delivery devices and artificial cell models. Because current laboratory techniques do not allow handling of large sample sizes, systematic and quantitative studies on the impact of different effectors, e.g. electrolytes, are limited. In this work, we evaluated the Hofmeister effects of ten alkali metal halides on giant unilamellar vesicles made of palmitoyloleoylphosphatidylcholine for a large sample size by combining the highly parallel water-in-oil emulsion transfer vesicle preparation method with automatic haemocytometry. We found that this new quantitative screening method is highly reliable and consistent with previously reported results. Thus, this method may provide a significant methodological advance in analysis of effects on free-standing model membranes. PMID:22355683

  6. A gas chromatography-mass spectrometry method for the quantitation of clobenzorex.

    PubMed

    Cody, J T; Valtier, S

    1999-01-01

    Drugs metabolized to amphetamine or methamphetamine are potentially significant concerns in the interpretation of amphetamine-positive urine drug-testing results. One of these compounds, clobenzorex, is an anorectic drug that is available in many countries. Clobenzorex (2-chlorobenzylamphetamine) is metabolized to amphetamine by the body and excreted in the urine. Following administration, the parent compound was detectable for a shorter time than the metabolite amphetamine, which could be detected for days. Because of the potential complication posed to the interpretation of amphetamin-positive drug tests following administration of this drug, the viability of a current amphetamine procedure using liquid-liquid extraction and conversion to the heptafluorobutyryl derivative followed by gas chromatography-mass spectrometry (GC-MS) analysis was evaluated for identification and quantitation of clobenzorex. Qualitative identification of the drug was relatively straightforward. Quantitative analysis proved to be a far more challenging process. Several compounds were evaluated for use as the internal standard in this method, including methamphetamine-d11, fenfluramine, benzphetamine, and diphenylamine. Results using these compounds proved to be less than satisfactory because of poor reproducibility of the quantitative values. Because of its similar chromatographic properties to the parent drug, the compound 3-chlorobenzylamphetamine (3-Cl-clobenzorex) was evaluated in this study as the internal standard for the quantitation of clobenzorex. Precision studies showed 3-Cl-clobenzorex to produce accurate and reliable quantitative results (within-run relative standard deviations [RSDs] < 6.1%, between-run RSDs < 6.0%). The limits of detection and quantitation for this assay were determined to be 1 ng/mL for clobenzorex. PMID:10595847

  7. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  8. Sample Collection Method Bias Effects in Quantitative Phosphoproteomics.

    PubMed

    Kanshin, Evgeny; Tyers, Michael; Thibault, Pierre

    2015-07-01

    Current advances in selective enrichment, fractionation, and MS detection of phosphorylated peptides allowed identification and quantitation of tens of thousands phosphosites from minute amounts of biological material. One of the major challenges in the field is preserving the in vivo phosphorylation state of the proteins throughout the sample preparation workflow. This is typically achieved by using phosphatase inhibitors and denaturing conditions during cell lysis. Here we determine if the upstream cell collection techniques could introduce changes in protein phosphorylation. To evaluate the effect of sample collection protocols on the global phosphorylation status of the cell, we compared different sample workflows by metabolic labeling and quantitative mass spectrometry on Saccharomyces cerevisiae cell cultures. We identified highly similar phosphopeptides for cells harvested in ice cold isotonic phosphate buffer, cold ethanol, trichloroacetic acid, and liquid nitrogen. However, quantitative analyses revealed that the commonly used phosphate buffer unexpectedly activated signaling events. Such effects may introduce systematic bias in phosphoproteomics measurements and biochemical analysis. PMID:26040406

  9. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  10. HPTLC Method for Quantitative Determination of Zopiclone and Its Impurity.

    PubMed

    Naguib, Ibrahim A; Abdelrahman, Maha M; El Ghobashy, Mohamed R; Ali, Nesma A

    2015-09-01

    This study was designed to establish, optimize and validate a sensitive, selective and accurate high-performance thin layer chromatographic (HPTLC) method for determination of zopiclone (ZPC) and its main impurity, 2-amino-5-chloropyridine, one of its degradation products, in raw material and pharmaceutical formulation. The proposed method was applied for analysis of ZPC and its impurity over the concentration range of 0.3-1.4 and 0.05-0.8 µg/band with accuracy of mean percentage recovery 99.92% ± 1.521 and 99.28% ± 2.296, respectively. The method is based on the separation of two components followed by densitometric measurement of the separated peaks at 305 nm. The separation was carried out on silica gel HPTLC F254 plates, using chloroform-methanol-glacial acetic acid (9:1:0.1, by volume) as a developing system. The suggested method was validated according to International Conference on Harmonization guidelines and can be applied for routine analysis in quality control laboratories. The results obtained by the proposed method were statistically compared with the reported method revealing high accuracy and good precision. PMID:25740427

  11. Methods and Challenges in Quantitative Imaging Biomarker Development

    PubMed Central

    Abramson, Richard G.; Burton, Kirsteen R.; Yu, John-Paul J.; Scalzetti, Ernest M.; Yankeelov, Thomas E.; Rosenkrantz, Andrew B.; Mendiratta-Lala, Mishal; Bartholmai, Brian J.; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M.

    2014-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This manuscript, drafted by the Association of University Radiologists (AUR) Radiology Research Alliance (RRA) Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field. PMID:25481515

  12. Quantitative Methods for Evaluating the Efficacy of Thalamic Deep Brain Stimulation in Patients with Essential Tremor

    PubMed Central

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Background Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. Methods We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. Results The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Discussion Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life. PMID:24255800

  13. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works

  14. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    PubMed Central

    Bosschaart, Nienke; van Leeuwen, Ton G.; Aalders, Maurice C.G.; Faber, Dirk J.

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of Kraszewski, in support of their conclusion that SOCT optimization should include window shape, next to choice of window size and analysis algorithm. PMID:25401016

  15. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    PubMed Central

    Yamashiro, Tsuneo; Miyara, Tetsuhiro; Honda, Osamu; Tomiyama, Noriyuki; Ohno, Yoshiharu; Noma, Satoshi; Murayama, Sadayuki

    2015-01-01

    Purpose To assess the advantages of iterative reconstruction for quantitative computed tomography (CT) analysis of pulmonary emphysema. Materials and methods Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D) and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < −950 Hounsfield units) and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001). For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01), but was not significantly different between each pair among scans when using AIDR3D. On scans without using AIDR3D, measurement errors between different tube current settings were significantly correlated with patients’ body weights (P<0.05), whereas these errors between scans when using AIDR3D were insignificantly or minimally correlated with body weight. Conclusion The extent of emphysema was more consistent across different tube currents when CT scans were converted to CT images using AIDR3D than using a conventional filtered-back projection

  16. Group-consensus method and results

    SciTech Connect

    Meyer, M.A.; Peaslee, A.T. Jr.; Booker, J.M.

    1982-11-01

    This report focuses on the group consensus method, its application, results, and recommendations for future use. The method involves a group of qualified individuals who reach agreement on one answer after discussing the options in a face-to-face situation. The group method was used to elicit estimates on the relevance of weapon-related components to certain military threats or needs. In this study, the group consensus method was chosen from four possible methods to provide input data for a decision analysis model being tested for weapons-planning use. The major goal of the weapons-planning project was to determine the applicability of the decision anlaysis model, a modified linear utility model. This report examines whether the estimates (also referred to as weights) properly reflected the relationships between the components being judged. Statistical analysis (chi-square tests) indicated that the estimates were largely assigned according to the relationships between the components. Behavioral and cognitive factors could not be found to correlate to the assignment of the estimates. In sum, the group consensus method was judged suitable for situations in which a single estimate must be obtained from many estimates and stringent controls over the estimating process would be unacceptably burdensome.

  17. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize. PMID:23470871

  18. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    SciTech Connect

    Kiefel, Denis E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  19. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    NASA Astrophysics Data System (ADS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  20. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  1. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  2. Analyzing the Students' Academic Integrity using Quantitative Methods

    ERIC Educational Resources Information Center

    Teodorescu, Daniel; Andrei, Tudorel; Tusa, Erika; Herteliu, Claudiu; Stancu, Stelian

    2007-01-01

    The transition period in Romania has generated a series of important changes, including the reforming of the Romanian tertiary education. This process has been accelerated after the signing of the Bologna treaty. Important changes were recorded in many of the quantitative aspects (such as number of student enrolled, pupil-student ratio etc) as…

  3. Quantitative methods for studying hemostasis in zebrafish larvae.

    PubMed

    Rost, M S; Grzegorski, S J; Shavit, J A

    2016-01-01

    Hemostasis is a coordinated system through which blood is prevented from exiting a closed circulatory system. We have taken advantage of the zebrafish, an emerging model for the study of blood coagulation, and describe three techniques for quantitative analysis of primary and secondary hemostasis. Collectively, these three techniques comprise a toolset to aid in our understanding of hemostasis and pathological clotting. PMID:27312499

  4. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  5. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  6. Combining qualitative and quantitative methods in assessing hospital learning environments.

    PubMed

    Chan, D S

    2001-08-01

    Clinical education is a vital component in the curricula of pre-registration nursing courses and provides student nurses with the opportunity to combine cognitive, psychomotor, and affective skills. Clinical practice enables the student to develop competencies in the application of knowledge, skills, and attitudes to clinical field situations. It is, therefore, vital that the valuable clinical time be utilised effectively and productively. Nursing students' perception of the hospital learning environment were assessed by combining quantitative and qualitative approaches. The Clinical Learning Environment Inventory, based on the theoretical framework of learning environment studies, was developed and validated. The quantitative and qualitative findings reinforced each other. It was found that there were significant differences in students' perceptions of the actual clinical learning environment and their preferred learning environment. Generally, students preferred a more positive and favourable clinical environment than they perceived as being actually present. PMID:11470103

  7. New Fluorescence Microscopy Methods for Microbiology: Sharper, Faster, and Quantitative

    PubMed Central

    Gitai, Zemer

    2009-01-01

    Summary In addition to the inherent interest stemming from their ecological and human health impacts, microbes have many advantages as model organisms, including ease of growth and manipulation and relatively simple genomes. However, the imaging of bacteria via light microscopy has been limited by their small sizes. Recent advances in fluorescence microscopy that allow imaging of structures at extremely high resolutions are thus of particular interest to the modern microbiologist. In addition, advances in high-throughput microscopy and quantitative image analysis are enabling cellular imaging to finally take advantage of the full power of bacterial numbers and ease of manipulation. These technical developments are ushering in a new era of using fluorescence microscopy to understand bacterial systems in a detailed, comprehensive, and quantitative manner. PMID:19356974

  8. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  9. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  10. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples. PMID:25993858

  11. Results of a Formal Methods Demonstration Project

    NASA Technical Reports Server (NTRS)

    Kelly, J.; Covington, R.; Hamilton, D.

    1994-01-01

    This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.

  12. Performance analysis of quantitative phase retrieval method in Zernike phase contrast X-ray microscopy

    NASA Astrophysics Data System (ADS)

    Heng, Chen; Kun, Gao; Da-Jiang, Wang; Li, Song; Zhi-Li, Wang

    2016-02-01

    Since the invention of Zernike phase contrast method in 1930, it has been widely used in optical microscopy and more recently in X-ray microscopy. Considering the image contrast is a mixture of absorption and phase information, we recently have proposed and demonstrated a method for quantitative phase retrieval in Zernike phase contrast X-ray microscopy. In this contribution, we analyze the performance of this method at different photon energies. Intensity images of PMMA samples are simulated at 2.5 keV and 6.2 keV, respectively, and phase retrieval is performed using the proposed method. The results demonstrate that the proposed phase retrieval method is applicable over a wide energy range. For weakly absorbing features, the optimal photon energy is 2.5 keV, from the point of view of image contrast and accuracy of phase retrieval. On the other hand, in the case of strong absorption objects, a higher photon energy is preferred to reduce the error of phase retrieval. These results can be used as guidelines to perform quantitative phase retrieval in Zernike phase contrast X-ray microscopy with the proposed method. Supported by the State Key Project for Fundamental Research (2012CB825801), National Natural Science Foundation of China (11475170, 11205157 and 11179004) and Anhui Provincial Natural Science Foundation (1508085MA20).

  13. Quantitative electromechanical impedance method for nondestructive testing based on a piezoelectric bimorph cantilever

    NASA Astrophysics Data System (ADS)

    Fu, Ji; Tan, Chi; Li, Faxin

    2015-06-01

    The electromechanical impedance (EMI) method, which holds great promise in structural health monitoring (SHM), is usually treated as a qualitative method. In this work, we proposed a quantitative EMI method based on a piezoelectric bimorph cantilever using the sample’s local contact stiffness (LCS) as the identification parameter for nondestructive testing (NDT). Firstly, the equivalent circuit of the contact vibration system was established and the analytical relationship between the cantilever’s contact resonance frequency and the LCS was obtained. As the LCS is sensitive to typical defects such as voids and delamination, the proposed EMI method can then be used for NDT. To verify the equivalent circuit model, two piezoelectric bimorph cantilevers were fabricated and their free resonance frequencies were measured and compared with theoretical predictions. It was found that the stiff cantilever’s EMI can be well predicted by the equivalent circuit model while the soft cantilever’s cannot. Then, both cantilevers were assembled into a homemade NDT system using a three-axis motorized stage for LCS scanning. Testing results on a specimen with a prefabricated defect showed that the defect could be clearly reproduced in the LCS image, indicating the validity of the quantitative EMI method for NDT. It was found that the single-frequency mode of the EMI method can also be used for NDT, which is faster but not quantitative. Finally, several issues relating to the practical application of the NDT method were discussed. The proposed EMI-based NDT method offers a simple and rapid solution for damage evaluation in engineering structures and may also shed some light on EMI-based SHM.

  14. A new method for quantitating total lesion glucose metabolic changes in serial tumor FDG PET studies

    SciTech Connect

    Wu, H.M.; Hoh, C.K.; Huang, S.C.; Phelps, M.E.

    1994-05-01

    Accurate quantitative FDG PET studies have the potential for important applications in clinical oncology for monitoring therapy induced changes in tumor glycolytic rates. Due to a number of technical problems that complicate the use of quantitative PET tumor imaging, methods which can maximize the accuracy and precision of such measurements are advantageous. In this study, we developed and evaluated a method for reducing the errors caused by the conventional single plane, single ROI analysis in parametric images generated from pixel by pixel Patlak graphic analysis (PGA) in FDG PET studies of melanoma patients. We compared this new method to the conventional ROI method. The new processing method involves (1) generating the correlation coefficient (r) constrained Patlak parametric images from dynamic PET data; (2) summing up all the planes which cover the lesion; (3) defining a single ROI which covers the whole lesion in the summing image and determining the total lesion glucose metabolic index (K{sub T}, ml/min/lesion). Although only a single ROI was defined on the summing image, the glucose metabolic index obtained showed negligible difference (<1%) compared to those obtained from multiple ROIs on multiple planes of unconstrained parametric images. When the dynamic PET images were rotated and translated to simulate different patient positionings between scans at different times, the results obtained from the new method showed negligible difference (<2%). In summary, we present a simple but reliable method to quantitatively monitor the total lesion glucose metabolic changes during tumor growth. The method has several advantages over the conventional single ROI, single plane evaluation: (1) less sensitive to the ROI definition; (2) smaller intra- and inter-observer variations and (3) not requiring image registrations of serial scan data.

  15. Quantitative Cardiac Perfusion: A Noninvasive Spin-labeling Method That Exploits Coronary Vessel Geometry1

    PubMed Central

    Reeder, Scott B.; Atalay, Michael K.; McVeigh, Elliot R.; Zerhouni, Elias A.; Forder, John R.

    2007-01-01

    PURPOSE: To quantitate myocardial arterial perfusion with a noninvasive magnetic resonance (MR) imaging technique that exploits the geometry of coronary vessel anatomy. MATERIALS AND METHODS: MR imaging was performed with a spinlabeling method in six arrested rabbit hearts at 4.7 T. Selective inversion of magnetization in the short-axis imaging section along with all myocardium apical to that section produces signal enhancement from arterial perfusion. A linescan protocol was used for validation of flow enhancement. Flow was quantitated from two images and validated with spin-echo (SE) imaging. Regional perfusion defects were created by means of coronary artery ligation and delineated with gadolinium-enhanced imaging. RESULTS: Linescan estimates of T1 obtained at physiologic flows agreed with model predictions. Flow-induced signal enhancement measured on SE images also agreed with expected values. Finally, perfusion abnormalities created by means of coronary artery ligation were detected. CONCLUSION: This spin-labeling method provides quantitative estimates of myocardial arterial perfusion in this model and may hold promise for clinical applications. PMID:8657907

  16. Multipoint methods for linkage analysis of quantitative trait loci in sib pairs

    SciTech Connect

    Cardon, L.R. |; Cherny, S.S.; Fulker, D.W.

    1994-09-01

    The sib-pair method of Haseman and Elston is widely used for linkage analysis of quantitative traits. The method requires no assumptions concerning the mode of transmission of the trait, it is robust with respect to genetic heterogeneity, and it is computationally efficient. However, the practical usefulness of the method is limited by its statistical power, requiring large numbers of sib paris and highly informative markers to detect genetic loci of only moderate effect size. We have developed a family of interval mapping procedures which dramatically increase the statistical power of the classical sib-pair approach. The methods make use of information from pairs of markers which flank a putative quantitative trait locus (QTL) in order to estimate the location and effect size of the QTL. Here we describe an extension of the interval mapping procedure which takes into account all available marker information on a chromosome simultaneously, rather than just pairs of markers. The method provides a computationally fast approximation to full multipoint analysis of sib-pair data using a modified Haserman-Elston approach. It gives very similar results to the earlier interval mapping procedure when marker information is relatively uniform and a coarse map is used. However, there is a substantial improvement over the original method when markers differ in information content and when a dense map is employed. The method is illustrated using real and simulated sib-pair data.

  17. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    NASA Astrophysics Data System (ADS)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  18. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  19. Development and evaluation of an improved quantitative 90Y bremsstrahlung SPECT method

    PubMed Central

    Rong, Xing; Du, Yong; Ljungberg, Michael; Rault, Erwann; Vandenberghe, Stefaan; Frey, Eric C.

    2012-01-01

    Purpose: Yttrium-90 (90Y) is one of the most commonly used radionuclides in targeted radionuclide therapy (TRT). Since it decays with essentially no gamma photon emissions, surrogate radionuclides (e.g., 111In) or imaging agents (e.g., 99mTc MAA) are typically used for treatment planning. It would, however, be useful to image 90Y directly in order to confirm that the distributions measured with these other radionuclides or agents are the same as for the 90Y labeled agents. As a result, there has been a great deal of interest in quantitative imaging of 90Y bremsstrahlung photons using single photon emission computed tomography (SPECT) imaging. The continuous and broad energy distribution of bremsstrahlung photons, however, imposes substantial challenges on accurate quantification of the activity distribution. The aim of this work was to develop and evaluate an improved quantitative 90Y bremsstrahlung SPECT reconstruction method appropriate for these imaging applications. Methods: Accurate modeling of image degrading factors such as object attenuation and scatter and the collimator-detector response is essential to obtain quantitatively accurate images. All of the image degrading factors are energy dependent. Thus, the authors separated the modeling of the bremsstrahlung photons into multiple categories and energy ranges. To improve the accuracy, the authors used a bremsstrahlung energy spectrum previously estimated from experimental measurements and incorporated a model of the distance between 90Y decay location and bremsstrahlung emission location into the SIMIND code used to generate the response functions and kernels used in the model. This improved Monte Carlo bremsstrahlung simulation was validated by comparison to experimentally measured projection data of a 90Y line source. The authors validated the accuracy of the forward projection model for photons in the various categories and energy ranges using the validated Monte Carlo (MC) simulation method. The

  20. The expected results method for data verification

    NASA Astrophysics Data System (ADS)

    Monday, Paul

    2016-05-01

    The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest

  1. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this. PMID:19602858

  2. A new method for robust quantitative and qualitative analysis of real-time PCR

    PubMed Central

    Shain, Eric B.; Clemens, John M.

    2008-01-01

    An automated data analysis method for real-time PCR needs to exhibit robustness to the factors that routinely impact the measurement and analysis of real-time PCR data. Robust analysis is paramount to providing the same interpretation for results regardless of the skill of the operator performing or reviewing the work. We present a new method for analysis of real-time PCR data, the maxRatio method, which identifies a consistent point within or very near the exponential region of the PCR signal without requiring user intervention. Compared to other analytical techniques that generate only a cycle number, maxRatio generates several measurements of amplification including cycle numbers and relative measures of amplification efficiency and curve shape. By using these values, the maxRatio method can make highly reliable reactive/nonreactive determination along with quantitative evaluation. Application of the maxRatio method to the analysis of quantitative and qualitative real-time PCR assays is shown along with examples of method robustness to, and detection of, amplification response anomalies. PMID:18603594

  3. An Augmented Classical Least Squares Method for Quantitative Raman Spectral Analysis against Component Information Loss

    PubMed Central

    Zhou, Yan; Cao, Hui

    2013-01-01

    We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR. PMID:23956689

  4. Methods for quantitative determination of drug localized in the skin.

    PubMed

    Touitou, E; Meidan, V M; Horwitz, E

    1998-12-01

    The quantification of drugs within the skin is essential for topical and transdermal delivery research. Over the last two decades, horizontal sectioning, consisting of both tape stripping and parallel slicing through the deeper tissues has constituted the traditional investigative technique. In recent years, this methodology has been augmented by such procedures as heat separation, qualitative autoradiography, isolation of the pilosebaceous units and the use of induced follicle-free skin. The development of skin quantitative autoradiography represents an entirely novel approach which permits quantification and visualization of the penetrant throughout a vertical cross-section of skin. Noninvasive strategies involve the application of optical measuring systems such as attenuated total reflectance Fourier transform infrared, fluorescence, remittance or photothermal spectroscopies. PMID:9801425

  5. Quantitative estimation of poikilocytosis by the coherent optical method

    NASA Astrophysics Data System (ADS)

    Safonova, Larisa P.; Samorodov, Andrey V.; Spiridonov, Igor N.

    2000-05-01

    The investigation upon the necessity and the reliability required of the determination of the poikilocytosis in hematology has shown that existing techniques suffer from grave shortcomings. To determine a deviation of the erythrocytes' form from the normal (rounded) one in blood smears it is expedient to use an integrative estimate. The algorithm which is based on the correlation between erythrocyte morphological parameters with properties of the spatial-frequency spectrum of blood smear is suggested. During analytical and experimental research an integrative form parameter (IFP) which characterizes the increase of the relative concentration of cells with the changed form over 5% and the predominating type of poikilocytes was suggested. An algorithm of statistically reliable estimation of the IFP on the standard stained blood smears has been developed. To provide the quantitative characterization of the morphological features of cells a form vector has been proposed, and its validity for poikilocytes differentiation was shown.

  6. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  7. Quantitative CT for volumetric analysis of medical images: initial results for liver tumors

    NASA Astrophysics Data System (ADS)

    Behnaz, Alexander S.; Snider, James; Chibuzor, Eneh; Esposito, Giuseppe; Wilson, Emmanuel; Yaniv, Ziv; Cohen, Emil; Cleary, Kevin

    2010-03-01

    Quantitative CT for volumetric analysis of medical images is increasingly being proposed for monitoring patient response during chemotherapy trials. An integrated MATLAB GUI has been developed for an oncology trial at Georgetown University Hospital. This GUI allows for the calculation and visualization of the volume of a lesion. The GUI provides an estimate of the volume of the tumor using a semi-automatic segmentation technique. This software package features a fixed parameter adaptive filter from the ITK toolkit and a tumor segmentation algorithm to reduce inter-user variability and to facilitate rapid volume measurements. The system also displays a 3D rendering of the segmented tumor, allowing the end user to have not only a quantitative measure of the tumor volume, but a qualitative view as well. As an initial validation test, several clinical cases were hand-segmented, and then compared against the results from the tool, showing good agreement.

  8. A method for three-dimensional quantitative observation of the microstructure of biological samples

    NASA Astrophysics Data System (ADS)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  9. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G

    2015-12-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  10. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  11. Optimization of Quantitative PCR Methods for Enteropathogen Detection

    PubMed Central

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M.; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R.

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen’s extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease. PMID:27336160

  12. Visual Display of Scientific Studies, Methods, and Results

    NASA Astrophysics Data System (ADS)

    Saltus, R. W.; Fedi, M.

    2015-12-01

    The need for efficient and effective communication of scientific ideas becomes more urgent each year.A growing number of societal and economic issues are tied to matters of science - e.g., climate change, natural resource availability, and public health. Societal and political debate should be grounded in a general understanding of scientific work in relevant fields. It is difficult for many participants in these debates to access science directly because the formal method for scientific documentation and dissemination is the journal paper, generally written for a highly technical and specialized audience. Journal papers are very effective and important for documentation of scientific results and are essential to the requirements of science to produce citable and repeatable results. However, journal papers are not effective at providing a quick and intuitive summary useful for public debate. Just as quantitative data are generally best viewed in graphic form, we propose that scientific studies also can benefit from visual summary and display. We explore the use of existing methods for diagramming logical connections and dependencies, such as Venn diagrams, mind maps, flow charts, etc., for rapidly and intuitively communicating the methods and results of scientific studies. We also discuss a method, specifically tailored to summarizing scientific papers that we introduced last year at AGU. Our method diagrams the relative importance and connections between data, methods/models, results/ideas, and implications/importance using a single-page format with connected elements in these four categories. Within each category (e.g., data) the spatial location of individual elements (e.g., seismic, topographic, gravity) indicates relative novelty (e.g., are these new data?) and importance (e.g., how critical are these data to the results of the paper?). The goal is to find ways to rapidly and intuitively share both the results and the process of science, both for communication

  13. MODIS Radiometric Calibration Program, Methods and Results

    NASA Technical Reports Server (NTRS)

    Xiong, Xiaoxiong; Guenther, Bruce; Angal, Amit; Barnes, William; Salomonson, Vincent; Sun, Junqiang; Wenny, Brian

    2012-01-01

    As a key instrument for NASA s Earth Observing System (EOS), the Moderate Resolution Imaging Spectroradiometer (MODIS) has made significant contributions to the remote sensing community with its unprecedented amount of data products continuously generated from its observations and freely distributed to users worldwide. MODIS observations, covering spectral regions from visible (VIS) to long-wave infrared (LWIR), have enabled a broad range of research activities and applications for studies of the earth s interactive system of land, oceans, and atmosphere. In addition to extensive pre-launch measurements, developed to characterize sensor performance, MODIS carries a set of on-board calibrators (OBC) that can be used to track on-orbit changes of various sensor characteristics. Most importantly, dedicated and continuous calibration efforts have been made to maintain sensor data quality. This paper provides an overview of the MODIS calibration program, on-orbit calibration activities, methods, and performance. Key calibration results and lessons learned from the MODIS calibration effort are also presented in this paper.

  14. Full quantitative phase analysis of hydrated lime using the Rietveld method

    SciTech Connect

    Lassinantti Gualtieri, Magdalena

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  15. Quantitative research on the primary process: method and findings.

    PubMed

    Holt, Robert R

    2002-01-01

    Freud always defined the primary process metapsychologically, but he described the ways it shows up in dreams, parapraxes, jokes, and symptoms with enough observational detail to make it possible to create an objective, reliable scoring system to measure its manifestations in Rorschach responses, dreams, TAT stories, free associations, and other verbal texts. That system can identify signs of the thinker's efforts, adaptive or maladaptive, to control or defend against the emergence of primary process. A prerequisite and a consequence of the research that used this system was clarification and elaboration of the psychoanalytic theory of thinking. Results of empirical tests of several propositions derived from psychoanalytic theory are summarized. Predictions concerning the method's most useful index, of adaptive vs. maladaptive regression, have been repeatedly verified: People who score high on this index (who are able to produce well-controlled "primary products" in their Rorschach responses), as compared to those who score at the maladaptive pole (producing primary-process-filled responses with poor reality testing, anxiety, and pathological defensive efforts), are better able to tolerate sensory deprivation, are more able to enter special states of consciousness comfortably (drug-induced, hypnotic, etc.), and have higher achievements in artistic creativity, while schizophrenics tend to score at the extreme of maladaptive regression. Capacity for adaptive regression also predicts success in psychotherapy, and rises with the degree of improvement after both psychotherapy and drug treatment. Some predictive failures have been theoretically interesting: Kris's hypothesis about creativity and the controlled use of primary process holds for males but usually not for females. This body of work is presented as a refutation of charges, brought by such critics as Crews, that psychoanalysis cannot become a science. PMID:12206540

  16. Semi-quantitative method to estimate levels of Campylobacter

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  17. Reconstruction-classification method for quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Malone, Emma; Powell, Samuel; Cox, Ben T.; Arridge, Simon

    2015-12-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in two and three dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches.

  18. A method and fortran program for quantitative sampling in paleontology

    USGS Publications Warehouse

    Tipper, J.C.

    1976-01-01

    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  19. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  20. Limitations of the ferrozine method for quantitative assay of mineral systems for ferrous and total iron

    NASA Astrophysics Data System (ADS)

    Anastácio, Alexandre S.; Harris, Brittany; Yoo, Hae-In; Fabris, José Domingos; Stucki, Joseph W.

    2008-10-01

    The quantitative assay of clay minerals, soils, and sediments for Fe(II) and total Fe is fundamental to understanding biogeochemical cycles occurring therein. The commonly used ferrozine method was originally designed to assay extracted forms of Fe(II) from non-silicate aqueous systems. It is becoming, however, increasingly the method of choice to report the total reduced state of Fe in soils and sediments. Because Fe in soils and sediments commonly exists in the structural framework of silicates, extraction by HCl, as used in the ferrozine method, fails to dissolve all of the Fe. The phenanthroline (phen) method, on the other hand, was designed to assay silicate minerals for Fe(II) and total Fe and has been proven to be highly reliable. In the present study potential sources of error in the ferrozine method were evaluated by comparing its results to those obtained by the phen method. Both methods were used to analyze clay mineral and soil samples for Fe(II) and total Fe. Results revealed that the conventional ferrozine method under reports total Fe in samples containing Fe in silicates and gives erratic results for Fe(II). The sources of error in the ferrozine method are: (1) HCl fails to dissolve silicates and (2) if the analyte solution contains Fe 3+, the analysis for Fe 2+ will be photosensitive, and reported Fe(II) values will likely be greater than the actual amount in solution. Another difficulty with the ferrozine method is that it is tedious and much more labor intensive than the phen method. For these reasons, the phen method is preferred and recommended. Its procedure is simpler, takes less time, and avoids the errors found in the ferrozine method.

  1. Comparative evaluation of two quantitative precipitation estimation methods in Korea

    NASA Astrophysics Data System (ADS)

    Ko, H.; Nam, K.; Jung, H.

    2013-12-01

    The spatial distribution and intensity of rainfall is necessary for hydrological model, particularly, grid based distributed model. The weather radar is much higher spatial resolution (1kmx1km) than rain gauges (~13km) although radar is indirect measurement of rainfall and rain gauges are directly observed it. And also, radar is provided areal and gridded rainfall information while rain gauges are provided point data. Therefore, radar rainfall data can be useful for input data on the hydrological model. In this study, we compared two QPE schemes to produce radar rainfall for hydrological utilization. The two methods are 1) spatial adjustment and 2) real-time Z-R relationship adjustment (hereafter RAR; Radar-Aws Rain rate). We computed and analyzed the statistics such as ME (Mean Error), RMSE (Root mean square Error), and correlation using cross-validation method (here, leave-one-out method).

  2. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models

    PubMed Central

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics. PMID:26091413

  3. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  4. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1981-02-25

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules or ions.

  5. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, Edward F.; Keller, Richard A.; Apel, Charles T.

    1983-01-01

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions.

  6. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1983-09-06

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions. 6 figs.

  7. Selection methods in forage breeding: a quantitative appraisal

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Forage breeding can be extraordinarily complex because of the number of species, perenniality, mode of reproduction, mating system, and the genetic correlation for some traits evaluated in spaced plants vs. performance under cultivation. Aiming to compare eight forage breeding methods for direct sel...

  8. Quantitative trait locus gene mapping: a new method for locating alcohol response genes.

    PubMed

    Crabbe, J C

    1996-01-01

    Alcoholism is a multigenic trait with important non-genetic determinants. Studies with genetic animal models of susceptibility to several of alcohol's effects suggest that several genes contributing modest effects on susceptibility (Quantitative Trait Loci, or QTLs) are important. A new technique of QTL gene mapping has allowed the identification of the location in mouse genome of several such QTLs. The method is described, and the locations of QTLs affecting the acute alcohol withdrawal reaction are described as an example of the method. Verification of these QTLs in ancillary studies is described and the strengths, limitations, and future directions to be pursued are discussed. QTL mapping is a promising method for identifying genes in rodents with the hope of directly extrapolating the results to the human genome. This review is based on a paper presented at the First International Congress of the Latin American Society for Biomedical Research on Alcoholism, Santiago, Chile, November 1994. PMID:12893462

  9. [Study on quantitative methods of cleistocalycis operculati cortex].

    PubMed

    Chen, Li-Si; Ou, Jia-Ju; Li, Shu-Yuan; Lu, Song-Gui

    2014-08-01

    Cleistocalycis Operculati Cortex is the dry bark of Cleistocalyx operculatus. It is the raw material of Compound Hibiscuse which is external sterilization antipruritic drugs. The quality standard of Cleistocalycis Operculati Cortex in Guangdong Province "standard for the traditional Chinese medicine" (second volumes) only contains TLC identification. It is unable to effectively monitor and control the quality of Cleistocalycis Operculati Cortex. A reversed-phase HPLC method was established for the determination of 3, 3'-O-dimethylellagic acid from Cleistocalycis Operculati Cortex and the content was calculated by external standard method for the first time. Under the selected chromatographic conditions, the target components between peaks to achieve effective separation. 3,3'-O- dimethylellagic acid standard solution at the concentration of 1.00 - 25.0 mg x L(-1) showed a good linear relationship. The standard curve was Y = 77.33X + 7.904, r = 0.999 5. The average recovery was 101.0%, RSD was 1.3%. The HPLC method for the determination of 3,3'-O-dimethylellagic acid in Cleistocalycis Operculati Cortex is accurate and reliable. It can provide a strong technical support for monitoring the quality of Cleistocalycis Operculati Cortex. PMID:25509300

  10. Compatibility of Qualitative and Quantitative Methods: Studying Child Sexual Abuse in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    1987-01-01

    Illustrates how the combined use of qualitative and quantitative methods were necessary in obtaining a clearer understanding of the process of incest in American society. Argues that the exclusive use of one methodology would have obscured important information. (FMW)

  11. Qualitative and quantitative determination of ubiquinones by the method of high-efficiency liquid chromatography

    SciTech Connect

    Yanotovskii, M.T.; Mogilevskaya, M.P.; Obol'nikova, E.A.; Kogan, L.M.; Samokhvalov, G.I.

    1986-07-10

    A method has been developed for the qualitative and quantitative determination of ubiquinones CoQ/sub 6/-CoQ/sub 10/, using high-efficiency reversed-phase liquid chromatography. Tocopherol acetate was used as the internal standard.

  12. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  13. Characterization of working iron Fischer-Tropsch catalysts using quantitative diffraction methods

    NASA Astrophysics Data System (ADS)

    Mansker, Linda Denise

    This study presents the results of the ex-situ characterization of working iron Fischer-Tropsch synthesis (F-TS) catalysts, reacted hundreds of hours at elevated pressures, using a new quantitative x-ray diffraction analytical methodology. Compositions, iron phase structures, and phase particle morphologies were determined and correlated with the observed reaction kinetics. Conclusions were drawn about the character of each catalyst in its most and least active state. The identity of the active phase(s) in the Fe F-TS catalyst has been vigorously debated for more than 45 years. The highly-reduced catalyst, used to convert coal-derived syngas to hydrocarbon products, is thought to form a mixture of oxides, metal, and carbides upon pretreatment and reaction. Commonly, Soxhlet extraction is used to effect catalyst-product slurry separation; however, the extraction process could be producing irreversible changes in the catalyst, contributing to the conflicting results in the literature. X-ray diffraction doesn't require analyte-matrix separation before analysis, and can detect trace phases down to 300 ppm/2 nm; thus, working catalyst slurries could be characterized as-sampled. Data were quantitatively interpreted employing first principles methods, including the Rietveld polycrystalline structure method. Pretreated catalysts and pure phases were examined experimentally and modeled to explore specific behavior under x-rays. Then, the working catalyst slurries were quantitatively characterized. Empirical quantitation factors were calculated from experimental data or single crystal parameters, then validated using the Rietveld method results. In the most active form, after pretreatment in H 2 or in CO at Pambient, well-preserved working catalysts contained significant amounts of Fe7C3 with trace alpha-Fe, once reaction had commenced at elevated pressure. Amounts of Fe3O 4 were constant and small, with carbide dpavg < 15 nm. Small amounts of Fe7C3 were found in unreacted

  14. A Bead-Based Method for Multiplexed Identification and Quantitation of DNA Sequences Using Flow Cytometry

    PubMed Central

    Spiro, Alexander; Lowe, Mary; Brown, Drew

    2000-01-01

    A new multiplexed, bead-based method which utilizes nucleic acid hybridizations on the surface of microscopic polystyrene spheres to identify specific sequences in heterogeneous mixtures of DNA sequences is described. The method consists of three elements: beads (5.6-μm diameter) with oligomer capture probes attached to the surface, three fluorophores for multiplexed detection, and flow cytometry instrumentation. Two fluorophores are impregnated within each bead in varying amounts to create different bead types, each associated with a unique probe. The third fluorophore is a reporter. Following capture of fluorescent cDNA sequences from environmental samples, the beads are analyzed by flow cytometric techniques which yield a signal intensity for each capture probe proportional to the amount of target sequences in the analyte. In this study, a direct hybrid capture assay was developed and evaluated with regard to sequence discrimination and quantitation of abundances. The target sequences (628 to 728 bp in length) were obtained from the 16S/23S intergenic spacer region of microorganisms collected from polluted groundwater at the nuclear waste site in Hanford, Wash. A fluorescence standard consisting of beads with a known number of fluorescent DNA molecules on the surface was developed, and the resolution, sensitivity, and lower detection limit for measuring abundances were determined. The results were compared with those of a DNA microarray using the same sequences. The bead method exhibited far superior sequence discrimination and possesses features which facilitate accurate quantitation. PMID:11010868

  15. Composition and quantitation of microalgal lipids by ERETIC ¹H NMR method.

    PubMed

    Nuzzo, Genoveffa; Gallo, Carmela; d'Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo

    2013-10-01

    Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (¹H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790

  16. Generalized multiple internal standard method for quantitative liquid chromatography mass spectrometry.

    PubMed

    Hu, Yuan-Liang; Chen, Zeng-Ping; Chen, Yao; Shi, Cai-Xia; Yu, Ru-Qin

    2016-05-01

    In this contribution, a multiplicative effects model for generalized multiple-internal-standard method (MEMGMIS) was proposed to solve the signal instability problem of LC-MS over time. MEMGMIS model seamlessly integrates the multiple-internal-standard strategy with multivariate calibration method, and takes full use of all the information carried by multiple internal standards during the quantification of target analytes. Unlike the existing methods based on multiple internal standards, MEMGMIS does not require selecting an optimal internal standard for the quantification of a specific analyte from multiple internal standards used. MEMGMIS was applied to a proof-of-concept model system: the simultaneous quantitative analysis of five edible artificial colorants in two kinds of cocktail drinks. Experimental results demonstrated that MEMGMIS models established on LC-MS data of calibration samples prepared with ultrapure water could provide quite satisfactory concentration predictions for colorants in cocktail samples from their LC-MS data measured 10days after the LC-MS analysis of the calibration samples. The average relative prediction errors of MEMGMIS models did not exceed 6.0%, considerably better than the corresponding values of commonly used univariate calibration models combined with multiple internal standards. The advantages of good performance and simple implementation render MEMGMIS model a promising alternative tool in quantitative LC-MS assays. PMID:27072522

  17. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    SciTech Connect

    Gray, Jeffrey F.; Puri, Ashok

    2007-06-15

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10{sup -6}, comparable with the recent results reported in the literature.

  18. Stress echocardiography: methods, indications and results

    PubMed Central

    Baur, L.H.B.

    2002-01-01

    Stress echocardiography has become an important clinical tool to detect cardiac ischaemia and viability in addition to single photon emission tomography. Stress echocardiography has a high positive and negative predictive value, is less expensive than the nuclear methods and has no radiation exposure. It can easily be used in an emergency room and coronary care unit. Because of its feasibility, low cost and high diagnostic accuracy, it will become a very important technique in every hospital and will soon be a real alternative to the more time-consuming and expensive nuclear techniques. The current article gives a review of the methods of stress echocardiography. ImagesFigure 2Figure 3Figure 4 PMID:25696080

  19. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions.

    PubMed

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E; Geller, Jil T; Fisher, Susan J; Hall, Steven C; Hazen, Terry C; Brenner, Steven E; Butland, Gareth; Jin, Jian; Witkowska, H Ewa; Chandonia, John-Marc; Biggin, Mark D

    2016-06-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  20. Quantitative 4D Transcatheter Intraarterial Perfusion MR Imaging as a Method to Standardize Angiographic Chemoembolization Endpoints

    PubMed Central

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2011-01-01

    PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P < 0.001) intraprocedural perfusion reduction. CONCLUSION The SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520

  1. Penumbra Pattern Assessment in Acute Stroke Patients: Comparison of Quantitative and Non-Quantitative Methods in Whole Brain CT Perfusion

    PubMed Central

    Baumann, Alena B.; Meinel, Felix G.; Helck, Andreas D.; Opherk, Christian; Straube, Andreas; Reiser, Maximilian F.; Sommer, Wieland H.

    2014-01-01

    Background And Purpose While penumbra assessment has become an important part of the clinical decision making for acute stroke patients, there is a lack of studies measuring the reliability and reproducibility of defined assessment techniques in the clinical setting. Our aim was to determine reliability and reproducibility of different types of three-dimensional penumbra assessment methods in stroke patients who underwent whole brain CT perfusion imaging (WB-CTP). Materials And Methods We included 29 patients with a confirmed MCA infarction who underwent initial WB-CTP with a scan coverage of 100 mm in the z-axis. Two blinded and experienced readers assessed the flow-volume-mismatch twice and in two quantitative ways: Performing a volumetric mismatch analysis using OsiriX imaging software (MMVOL) and visual estimation of mismatch (MMEST). Complementarily, the semiquantitative Alberta Stroke Programme Early CT Score for CT perfusion was used to define mismatch (MMASPECTS). A favorable penumbral pattern was defined by a mismatch of ≥30% in combination with a cerebral blood flow deficit of ≤90 ml and an MMASPECTS score of ≥1, respectively. Inter- and intrareader agreement was determined by Kappa-values and ICCs. Results Overall, MMVOL showed considerably higher inter-/intrareader agreement (ICCs: 0.751/0.843) compared to MMEST (0.292/0.749). In the subgroup of large (≥50 mL) perfusion deficits, inter- and intrareader agreement of MMVOL was excellent (ICCs: 0.961/0.942), while MMEST interreader agreement was poor (0.415) and intrareader agreement was good (0.919). With respect to penumbra classification, MMVOL showed the highest agreement (interreader agreement: 25 agreements/4 non-agreements/κ: 0.595; intrareader agreement 27/2/0.833), followed by MMEST (22/7/0.471; 23/6/0.577), and MMASPECTS (18/11/0.133; 21/8/0.340). Conclusion The evaluated approach of volumetric mismatch assessment is superior to pure visual and ASPECTS penumbra pattern assessment in WB

  2. Quantitative Trait Locus Mapping Methods for Diversity Outbred Mice

    PubMed Central

    Gatti, Daniel M.; Svenson, Karen L.; Shabalin, Andrey; Wu, Long-Yang; Valdar, William; Simecek, Petr; Goodwin, Neal; Cheng, Riyan; Pomp, Daniel; Palmer, Abraham; Chesler, Elissa J.; Broman, Karl W.; Churchill, Gary A.

    2014-01-01

    Genetic mapping studies in the mouse and other model organisms are used to search for genes underlying complex phenotypes. Traditional genetic mapping studies that employ single-generation crosses have poor mapping resolution and limit discovery to loci that are polymorphic between the two parental strains. Multiparent outbreeding populations address these shortcomings by increasing the density of recombination events and introducing allelic variants from multiple founder strains. However, multiparent crosses present new analytical challenges and require specialized software to take full advantage of these benefits. Each animal in an outbreeding population is genetically unique and must be genotyped using a high-density marker set; regression models for mapping must accommodate multiple founder alleles, and complex breeding designs give rise to polygenic covariance among related animals that must be accounted for in mapping analysis. The Diversity Outbred (DO) mice combine the genetic diversity of eight founder strains in a multigenerational breeding design that has been maintained for >16 generations. The large population size and randomized mating ensure the long-term genetic stability of this population. We present a complete analytical pipeline for genetic mapping in DO mice, including algorithms for probabilistic reconstruction of founder haplotypes from genotyping array intensity data, and mapping methods that accommodate multiple founder haplotypes and account for relatedness among animals. Power analysis suggests that studies with as few as 200 DO mice can detect loci with large effects, but loci that account for <5% of trait variance may require a sample size of up to 1000 animals. The methods described here are implemented in the freely available R package DOQTL. PMID:25237114

  3. Quantitative trait locus mapping methods for diversity outbred mice.

    PubMed

    Gatti, Daniel M; Svenson, Karen L; Shabalin, Andrey; Wu, Long-Yang; Valdar, William; Simecek, Petr; Goodwin, Neal; Cheng, Riyan; Pomp, Daniel; Palmer, Abraham; Chesler, Elissa J; Broman, Karl W; Churchill, Gary A

    2014-09-01

    Genetic mapping studies in the mouse and other model organisms are used to search for genes underlying complex phenotypes. Traditional genetic mapping studies that employ single-generation crosses have poor mapping resolution and limit discovery to loci that are polymorphic between the two parental strains. Multiparent outbreeding populations address these shortcomings by increasing the density of recombination events and introducing allelic variants from multiple founder strains. However, multiparent crosses present new analytical challenges and require specialized software to take full advantage of these benefits. Each animal in an outbreeding population is genetically unique and must be genotyped using a high-density marker set; regression models for mapping must accommodate multiple founder alleles, and complex breeding designs give rise to polygenic covariance among related animals that must be accounted for in mapping analysis. The Diversity Outbred (DO) mice combine the genetic diversity of eight founder strains in a multigenerational breeding design that has been maintained for >16 generations. The large population size and randomized mating ensure the long-term genetic stability of this population. We present a complete analytical pipeline for genetic mapping in DO mice, including algorithms for probabilistic reconstruction of founder haplotypes from genotyping array intensity data, and mapping methods that accommodate multiple founder haplotypes and account for relatedness among animals. Power analysis suggests that studies with as few as 200 DO mice can detect loci with large effects, but loci that account for <5% of trait variance may require a sample size of up to 1000 animals. The methods described here are implemented in the freely available R package DOQTL. PMID:25237114

  4. Immunochemical methods for quantitation of vitamin B6. Technical report

    SciTech Connect

    Brandon, D.L.; Corse, J.W.

    1981-09-30

    A procedure is described which proposes schemes for determining the total of all B6 vitamins in acid-hydrolyzed samples utilizing a radio-immunoassay (RIA) or an enzyme-immunoassay (EIA). Sample preparation is similar for both RIA and EIA. Two specific antibodies (antipyridoxine and antipyridoxamine) are employed to determine pyridoxamine, a portion of the sample is reduced with sodium borohydride. Pyridoxal is determined by difference between pyridoxine before and after reduction. The results indicate that two procedures have been developed which are selective for pyridoxamine (the fluorescent enzyme immunoassay and the spin immunoassay) and one assay which is equally sensitive to pyridoxine and pyridoxamine (the radio-immunoassay).

  5. Quantitative mineralogical composition of complex mineral wastes - Contribution of the Rietveld method

    SciTech Connect

    Mahieux, P.-Y.; Aubert, J.-E.; Cyr, M.; Coutand, M.; Husson, B.

    2010-03-15

    The objective of the work presented in this paper is the quantitative determination of the mineral composition of two complex mineral wastes: a sewage sludge ash (SSA) and a municipal solid waste incineration fly ash (MSWIFA). The mineral compositions were determined by two different methods: the first based on calculation using the qualitative mineralogical composition of the waste combined with physicochemical analyses; the second the Rietveld method, which uses only X-ray diffraction patterns. The results obtained are coherent, showing that it is possible to quantify the mineral compositions of complex mineral waste with such methods. The apparent simplicity of the Rietveld method (due principally to the availability of software packages implementing the method) facilitates its use. However, care should be taken since the crystal structure analysis based on powder diffraction data needs experience and a thorough understanding of crystallography. So the use of another, complementary, method such as the first one used in this study, may sometimes be needed to confirm the results.

  6. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    PubMed Central

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  7. A Comparative Study on Tobacco Cessation Methods: A Quantitative Systematic Review

    PubMed Central

    Heydari, Gholamreza; Masjedi, Mohammadreza; Ahmady, Arezoo Ebn; Leischow, Scott J.; Lando, Harry A.; Shadmehr, Mohammad Behgam; Fadaizadeh, Lida

    2014-01-01

    Background: During recent years, there have been many advances in different types of pharmacological and non-pharmacological tobacco control treatments. In this study, we aimed to identify the most effective smoking cessation methods used in quit based upon a review of the literature. Methods: We did a search of PubMed, limited to English publications from 2000 to 2012. Two trained reviewers independently assessed titles, abstracts and full texts of articles after a pilot inter-rater reliability assessment which was conducted by the author (GH). The total number of papers and their conclusions including recommendation of that method (positive) or not supporting (negative) was computed for each method. The number of negative papers was subtracted from the number of positive ones for each method. In cases of inconsistency between the two reviewers, these were adjudicated by author. Results: Of the 932 articles that were critically assessed, 780 studies supported quit smoking methods. In 90 studies, the methods were not supported or rejected and in 62 cases the methods were not supported. Nicotine replacement therapy (NRT), Champix and Zyban with 352, 117 and 71 studies respectively were the most supported methods and e-cigarettes and non-Nicotine medications with one case were the least supported methods. Finally, NRT with 39 and Champix and education with 36 scores were the most supported methods. Conclusions: Results of this review indicate that the scientific papers in the most recent decade recommend the use of NRT and Champix in combination with educational interventions. Additional research is needed to compare qualitative and quantitative studies for smoking cessation. PMID:25013685

  8. A practical and sensitive method of quantitating lymphangiogenesis in vivo.

    PubMed

    Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K

    2013-07-01

    To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology. PMID:23711825

  9. A quantitative method for photovoltaic encapsulation system optimization

    NASA Technical Reports Server (NTRS)

    Garcia, A., III; Minning, C. P.; Cuddihy, E. F.

    1981-01-01

    It is pointed out that the design of encapsulation systems for flat plate photovoltaic modules requires the fulfillment of conflicting design requirements. An investigation was conducted with the objective to find an approach which will make it possible to determine a system with optimum characteristics. The results of the thermal, optical, structural, and electrical isolation analyses performed in the investigation indicate the major factors in the design of terrestrial photovoltaic modules. For defect-free materials, minimum encapsulation thicknesses are determined primarily by structural considerations. Cell temperature is not strongly affected by encapsulant thickness or thermal conductivity. The emissivity of module surfaces exerts a significant influence on cell temperature. Encapsulants should be elastomeric, and ribs are required on substrate modules. Aluminum is unsuitable as a substrate material. Antireflection coating is required on cell surfaces.

  10. A method for quantitatively estimating diffuse and discrete hydrothermal discharge

    NASA Astrophysics Data System (ADS)

    Baker, Edward T.; Massoth, Gary J.; Walker, Sharon L.; Embley, Robert W.

    1993-07-01

    Submarine hydrothermal fluids discharge as undiluted, high-temperature jets and as diffuse, highly diluted, low-temperature percolation. Estimates of the relative contribution of each discharge type, which are important for the accurate determination of local and global hydrothermal budgets, are difficult to obtain directly. In this paper we describe a new method of using measurements of hydrothermal tracers such as Fe/Mn, Fe/heat, and Mn/heat in high-temperature fluids, low-temperature fluids, and the neutrally buoyant plume to deduce the relative contribution of each discharge type. We sampled vent fluids from the north Cleft vent field on the Juan de Fuca Ridge in 1988, 1989 and 1991, and plume samples every year from 1986 to 1991. The tracers were, on average, 3 to 90 times greater in high-temperature than in low-temperature fluids, with plume values intermediate. A mixing model calculates that high-temperature fluids contribute only ˜ 3% of the fluid mass flux but > 90% of the hydrothermal Fe and > 60% of the hydrothermal Mn to the overlying plume. Three years of extensive camera-CTD sled tows through the vent field show that diffuse venting is restricted to a narrow fissure zone extending for 18 km along the axial strike. Linear plume theory applied to the temperature plumes detected when the sled crossed this zone yields a maximum likelihood estimate for the diffuse heat flux of8.9 × 10 4 W/m, for a total flux of 534 MW, considering that diffuse venting is active along only one-third of the fissure system. For mean low- and high-temperature discharge of 25°C and 319°C, respectively, the discrete heat flux must be 266 MW to satisfy the mass flux partitioning. If the north Cleft vent field is globally representative, the assumption that high-temperature discharge dominates the mass flux in axial vent fields leads to an overestimation of the flux of many non-conservative hydrothermal species by about an order of magnitude.

  11. Spatial Access Priority Mapping (SAPM) with Fishers: A Quantitative GIS Method for Participatory Planning

    PubMed Central

    Yates, Katherine L.; Schoeman, David S.

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers’ spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers’ willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision

  12. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  13. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    PubMed

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  14. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  15. The Ten Beads Method: A Novel Way to Collect Quantitative Data in Rural Uganda

    PubMed Central

    Bwambale, Francis Mulekya; Moyer, Cheryl A.; Komakech, Innocent; -Mangen, Fred-Wabwire; Lori, Jody R

    2013-01-01

    This paper illustrates how locally appropriate methods can be used to collect quantitative data from illiterate respondents. This method uses local beads to represent quantities, which is a novel yet potentially valuable methodological improvement over standard Western survey methods. PMID:25170477

  16. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    PubMed

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  17. An immunochemical method for quantitation of Epinotia aporema granulovirus (EpapGV).

    PubMed

    Parola, Alejandro Daniel; Sciocco-Cap, Alicia; Glikmann, Graciela; Romanowski, Víctor

    2003-09-01

    Epinotia aporema granulovirus (EpapGV) is a baculovirus that affects E. aporema larvae and has proven to be a good candidate for the biocontrol of this important pest in South America. As part of the quality control of the production of a bioinsecticide based on EpapGV, a sensitive method was developed for the detection and quantitation of the virus. To this end, we used the major occlusion body (OB) protein (granulin) to generate polyclonal antibodies in rabbits. Purified IgG fractions from hyperimmune sera were labeled with biotin and used as detecting antibodies in a double antibody sandwich enzyme linked immunosorbent assays (ELISA). No cross-reactivity was detected with any of the nucleopolyhedroviruses (NPV) tested in this study, while a minor degree of reactivity was observed with the closely related Cydia pomonella granulovirus (CpGV). The performance of the ELISA was satisfactory in terms of sensitivity, detecting as little as 0.53 ng/ml of EpapGV granulin in suspensions of purified virus OB. This represented 2.0x10(4) OB/ml. Granulin was also detected in complex and highly diluted bioinsecticidal formulate mixtures. In time course experiments, the virus was detected as early as 24 h post infection (p.i.). The results of the studies demonstrate that this method is a convenient, rapid and inexpensive alternative for routine detection and quantitation of EpapGV. PMID:12951208

  18. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method.

    PubMed

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is (252)Cf or (241)Am-Be. In this study, (252)Cf with a neutron flux of 6.3x10(6)n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with (3)He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of approximately 0.947g/cc and area of 40cmx25cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei. PMID:19285419

  19. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  20. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  1. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  2. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran.

    PubMed

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-03-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  3. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran

    PubMed Central

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-01-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  4. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;…

  5. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the

  6. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods

    NASA Astrophysics Data System (ADS)

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-01

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  7. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods.

    PubMed

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-15

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%). PMID:26774813

  8. Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research

    PubMed Central

    SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN

    2015-01-01

    Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073

  9. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  10. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials. PMID:11767156

  11. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  12. Rapid quantitative analysis of lipids using a colorimetric method in a microplate format.

    PubMed

    Cheng, Yu-Shen; Zheng, Yi; VanderGheynst, Jean S

    2011-01-01

    A colorimetric sulfo-phospho-vanillin (SPV) method was developed for high throughput analysis of total lipids. The developed method uses a reaction mixture that is maintained in a 96-well microplate throughout the entire assay. The new assay provides the following advantages over other methods of lipid measurement: (1) background absorbance can be easily corrected for each well, (2) there is less risk of handling and transferring sulfuric acid contained in reaction mixtures, (3) color develops more consistently providing more accurate measurement of absorbance, and (4) the assay can be used for quantitative measurement of lipids extracted from a wide variety of sources. Unlike other spectrophotometric approaches that use fluorescent dyes, the optimal spectra and reaction conditions for the developed assay do not vary with the sample source. The developed method was used to measure lipids in extracts from four strains of microalgae. No significant difference was found in lipid determination when lipid content was measured using the new method and compared to results obtained using a macro-gravimetric method. PMID:21069472

  13. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation

  14. Application of bias correction methods to improve the accuracy of quantitative radar rainfall in Korea

    NASA Astrophysics Data System (ADS)

    Lee, J.-K.; Kim, J.-H.; Suk, M.-K.

    2015-11-01

    There are many potential sources of the biases in the radar rainfall estimation process. This study classified the biases from the rainfall estimation process into the reflectivity measurement bias and the rainfall estimation bias by the Quantitative Precipitation Estimation (QPE) model and also conducted the bias correction methods to improve the accuracy of the Radar-AWS Rainrate (RAR) calculation system operated by the Korea Meteorological Administration (KMA). In the Z bias correction for the reflectivity biases occurred by measuring the rainfalls, this study utilized the bias correction algorithm. The concept of this algorithm is that the reflectivity of the target single-pol radars is corrected based on the reference dual-pol radar corrected in the hardware and software bias. This study, and then, dealt with two post-process methods, the Mean Field Bias Correction (MFBC) method and the Local Gauge Correction method (LGC), to correct the rainfall estimation bias by the QPE model. The Z bias and rainfall estimation bias correction methods were applied to the RAR system. The accuracy of the RAR system was improved after correcting Z bias. For the rainfall types, although the accuracy of the Changma front and the local torrential cases was slightly improved without the Z bias correction the accuracy of the typhoon cases got worse than the existing results in particular. As a result of the rainfall estimation bias correction, the Z bias_LGC was especially superior to the MFBC method because the different rainfall biases were applied to each grid rainfall amount in the LGC method. For the rainfall types, the results of the Z bias_LGC showed that the rainfall estimates for all types was more accurate than only the Z bias and, especially, the outcomes in the typhoon cases was vastly superior to the others.

  15. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  16. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  17. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms. PMID:26643074

  18. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGESBeta

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  19. A simple method for the subnanomolar quantitation of seven ophthalmic drugs in the rabbit eye.

    PubMed

    Latreille, Pierre-Luc; Banquy, Xavier

    2015-05-01

    This study describes the development and validation of a new liquid chromatography-tandem mass spectrometry (MS/MS) method capable of simultaneous quantitation of seven ophthalmic drugs-pilocarpine, lidocaine, atropine, proparacaine, timolol, prednisolone, and triamcinolone acetonide-within regions of the rabbit eye. The complete validation of the method was performed using an Agilent 1100 series high-performance liquid chromatography system coupled to a 4000 QTRAP MS/MS detector in positive TurboIonSpray mode with pooled drug solutions. The method sensitivity, evaluated by the lower limit of quantitation in two simulated matrices, yielded lower limits of quantitation of 0.25 nmol L(-1) for most of the drugs. The precision in the low, medium, and high ranges of the calibration curves, the freeze-thaw stability over 1 month, the intraday precision, and the interday precision were all within a 15% limit. The method was used to quantitate the different drugs in the cornea, aqueous humor, vitreous humor, and remaining eye tissues of the rabbit eye. It was validated to a concentration of up to 1.36 ng/g in humors and 5.43 ng/g in tissues. The unprecedented low detection limit of the present method and its ease of implementation allow easy, robust, and reliable quantitation of multiple drugs for rapid in vitro and in vivo evaluation of the local pharmacokinetics of these compounds. PMID:25749792

  20. Quantitative analysis of single amino acid variant peptides associated with pancreatic cancer in serum by an isobaric labeling quantitative method.

    PubMed

    Nie, Song; Yin, Haidi; Tan, Zhijing; Anderson, Michelle A; Ruffin, Mack T; Simeone, Diane M; Lubman, David M

    2014-12-01

    Single amino acid variations are highly associated with many human diseases. The direct detection of peptides containing single amino acid variants (SAAVs) derived from nonsynonymous single nucleotide polymorphisms (SNPs) in serum can provide unique opportunities for SAAV associated biomarker discovery. In the present study, an isobaric labeling quantitative strategy was applied to identify and quantify variant peptides in serum samples of pancreatic cancer patients and other benign controls. The largest number of SAAV peptides to date in serum including 96 unique variant peptides were quantified in this quantitative analysis, of which five variant peptides showed a statistically significant difference between pancreatic cancer and other controls (p-value < 0.05). Significant differences in the variant peptide SDNCEDTPEAGYFAVAVVK from serotransferrin were detected between pancreatic cancer and controls, which was further validated by selected reaction monitoring (SRM) analysis. The novel biomarker panel obtained by combining α-1-antichymotrypsin (AACT), Thrombospondin-1 (THBS1) and this variant peptide showed an excellent diagnostic performance in discriminating pancreatic cancer from healthy controls (AUC = 0.98) and chronic pancreatitis (AUC = 0.90). These results suggest that large-scale analysis of SAAV peptides in serum may provide a new direction for biomarker discovery research. PMID:25393578

  1. Quantitative measurement of ultrasound pressure field by optical phase contrast method and acoustic holography

    NASA Astrophysics Data System (ADS)

    Oyama, Seiji; Yasuda, Jun; Hanayama, Hiroki; Yoshizawa, Shin; Umemura, Shin-ichiro

    2016-07-01

    A fast and accurate measurement of an ultrasound field with various exposure sequences is necessary to ensure the efficacy and safety of various ultrasound applications in medicine. The most common method used to measure an ultrasound pressure field, that is, hydrophone scanning, requires a long scanning time and potentially disturbs the field. This may limit the efficiency of developing applications of ultrasound. In this study, an optical phase contrast method enabling fast and noninterfering measurements is proposed. In this method, the modulated phase of light caused by the focused ultrasound pressure field is measured. Then, a computed tomography (CT) algorithm used to quantitatively reconstruct a three-dimensional (3D) pressure field is applied. For a high-intensity focused ultrasound field, a new approach that combines the optical phase contrast method and acoustic holography was attempted. First, the optical measurement of focused ultrasound was rapidly performed over the field near a transducer. Second, the nonlinear propagation of the measured ultrasound was simulated. The result of the new approach agreed well with that of the measurement using a hydrophone and was improved from that of the phase contrast method alone with phase unwrapping.

  2. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  3. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  4. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  5. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  6. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a particular…

  7. A method for the quantitative determination of crystalline phases by X-ray

    NASA Technical Reports Server (NTRS)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  8. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  9. Clustering and training set selection methods for improving the accuracy of quantitative laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Anderson, Ryan B.; Bell, James F., III; Wiens, Roger C.; Morris, Richard V.; Clegg, Samuel M.

    2012-04-01

    We investigated five clustering and training set selection methods to improve the accuracy of quantitative chemical analysis of geologic samples by laser induced breakdown spectroscopy (LIBS) using partial least squares (PLS) regression. The LIBS spectra were previously acquired for 195 rock slabs and 31 pressed powder geostandards under 7 Torr CO2 at a stand-off distance of 7 m at 17 mJ per pulse to simulate the operational conditions of the ChemCam LIBS instrument on the Mars Science Laboratory Curiosity rover. The clustering and training set selection methods, which do not require prior knowledge of the chemical composition of the test-set samples, are based on grouping similar spectra and selecting appropriate training spectra for the partial least squares (PLS2) model. These methods were: (1) hierarchical clustering of the full set of training spectra and selection of a subset for use in training; (2) k-means clustering of all spectra and generation of PLS2 models based on the training samples within each cluster; (3) iterative use of PLS2 to predict sample composition and k-means clustering of the predicted compositions to subdivide the groups of spectra; (4) soft independent modeling of class analogy (SIMCA) classification of spectra, and generation of PLS2 models based on the training samples within each class; (5) use of Bayesian information criteria (BIC) to determine an optimal number of clusters and generation of PLS2 models based on the training samples within each cluster. The iterative method and the k-means method using 5 clusters showed the best performance, improving the absolute quadrature root mean squared error (RMSE) by ~ 3 wt.%. The statistical significance of these improvements was ~ 85%. Our results show that although clustering methods can modestly improve results, a large and diverse training set is the most reliable way to improve the accuracy of quantitative LIBS. In particular, additional sulfate standards and specifically fabricated

  10. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods

    PubMed Central

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2015-01-01

    This data article describes a controlled, spiked proteomic dataset for which the “ground truth” of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values. PMID:26862574

  11. Comparison of six DNA extraction methods for recovery of fungal DNA as assessed by quantitative PCR.

    PubMed

    Fredricks, David N; Smith, Caitlin; Meier, Amalia

    2005-10-01

    The detection of fungal pathogens in clinical samples by PCR requires the use of extraction methods that efficiently lyse fungal cells and recover DNA suitable for amplification. We used quantitative PCR assays to measure the recovery of DNA from two important fungal pathogens subjected to six DNA extraction methods. Aspergillus fumigatus conidia or Candida albicans yeast cells were added to bronchoalveolar lavage fluid and subjected to DNA extraction in order to assess the recovery of DNA from a defined number of fungal propagules. In order to simulate hyphal growth in tissue, Aspergillus fumigatus conidia were allowed to form mycelia in tissue culture media and then harvested for DNA extraction. Differences among the DNA yields from the six extraction methods were highly significant (P<0.0001) in each of the three experimental systems. An extraction method based on enzymatic lysis of fungal cell walls (yeast cell lysis plus the use of GNOME kits) produced high levels of fungal DNA with Candida albicans but low levels of fungal DNA with Aspergillus fumigatus conidia or hyphae. Extraction methods employing mechanical agitation with beads produced the highest yields with Aspergillus hyphae. The Master Pure yeast method produced high levels of DNA from C. albicans but only moderate yields from A. fumigatus. A reagent from one extraction method was contaminated with fungal DNA, including DNA from Aspergillus and Candida species. In conclusion, the six extraction methods produce markedly differing yields of fungal DNA and thus can significantly affect the results of fungal PCR assays. No single extraction method was optimal for all organisms. PMID:16207973

  12. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs. PMID:26085428

  13. Fluorometric method for the simultaneous quantitation of differently-sized nanoparticles in rodent tissue.

    PubMed

    Hussain, N

    2001-02-19

    The oral absorption and systemic translocation of particulate matter via the gastrointestinal tract has been shown by a number of laboratories using a wide variety of particles in different animal species. While there is debate on the magnitude of particle intestinal translocation, which is encumbered by the differing experimental protocols, particularly the method of quantitation of absorbed material, few have sought to examine the pharmacokinetic aspects of particle absorption. We describe in this communication the development of a simple and a rapid fluorometric assay of quantifying tissue-laden fluorescent nanoparticles that is able to isolate, detect and quantify the presence of two or more particle populations differing both in their size and fluorescent label. Six types of polystyrene nanoparticles incorporating different fluorescent markers were spiked in whole livers. The fluorophores were extracted using our previously developed method of freeze-drying the tissue and using chloroform as the extractive solvent. Only two types of particle populations, orange-labelled 40 nm and Fluoroscein-emitting 500 nm nanoparticles, were sufficiently recoverable and provided a high signal-to-noise ratio for further work. The amount of tissue and type of biological tissue type also impacted on the nanoparticle recovery and detection, reflecting, perhaps, the quenching effects of interacting tissue-derived molecules. In addition, the results also indicate that the use of nanoparticles incorporating fluorescent dyes that have emission over 500 nm overcome the tissue interfering autofluorescence for low doses of nanoparticles. The use of this fluorometric method has several advantages compared with other modes of quantitation in that it is rapid, non-radioactive and the marker is non-leaching. More importantly, it allows the simultaneous detection of multiple fluorophores such that two or more different fluorescent particle populations can be detected in the same sample

  14. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    PubMed Central

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  15. A quantitative PCR method to quantify ruminant DNA in porcine crude heparin.

    PubMed

    Concannon, Sean P; Wimberley, P Brett; Workman, Wesley E

    2011-01-01

    Heparin is a well-known glycosaminoglycan extracted from porcine intestines. Increased vigilance for transmissible spongiform encephalopathy in animal-derived pharmaceuticals requires methods to prevent the introduction of heparin from ruminants into the supply chain. The sensitivity, specificity, and precision of the quantitative polymerase chain reaction (PCR) make it a superior analytical platform for screening heparin raw material for bovine-, ovine-, and caprine-derived material. A quantitative PCR probe and primer set homologous to the ruminant Bov-A2 short interspersed nuclear element (SINE) locus (Mendoza-Romero et al. J. Food Prot. 67:550-554, 2004) demonstrated nearly equivalent affinities for bovine, ovine, and caprine DNA targets, while exhibiting no cross-reactivity with porcine DNA in the quantitative PCR method. A second PCR primer and probe set, specific for the porcine PRE1 SINE sequence, was also developed to quantify the background porcine DNA level. DNA extraction and purification was not necessary for analysis of the raw heparin samples, although digestion of the sample with heparinase was employed. The method exhibits a quantitation range of 0.3-3,000 ppm ruminant DNA in heparin. Validation parameters of the method included accuracy, repeatability, precision, specificity, range, quantitation limit, and linearity. PMID:21058016

  16. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    PubMed

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens. PMID:19007521

  17. Quantitative analysis of eugenol in clove extract by a validated HPLC method.

    PubMed

    Yun, So-Mi; Lee, Myoung-Heon; Lee, Kwang-Jick; Ku, Hyun-Ok; Son, Seong-Wan; Joo, Yi-Seok

    2010-01-01

    Clove (Eugenia caryophyllata) is a well-known medicinal plant used for diarrhea, digestive disorders, or in antiseptics in Korea. Eugenol is the main active ingredient of clove and has been chosen as a marker compound for the chemical evaluation or QC of clove. This paper reports the development and validation of an HPLC-diode array detection (DAD) method for the determination of eugenol in clove. HPLC separation was accomplished on an XTerra RP18 column (250 x 4.6 mm id, 5 microm) with an isocratic mobile phase of 60% methanol and DAD at 280 nm. Calibration graphs were linear with very good correlation coefficients (r2 > 0.9999) from 12.5 to 1000 ng/mL. The LOD was 0.81 and the LOQ was 2.47 ng/mL. The method showed good intraday precision (%RSD 0.08-0.27%) and interday precision (%RSD 0.32-1.19%). The method was applied to the analysis of eugenol from clove cultivated in various countries (Indonesia, Singapore, and China). Quantitative analysis of the 15 clove samples showed that the content of eugenol varied significantly, ranging from 163 to 1049 ppb. The method of determination of eugenol by HPLC is accurate to evaluate the quality and safety assurance of clove, based on the results of this study. PMID:21313806

  18. Methods for the Specific Detection and Quantitation of Amyloid-β Oligomers in Cerebrospinal Fluid.

    PubMed

    Schuster, Judith; Funke, Susanne Aileen

    2016-05-01

    Protein misfolding and aggregation are fundamental features of the majority of neurodegenerative diseases, like Alzheimer's disease (AD), Parkinson's disease, frontotemporal dementia, and prion diseases. Proteinaceous deposits in the brain of the patient, e.g., amyloid plaques consisting of the amyloid-β (Aβ) peptide and tangles composed of tau protein, are the hallmarks of AD. Soluble oligomers of Aβ and tau play a fundamental role in disease progression, and specific detection and quantification of the respective oligomeric proteins in cerebrospinal fluid may provide presymptomatically detectable biomarkers, paving the way for early diagnosis or even prognosis. Several studies on the development of techniques for the specific detection of Aβ oligomers were published, but some of the existing tools do not yet seem to be satisfactory, and the study results are contradicting. The detection of oligomers is challenging due to their polymorphous and unstable nature, their low concentration, and the presence of competing proteins and Aβ monomers in body fluids. Here, we present an overview of the current state of the development of methods for Aβ oligomer specific detection and quantitation. The methods are divided in the three subgroups: (i) enzyme linked immunosorbent assays (ELISA), (ii) methods for single oligomer detection, and (iii) others, which are mainly biosensor based methods. PMID:27163804

  19. Quantitative analysis and efficiency study of PSD methods for a LaBr3:Ce detector

    NASA Astrophysics Data System (ADS)

    Zeng, Ming; Cang, Jirong; Zeng, Zhi; Yue, Xiaoguang; Cheng, Jianping; Liu, Yinong; Ma, Hao; Li, Junli

    2016-03-01

    The LaBr3:Ce scintillator has been widely studied for nuclear spectroscopy because of its optimal energy resolution (<3%@ 662 keV) and time resolution (~300 ps). Despite these promising properties, the intrinsic radiation background of LaBr3:Ce is a critical issue, and pulse shape discrimination (PSD) has been shown to be an efficient potential method to suppress the alpha background from the 227Ac. In this paper, the charge comparison method (CCM) for alpha and gamma discrimination in LaBr3:Ce is quantitatively analysed and compared with two other typical PSD methods using digital pulse processing. The algorithm parameters and discrimination efficiency are calculated for each method. Moreover, for the CCM, the correlation between the CCM feature value distribution and the total charge (energy) is studied, and a fitting equation for the correlation is inferred and experimentally verified. Using the equations, an energy-dependent threshold can be chosen to optimize the discrimination efficiency. Additionally, the experimental results show a potential application in low-activity high-energy γ measurement by suppressing the alpha background.

  20. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study

    PubMed Central

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Aglyamov, Salavat R.; Twa, Michael D.; Larin, Kirill V.

    2015-01-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessing biomechanical properties of tissues with a micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Young’s modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods. PMID:25860076

  1. Radar Based Probabilistic Quantitative Precipitation Estimation: First Results of Large Sample Data Analysis

    NASA Astrophysics Data System (ADS)

    Ciach, G. J.; Krajewski, W. F.; Villarini, G.

    2005-05-01

    Large uncertainties in the operational precipitation estimates produced by the U.S. national network of WSR-88D radars are well-acknowledged. However, quantitative information about these uncertainties is not operationally available. In an effort to fill this gap, the U.S. National Weather Service (NWS) is supporting the development of a probabilistic approach to the radar precipitation estimation. The probabilistic quantitative precipitation estimation (PQPE) methodology that was selected for this development is based on the empirically-based modeling of the functional-statistical error structure in the operational WSR-88D precipitation products under different conditions. Our first goal is to deliver a realistic parameterization of the probabilistic error model describing its dependences on the radar-estimated precipitation value, distance from the radar, season, spatiotemporal averaging scale, and the setup of the precipitation processing system (PPS). In the long-term perspective, when large samples of relevant data are available, we will extend the model to include the dependences on different types of precipitation estimates (e.g. polarimeteric and multi-sensor), geographic locations and climatic regimes. At this stage of the PQPE project, we organized a 6-year-long sample of the Level II data from the Oklahoma City radar station (KTLX), and processed it with the Built 4 of the PPS that is currently used in the NWS operations. This first set of operational products was generated with the standard setup of the PPS parameters. The radar estimates are completed with the corresponding raingauge data from the Oklahoma Mesonet, the ARS Little Washita Micronet and the EVAC PicoNet covering different spatial scales. The raingauge data are used as a ground reference (GR) to estimate the required uncertainty characteristics in the radar precipitation products. In this presentation, we describe the first results of the large-sample uncertainty analysis of the products

  2. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. PMID:25842334

  3. Comparison of Concentration Methods for Quantitative Detection of Sewage-Associated Viral Markers in Environmental Waters

    PubMed Central

    Harwood, V. J.; Gyawali, P.; Sidhu, J. P. S.; Toze, S.

    2015-01-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ∼10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg2+) and elution with acid were inefficient and more prone to

  4. A rapid, sensitive, and selective method for quantitation of lamprey migratory pheromones in river water.

    PubMed

    Stewart, Michael; Baker, Cindy F; Cooney, Terry

    2011-11-01

    The methodology of using fish pheromones, or chemical signatures, as a tool to monitor or manage species of fish is rapidly gaining popularity. Unequivocal detection and accurate quantitation of extremely low concentrations of these chemicals in natural waters is paramount to using this technique as a management tool. Various species of lamprey are known to produce a mixture of three important migratory pheromones; petromyzonol sulfate (PS), petromyzonamine disulfate (PADS), and petromyzosterol disulfate (PSDS), but presently there are no established robust methods for quantitation of all three pheromones. In this study, we report a new, highly sensitive and selective method for the rapid identification and quantitation of these pheromones in river water samples. The procedure is based on pre-concentration, followed by liquid chromatography/tandem mass spectrometry (LC/MS/MS) analysis. The method is fast, with unambiguous pheromone determination. Practical quantitation limits of 0.25 ng/l were achieved for PS and PADS and 2.5 ng/l for PSDS in river water, using a 200-fold pre-concentration, However, lower quantitation limits can be achieved with greater pre-concentration. The methodology can be modified easily to include other chemicals of interest. Furthermore, the pre-concentration step can be applied easily in the field, circumventing potential stability issues of these chemicals. PMID:22076684

  5. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    PubMed Central

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  6. Simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and dead reckoning

    NASA Astrophysics Data System (ADS)

    Davey, Neil S.; Godil, Haris

    2013-05-01

    This article presents a comparative study between a well-known SLAM (Simultaneous Localization and Mapping) algorithm, called Gmapping, and a standard Dead-Reckoning algorithm; the study is based on experimental results of both approaches by using a commercial skid-based turning robot, P3DX. Five main base-case scenarios are conducted to evaluate and test the effectiveness of both algorithms. The results show that SLAM outperformed the Dead Reckoning in terms of map-making accuracy in all scenarios but one, since SLAM did not work well in a rapidly changing environment. Although the main conclusion about the excellence of SLAM is not surprising, the presented test method is valuable to professionals working in this area of mobile robots, as it is highly practical, and provides solid and valuable results. The novelty of this study lies in its simplicity. The simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and Dead Reckoning and some applications using autonomous robots are being patented by the authors in U.S. Patent Application Nos. 13/400,726 and 13/584,862.

  7. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699

  8. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-01

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed. PMID:26928571

  9. Quantitative comparison of reconstruction methods for intra-voxel fiber recovery from diffusion MRI.

    PubMed

    Daducci, Alessandro; Canales-Rodríguez, Erick Jorge; Descoteaux, Maxime; Garyfallidis, Eleftherios; Gur, Yaniv; Lin, Ying-Chia; Mani, Merry; Merlet, Sylvain; Paquette, Michael; Ramirez-Manzanares, Alonso; Reisert, Marco; Reis Rodrigues, Paulo; Sepehrband, Farshid; Caruyer, Emmanuel; Choupan, Jeiran; Deriche, Rachid; Jacob, Mathews; Menegaz, Gloria; Prčkovska, Vesna; Rivera, Mariano; Wiaux, Yves; Thiran, Jean-Philippe

    2014-02-01

    Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies. PMID:24132007

  10. Evaluation of a rapid method for the quantitative estimation of coliforms in meat by impedimetric procedures.

    PubMed Central

    Martins, S B; Selby, M J

    1980-01-01

    A 24-h instrumental procedure is described for the quantitative estimation of coliforms in ground meat. The method is simple and rapid, and it requires but a single sample dilution and four replicates. The data are recorded automatically and can be used to estimate coliforms in the range of 100 to 10,000 organisms per g. The procedure is an impedance detection time (IDT) method using a new medium, tested against 131 stock cultures, that markedly enhances the impedance response of gram-negative organisms, and it is selective for coliforms. Seventy samples of ground beef were analyzed for coliforms by the IDT method and the conventional three-dilution, two-step most-probable-number test tube procedure. Seventy-nine percent of the impedimetric estimates fell within the 95% confidence limits of the most-probable-number values. This corresponds to the criteria used to evaluate other coliform tests, with the added advantage of a single dilution and more rapid results. PMID:6992712

  11. Quantitative determination of aflatoxin B1 concentration in acetonitrile by chemometric methods using terahertz spectroscopy.

    PubMed

    Ge, Hongyi; Jiang, Yuying; Lian, Feiyu; Zhang, Yuan; Xia, Shanhong

    2016-10-15

    Aflatoxins contaminate and colonize agricultural products, such as grain, and thereby potentially cause human liver carcinoma. Detection via conventional methods has proven to be time-consuming and complex. In this paper, the terahertz (THz) spectra of aflatoxin B1 in acetonitrile solutions with concentration ranges of 1-50μg/ml and 1-50μg/l are obtained and analyzed for the frequency range of 0.4-1.6THz. Linear and nonlinear regression models are constructed to relate the absorption spectra and the concentrations of 160 samples using the partial least squares (PLS), principal component regression (PCR), support vector machine (SVM), and PCA-SVM methods. Our results indicate that PLS and PCR models are more accurate for the concentration range of 1-50μg/ml, whereas SVM and PCA-SVM are more accurate for the concentration range of 1-50μg/l. Furthermore, ten unknown concentration samples extracted from mildewed maize are analyzed quantitatively using these methods. PMID:27173565

  12. Quantitative evaluation of linear and nonlinear methods characterizing interdependencies between brain signals

    PubMed Central

    Ansari-Asl, Karim; Senhadji, Lotfi; Bellanger, Jean-Jacques; Wendling, Fabrice

    2006-01-01

    Brain functional connectivity can be characterized by the temporal evolution of correlation between signals recorded from spatially-distributed regions. It is aimed at explaining how different brain areas interact within networks involved during normal (as in cognitive tasks) or pathological (as in epilepsy) situations. Numerous techniques were introduced for assessing this connectivity. Recently, some efforts were made to compare methods performances but mainly qualitatively and for a special application. In this paper, we go further and propose a comprehensive comparison of different classes of methods (linear and nonlinear regressions, phase synchronization (PS), and generalized synchronization (GS)) based on various simulation models. For this purpose, quantitative criteria are used: in addition to mean square error (MSE) under null hypothesis (independence between two signals) and mean variance (MV) computed over all values of coupling degree in each model, we introduce a new criterion for comparing performances. Results show that the performances of the compared methods are highly depending on the hypothesis regarding the underlying model for the generation of the signals. Moreover, none of them outperforms the others in all cases and the performance hierarchy is model-dependent. PMID:17025676

  13. A novel volumetric method for quantitation of titanium dioxide in cosmetics.

    PubMed

    Kim, Young So; Kim, Boo-Min; Park, Sang-Chul; Jeong, Hye-Jin; Chang, Ih Seop

    2006-01-01

    Nowadays there are many sun-protection cosmetics incorporating organic or inorganic UV filters as active ingredients. Chemically stable inorganic sunscreen agents, usually metal oxides, are widely employed in high-SPF (sun protection factor) products. Titanium dioxide is one of the most frequently used inorganic UV filters. It has been used as a pigment for a long period of cosmetic history. With the development of micronization techniques, it has become possible to incorporate titanium dioxide in sunscreen formulations without the previous whitening effect, and hence its use in cosmetics has become an important research topic. However, there are very few works related to quantitation of titanium dioxide in sunscreen products. In this research, we analyzed the amounts of titanium dioxide in sunscreen cosmetics by adapting redox titration, reduction of Ti(IV) to Ti(III), and reoxidation to Ti(IV). After calcification of other organic ingredients of cosmetics, titanium dioxide is dissolved by hot sulfuric acid. The dissolved Ti(IV) is reduced to Ti(III) by adding metallic aluminum. The reduced Ti(III) is titrated against a standard oxidizing agent, Fe(III) (ammonium iron(III) sulfate), with potassium thiocyanate as an indicator. In order to test the accuracy and applicability of the proposed method, we analyzed the amounts of titanium dioxide in four types of sunscreen cosmetics, namely cream, make-up base, foundation, and powder, after adding known amounts of titanium dioxide (1 approximately 25 w/w%). The percentages of titanium dioxide recovered in the four types of formulations were in the range between 96% and 105%. We also analyzed seven commercial cosmetic products labeled with titanium dioxide as an ingredient and compared the results with those obtained from ICP-AES (inductively coupled plasma-atomic emission spectrometry), one of the most powerful atomic analysis techniques. The results showed that the titrated amounts were well in accord with the analyzed

  14. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  15. The Use of Quantitative Methods as an Aid to Decision Making in Educational Administration.

    ERIC Educational Resources Information Center

    Alkin, Marvin C.

    Three quantitative methods are outlined, with suggestions for application to particular problem areas of educational administration: (1) The Leontief input-output analysis, incorporating a "transaction table" for displaying relationships between economic outputs and inputs, mainly applicable to budget analysis and planning; (2) linear programing,…

  16. A simple method for quantitative diagnosis of small hive beetles, Aethina tumida, in the field

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Here we present a simple and fast method for quantitative diagnosis of small hive beetles (= SHB) in honeybee field colonies using corrugated plastic “diagnostic-strips”. In Australia, we evaluated its efficacy by comparing the number of lured SHB with the total number of beetles in the hives. The d...

  17. Examination of Quantitative Methods Used in Early Intervention Research: Linkages with Recommended Practices.

    ERIC Educational Resources Information Center

    Snyder, Patricia; Thompson, Bruce; McLean, Mary E.; Smith, Barbara J.

    2002-01-01

    Findings are reported related to the research methods and statistical techniques used in 450 group quantitative studies examined by the Council for Exceptional Children's Division for Early Childhood Recommended Practices Project. Studies were analyzed across seven dimensions including sampling procedures, variable selection, variable definition,…

  18. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  19. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  20. Improved GC/MS method for quantitation of n-Alkanes in plant and fecal material

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A gas chromatography-mass spectrometry (GC/MS) method for the quantitation of n-alkanes (carbon backbones ranging from 21 to 36 carbon atoms) in forage and fecal samples has been developed. Automated solid-liquid extraction using elevated temperature and pressure minimized extraction time to 30 min...

  1. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  2. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  3. Paradigms Lost and Pragmatism Regained: Methodological Implications of Combining Qualitative and Quantitative Methods

    ERIC Educational Resources Information Center

    Morgan, David L.

    2007-01-01

    This article examines several methodological issues associated with combining qualitative and quantitative methods by comparing the increasing interest in this topic with the earlier renewal of interest in qualitative research during the 1980s. The first section argues for the value of Kuhn's concept of paradigm shifts as a tool for examining…

  4. Quantitative measurement of acoustic pressure in the focal zone of acoustic lens-line focusing using the Schlieren method.

    PubMed

    Jiang, Xueping; Cheng, Qian; Xu, Zheng; Qian, Menglu; Han, Qingbang

    2016-04-01

    This paper proposes a theory and method for quantitative measurement of the acoustic lens-line focusing ultrasonic (ALLFU) field in its focal spot size and acoustic pressure using the Schlieren imaging technique. Using Fourier transformation, the relationship between the brightness of the Schlieren image and the acoustic pressure was introduced. The ALLFU field was simulated using finite element method and compared with the Schlieren acoustic field image. The measurement of the focal spot size was performed using the Schlieren method. The acoustic pressure in the focal zone of the ALLFU field and the transducer-transmitting voltage response were quantitatively determined by measuring the diffraction light fringe intensity. The results show that the brightness of the Schlieren image is a linear function of the acoustic intensity when the acousto-optic interaction length remains constant and the acoustic field is weak. PMID:27139646

  5. Analysis of quantitative methods for rib seriation using the Spitalfields documented skeletal collection.

    PubMed

    Owers, Sonya K; Pastor, Robert F

    2005-06-01

    Accurate rib seriation is essential in forensic anthropology and bioarchaeology for determination of minimum numbers of individuals, sequencing trauma patterns to the chest, and identification of central ribs for use in age estimation. We investigate quantitative methods for rib seriation based on three metric variables: superior (anterior) costo-transverse crest height (SCTCH), articular facet of the tubercle-to-angle length (AFTAL), and head-to-articular facet length (HAFL). The sample consists of complete but unseriated sets of ribs from 133 individuals from the documented (known age and sex) and undocumented skeletal collections of Christ Church Spitalfields, London. This research confirms the results of an earlier study (Hoppa and Saunders [1998] J. Forensic. Sci. 43:174-177) and extends it with the application of two new metric traits and further analyses of sex differences. Analyses of variance showed that SCTCH and AFTAL are significantly associated (P < 0.001) with rib number. Tukey tests of pairwise rib comparisons revealed that for two dimensions (SCTCH and AFTAL), the central ribs (3rd-6th) are significantly distinct from each other (P < 0.05). Using simple ranking of either the SCTCH or AFTAL traits, the proportion of correctly identified ribs within +/-1 position was 80%, compared to initial seriation using morphological methods (Dudar [1993] J. Forensic. Sci. 28:788-797; Mann [1993] J. Forensic. Sci. 28:151-155). Significant sex dimorphism was also identified for these two traits. Analysis of the HAFL trait produced somewhat equivocal results, suggesting that this variable is not reliable for rib seriation. The variable SCTCH proves to be the most useful dimension for seriation, and shows that all but the 7th-9th ribs can be distinguished from others in the sequence, with important results for the 4th rib, where ranking allowed identification in 86% of cases, consistent with morphological methods for intact ribs. PMID:15503341

  6. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations

    PubMed Central

    Wagner, Karla D.; Davidson, Peter J.; Pollini, Robin A.; Strathdee, Steffanie A.; Washburn, Rachel; Palinkas, Lawrence A.

    2011-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, while conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors’ research on HIV risk among injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a needle/syringe exchange program in Los Angeles, California, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative

  7. Linking multidimensional functional diversity to quantitative methods: a graphical hypothesis--evaluation framework.

    PubMed

    Boersma, Kate S; Dee, Laura E; Miller, Steve J; Bogan, Michael T; Lytle, David A; Gitelman, Alix I

    2016-03-01

    Functional trait analysis is an appealing approach to study differences among biological communities because traits determine species' responses to the environment and their impacts on ecosystem functioning. Despite a rapidly expanding quantitative literature, it remains challenging to conceptualize concurrent changes in multiple trait dimensions ("trait space") and select quantitative functional diversity methods to test hypotheses prior to analysis. To address this need, we present a widely applicable framework for visualizing ecological phenomena in trait space to guide the selection, application, and interpretation of quantitative functional diversity methods. We describe five hypotheses that represent general patterns of responses to disturbance in functional community ecology and then apply a formal decision process to determine appropriate quantitative methods to test ecological hypotheses. As a part of this process, we devise a new statistical approach to test for functional turnover among communities. Our combination of hypotheses and metrics can be applied broadly to address ecological questions across a range of systems and study designs. We illustrate the framework with a case study of disturbance in freshwater communities. This hypothesis-driven approach will increase the rigor and transparency of applied functional trait studies. PMID:27197386

  8. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer

    NASA Astrophysics Data System (ADS)

    Fu, Guanglei; Sanjay, Sharma T.; Dou, Maowei; Li, Xiujun

    2016-03-01

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays. Electronic supplementary information (ESI) available: Additional information on FTIR characterization (Fig. S1), photothermal immunoassay of PSA in human serum samples (Table S1), and the Experimental section, including preparation of antibody-conjugated iron oxide NPs, sandwich-type immunoassay, characterization, and photothermal detection protocol. See DOI: 10.1039/c5nr09051b

  9. A validated LC-MS-MS method for simultaneous identification and quantitation of rodenticides in blood.

    PubMed

    Bidny, Sergei; Gago, Kim; David, Mark; Duong, Thanh; Albertyn, Desdemona; Gunja, Naren

    2015-04-01

    A rapid, highly sensitive and specific analytical method for the extraction, identification and quantification of nine rodenticides from whole blood has been developed and validated. Commercially available rodenticides in Australia include coumatetralyl, warfarin, brodifacoum, bromadiolone, difenacoum, flocoumafen, difethialone, diphacinone and chlorophacinone. A Waters ACQUITY UPLC TQD system operating in multiple reaction monitoring mode was used to conduct the analysis. Two different ionization techniques, ES+ and ES-, were examined to achieve optimal sensitivity and selectivity resulting in detection by MS-MS using electrospray ionization in positive mode for difenacoum and brodifacoum and in negative mode for all other analytes. All analytes were extracted from 200 µL of whole blood with ethylacetate and separated on a Waters ACQUITY UPLC BEH-C18 column using gradient elution. Ammonium acetate (10 mM, pH 7.5) and methanol were used as mobile phases with a total run time of 8 min. Recoveries were between 70 and 105% with limits of detection ranging from 0.5 to 1 ng/mL. The limit of quantitation was 2 ng/mL for all analytes. Calibration curves were linear within the range 2-200 ng/mL for all analytes with the coefficient of determination ≥0.98. The application of the proposed method using liquid-liquid extraction in a series of clinical investigations and forensic toxicological analyses was successful. PMID:25595137

  10. Laser flare photometry: a noninvasive, objective, and quantitative method to measure intraocular inflammation.

    PubMed

    Tugal-Tutkun, Ilknur; Herbort, Carl P

    2010-10-01

    Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis. PMID:19430730

  11. An Evaluation of Quantitative Methods of Determining the Degree of Melting Experienced by a Chondrule

    NASA Technical Reports Server (NTRS)

    Nettles, J. W.; Lofgren, G. E.; Carlson, W. D.; McSween, H. Y., Jr.

    2004-01-01

    Many workers have considered the degree to which partial melting occurred in chondrules they have studied, and this has led to attempts to find reliable methods of determining the degree of melting. At least two quantitative methods have been used in the literature: a convolution index (CVI), which is a ratio of the perimeter of the chondrule as seen in thin section divided by the perimeter of a circle with the same area as the chondrule, and nominal grain size (NGS), which is the inverse square root of the number density of olivines and pyroxenes in a chondrule (again, as seen in thin section). We have evaluated both nominal grain size and convolution index as melting indicators. Nominal grain size was measured on the results of a set of dynamic crystallization experiments previously described, where aliquots of LEW97008(L3.4) were heated to peak temperatures of 1250, 1350, 1370, and 1450 C, representing varying degrees of partial melting of the starting material. Nominal grain size numbers should correlate with peak temperature (and therefore degree of partial melting) if it is a good melting indicator. The convolution index is not directly testable with these experiments because the experiments do not actually create chondrules (and therefore they have no outline on which to measure a CVI). Thus we had no means to directly test how well the CVI predicted different degrees of melting. Therefore, we discuss the use of the CVI measurement and support the discussion with X-ray Computed Tomography (CT) data.

  12. Deep neural nets as a method for quantitative structure-activity relationships.

    PubMed

    Ma, Junshui; Sheridan, Robert P; Liaw, Andy; Dahl, George E; Svetnik, Vladimir

    2015-02-23

    Neural networks were widely used for quantitative structure-activity relationships (QSAR) in the 1990s. Because of various practical issues (e.g., slow on large problems, difficult to train, prone to overfitting, etc.), they were superseded by more robust methods like support vector machine (SVM) and random forest (RF), which arose in the early 2000s. The last 10 years has witnessed a revival of neural networks in the machine learning community thanks to new methods for preventing overfitting, more efficient training algorithms, and advancements in computer hardware. In particular, deep neural nets (DNNs), i.e. neural nets with more than one hidden layer, have found great successes in many applications, such as computer vision and natural language processing. Here we show that DNNs can routinely make better prospective predictions than RF on a set of large diverse QSAR data sets that are taken from Merck's drug discovery effort. The number of adjustable parameters needed for DNNs is fairly large, but our results show that it is not necessary to optimize them for individual data sets, and a single set of recommended parameters can achieve better performance than RF for most of the data sets we studied. The usefulness of the parameters is demonstrated on additional data sets not used in the calibration. Although training DNNs is still computationally intensive, using graphical processing units (GPUs) can make this issue manageable. PMID:25635324

  13. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-05-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.

  14. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  15. Terahertz absorbance spectrum fitting method for quantitative detection of concealed contraband

    NASA Astrophysics Data System (ADS)

    Wang, Yingxin; Zhao, Ziran; Chen, Zhiqiang; Kang, Kejun; Feng, Bing; Zhang, Yan

    2007-12-01

    We present a quantitative method for the nondestructive detection of concealed contraband based on terahertz transmission spectroscopy. Without knowing the prior information of barrier materials, the amount of concealed contraband can be extracted by approximating the terahertz absorbance spectrum of the barrier material with a low-order polynomial and then fitting the measured absorbance spectrum of the inspected object with the polynomial and the known standard spectrum of this kind of contraband. We verify the validity of this method using a sample of explosive 1,3,5-trinitro-s-triazine (RDX) covered with several different barrier materials which are commonly encountered in actual inspection, and good agreement between the calculated and actual value of the amount of RDX is obtained for the experiments performed under both nitrogen and air atmospheres. This indicates that the presented method can achieve quantitative detection of hidden contraband, which is important for security inspection applications.

  16. Quantitative Evaluation of Peptide-Material Interactions by a Force Mapping Method: Guidelines for Surface Modification.

    PubMed

    Mochizuki, Masahito; Oguchi, Masahiro; Kim, Seong-Oh; Jackman, Joshua A; Ogawa, Tetsu; Lkhamsuren, Ganchimeg; Cho, Nam-Joon; Hayashi, Tomohiro

    2015-07-28

    Peptide coatings on material surfaces have demonstrated wide application across materials science and biotechnology, facilitating the development of nanobio interfaces through surface modification. A guiding motivation in the field is to engineer peptides with a high and selective binding affinity to target materials. Herein, we introduce a quantitative force mapping method in order to evaluate the binding affinity of peptides to various hydrophilic oxide materials by atomic force microscopy (AFM). Statistical analysis of adhesion forces and probabilities obtained on substrates with a materials contrast enabled us to simultaneously compare the peptide binding affinity to different materials. On the basis of the experimental results and corresponding theoretical analysis, we discuss the role of various interfacial forces in modulating the strength of peptide attachment to hydrophilic oxide solid supports as well as to gold. The results emphasize the precision and robustness of our approach to evaluating the adhesion strength of peptides to solid supports, thereby offering guidelines to improve the design and fabrication of peptide-coated materials. PMID:26125092

  17. Pleistocene Lake Bonneville and Eberswalde Crater of Mars: Quantitative Methods for Recognizing Poorly Developed Lacustrine Shorelines

    NASA Astrophysics Data System (ADS)

    Jewell, P. W.

    2014-12-01

    The ability to quantify shoreline features on Earth has been aided by advances in acquisition of high-resolution topography through laser imaging and photogrammetry. Well-defined and well-documented features such as the Bonneville, Provo, and Stansbury shorelines of Late Pleistocene Lake Bonneville are recognizable to the untrained eye and easily mappable on aerial photos. The continuity and correlation of lesser shorelines must rely quantitative algorithms for processing high-resolution data in order to gain widespread scientific acceptance. Using Savitsky-Golay filters and the geomorphic methods and criteria described by Hare et al. [2001], minor, transgressive, erosional shorelines of Lake Bonneville have been identified and correlated across the basin with varying degrees of statistical confidence. Results solve one of the key paradoxes of Lake Bonneville first described by G. K. Gilbert in the late 19th century and point the way for understanding climatically driven oscillations of the Last Glacial Maximum in the Great Basin of the United States. Similar techniques have been applied to the Eberswalde Crater area of Mars using HRiSE DEMs (1 m horizontal resolution) where a paleolake is hypothesized to have existed. Results illustrate the challenges of identifying shorelines where long term aeolian processes have degraded the shorelines and field validation is not possible. The work illustrates the promises and challenges of indentifying remnants of a global ocean elsewhere on the red planet.

  18. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  19. Quantitative Analysis of Carbon Steel with Multi-Line Internal Standard Calibration Method Using Laser-Induced Breakdown Spectroscopy.

    PubMed

    Pan, Congyuan; Du, Xuewei; An, Ning; Zeng, Qiang; Wang, Shengbo; Wang, Qiuping

    2016-04-01

    A multi-line internal standard calibration method is proposed for the quantitative analysis of carbon steel using laser-induced breakdown spectroscopy (LIBS). A procedure based on the method was adopted to select the best calibration curves and the corresponding emission lines pairs automatically. Laser-induced breakdown spectroscopy experiments with carbon steel samples were performed, and C, Cr, and Mn were analyzed via the proposed method. Calibration curves of these elements were constructed via a traditional single line internal standard calibration method and a multi-line internal standard calibration method. The calibration curves obtained were evaluated with the determination coefficient, the root mean square error of cross-validation, and the average relative error of cross-validation. All of the parameters were improved significantly with the proposed method. The results show that accurate and stable calibration curves can be obtained efficiently via the multi-line internal standard calibration method. PMID:26872822

  20. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Cates, Michael R.; Franks, Larry A.

    1985-01-01

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fissions are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for .sup.239 Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  1. A simple HPLC-MS method for the quantitative determination of the composition of bacterial medium chain-length polyhydroxyalkanoates.

    PubMed

    Grubelnik, Andreas; Wiesli, Luzia; Furrer, Patrick; Rentsch, Daniel; Hany, Roland; Meyer, Veronika R

    2008-06-01

    Bacterial poly(hydroxyalkanoates) (PHAs) vary in the composition of their monomeric units. Besides saturated side-chains, unsaturated ones can also be found. The latter leads to unwanted by-products (THF ester, secondary alcohols) during acidic cleavage of the polymer backbone in the conventional analytical assays. To prevent these problems, we developed a new method for the reductive depolymerization of medium chain-length PHAs, leading to monomeric diols that can be separated and quantified by HPLC/MS. Reduction is performed at room temperature with lithium aluminum hydride within 5-15 min. The new method is faster and simpler than the previous ones and is quantitative. The results are consistent with the ones obtained by quantitative (1)H NMR. PMID:18461645

  2. Multi-Window Classical Least Squares Multivariate Calibration Methods for Quantitative ICP-AES Analyses

    SciTech Connect

    CHAMBERS,WILLIAM B.; HAALAND,DAVID M.; KEENAN,MICHAEL R.; MELGAARD,DAVID K.

    1999-10-01

    The advent of inductively coupled plasma-atomic emission spectrometers (ICP-AES) equipped with charge-coupled-device (CCD) detector arrays allows the application of multivariate calibration methods to the quantitative analysis of spectral data. We have applied classical least squares (CLS) methods to the analysis of a variety of samples containing up to 12 elements plus an internal standard. The elements included in the calibration models were Ag, Al, As, Au, Cd, Cr, Cu, Fe, Ni, Pb, Pd, and Se. By performing the CLS analysis separately in each of 46 spectral windows and by pooling the CLS concentration results for each element in all windows in a statistically efficient manner, we have been able to significantly improve the accuracy and precision of the ICP-AES analyses relative to the univariate and single-window multivariate methods supplied with the spectrometer. This new multi-window CLS (MWCLS) approach simplifies the analyses by providing a single concentration determination for each element from all spectral windows. Thus, the analyst does not have to perform the tedious task of reviewing the results from each window in an attempt to decide the correct value among discrepant analyses in one or more windows for each element. Furthermore, it is not necessary to construct a spectral correction model for each window prior to calibration and analysis: When one or more interfering elements was present, the new MWCLS method was able to reduce prediction errors for a selected analyte by more than 2 orders of magnitude compared to the worst case single-window multivariate and univariate predictions. The MWCLS detection limits in the presence of multiple interferences are 15 rig/g (i.e., 15 ppb) or better for each element. In addition, errors with the new method are only slightly inflated when only a single target element is included in the calibration (i.e., knowledge of all other elements is excluded during calibration). The MWCLS method is found to be vastly

  3. Raman spectroscopy provides a rapid, non-invasive method for quantitation of starch in live, unicellular microalgae.

    PubMed

    Ji, Yuetong; He, Yuehui; Cui, Yanbin; Wang, Tingting; Wang, Yun; Li, Yuanguang; Huang, Wei E; Xu, Jian

    2014-12-01

    Conventional methods for quantitation of starch content in cells generally involve starch extraction steps and are usually labor intensive, thus a rapid and non-invasive method will be valuable. Using the starch-producing unicellular microalga Chlamydomonas reinhardtii as a model, we employed a customized Raman spectrometer to capture the Raman spectra of individual single cells under distinct culture conditions and along various growth stages. The results revealed a nearly linear correlation (R(2) = 0.9893) between the signal intensity at 478 cm(-1) and the starch content of the cells. We validated the specific correlation by showing that the starch-associated Raman peaks were eliminated in a mutant strain where the AGPase (ADP-glucose pyrophosphorylase) gene was disrupted and consequentially the biosynthesis of starch blocked. Furthermore, the method was validated in an industrial algal strain of Chlorella pyrenoidosa. This is the first demonstration of starch quantitation in individual live cells. Compared to existing cellular starch quantitation methods, this single-cell Raman spectra-based approach is rapid, label-free, non-invasive, culture-independent, low-cost, and potentially able to simultaneously track multiple metabolites in individual live cells, therefore should enable many new applications. PMID:24906189

  4. Comparison of reconstruction methods and quantitative accuracy in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Kang, Joo Hyun; Moo Lim, Sang

    2015-04-01

    concentrations for radioactivity Our data collectively showed that OSEM 2D reconstruction method provides quantitatively accurate reconstructed PET data results.

  5. Radial period extraction method employing frequency measurement for quantitative collimation testing

    NASA Astrophysics Data System (ADS)

    Li, Sikun; Wang, Xiangzhao

    2016-01-01

    A radial period extraction method employing frequency measurement is proposed for quantitative collimation testing using spiral gratings. The radial period of the difference-frequency fringe is treated as a measure of the collimation condition. A frequency measurement technique based on wavelet transform and a statistical approach is presented to extract the radial period directly from the amplitude-transmittance spiral fringe. A basic constraint to set the parameters of the wavelet is introduced. Strict mathematical demonstration is given. The method outperforms methods employing phase measurement in terms of precision, stability and noise immune ability.

  6. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  7. Hepatitis C Virus RNA Real-Time Quantitative RT-PCR Method Based on a New Primer Design Strategy.

    PubMed

    Chen, Lida; Li, Wenli; Zhang, Kuo; Zhang, Rui; Lu, Tian; Hao, Mingju; Jia, Tingting; Sun, Yu; Lin, Guigao; Wang, Lunan; Li, Jinming

    2016-01-01

    Viral nucleic acids are unstable when improperly collected, handled, and stored, resulting in decreased sensitivity of currently available commercial quantitative nucleic acid testing kits. Using known unstable hepatitis C virus RNA, we developed a quantitative RT-PCR method based on a new primer design strategy to reduce the impact of nucleic acid instability on nucleic acid testing. The performance of the method was evaluated for linearity, limit of detection, precision, specificity, and agreement with commercial hepatitis C virus assays. Its clinical application was compared to that of two commercial kits--Cobas AmpliPrep/Cobas TaqMan (CAP/CTM) and Kehua. The quantitative RT-PCR method delivered a good performance, with a linearity of R(2) = 0.99, a total limit of detection (genotypes 1 to 6) of 42.6 IU/mL (95% CI, 32.84 to 67.76 IU/mL), a CV of 1.06% to 3.34%, a specificity of 100%, and a high concordance with the CAP/CTM assay (R(2) = 0.97), with a means ± SD value of -0.06 ± 1.96 log IU/mL (range, -0.38 to 0.25 log IU/mL). The method was superior to commercial assays in detecting unstable hepatitis C virus RNA (P < 0.05). This quantitative RT-PCR method can effectively eliminate the influence of RNA instability on nucleic acid testing. The principle of primer design strategy may be applied to the detection of other RNA or DNA viruses. PMID:26612712

  8. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo)

    PubMed Central

    Li, Yi; Kim, Jong-Joo

    2015-01-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  9. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  10. Qualitative and quantitative methods to determine miscibility in amorphous drug-polymer systems.

    PubMed

    Meng, Fan; Dave, Vivek; Chauhan, Harsh

    2015-09-18

    Amorphous drug-polymer systems or amorphous solid dispersions are commonly used in pharmaceutical industry to enhance the solubility of compounds with poor aqueous solubility. The degree of miscibility between drug and polymer is important both for solubility enhancement as well as for the formation of a physically stable amorphous system. Calculation of solubility parameters, Computational data mining, Tg measurements by DSC and Raman mapping are established traditional methods used to qualitatively detect the drug-polymer miscibility. Calculation of Flory-Huggins interaction parameter, computational analysis of X-Ray Diffraction (XRD) data, solid state Nuclear Magnetic Resonance (NMR) spectroscopy and Atomic Forced Microscopy (AFM) have been recently developed to quantitatively determine the miscibility in amorphous drug-polymer systems. This brief review introduces and compiles these qualitative and quantitative methods employed in the evaluation of drug-polymer miscibility. Combination of these techniques can provide deeper insights into the true miscibility of the drug-polymer systems. PMID:26006307

  11. Researchers’ views on return of incidental genomic research results: qualitative and quantitative findings

    PubMed Central

    Klitzman, Robert; Appelbaum, Paul S.; Fyer, Abby; Martinez, Josue; Buquez, Brigitte; Wynn, Julia; Waldman, Cameron R.; Phelan, Jo; Parens, Erik; Chung, Wendy K.

    2013-01-01

    Purpose Comprehensive genomic analysis including exome and genome sequencing is increasingly being utilized in research studies, leading to the generation of incidental genetic findings. It is unclear how researchers plan to deal with incidental genetic findings. Methods We conducted a survey of the practices and attitudes of 234 members of the US genetic research community and performed qualitative semistructured interviews with 28 genomic researchers to understand their views and experiences with incidental genetic research findings. Results We found that 12% of the researchers had returned incidental genetic findings, and an additional 28% planned to do so. A large majority of researchers (95%) believe that incidental findings for highly penetrant disorders with immediate medical implications should be offered to research participants. However, there was no consensus on returning incidental results for other conditions varying in penetrance and medical actionability. Researchers raised concerns that the return of incidental findings would impose significant burdens on research and could potentially have deleterious effects on research participants if not performed well. Researchers identified assistance needed to enable effective, accurate return of incidental findings. Conclusion The majority of the researchers believe that research participants should have the option to receive at least some incidental genetic research results. PMID:23807616

  12. A quantitative and standardized robotic method for the evaluation of arm proprioception after stroke.

    PubMed

    Simo, Lucia S; Ghez, Claude; Botzer, Lior; Scheidt, Robert A

    2011-01-01

    Stroke often results in both motor and sensory deficits, which may interact in the manifested functional impairment. Proprioception is known to play important roles in the planning and control of limb posture and movement; however, the impact of proprioceptive deficits on motor function has been difficult to elucidate due in part to the qualitative nature of available clinical tests. We present a quantitative and standardized method for evaluating proprioception in tasks directly relevant to those used to assess motor function. Using a robotic manipulandum that exerted controlled displacements of the hand, stroke participants were evaluated, and compared with a control group, in their ability to detect such displacements in a 2-alternative, forced-choice paradigm. A psychometric function parameterized the decision process underlying the detection of the hand displacements. The shape of this function was determined by a signal detection threshold and by the variability of the response about this threshold. Our automatic procedure differentiates between participants with and without proprioceptive deficits and quantifies functional proprioceptive sensation on a magnitude scale that is meaningful for ongoing studies of degraded motor function in comparable horizontal movements. PMID:22256252

  13. Integrated multiplatform method for in vitro quantitative assessment of cellular uptake for fluorescent polymer nanoparticles

    NASA Astrophysics Data System (ADS)

    Ferrari, Raffaele; Lupi, Monica; Falcetta, Francesca; Bigini, Paolo; Paolella, Katia; Fiordaliso, Fabio; Bisighini, Cinzia; Salmona, Mario; D'Incalci, Maurizio; Morbidelli, Massimo; Moscatelli, Davide; Ubezio, Paolo

    2014-01-01

    Studies of cellular internalization of nanoparticles (NPs) play a paramount role for the design of efficient drug delivery systems, but so far they lack a robust experimental technique able to quantify the NP uptake in terms of number of NPs internalized in each cell. In this work we propose a novel method which provides a quantitative evaluation of fluorescent NP uptake by combining flow cytometry and plate fluorimetry with measurements of number of cells. Single cell fluorescence signals measured by flow cytometry were associated with the number of internalized NPs, exploiting the observed linearity between average flow cytometric fluorescence and overall plate fluorimeter measures, and previous calibration of the microplate reader with serial dilutions of NPs. This precise calibration has been made possible by using biocompatible fluorescent NPs in the range of 20-300 nm with a narrow particle size distribution, functionalized with a covalently bonded dye, Rhodamine B, and synthesized via emulsion free-radical polymerization. We report the absolute number of NPs internalized in mouse mammary tumor cells (4T1) as a function of time for different NP dimensions and surface charges and at several exposure concentrations. The obtained results indicate that 4T1 cells incorporated 103-104 polymer NPs in a short time, reaching an intracellular concentration 15 times higher than the external one.

  14. EFFECTIVE REMOVAL METHOD OF ILLEGAL PARKING BICYCLES BASED ON THE QUANTITATIVE CHANGE AFTER REMOVAL

    NASA Astrophysics Data System (ADS)

    Toi, Satoshi; Kajita, Yoshitaka; Nishikawa, Shuichirou

    This study aims to find an effective removal method of illegal parking bicycles based on the analysis on the numerical change of illegal bicycles. And then, we built the time and space quantitative distribution model of illegal parking bicycles after removal, considering the logistic increase of illegal parking bicycles, several behaviors concerning of direct return or indirect return to the original parking place and avoidance of the original parking place, based on the investigation of real condition of illegal bicycle parking at TENJIN area in FUKUOKA city. Moreover, we built the simulation model including above-mentioned model, and calculated the number of illegal parking bicycles when we change the removal frequency and the number of removal at one time. The next interesting four results were obtained. (1) Recovery speed from removal the illegal parking bicycles differs by each zone. (2) Thorough removal is effective to keep the number of illegal parking bicycles lower level. (3) Removal at one zone causes the increase of bicycles at other zones where the level of illegal parking is lower. (4) The relationship between effects and costs of removing the illegal parking bicycles was clarified.

  15. Common standards for quantitative electrocardiography: goals and main results. CSE Working Party.

    PubMed

    Willems, J L; Arnaud, P; van Bemmel, J H; Degani, R; Macfarlane, P W; Zywietz, C

    1990-09-01

    Computer processing of electrocardiograms (ECGs) has over the last 15 years increased rapidly. Still, there are at present no standards for computer ECG interpretation. Different techniques are used not only for measurement and interpretation, but also for transmission and storage of data. In order to fill these gaps, a large international project, sponsored by the European Commission, was launched in 1980 to develop "Common Standards for Quantitative Electrocardiography (CSE)". The main objective of the first CSE study was to reduce the wide variation in wave measurements currently obtained by ECG computer programs. The second study was started in 1985 and aimed at the assessment and improvement of diagnostic classification of ECG interpretation programs. To this end reference libraries of well documented ECGs have been developed and comprehensive reviewing schemes devised for the visual and computer analysis of ECGs. This task was performed by a board of cardiologists in a Delphi review process, and by 9 VCG and 10 standard 12-lead programs developed by university research groups and by industry. A third action was started in June 1989 to harmonize acquisition, encoding, interchange and storing of digital ECG data. The action thus performed have become internationally recognized milestones for the standardization of quantitative electrocardiography. PMID:2233372

  16. Quantitative method of analyzing the interaction of slightly selective radioligands with multiple receptor subtypes

    SciTech Connect

    McGonigle, P.; Neve, K.A.; Molinoff, P.B.

    1986-10-01

    Subclasses of receptors exist for most neurotransmitters. Frequently, two subtypes of receptors coexist in the same tissue and, in some cases, they mediate the same physiological response. In tissues with two classes of binding sites for a given hormone, an estimate of the proportion of each class of binding sites is obtained by inhibiting the binding of a single concentration of a radioligand with a selective unlabeled ligand. Accurate estimates of the density of each class of receptors will only be obtained, however, if the radioligand is entirely nonselective. Selectivity of just 2- to 3-fold can markedly influence the results of subtype analysis. The conclusion that a radioligand is nonselective is usually based on the results of a saturation binding curve. If Scatchard analysis results in a linear plot, the radioligand is nonselective. Scatchard analysis cannot distinguish between a radioligand that is nonselective and one that is slightly selective. The use of a slightly selective radioligand can lead to errors of 50% or more, depending on the concentration of the radioligand relative to the Kd values of the two classes of sites. A new method has been developed that can be used to quantitate 2- to 3-fold differences in the affinity of two distinct classes of binding sites for a radioligand. This approach requires that a series of inhibition experiments with a selective unlabeled ligand be performed in the presence of increasing concentrations of the radioligand. Analysis of the resulting inhibition curves, utilizing the mathematical modeling program MLAB on the PROPHET system, yields accurate estimates of the density of each class of receptor as well as the affinity of each receptor for the labeled and unlabeled ligands. This approach was used to determine whether /sup 125/I-iodopindolol shows selectivity for beta 1- or beta 2-adrenergic receptors.

  17. A Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4 -tetrahidro-1,6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 BrcdotH_2O) has been determined, first using monochromatic Mo K alpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed an R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop-2-en-1-one, (C_{25}H _{20}N_2 O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analyses respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analyses of the benzil compound ((C_6H_5 OcdotCO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new

  18. a Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis.

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Available from UMI in association with The British Library. Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4-tetrahidro-1, 6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 Br.H_2O) has been determined, first using monochromatic Mo Kalpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed a R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop- 2-en-1-one, (C_{25 }H_{20}N _2O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analysis respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analysis of the benzil compound ((C_6H_5 O.CO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature -114 ^circC. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation

  19. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  20. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  1. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles

    SciTech Connect

    Gonda, Kohsuke; Miyashita, Minoru; Watanabe, Mika; Takahashi, Yayoi; Goda, Hideki; Okada, Hisatake; Nakano, Yasushi; Tada, Hiroshi; Amari, Masakazu; Ohuchi, Noriaki

    2012-09-28

    quantitatively examine the two methods. The results demonstrated that our nanoparticle staining analyzed a wide range of ER expression levels with higher accuracy and quantitative sensitivity than DAB staining. This enhancement in the diagnostic accuracy and sensitivity for ERs using our immunostaining method will improve the prediction of responses to therapies that target ERs and progesterone receptors that are induced by a downstream ER signal.

  2. An Improved Method for Measuring Quantitative Resistance to the Wheat Pathogen Zymoseptoria tritici Using High-Throughput Automated Image Analysis.

    PubMed

    Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A

    2016-07-01

    Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions. PMID:27050574

  3. Adaptation of the quantitative 2-[14C]deoxyglucose method for use in freely moving rats.

    PubMed

    Crane, A M; Porrino, L J

    1989-10-01

    A procedure for venous and arterial catheterization is described which allows the quantitative 2-[14C]deoxyglucose method to be applied to freely moving animals for behavioral and pharmacological studies. The catheterization method is rapid, minimally invasive, and requires no complicated equipment. Physiological conditions and rates of cerebral glucose utilization in freely moving rats and in restrained rats have been compared. The results demonstrate that local cerebral glucose utilization can readily be measured in freely moving animals engaged in behavioral experiments. PMID:2804673

  4. Task 4.4 - development of supercritical fluid extraction methods for the quantitation of sulfur forms in coal

    SciTech Connect

    Timpe, R.C.

    1995-04-01

    Development of advanced fuel forms depends on having reliable quantitative methods for their analysis. Determination of the true chemical forms of sulfur in coal is necessary to develop more effective methods to reduce sulfur content. Past work at the Energy & Environmental Research Center (EERC) indicates that sulfur chemistry has broad implications in combustion, gasification, pyrolysis, liquefaction, and coal-cleaning processes. Current analytical methods are inadequate for accurately measuring sulfur forms in coal. This task was concerned with developing methods to quantitate and identify major sulfur forms in coal based on direct measurement (as opposed to present techniques based on indirect measurement and difference values). The focus was on the forms that were least understood and for which the analytical methods have been the poorest, i.e., organic and elemental sulfur. Improved measurement techniques for sulfatic and pyritic sulfur also need to be developed. A secondary goal was to understand the interconversion of sulfur forms in coal during thermal processing. EERC has developed the first reliable analytical method for extracting and quantitating elemental sulfur from coal (1). This method has demonstrated that elemental sulfur can account for very little or as much as one-third of the so-called organic sulfur fraction. This method has disproved the generally accepted idea that elemental sulfur is associated with the organic fraction. A paper reporting the results obtained on this subject entitled {open_quote}Determination of Elemental Sulfur in Coal by Supercritical Fluid Extraction and Gas Chromatography with Atomic Emission Detection{close_quote} was published in Fuel (A).

  5. QUANTITATIVE EVALUATION OF ASR DETERIORATION LEVEL BASED ON SURVEY RESULT OF EXISTING STRUCTURE

    NASA Astrophysics Data System (ADS)

    Kawashima, Yasushi; Kosa, Kenji; Matsumoto, Shigeru; Miura, Masatsugu

    The relationship between the crack density and compressive strength of the core cylinder, which drilled from actual structure damaged by ASR, was investigated. The results showed that even if the crack density increased about 1.0m/m2, the compressive strength decreased only 2N/mm2. Then, the new method for estimating future compressive strength using the accumulation crack density in the current is proposed. In addition, the declining tendency of compressive strength by the ASR expansion was early proportional to the expansion, and it was examined on the reason for becoming gentle curve afterwards. As a technique, the detailed observation of ASR crack which arose in the loading test for the plane was carried out, after cylindrical specimen for test was cut in longitudinal direction. As the result, It was proven that the proportion in which line of rupture overlaps with the ASR crack was low, and the load is resisted by interlocking between coarse aggregate and concrete in the crack plane.

  6. Path Integrals and Exotic Options:. Methods and Numerical Results

    NASA Astrophysics Data System (ADS)

    Bormetti, G.; Montagna, G.; Moreni, N.; Nicrosini, O.

    2005-09-01

    In the framework of Black-Scholes-Merton model of financial derivatives, a path integral approach to option pricing is presented. A general formula to price path dependent options on multidimensional and correlated underlying assets is obtained and implemented by means of various flexible and efficient algorithms. As an example, we detail the case of Asian call options. The numerical results are compared with those obtained with other procedures used in quantitative finance and found to be in good agreement. In particular, when pricing at the money (ATM) and out of the money (OTM) options, path integral exhibits competitive performances.

  7. A New Quantitative Method for the Non-Invasive Documentation of Morphological Damage in Paintings Using RTI Surface Normals

    PubMed Central

    Manfredi, Marcello; Bearman, Greg; Williamson, Greg; Kronkright, Dale; Doehne, Eric; Jacobs, Megan; Marengo, Emilio

    2014-01-01

    In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI) we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time. PMID:25010699

  8. Post-Reconstruction Non-Local Means Filtering Methods using CT Side Information for Quantitative SPECT

    PubMed Central

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-01-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved −2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  9. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT

    NASA Astrophysics Data System (ADS)

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-09-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved -2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  10. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years.

    PubMed

    Tapaltsyan, Vagan; Eronen, Jussi T; Lawing, A Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D

    2015-05-01

    The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem-cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3,500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine whether evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  11. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years

    PubMed Central

    Mushegyan, Vagan; Eronen, Jussi T.; Lawing, A. Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D.

    2015-01-01

    Summary The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine if evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic, and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem-cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  12. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  13. Response monitoring using quantitative ultrasound methods and supervised dictionary learning in locally advanced breast cancer

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Fung, Brandon; Tadayyon, Hadi; Tran, William T.; Czarnota, Gregory J.

    2016-03-01

    A non-invasive computer-aided-theragnosis (CAT) system was developed for the early assessment of responses to neoadjuvant chemotherapy in patients with locally advanced breast cancer. The CAT system was based on quantitative ultrasound spectroscopy methods comprising several modules including feature extraction, a metric to measure the dissimilarity between "pre-" and "mid-treatment" scans, and a supervised learning algorithm for the classification of patients to responders/non-responders. One major requirement for the successful design of a high-performance CAT system is to accurately measure the changes in parametric maps before treatment onset and during the course of treatment. To this end, a unified framework based on Hilbert-Schmidt independence criterion (HSIC) was used for the design of feature extraction from parametric maps and the dissimilarity measure between the "pre-" and "mid-treatment" scans. For the feature extraction, HSIC was used to design a supervised dictionary learning (SDL) method by maximizing the dependency between the scans taken from "pre-" and "mid-treatment" with "dummy labels" given to the scans. For the dissimilarity measure, an HSIC-based metric was employed to effectively measure the changes in parametric maps as an indication of treatment effectiveness. The HSIC-based feature extraction and dissimilarity measure used a kernel function to nonlinearly transform input vectors into a higher dimensional feature space and computed the population means in the new space, where enhanced group separability was ideally obtained. The results of the classification using the developed CAT system indicated an improvement of performance compared to a CAT system with basic features using histogram of intensity.

  14. Quantitative analysis methods for three-dimensional microstructure of the solid-oxide fuel cell anode

    NASA Astrophysics Data System (ADS)

    Song, X.; Guan, Y.; Liu, G.; Chen, L.; Xiong, Y.; Zhang, X.; Tian, Y.

    2013-10-01

    The electrochemical performance is closely related to three-dimensional microstructure of the Ni-YSZ anode. X-ray nano-tomography combined with quantitative analysis methods has been applied to non-destructively study the internal microstructure of the porous Ni-YSZ anode. In this paper, the methods for calculating some critical structural parameters, such as phase volume fraction, connectivity and active triple phase boundary (TPB) density were demonstrated. These structural parameters help us to optimize electrodes and improve the performance.

  15. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is

  16. A rapid quantitative method of carisoprodol and meprobamate by liquid chromatography-tandem mass spectrometry.

    PubMed

    Essler, Shannon; Bruns, Kerry; Frontz, Michael; McCutcheon, J Rod

    2012-11-01

    The identification and quantitation of carisoprodol (Soma) and its chief metabolite meprobamate, which is also a clinically prescribed drug, remains a challenge for forensic toxicology laboratories. Carisoprodol and meprobamate are notable for their widespread use as muscle relaxants and their frequent identification in the blood of impaired drivers. Routine screening is possible in both an acidic/neutral pH screen and a traditional basic screen. An improvement in directed testing quantitations was desirable over the current options of an underivatized acidic/neutral extraction or a basic screen, neither of which used ideal internal standards. A new method was developed that utilized a simple protein precipitation, deuterated internal standards and a short 2-min isocratic liquid chromatography separation, followed by multiple reaction monitoring with tandem mass spectrometry. The linear quantitative range for carisoprodol was determined to be 1-35mg/L and for meprobamate was 0.5-50mg/L. The method was validated for specificity and selectivity, matrix effects, and accuracy and precision. PMID:23040985

  17. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  18. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18 O-Labeling Method for Quantitative Proteomics

    SciTech Connect

    Lopez-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather S.; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-08-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min and minimized the amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from bacteria Shewanella oneidensis, and mouse plasma, as well as for the labeling of complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, and rapid, and thus well-suited for automation.

  19. Quantitative measurement of analyte gases in a microwave spectrometer using a dynamic sampling method

    NASA Astrophysics Data System (ADS)

    Zhu, Z.; Matthews, I. P.; Samuel, A. H.

    1996-07-01

    This article reports quantitative measurement of concentrations of water vapor (absorption line at 22.235 GHz) and ethylene oxide (absorption line at 23.123 GHz) in different gas mixtures by means of a microwave spectrometer. The problem of absorption line broadening and the gas memory problem inherent in the quantitative analysis of gases using microwave molecular rotational spectroscopy have been solved. The line broadening problem was minimized by gas dilution with nitrogen and the gas memory problem was effectively reduced by means of a dynamic sampling method. Calibration of ethylene oxide with a dilution factor of 5 has demonstrated that the standard deviations of the calibration data were less than 4.2%. A typical ethylene oxide sterilization production cycle was chosen to monitor chamber ethylene oxide concentrations in the gas dwell phase and the repeatability of these real time measurements was 2.7%.

  20. Quantitative analysis of gene expression in fixed colorectal carcinoma samples as a method for biomarker validation

    PubMed Central

    OSTASIEWICZ, BEATA; OSTASIEWICZ, PAWEŁ; DUŚ-SZACHNIEWICZ, KAMILA; OSTASIEWICZ, KATARZYNA; ZIÓŁKOWSKI, PIOTR

    2016-01-01

    Biomarkers have been described as the future of oncology. Modern proteomics provide an invaluable tool for the near-whole proteome screening for proteins expressed differently in neoplastic vs. healthy tissues. However, in order to select the most promising biomarkers, an independent method of validation is required. The aim of the current study was to propose a methodology for the validation of biomarkers. Due to material availability the majority of large scale biomarker studies are performed using formalin-fixed paraffin-embedded (FFPE) tissues, therefore these were selected for use in the current study. A total of 10 genes were selected from what have been previously described as the most promising candidate biomarkers, and the expression levels were analyzed with reverse transcription-quantitative polymerase chain reaction (RT-qPCR) using calibrator normalized relative quantification with the efficiency correction. For 6/10 analyzed genes, the results were consistent with the proteomic data; for the remaining four genes, the results were inconclusive. The upregulation of karyopherin α 2 (KPNA2) and chromosome segregation 1-like (CSE1L) in colorectal carcinoma, in addition to downregulation of chloride channel accessory 1 (CLCA1), fatty acid binding protein 1 (FABP1), sodium channel, voltage gated, type VII α subunit (SCN7A) and solute carrier family 26 (anion exchanger), member 3 (SLC26A3) was confirmed. With the combined use of proteomic and genetic tools, it was reported, for the first time to the best of our knowledge, that SCN7A was downregulated in colorectal carcinoma at mRNA and protein levels. It had been previously suggested that the remaining five genes served an important role in colorectal carcinogenesis, however the current study provided strong evidence to support their use as biomarkers. Thus, it was concluded that combination of RT-qPCR with proteomics offers a powerful methodology for biomarker identification, which can be used to analyze

  1. Using quantitative and qualitative data in health services research – what happens when mixed method findings conflict? [ISRCTN61522618

    PubMed Central

    Moffatt, Suzanne; White, Martin; Mackintosh, Joan; Howel, Denise

    2006-01-01

    Background In this methodological paper we document the interpretation of a mixed methods study and outline an approach to dealing with apparent discrepancies between qualitative and quantitative research data in a pilot study evaluating whether welfare rights advice has an impact on health and social outcomes among a population aged 60 and over. Methods Quantitative and qualitative data were collected contemporaneously. Quantitative data were collected from 126 men and women aged over 60 within a randomised controlled trial. Participants received a full welfare benefits assessment which successfully identified additional financial and non-financial resources for 60% of them. A range of demographic, health and social outcome measures were assessed at baseline, 6, 12 and 24 month follow up. Qualitative data were collected from a sub-sample of 25 participants purposively selected to take part in individual interviews to examine the perceived impact of welfare rights advice. Results Separate analysis of the quantitative and qualitative data revealed discrepant findings. The quantitative data showed little evidence of significant differences of a size that would be of practical or clinical interest, suggesting that the intervention had no impact on these outcome measures. The qualitative data suggested wide-ranging impacts, indicating that the intervention had a positive effect. Six ways of further exploring these data were considered: (i) treating the methods as fundamentally different; (ii) exploring the methodological rigour of each component; (iii) exploring dataset comparability; (iv) collecting further data and making further comparisons; (v) exploring the process of the intervention; and (vi) exploring whether the outcomes of the two components match. Conclusion The study demonstrates how using mixed methods can lead to different and sometimes conflicting accounts and, using this six step approach, how such discrepancies can be harnessed to interrogate each

  2. Combined megaplex TCR isolation and SMART-based real-time quantitation methods for quantitating antigen-specific T cell clones in mycobacterial infection

    PubMed Central

    Du, George; Qiu, Liyou; Shen, Ling; Sehgal, Probhat; Shen, Yun; Huang, Dan; Letvin, Norman L.; Chen, Zheng W.

    2010-01-01

    Despite recent advances in measuring cellular immune responses, the quantitation of antigen-specific T cell clones in infections or diseases remains challenging. Here, we employed combined megaplex TCR isolation and SMART-based real-time quantitation methods to quantitate numerous antigen-specific T cell clones using limited amounts of specimens. The megaplex TCR isolation covered the repertoire comprised of recombinants from 24 Vβ families and 13 Jβ segments, and allowed us to isolate TCR VDJ clonotypic sequences from one or many PPD-specific IFNγ-producing T cells that were purified by flow cytometry sorting. The SMART amplification technique was then validated for its capacity to proportionally enrich cellular TCR mRNA/cDNA for real-time quantitation of large numbers of T cell clones. SMART amplified cDNA was shown to maintain relative expression levels of TCR genes when compared to unamplified cDNA. While the SMART-based real-time quantitative PCR conferred a detection limit of 10−5 to 10−6 antigen-specific T cells, the clonotypic primers specifically amplified and quantitated the target clone TCR but discriminated other clones that differed by ≥2 bases in the DJ regions. Furthermore, the combined megaplex TCR isolation and SMART-based real-time quantiation methods allowed us to quantitate large numbers of PPD-specific IFNγ-producing T cell clones using as few as 2×106 PBMC collected weekly after mycobacterial infection. This assay system may be useful for studies of antigen-specific T cell clones in tumors, autoimmune and infectious diseases. PMID:16403511

  3. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  4. A method of quantitative characterization for the component of C/C composites based on the PLM video

    NASA Astrophysics Data System (ADS)

    Li, Y. X.; Qi, L. H.; Song, Y. S.; Li, H. J.

    2016-07-01

    PLM video is used for studying the microstructure of C/C composites, because it contains the structure and motion information at the same time. It means that PLM video could provide more comprehensive microstructure features of C/C composites, and then the microstructure could be quantitatively characterized by image processing. However, several unavoidable displacements still exist in the PLM video, which could occur during the process of image acquisition. Therefore, an image registration method was put forward to correct the displacements by the phase correlation, and further to achieve the quantitative characterization of component combined with image fusion and threshold segmentation based on the PLM video of C/C composites. Specifically, PLM video was decomposed to a frame sequence firstly. Then a series of processes was carried out on this basis, including selecting the frame as equal interval, segmenting the static and dynamic regions and correcting the relative displacements between the adjacent frames. Meanwhile, the result of image registration was verified through image fusion, and it indicates that the proposed method could eliminate the displacements effectively. Finally, some operations of image processing were used to segment the components and calculate their fractions, thus the quantitative calculation was achieved successfully.

  5. Studies of polyester fiber as carrier for microbes in a quantitative test method for disinfectants.

    PubMed

    Miner, Norman; Harris, Valerie; Stumph, Sara; Cobb, Amanda; Ortiz, Jennifer

    2004-01-01

    Tests were conducted by a Task Force on Disinfectant Test Methods that was appointed to investigate controversies regarding the accuracy of AOAC test methods for disinfectants as presented in AOAC's Official Methods of Analysis, Chapter 6. The general principles for new and improved AOAC tests are discussed, and a disinfectant test using microbes labeled onto a polyester fiber surface is described. The quantitative test measures the survival of test microbes as a function of exposure time as well as the exposure conditions required to kill 6 log10 of the test microbes. The time required was similar to that for the kinetics of the kill of Bacillus subtilis-labeled cylinders as tested by methods of the AOAC Sporicidal Test 966.04. PMID:15164838

  6. A simple regression-based method to map quantitative trait loci underlying function-valued phenotypes.

    PubMed

    Kwak, Il-Youp; Moore, Candace R; Spalding, Edgar P; Broman, Karl W

    2014-08-01

    Most statistical methods for quantitative trait loci (QTL) mapping focus on a single phenotype. However, multiple phenotypes are commonly measured, and recent technological advances have greatly simplified the automated acquisition of numerous phenotypes, including function-valued phenotypes, such as growth measured over time. While methods exist for QTL mapping with function-valued phenotypes, they are generally computationally intensive and focus on single-QTL models. We propose two simple, fast methods that maintain high power and precision and are amenable to extensions with multiple-QTL models using a penalized likelihood approach. After identifying multiple QTL by these approaches, we can view the function-valued QTL effects to provide a deeper understanding of the underlying processes. Our methods have been implemented as a package for R, funqtl. PMID:24931408

  7. Development of a HPLC Method for the Quantitative Determination of Capsaicin in Collagen Sponge

    PubMed Central

    Guo, Chun-Lian; Chen, Hong-Ying; Cui, Bi-Ling; Chen, Yu-Huan; Zhou, Yan-Fang; Peng, Xin-Sheng; Wang, Qin

    2015-01-01

    Controlling the concentration of drugs in pharmaceutical products is essential to patient's safety. In this study, a simple and sensitive HPLC method is developed to quantitatively analyze capsaicin in collagen sponge. The capsaicin from sponge was extracted for 30 min with ultrasonic wave extraction technique and methanol was used as solvent. The chromatographic method was performed by using isocratic system composed of acetonitrile-water (70 : 30) with a flow rate of 1 mL/min and the detection wavelength was at 280 nm. Capsaicin can be successfully separated with good linearity (the regression equation is A = 9.7182C + 0.8547; R2 = 1.0) and perfect recovery (99.72%). The mean capsaicin concentration in collagen sponge was 49.32 mg/g (RSD = 1.30%; n = 3). In conclusion, the ultrasonic wave extraction method is simple and the extracting efficiency is high. The HPLC assay has excellent sensitivity and specificity and is a convenient method for capsaicin detection in collagen sponge. This paper firstly discusses the quantitative analysis of capsaicin in collagen sponge. PMID:26612986

  8. Multiresidue method for the quantitation of 20 pesticides in aquatic products.

    PubMed

    Cho, Ha Ra; Park, Jun Seo; Kim, Junghyun; Han, Sang Beom; Choi, Yong Seok

    2015-12-01

    As the consumption of aquatic products increased, the need for regulation of pesticide residues in aquatic products also emerged. Thus, in this study, a scheduled multiple reaction monitoring (sMRM) method employing a novel extraction and purification step based on QuEChERS with EDTA was developed for the simultaneous quantitation of 20 pesticides (alachlor, aldicarb, carbofuran, diazinon, dimethoate, dimethomorph, ethoprophos, ferimzone, fluridone, hexaconazole, iprobenfos, malathion, methidathion, methiocarb, phenthoate, phosalone, phosmet, phosphamidon, pirimicarb, and simazine) in aquatic products. Additionally, the present method was validated in the aspects of specificity, linearity (r ≥ 0.980), sensitivity (the limit of quantitation (LOQ) ≤ 5 ng/g), relative standard deviation, RSD (1.0% ≤ RSD ≤ 19.4%), and recovery (60.1% ≤ recovery ≤ 117.9%). Finally, the validated method was applied for the determination of the 20 pesticide residues in eel and shrimp purchased from local food markets. In the present study, QuEChERS with EDTA was successfully expanded to residual pesticide analysis for the first time. The present method could contribute to the rapid and successful establishment of the positive list system in South Korea. PMID:26466578

  9. A quantitative method for the evaluation of three-dimensional structure of temporal bone pneumatization.

    PubMed

    Hill, Cheryl A; Richtsmeier, Joan T

    2008-10-01

    Temporal bone pneumatization has been included in lists of characters used in phylogenetic analyses of human evolution. While studies suggest that the extent of pneumatization has decreased over the course of human evolution, little is known about the processes underlying these changes or their significance. In short, reasons for the observed reduction and the potential reorganization within pneumatized spaces are unknown. Technological limitations have limited previous analyses of pneumatization in extant and fossil species to qualitative observations of the extent of temporal bone pneumatization. In this paper, we introduce a novel application of quantitative methods developed for the study of trabecular bone to the analysis of pneumatized spaces of the temporal bone. This method utilizes high-resolution X-ray computed tomography (HRXCT) images and quantitative software to estimate three-dimensional parameters (bone volume fractions, anisotropy, and trabecular thickness) of bone structure within defined units of pneumatized spaces. We apply this approach in an analysis of temporal bones of diverse but related primate species, Gorilla gorilla, Pan troglodytes, Homo sapiens, and Papio hamadryas anubis, to illustrate the potential of these methods. In demonstrating the utility of these methods, we show that there are interspecific differences in the bone structure of pneumatized spaces, perhaps reflecting changes in the localized growth dynamics, location of muscle attachments, encephalization, or basicranial flexion. PMID:18715622

  10. The quantitative and qualitative recovery of Campylobacter from raw poultry using USDA and Health Canada methods.

    PubMed

    Sproston, E L; Carrillo, C D; Boulter-Bitzer, J

    2014-12-01

    Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating. PMID:25084671

  11. Comparative Evaluation of Four Real-Time PCR Methods for the Quantitative Detection of Epstein-Barr Virus from Whole Blood Specimens.

    PubMed

    Buelow, Daelynn; Sun, Yilun; Tang, Li; Gu, Zhengming; Pounds, Stanley; Hayden, Randall

    2016-07-01

    Monitoring of Epstein-Barr virus (EBV) load in immunocompromised patients has become integral to their care. An increasing number of reagents are available for quantitative detection of EBV; however, there are little published comparative data. Four real-time PCR systems (one using laboratory-developed reagents and three using analyte-specific reagents) were compared with one another for detection of EBV from whole blood. Whole blood specimens seeded with EBV were used to determine quantitative linearity, analytical measurement range, lower limit of detection, and CV for each assay. Retrospective testing of 198 clinical samples was performed in parallel with all methods; results were compared to determine relative quantitative and qualitative performance. All assays showed similar performance. No significant difference was found in limit of detection (3.12-3.49 log10 copies/mL; P = 0.37). A strong qualitative correlation was seen with all assays that used clinical samples (positive detection rates of 89.5%-95.8%). Quantitative correlation of clinical samples across assays was also seen in pairwise regression analysis, with R(2) ranging from 0.83 to 0.95. Normalizing clinical sample results to IU/mL did not alter the quantitative correlation between assays. Quantitative EBV detection by real-time PCR can be performed over a wide linear dynamic range, using three different commercially available reagents and laboratory-developed methods. EBV was detected with comparable sensitivity and quantitative correlation for all assays. PMID:27157323

  12. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  13. Quantitative validation of the 3D SAR profile of hyperthermia applicators using the gamma method.

    PubMed

    de Bruijne, Maarten; Samaras, Theodoros; Chavannes, Nicolas; van Rhoon, Gerard C

    2007-06-01

    For quality assurance of hyperthermia treatment planning systems, quantitative validation of the electromagnetic model of an applicator is essential. The objective of this study was to validate a finite-difference time-domain (FDTD) model implementation of the Lucite cone applicator (LCA) for superficial hyperthermia. The validation involved (i) the assessment of the match between the predicted and measured 3D specific absorption rate (SAR) distribution, and (ii) the assessment of the ratio between model power and real-world power. The 3D SAR distribution of seven LCAs was scanned in a phantom bath using the DASY4 dosimetric measurement system. The same set-up was modelled in SEMCAD X. The match between the predicted and the measured SAR distribution was quantified with the gamma method, which combines distance-to-agreement and dose difference criteria. Good quantitative agreement was observed: more than 95% of the measurement points met the acceptance criteria 2 mm/2% for all applicators. The ratio between measured and predicted power absorption ranged from 0.75 to 0.92 (mean 0.85). This study shows that quantitative validation of hyperthermia applicator models is feasible and is worth considering as a part of hyperthermia quality assurance procedures. PMID:17505090

  14. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  15. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    PubMed

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities. PMID:26963561

  16. Modeling Bone Surface Morphology: A Fully Quantitative Method for Age-at-Death Estimation Using the Pubic Symphysis.

    PubMed

    Slice, Dennis E; Algee-Hewitt, Bridget F B

    2015-07-01

    The pubic symphysis is widely used in age estimation for the adult skeleton. Standard practice requires the visual comparison of surface morphology against criteria representing predefined phases and the estimation of case-specific age from an age range associated with the chosen phase. Known problems of method and observer error necessitate alternative tools to quantify age-related change in pubic morphology. This paper presents an objective, fully quantitative method for estimating age-at-death from the skeleton, which exploits a variance-based score of surface complexity computed from vertices obtained from a scanner sampling the pubic symphysis. For laser scans from 41 modern American male skeletons, this method produces results that are significantly associated with known age-at-death (RMSE = 17.15 years). Chronological age is predicted, therefore, equally well, if not, better, with this robust, objective, and fully quantitative method than with prevailing phase-aging systems. This method contributes to forensic casework by responding to medico-legal expectations for evidence standards. PMID:25929827

  17. Depth determination for shallow teleseismic earthquakes Methods and results

    SciTech Connect

    Stein, S.; Wiens, D.A.

    1986-11-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented. 119 references.

  18. Development and application of quantitative methods for monitoring dermal and inhalation exposure to propiconazole.

    PubMed

    Flack, Sheila; Goktepe, Ipek; Ball, Louise M; Nylander-French, Leena A

    2008-03-01

    Quantitative methods to measure dermal and inhalation exposure to the fungicide propiconazole were developed in the laboratory and applied in the occupational exposure setting for monitoring five farm workers' exposure during pesticide preparation and application to peach crops. Dermal exposure was measured with tape-strips applied to the skin, and the amount of propiconazole was normalized to keratin content in the tape-strip. Inhalation exposure was measured with an OVS tube placed in the worker's breathing-zone during pesticide handling. Samples were analyzed by GC-MS in EI+ mode (limit of detection 6 pg microl(-1)). Dermal exposure ranged from non-detectable to 32.1 +/- 22.6 ng per microg keratin while breathing-zone concentrations varied from 0.2 to 2.2 microg m(-3). A positive correlation was observed between breathing-zone concentrations and ambient air temperature (r2 = 0.87, p < 0.01). Breathing-zone concentrations did not correlate with dermal exposure levels (r2 = 0.11, p = 0.52). Propiconazole levels were below limit of detection when rubber gloves, coveralls, and full-face mask were used. The total-body propiconazole dose, determined for each worker by summing the estimated dermal dose and inhalation dose, ranged from 0.01 to 12 microg per kg body weight per day. Our results show that tape-stripping of the skin and the OVS can be effectively utilized to measure dermal and inhalation exposure to propiconazole, respectively, and that the dermal route of exposure contributed substantially more to the total dose than the inhalation route. PMID:18392276

  19. Investigation of a diffuse optical measurements-assisted quantitative photoacoustic tomographic method in reflection geometry.

    PubMed

    Xu, Chen; Kumavor, Patrick D; Aguirre, Andres; Zhu, Quing

    2012-06-01

    Photoacoustic tomography provides the distribution of absorbed optical energy density, which is the product of optical absorption coefficient and optical fluence distribution. We report the experimental investigation of a novel fitting procedure that quantitatively determines the optical absorption coefficient of chromophores. The experimental setup consisted of a hybrid system of a 64-channel photoacoustic imaging system with a frequency-domain diffused optical measurement system. The fitting procedure included a complete photoacoustic forward model and an analytical solution of a target chromophore using the diffusion approximation. The fitting procedure combines the information from the photoacoustic image and the background information from the diffuse optical measurements to minimize the photoacoustic measurements and forward model data and recover the target absorption coefficient quantitatively. 1-cm-cube phantom absorbers of high and low contrasts were imaged at depths of up to 3.0 cm. The fitted absorption coefficient results were at least 80% of their true values. The sensitivities of this fitting procedure to target location, target radius, and background optical properties were also investigated. We found that this fitting procedure was most sensitive to the accurate determination of the target radius and depth. Blood sample in a thin tube of radius 0.58 mm, simulating a blood vessel, was also studied. The photoacoustic images and fitted absorption coefficients are presented. These results demonstrate the clinical potential of this fitting procedure to quantitatively characterize small lesions in breast imaging. PMID:22734743

  20. Investigation of a diffuse optical measurements-assisted quantitative photoacoustic tomographic method in reflection geometry

    NASA Astrophysics Data System (ADS)

    Xu, Chen; Kumavor, Patrick D.; Aguirre, Andres; Zhu, Quing

    2012-06-01

    Photoacoustic tomography provides the distribution of absorbed optical energy density, which is the product of optical absorption coefficient and optical fluence distribution. We report the experimental investigation of a novel fitting procedure that quantitatively determines the optical absorption coefficient of chromophores. The experimental setup consisted of a hybrid system of a 64-channel photoacoustic imaging system with a frequency-domain diffused optical measurement system. The fitting procedure included a complete photoacoustic forward model and an analytical solution of a target chromophore using the diffusion approximation. The fitting procedure combines the information from the photoacoustic image and the background information from the diffuse optical measurements to minimize the photoacoustic measurements and forward model data and recover the target absorption coefficient quantitatively. 1-cm-cube phantom absorbers of high and low contrasts were imaged at depths of up to 3.0 cm. The fitted absorption coefficient results were at least 80% of their true values. The sensitivities of this fitting procedure to target location, target radius, and background optical properties were also investigated. We found that this fitting procedure was most sensitive to the accurate determination of the target radius and depth. Blood sample in a thin tube of radius 0.58 mm, simulating a blood vessel, was also studied. The photoacoustic images and fitted absorption coefficients are presented. These results demonstrate the clinical potential of this fitting procedure to quantitatively characterize small lesions in breast imaging.

  1. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  2. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. PMID:27414749

  3. A comparison of quantitative methods for clinical imaging with hyperpolarized 13C‐pyruvate

    PubMed Central

    Daniels, Charlie J.; McLean, Mary A.; Schulte, Rolf F.; Robb, Fraser J.; Gill, Andrew B.; McGlashan, Nicholas; Graves, Martin J.; Schwaiger, Markus; Lomas, David J.; Brindle, Kevin M.

    2016-01-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized 13C‐labelled molecules, such as the conversion of [1‐13C]pyruvate to [1‐13C]lactate, to be dynamically and non‐invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model‐free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two‐way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time‐to‐peak and the lactate‐to‐pyruvate area under the curve ratio were simple model‐free approaches that accurately represented the full reaction, with the time‐to‐peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized 13C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd. PMID:27414749

  4. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  5. Tentative method for the qualitative detection and quantitative assessment of air contamination by drugs.

    PubMed

    Buogo, A; Eboli, V

    1972-06-01

    A method for detecting and measuring air contamination by drugs is described which uses an electrostatic bacterial air sampler, sprayers for micronizing drugs, and Mueller-Hinton medium seeded with a highly susceptible strain of Sarcina lutea. Three antibiotics (penicillin, tetracycline, aminosidine) and a sulfonamide (sulfapyrazine) were identified by pretreating portions of medium, showing no bacterial growth, with penicillinase or p-aminobenzoic acid solution and subsequently determining how both drug(-) susceptible and drug-resistant strains of Staphylococcus aureus were affected by this pretreatment. Quantitative determinations were also attempted by measuring the size of the inhibition zones. PMID:4483536

  6. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  7. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J.; Cremers, David A.

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  8. Quantitative methods for the analysis of protein phosphorylation in drug development.

    PubMed

    Olive, D Michael

    2004-10-01

    Most signal transduction and cell signaling pathways are mediated by protein kinases. Protein kinases have emerged as important cellular regulatory proteins in many aspects of neoplasia. Protein kinase inhibitors offer the opportunity to target diseases such as cancer with chemotherapeutic agents specific for the causative molecular defect. In order to identify possible targets and assess kinase inhibitors, quantitative methods for analyzing protein phosphorylation have been developed. This review examines some of the current formats used for quantifying kinase function for drug development. PMID:15966829

  9. Tentative Method for the Qualitative Detection and Quantitative Assessment of Air Contamination by Drugs

    PubMed Central

    Buogo, A.; Eboli, V.

    1972-01-01

    A method for detecting and measuring air contamination by drugs is described which uses an electrostatic bacterial air sampler, sprayers for micronizing drugs, and Mueller-Hinton medium seeded with a highly susceptible strain of Sarcina lutea. Three antibiotics (penicillin, tetracycline, aminosidine) and a sulfonamide (sulfapyrazine) were identified by pretreating portions of medium, showing no bacterial growth, with penicillinase or p-aminobenzoic acid solution and subsequently determining how both drug- susceptible and drug-resistant strains of Staphylococcus aureus were affected by this pretreatment. Quantitative determinations were also attempted by measuring the size of the inhibition zones. Images PMID:4483536

  10. A Method to Prioritize Quantitative Traits and Individuals for Sequencing in Family-Based Studies

    PubMed Central

    Shah, Kaanan P.; Douglas, Julie A.

    2013-01-01

    Owing to recent advances in DNA sequencing, it is now technically feasible to evaluate the contribution of rare variation to complex traits and diseases. However, it is still cost prohibitive to sequence the whole genome (or exome) of all individuals in each study. For quantitative traits, one strategy to reduce cost is to sequence individuals in the tails of the trait distribution. However, the next challenge becomes how to prioritize traits and individuals for sequencing since individuals are often characterized for dozens of medically relevant traits. In this article, we describe a new method, the Rare Variant Kinship Test (RVKT), which leverages relationship information in family-based studies to identify quantitative traits that are likely influenced by rare variants. Conditional on nuclear families and extended pedigrees, we evaluate the power of the RVKT via simulation. Not unexpectedly, the power of our method depends strongly on effect size, and to a lesser extent, on the frequency of the rare variant and the number and type of relationships in the sample. As an illustration, we also apply our method to data from two genetic studies in the Old Order Amish, a founder population with extensive genealogical records. Remarkably, we implicate the presence of a rare variant that lowers fasting triglyceride levels in the Heredity and Phenotype Intervention (HAPI) Heart study (p = 0.044), consistent with the presence of a previously identified null mutation in the APOC3 gene that lowers fasting triglyceride levels in HAPI Heart study participants. PMID:23626830

  11. A method for the quantitative evaluation of SAR distribution in deep regional hyperthermia.

    PubMed

    Baroni, C; Giri, M G; Meliadó, G; Maluta, S; Chierego, G

    2001-01-01

    The Specific Absorption Rate (SAR) distribution pattern visualization by a matrix of E-field light-emitting sensors has demonstrated to be a useful tool to evaluate the characteristics of the applicators used in deep regional hyperthermia and to perform a quality assurance programme. A method to quantify the SAR from photographs of the sensor array--the so-called 'Power Stepping Technique'--has already been proposed. This paper presents a new approach to the quantitative determination of the SAR profiles in a liquid phantom exposed to electromagnetic fields from the Sigma-60 applicator (BSD-2000 system for deep regional hyperthermia). The method is based on the construction of a 'calibration curve' modelling the light-output of an E-field sensor as a function of the supplied voltage and on the use of a reference light source to 'normalize' the light-output readings from the photos of the sensor array, in order to minimize the errors introduced by the non-uniformity of the photographic process. Once the calibration curve is obtained, it is possible, with only one photo, to obtain the quantitative SAR distribution in the operating conditions. For this reason, this method is suitable for equipment characterization and also for the control of the repeatability of power deposition in time. PMID:11587076

  12. A quantitative solid-state Raman spectroscopic method for control of fungicides.

    PubMed

    Ivanova, Bojidarka; Spiteller, Michael

    2012-07-21

    A new analytical procedure using solid-state Raman spectroscopy within the THz-region for the quantitative determination of mixtures of different conformations of trifloxystrobin (EE, EZ, ZE and ZZ), tebuconazole (1), and propiconazole (2) as an effective method for the fungicide product quality monitoring programmes and control has been developed and validated. The obtained quantities were controlled independently by the validated hybrid HPLC electrospray ionization (ESI) tandem mass spectrometric (MS) and matrix-assisted laser desorption/ionization (MALDI) MS methods in the condensed phase. The quantitative dependences were obtained on the twenty binary mixtures of the analytes and were further tested on the three trade fungicide products, containing mixtures of trifloxystrobin-tebuconazole and trifloxystrobin-propiconazole, as an emissive concentrate or water soluble granules of the active ingredients. The present methods provided sufficient sensitivity as reflected by the metrologic quantities, evaluating the concentration limit of detection (LOD) and quantification (LOQ), linear limit (LL), measurement accuracy and precision, true quantity value, trueness of measurement and more. PMID:22679621

  13. A quantitative autoradiographic method for the measurement of local rates of brain protein synthesis

    SciTech Connect

    Dwyer, B.E.; Donatoni, P.; Wasterlain, C.G.

    1982-05-01

    We have developed a new method for measuring local rates of brain protein synthesis in vivo. It combines the intraperitoneal injection of a large dose of low specific activity amino acid with quantitative autoradiography. This method has several advantages: 1) It is ideally suited for young or small animals or where immobilizing an animal is undesirable. 2 The amino acid injection ''floods'' amino acid pools so that errors in estimating precursor specific activity, which is especially important in pathological conditions, are minimized. 3) The method provides for the use of a radioautographic internal standard in which valine incorporation is measured directly. Internal standards from experimental animals correct for tissue protein content and self-absorption of radiation in tissue sections which could vary under experimental conditions.

  14. Sample preparation methods for quantitative detection of DNA by molecular assays and marine biosensors.

    PubMed

    Cox, Annie M; Goodwin, Kelly D

    2013-08-15

    The need for quantitative molecular methods is growing in environmental, food, and medical fields but is hindered by low and variable DNA extraction and by co-extraction of PCR inhibitors. DNA extracts from Enterococcus faecium, seawater, and seawater spiked with E. faecium and Vibrio parahaemolyticus were tested by qPCR for target recovery and inhibition. Conventional and novel methods were tested, including Synchronous Coefficient of Drag Alteration (SCODA) and lysis and purification systems used on an automated genetic sensor (the Environmental Sample Processor, ESP). Variable qPCR target recovery and inhibition were measured, significantly affecting target quantification. An aggressive lysis method that utilized chemical, enzymatic, and mechanical disruption enhanced target recovery compared to commercial kit protocols. SCODA purification did not show marked improvement over commercial spin columns. Overall, data suggested a general need to improve sample preparation and to accurately assess and account for DNA recovery and inhibition in qPCR applications. PMID:23790450

  15. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  16. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  17. Quantitative carbide analysis using the Rietveld method for 2.25Cr-1Mo-0.25V steel

    SciTech Connect

    Zhang Yongtao; Han Haibo; Miao Lede; Zhang Hanqian; Li Jinfu

    2009-09-15

    It is usually difficult to quantitatively determine the mass fraction of each type of precipitates in steels using transmission electron microscopy and traditional X-ray powder diffraction analysis methods. In this paper the Rietveld full-pattern fitting algorithm was employed to calculate the relative mass fractions of the precipitates in 2.25Cr-1Mo-0.25V steel. The results suggest that the fractions of MC, M{sub 7}C{sub 3} and M{sub 23}C{sub 6} carbides were evaluated precisely and relatively quickly. In addition, it was found that the fine MC phase dissolved into the matrix with prolonged tempering.

  18. Validation of the method of quantitative phase analysis by X-ray diffraction in API: case of Tibolone

    NASA Astrophysics Data System (ADS)

    Silva, R. P.; Ambrósio, M. F. S.; Epprecht, E. K.; Avillez, R. R.; Achete, C. A.; Kuznetsov, A.; Visentin, L. C.

    2016-07-01

    In this study, different structural and microstructural models applied to X-ray analysis of powder diffraction data of polymorphic mixtures of known concentrations of Tibolone were investigated. The X-ray data obtained in different diffraction instruments were analysed via Rietveld method using the same analytical models. The results of quantitative phase analysis show that regardless of the instrument used, the values of the calculated concentrations follow the same systematics with respect to the final errors. The strategy to select a specific analytical model that leads to lower measurement errors is here presented.

  19. "Do I Need Research Skills in Working Life?": University Students' Motivation and Difficulties in Quantitative Methods Courses

    ERIC Educational Resources Information Center

    Murtonen, Mari; Olkinuora, Erkki; Tynjala, Paivi; Lehtinen, Erno

    2008-01-01

    This study explored university students' views of whether they will need research skills in their future work in relation to their approaches to learning, situational orientations on a learning situation of quantitative methods, and difficulties experienced in quantitative research courses. Education and psychology students in both Finland (N =…

  20. A method for estimating the effective number of loci affecting a quantitative character.

    PubMed

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci. PMID:23973416

  1. Goals of Secondary Education as Perceived by Education Consumers. Volume IV, Quantitative Results.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. Inst. for Social Research and Development.

    The results of a study to determine attitudes of parents and professional educators toward educational goals for secondary school students are analyzed in this report. The survey was conducted in two communities--Albuquerque, New Mexico, and Philadelphia, Pennsylvania. The essential nature of the results is summarized by the following categories:…

  2. Methods for quantitative detection of antibody-induced complement activation on red blood cells.

    PubMed

    Meulenbroek, Elisabeth M; Wouters, Diana; Zeerleder, Sacha

    2014-01-01

    Antibodies against red blood cells (RBCs) can lead to complement activation resulting in an accelerated clearance via complement receptors in the liver (extravascular hemolysis) or leading to intravascular lysis of RBCs. Alloantibodies (e.g. ABO) or autoantibodies to RBC antigens (as seen in autoimmune hemolytic anemia, AIHA) leading to complement activation are potentially harmful and can be - especially when leading to intravascular lysis - fatal(1). Currently, complement activation due to (auto)-antibodies on RBCs is assessed in vitro by using the Coombs test reflecting complement deposition on RBC or by a nonquantitative hemolytic assay reflecting RBC lysis(1-4). However, to assess the efficacy of complement inhibitors, it is mandatory to have quantitative techniques. Here we describe two such techniques. First, an assay to detect C3 and C4 deposition on red blood cells that is induced by antibodies in patient serum is presented. For this, FACS analysis is used with fluorescently labeled anti-C3 or anti-C4 antibodies. Next, a quantitative hemolytic assay is described. In this assay, complement-mediated hemolysis induced by patient serum is measured making use of spectrophotometric detection of the released hemoglobin. Both of these assays are very reproducible and quantitative, facilitating studies of antibody-induced complement activation. PMID:24514151

  3. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  4. A fast semi-quantitative method for Plutonium determination in an alpine firn/ice core

    NASA Astrophysics Data System (ADS)

    Gabrieli, J.; Cozzi, G.; Vallelonga, P.; Schwikowski, M.; Sigl, M.; Boutron, C.; Barbante, C.

    2009-04-01

    deposition decreased very sharply reaching a minimum in 1967. The third period (1967-1975) is characterized by irregular Pu profiles with smaller peaks (about 20-30% compared to the 1964 peak) which could be due to French and Chinese tests. Comparison with the Pu profiles obtained from the Col du Dome and Belukha ice cores by AMS (Accelerator Mass Spectrometry) shows very good agreement. Considering the semi-quantitative method and the analytical uncertainty, the results are also quantitatively comparable. However, the Pu concentrations at Colle Gnifetti are normally 2-3 times greater than in Col du Dome. This could be explained by different air mass transport or, more likely, different accumulation rates at each site.

  5. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods

    PubMed Central

    Ahmed, Rafay; Oborski, Matthew J; Hwang, Misun; Lieberman, Frank S; Mountz, James M

    2014-01-01

    Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12–15 months for glioblastomas and 2–5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies, and importantly, for facilitating patient management, sparing patients from weeks or months of toxicity and ineffective treatment. This review will present an overview of epidemiology, molecular pathogenesis and current advances in diagnoses, and management of malignant gliomas. PMID:24711712

  6. Setting health research priorities using the CHNRI method: VI. Quantitative properties of human collective opinion

    PubMed Central

    Yoshida, Sachiyo; Rudan, Igor; Cousens, Simon

    2016-01-01

    Introduction Crowdsourcing has become an increasingly important tool to address many problems – from government elections in democracies, stock market prices, to modern online tools such as TripAdvisor or Internet Movie Database (IMDB). The CHNRI method (the acronym for the Child Health and Nutrition Research Initiative) for setting health research priorities has crowdsourcing as the major component, which it uses to generate, assess and prioritize between many competing health research ideas. Methods We conducted a series of analyses using data from a group of 91 scorers to explore the quantitative properties of their collective opinion. We were interested in the stability of their collective opinion as the sample size increases from 15 to 90. From a pool of 91 scorers who took part in a previous CHNRI exercise, we used sampling with replacement to generate multiple random samples of different size. First, for each sample generated, we identified the top 20 ranked research ideas, among 205 that were proposed and scored, and calculated the concordance with the ranking generated by the 91 original scorers. Second, we used rank correlation coefficients to compare the ranks assigned to all 205 proposed research ideas when samples of different size are used. We also analysed the original pool of 91 scorers to to look for evidence of scoring variations based on scorers' characteristics. Results The sample sizes investigated ranged from 15 to 90. The concordance for the top 20 scored research ideas increased with sample sizes up to about 55 experts. At this point, the median level of concordance stabilized at 15/20 top ranked questions (75%), with the interquartile range also generally stable (14–16). There was little further increase in overlap when the sample size increased from 55 to 90. When analysing the ranking of all 205 ideas, the rank correlation coefficient increased as the sample size increased, with a median correlation of 0.95 reached at the sample size

  7. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  8. Rapid method for glutathione quantitation using high-performance liquid chromatography with coulometric electrochemical detection.

    PubMed

    Bayram, Banu; Rimbach, Gerald; Frank, Jan; Esatbeyoglu, Tuba

    2014-01-15

    A rapid, sensitive, and direct method (without derivatization) was developed for the detection of reduced glutathione (GSH) in cultured hepatocytes (HepG2 cells) using high-performance liquid chromatography with electrochemical detection (HPLC-ECD). The method was validated according to the guidelines of the U.S. Food and Drug Administration in terms of linearity, lower limit of quantitation (LOQ), lower limit of detection (LOD), precision, accuracy, recovery, and stabilities of GSH standards and quality control samples. The total analysis time was 5 min, and the retention time of GSH was 1.78 min. Separation was carried out isocratically using 50 mM sodium phosphate (pH 3.0) as a mobile phase with a fused-core column. The detector response was linear between 0.01 and 80 μmol/L, and the regression coefficient (R(2)) was >0.99. The LOD for GSH was 15 fmol, and the intra- and interday recoveries ranged between 100.7 and 104.6%. This method also enabled the rapid detection (in 4 min) of other compounds involved in GSH metabolism such as uric acid, ascorbic acid, and glutathione disulfite. The optimized and validated HPLC-ECD method was successfully applied for the determination of GSH levels in HepG2 cells treated with buthionine sulfoximine (BSO), an inhibitor, and α-lipoic acid (α-LA), an inducer of GSH synthesis. As expected, the amount of GSH concentration-dependently decreased with BSO and increased with α-LA treatments in HepG2 cells. This method could also be useful for the quantitation of GSH, uric acid, ascorbic acid, and glutathione disulfide in other biological matrices such as tissue homogenates and blood. PMID:24328299

  9. Development of a High-Sensitivity Quantitation Method for Arginine Vasopressin by High-Performance Liquid Chromatography Tandem Mass Spectrometry, and Comparison with Quantitative Values by Radioimmunoassay.

    PubMed

    Tsukazaki, Yasuko; Senda, Naoto; Kubo, Kinya; Yamada, Shigeru; Kugoh, Hiroyuki; Kazuki, Yasuhiro; Oshimura, Mitsuo

    2016-01-01

    Human plasma arginine vasopressin (AVP) levels serve as a clinically relevant marker of diabetes and related syndromes. We developed a highly sensitive method for measuring human plasma AVP using high-performance liquid chromatography tandem mass spectrometry. AVP was extracted from human plasma using a weak-cation solid-phase extraction plate, and separated on a wide-bore octadecyl reverse-phase column. AVP was quantified in ion-transition experiments utilizing a product ion (m/z 328.3) derived from its parent ion (m/z 542.8). The sensitivity was enhanced using 0.02% dichloromethane as a mobile-phase additive. The lower limit of quantitation was 0.200 pmol/L. The extraction recovery ranged from 70.2 ± 7.2 to 73.3 ± 6.2% (mean ± SD), and the matrix effect ranged from 1.1 - 1.9%. Quality-testing samples revealed interday/intraday accuracy and precision ranging over 0.9 - 3% and -0.3 - 2%, respectively, which included the endogenous baseline. Our results correlated well with radioimmunoassay results using 22 human volunteer plasma samples. PMID:26860558

  10. A method for the simultaneous identification and quantitation of five superwarfarin rodenticides in human serum.

    PubMed

    Kuijpers, E A; den Hartigh, J; Savelkoul, T J; de Wolff, F A

    1995-01-01

    A high-performance liquid chromatographic method with ultraviolet (UV) and fluorescence detection was developed for the analysis of one indandione and four hydroxycoumarin anticoagulant rodenticides in human serum. The superwarfarin rodenticides, chlorophacinone, bromadiolone, difenacoum, brodifacoum, and difethialone, can be identified and quantitated simultaneously with this method. After adding a buffer (pH 5.5), the anticoagulants were extracted from serum with chloroform-acetone. The organic phase was separated and evaporated to dryness, and the residue was subjected to chromatographic analysis. The anticoagulants were separated by reversed-phase chromatography and detected by UV absorption at 285 nm and by fluorescence at an excitation wavelength of 265 nm and an emission wavelength of 400 nm. Extraction efficiencies from 55 to 131% were obtained. The within-run precision ranged from 2.0 to 7.1% for UV detection and from 0.0 to 4.8% for fluorescence detection. Between-run precision ranged from 1.3 to 16.0% for UV detection and from 1.8 to 9.0% for fluorescence detection. The anticoagulants can be quantitated at serum concentrations down to 3-12 ng/mL for fluorescence detection and down to 20-75 ng/mL for UV detection. No interferences were observed with the related compounds warfarin and vitamin K1. PMID:8577178

  11. A quantitative method for estimation of volume changes in arachnoid foveae with age.

    PubMed

    Duray, Stephen M; Martel, Stacie S

    2006-03-01

    Age-related changes of arachnoid foveae have been described, but objective, quantitative analyses are lacking. A new quantitative method is presented for estimation of change in total volume of arachnoid foveae with age. The pilot sample consisted of nine skulls from the Palmer Anatomy Laboratory. Arachnoid foveae were filled with sand, which was extracted using a vacuum pump. Mass was determined with an analytical balance and converted to volume. A reliability analysis was performed using intraclass correlation coefficients. The method was found to be highly reliable (intraobserver ICC = 0.9935, interobserver ICC = 0.9878). The relationship between total volume and age was then examined in a sample of 63 males of accurately known age from the Hamann-Todd collection. Linear regression analysis revealed no statistically significant relationship between total volume and age, or foveae frequency and age (alpha = 0.05). Development of arachnoid foveae may be influenced by health factors, which could limit its usefulness in aging. PMID:16566755

  12. A quantitative method for measurement of HL-60 cell apoptosis based on diffraction imaging flow cytometry technique

    PubMed Central

    Yang, Xu; Feng, Yuanming; Liu, Yahui; Zhang, Ning; Lin, Wang; Sa, Yu; Hu, Xin-Hua

    2014-01-01

    A quantitative method for measurement of apoptosis in HL-60 cells based on polarization diffraction imaging flow cytometry technique is presented in this paper. Through comparative study with existing methods and the analysis of diffraction images by a gray level co-occurrence matrix algorithm (GLCM), we found 4 GLCM parameters of contrast (CON), cluster shade (CLS), correlation (COR) and dissimilarity (DIS) exhibit high sensitivities as the apoptotic rates. It was further demonstrated that the CLS parameter correlates significantly (R2 = 0.899) with the degree of nuclear fragmentation and other three parameters showed a very good correlations (R2 ranges from 0.69 to 0.90). These results demonstrated that the new method has the capability for rapid and accurate extraction of morphological features to quantify cellular apoptosis without the need for cell staining. PMID:25071957

  13. Quantitative analysis of gene expression in fixed colorectal carcinoma samples as a method for biomarker validation.

    PubMed

    Ostasiewicz, Beata; Ostasiewicz, Paweł; Duś-Szachniewicz, Kamila; Ostasiewicz, Katarzyna; Ziółkowski, Piotr

    2016-06-01

    Biomarkers have been described as the future of oncology. Modern proteomics provide an invaluable tool for the near‑whole proteome screening for proteins expressed differently in neoplastic vs. healthy tissues. However, in order to select the most promising biomarkers, an independent method of validation is required. The aim of the current study was to propose a methodology for the validation of biomarkers. Due to material availability the majority of large scale biomarker studies are performed using formalin‑fixed paraffin‑embedded (FFPE) tissues, therefore these were selected for use in the current study. A total of 10 genes were selected from what have been previously described as the most promising candidate biomarkers, and the expression levels were analyzed with reverse transcription‑quantitative polymerase chain reaction (RT‑qPCR) using calibrator normalized relative quantification with the efficiency correction. For 6/10 analyzed genes, the results were consistent with the proteomic data; for the remaining four genes, the results were inconclusive. The upregulation of karyopherin α 2 (KPNA2) and chromosome segregation 1‑like (CSE1L) in colorectal carcinoma, in addition to downregulation of chloride channel accessory 1 (CLCA1), fatty acid binding protein 1 (FABP1), sodium channel, voltage gated, type VII α subunit (SCN7A) and solute carrier family 26 (anion exchanger), member 3 (SLC26A3) was confirmed. With the combined use of proteomic and genetic tools, it was reported, for the first time to the best of our knowledge, that SCN7A was downregulated in colorectal carcinoma at mRNA and protein levels. It had been previously suggested that the remaining five genes served an important role in colorectal carcinogenesis, however the current study provided strong evidence to support their use as biomarkers. Thus, it was concluded that combination of RT‑qPCR with proteomics offers a powerful methodology for biomarker identification, which

  14. A Rapid and Quantitative Flow Cytometry Method for the Analysis of Membrane Disruptive Antimicrobial Activity

    PubMed Central

    O’Brien-Simpson, Neil M.; Pantarat, Namfon; Attard, Troy J.; Walsh, Katrina A.; Reynolds, Eric C.

    2016-01-01

    We describe a microbial flow cytometry method that quantifies within 3 hours antimicrobial peptide (AMP) activity, termed Minimum Membrane Disruptive Concentration (MDC). Increasing peptide concentration positively correlates with the extent of bacterial membrane disruption and the calculated MDC is equivalent to its MBC. The activity of AMPs representing three different membranolytic modes of action could be determined for a range of Gram positive and negative bacteria, including the ESKAPE pathogens, E. coli and MRSA. By using the MDC50 concentration of the parent AMP, the method provides high-throughput, quantitative screening of AMP analogues. A unique feature of the MDC assay is that it directly measures peptide/bacteria interactions and lysed cell numbers rather than bacteria survival as with MIC and MBC assays. With the threat of multi-drug resistant bacteria, this high-throughput MDC assay has the potential to aid in the development of novel antimicrobials that target bacteria with improved efficacy. PMID:26986223

  15. Two methods for the quantitative analysis of surface antigen expression in acute myeloid leukemia (AML).

    PubMed

    Woźniak, Jolanta

    2004-01-01

    The expression of lineage molecules (CD13 and CD33), c-Kit receptor (CD117), CD34, HLA-DR and adhesion molecule CD49d was assessed in acute myeloid leukemia (AML) blast cells from 32 cases, using direct and indirect quantitative cytometric analysis. High correlation (r=0.8) was found between antigen expression intensity values calculated by direct analysis method (ABC) and by indirect analysis method (RFI). Moreover, the differences in expression intensity of CD13, CD117 and CD34 antigens were found between leukemic and normal myeloblasts. This may be helpful in identification of leukemic cells in the diagnostics of minimal residual disease after treatment in AML patients. PMID:15493582

  16. Quantitative imaging of volcanic plumes — Results, needs, and future trends

    USGS Publications Warehouse

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-01-01

    Recent technology allows two-dimensional “imaging” of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry–Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  17. Quantitative assessment of port-wine stains using chromametry: preliminary results

    NASA Astrophysics Data System (ADS)

    Beacco, Claire; Brunetaud, Jean Marc; Rotteleur, Guy; Steen, D. A.; Brunet, F.

    1996-12-01

    Objective assessment of the efficacy of different lasers for the treatment of port wine stains remains difficult. Chromametry gives reproducible information on the color of PWS, but its data are useless for a medical doctor. Thus a specific software was developed to allow graphic representation of PWS characteristics. Before the first laser treatment and after every treatment, tests were done using a chromameter on a marked zone of the PWS and on the control-lateral normal zone which represents the reference. The software calculates and represents graphically the difference of color between PWS and normal skin using data provided by the chromameter. Three parameters are calculated: (Delta) H is the difference of hue, (Delta) L is the difference of lightness and (Delta) E is the total difference of color. Each measured zone is represented by its coordinates. Calculated initial values were compared with the subjective initial color assessed by the dermatologist. The variation of the color difference was calculated using the successive values of (Delta) E after n treatments and was compared with the subjective classification of fading. Since January 1995, forty three locations have been measured before laser treatment. Purple PWS tended to differentiate from others but red and dark pink PWS could not be differentiated. The evolution of the color after treatment was calculated in 29 PWS treated 3 or 4 times. Poor result corresponded to an increase of (Delta) E. Fair and good results were associated to a decrease of (Delta) E. We did not observe excellent results during this study. These promising preliminary results need to be confirmed in a larger group of patients.

  18. Assessment of Riboflavin as a Tracer Substance: Comparison of a Qualitative to a Quantitative Method of Riboflavin Measurement

    PubMed Central

    Herron, Abigail J.; Mariani, John J.; Pavlicova, Martina; Parinello, Christina M.; Bold, Krysten W.; Levin, Frances R.; Nunes, Edward V.; Sullivan, Maria A.; Raby, Wilfred N.; Bisaga, Adam

    2013-01-01

    Background Noncompliance with medications may have major impacts on outcomes measured in research, potentially distorting the validity of controlled clinical trials. Riboflavin is frequently used in trials as a marker of adherence. It can be combined with study medication and is excreted in urine where it fluoresces under UV light. This study compares qualitative visual inspection of fluorescence to quantitative fluorometric analysis of riboflavin concentration in its ability to detect the presence of riboflavin in urine. Methods Twenty-four volunteers received 0 mg, 25 mg, and 50 mg doses of riboflavin under single-blind conditions, with 20 also receiving a 100 mg dose. Five serial urine samples were collected over the following 36 hours. Quantitative measurement of riboflavin by fluorometric analysis and qualitative assessment of each sample using visual inspection were performed. Results The overall false positive rate for qualitative assessment was 53%. For quantitative assessment, a riboflavin concentration of 900 ng/mL was established to classify positive samples. More than 80% of samples were positive 2 to 24 hours following ingestion of 25 mg and 50 mg, and less than 80% were positive at 36 hours. At least 95% of observations for the 100 mg dose were above 900 ng/mL at all timepoints. Conclusions Quantitative fluorometric assessment is superior to qualitative visual inspection alone in determining medication adherence. The combination of 25–50 mg of daily riboflavin and a cut-off level of 900 ng/mL allows for the acceptable sensitivity of missing detection of non-compliant participants while preserving a high level of power to detect all cases of medication compliance. PMID:22921475

  19. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  20. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody.

    PubMed

    Yoshinari, Tomoya; Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena; Ohkawa, Hideo; Sugita-Konishi, Yoshiko

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L(-1). The coefficients of variation were 7.9% at 0.003 mg L(-1), 5.0% at 0.03 mg L(-1) and 13.7% at 0.3 mg L(-1), respectively. The limit of detection was 0.006 mg L(-1) for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9-100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg(-1). The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R(2) = 0.9760) than the immunochromatographic assay kit (R(2) = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety. PMID:26320967

  1. Perception of mobbing during the study: results of a national quantitative research among Slovenian midwifery students.

    PubMed

    Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona

    2014-09-01

    Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations. PMID:25507371

  2. Perception of mobbing during the study: results of a national quantitative research among Slovenian midwifery students.

    PubMed

    Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona

    2014-09-01

    Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations. PMID:25420387

  3. Quantitative Assessment of the CCMC's Experimental Real-time SWMF-Geospace Results

    NASA Astrophysics Data System (ADS)

    Liemohn, Michael; Ganushkina, Natalia; De Zeeuw, Darren; Welling, Daniel; Toth, Gabor; Ilie, Raluca; Gombosi, Tamas; van der Holst, Bart; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz

    2016-04-01

    Experimental real-time simulations of the Space Weather Modeling Framework (SWMF) are conducted at the Community Coordinated Modeling Center (CCMC), with results available there (http://ccmc.gsfc.nasa.gov/realtime.php), through the CCMC Integrated Space Weather Analysis (iSWA) site (http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/), and the Michigan SWMF site (http://csem.engin.umich.edu/realtime). Presently, two configurations of the SWMF are running in real time at CCMC, both focusing on the geospace modules, using the BATS-R-US magnetohydrodynamic model, the Ridley Ionosphere Model, and with and without the Rice Convection Model for inner magnetospheric drift physics. While both have been running for several years, nearly continuous results are available since July 2015. Dst from the model output is compared against the Kyoto real-time Dst, in particular the daily minimum value of Dst to quantify the ability of the model to capture storms. Contingency tables are presented, showing that the run with the inner magnetosphere model is much better at reproducing storm-time values. For disturbances with a minimum Dst lower than -50 nT, this version yields a probability of event detection of 0.86 and a Heidke Skill Score of 0.60. In the other version of the SWMF, without the inner magnetospheric module included, the modeled Dst never dropped below -50 nT during the examined epoch.

  4. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  5. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  6. Development of a Quantitative Decision Metric for Selecting the Most Suitable Discretization Method for SN Transport Problems

    NASA Astrophysics Data System (ADS)

    Schunert, Sebastian

    In this work we develop a quantitative decision metric for spatial discretization methods of the SN equations. The quantitative decision metric utilizes performance data from selected test problems for computing a fitness score that is used for the selection of the most suitable discretization method for a particular SN transport application. The fitness score is aggregated as a weighted geometric mean of single performance indicators representing various performance aspects relevant to the user. Thus, the fitness function can be adjusted to the particular needs of the code practitioner by adding/removing single performance indicators or changing their importance via the supplied weights. Within this work a special, broad class of methods is considered, referred to as nodal methods. This class is naturally comprised of the DGFEM methods of all function space families. Within this work it is also shown that the Higher Order Diamond Difference (HODD) method is a nodal method. Building on earlier findings that the Arbitrarily High Order Method of the Nodal type (AHOTN) is also a nodal method, a generalized finite-element framework is created to yield as special cases various methods that were developed independently using profoundly different formalisms. A selection of test problems related to a certain performance aspect are considered: an Method of Manufactured Solutions (MMS) test suite for assessing accuracy and execution time, Lathrop's test problem for assessing resilience against occurrence of negative fluxes, and a simple, homogeneous cube test problem to verify if a method possesses the thick diffusive limit. The contending methods are implemented as efficiently as possible under a common SN transport code framework to level the playing field for a fair comparison of their computational load. Numerical results are presented for all three test problems and a qualitative rating of each method's performance is provided for each aspect: accuracy

  7. A small-scale method for quantitation of carotenoids in bacteria and yeasts.

    PubMed

    Kaiser, Philipp; Surmann, Peter; Vallentin, Gerald; Fuhrmann, Herbert

    2007-07-01

    Microbial carotenoids are difficult to extract because of their embedding into a compact matrix and prominent sensitivity to degradation. Especially for carotenoid analysis of bacteria and yeasts, there is lack of information about capability, precision and recovery of the method used. Accordingly, we investigated feasibility, throughput and validity of a new small-scale method using Micrococcus luteus and Rhodotorula glutinis for testing purposes. For disintegration and extraction, we combined primarily mild techniques: enzymatically we used combinations of lysozyme and lipase for bacteria as well as lyticase and lipase for yeasts. Additional mechanical treatment included sonication and freeze-thawing cycles. Chemical treatment with dimethylsulfoxide was applied for yeasts only. For extraction we used a methanol-chloroform mixture stabilized efficiently with butylated hydroxytoluene and alpha-tocopherol. Separation of compounds was achieved with HPLC, applying a binary methanol/tert-butyl methyl ether gradient on a polymer reversed C30 phase. Substances of interest were detected and identified applying a photodiode-array (PDA) and carotenoids quantitated as all-trans-beta-carotene equivalents. For evaluation of recovery and reproducibility of the extraction method, we used beta-8'-apo-carotenal as internal standard. The method provides a sensitive tool for the determination of carotenoids from bacteria and yeasts and also for small changes in carotenoid spectrum of a single species. Corequisite large experiments are facilitated by the high throughput of the method. PMID:17509707

  8. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials. PMID:24282943

  9. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 μg L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams. PMID:26395288

  10. A quantitative method for zoning of protected areas and its spatial ecological implications.

    PubMed

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  11. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  12. Evaluation of a quantitative fit testing method for N95 filtering facepiece respirators.

    PubMed

    Janssen, Larry; Luinenburg, Michael D; Mullins, Haskell E; Danisch, Susan G; Nelson, Thomas J

    2003-01-01

    A method for performing quantitative fit tests (QNFT) with N95 filtering facepiece respirators was developed by earlier investigators. The method employs a simple clamping device to allow the penetration of submicron aerosols through N95 filter media to be measured. The measured value is subtracted from total penetration, with the assumption that the remaining penetration represents faceseal leakage. The developers have used the clamp to assess respirator performance. This study evaluated the clamp's ability to measure filter penetration and determine fit factors. In Phase 1, subjects were quantitatively fit-tested with elastomeric half-facepiece respirators using both generated and ambient aerosols. QNFT were done with each aerosol with both P100 and N95 filters without disturbing the facepiece. In Phase 2 of the study elastomeric half facepieces were sealed to subjects' faces to eliminate faceseal leakage. Ambient aerosol QNFT were performed with P100 and N95 filters without disturbing the facepiece. In both phases the clamp was used to measure N95 filter penetration, which was then subtracted from total penetration for the N95 QNFT. It was hypothesized that N95 fit factors corrected for filter penetration would equal the P100 fit factors. Mean corrected N95 fit factors were significantly different from the P100 fit factors in each phase of the study. In addition, there was essentially no correlation between corrected N95 fit factors and P100 fit factors. It was concluded that the clamp method should not be used to fit-test N95 filtering facepieces or otherwise assess respirator performance. PMID:12908863

  13. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  14. Autoradiographic method for quantitation of radiolabeled proteins in tissues using indium-111

    SciTech Connect

    Morrell, E.M.; Tompkins, R.G.; Fischman, A.J.; Wilkinson, R.A.; Burke, J.F.; Rubin, R.H.; Strauss, H.W.; Yarmush, M.L. )

    1989-09-01

    A quantitative autoradiographic method was developed to measure 111In-labeled proteins in extravascular tissues with a spatial resolution sufficient to associate these proteins with tissue morphology. A linear relationship between measured grain density and isotope concentration was demonstrated with uniformly-labeled standard sources of epoxy-embedded gelatin containing (111In)albumin; half-distance of spatial resolution was 0.6 micron. The technique was illustrated by measuring 24-hr accumulation of diethylenetriaminepentaacetic acid-coupled 111In-labeled human polyclonal IgG and human serum albumin (HSA) in a thigh infection model in the rat. Gamma camera images localized the infection and showed target-to-background ratios of 2.5 {plus minus} 0.3 for IgG and 1.4 {plus minus} 0.02 for human serum albumin (mean {plus minus} s.d., n = 3). Using quantitative autoradiography, significantly higher average tissue concentrations were found in the infected thighs at 4 to 4.5% of the initial plasma concentrations as compared to 0.2 to 0.3% of initial plasma concentrations in the noninfected thigh (p less than 0.05); these radiolabeled proteins were not inflammatory cell associated and localized primarily within the edematous interstitial spaces of the infection.

  15. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method

    PubMed Central

    Yang, Ganglong; Xu, Zhipeng; Lu, Wei; Li, Xiang; Sun, Chengwen; Guo, Jia; Xue, Peng; Guan, Feng

    2015-01-01

    The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia), KK47 (low grade nonmuscle invasive bladder cancer, NMIBC), and YTS1 (metastatic bladder cancer) have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC) progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO) term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer. PMID:26230496

  16. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    SciTech Connect

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D. )

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity.

  17. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  18. Quantitative Results from Shockless Compression Experiments on Solids to Multi-Megabar Pressure

    NASA Astrophysics Data System (ADS)

    Davis, Jean-Paul; Brown, Justin; Knudson, Marcus; Lemke, Raymond

    2015-03-01

    Quasi-isentropic, shockless ramp-wave experiments promise accurate equation-of-state (EOS) data in the solid phase at relatively low temperatures and multi-megabar pressures. In this range of pressure, isothermal diamond-anvil techniques have limited pressure accuracy due to reliance on theoretical EOS of calibration standards, thus accurate quasi-isentropic compression data would help immensely in constraining EOS models. Multi-megabar shockless compression experiments using the Z Machine at Sandia as a magnetic drive with stripline targets continue to be performed on a number of solids. New developments will be presented in the design and analysis of these experiments, including topics such as 2-D and magneto-hydrodynamic (MHD) effects and the use of LiF windows. Results will be presented for tantalum and/or gold metals, with comparisons to independently developed EOS. * Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  19. Parents' decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results.

    PubMed

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9-10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents' general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  20. Parents’ decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results

    PubMed Central

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9–10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents’ general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  1. Development and validation of a modified ultrasound-assisted extraction method and a HPLC method for the quantitative determination of two triterpenic acids in Hedyotis diffusa.

    PubMed

    Yang, Yu-Chiao; Wei, Ming-Chi; Chiu, Hui-Fen; Huang, Ting-Chia

    2013-12-01

    In the present study, the oleanolic acid (OA) and ursolic acid (UA) contents ofHedyotis diffusa and H. corymbosa were determined by a rapid, selective and accurate method combining modified ultrasound-assisted extraction (MUAE) and HPLC. Compared with traditional extraction methods, MUAE reduced the extraction time, the extraction temperature and the solvent consumption and maximized the extraction yields of OA and UA. Furthermore, the combined MUAE-HPLC method was applied to quantitate OA and UA in plant samples and exhibited good repeatability, reproducibility and stability. The mean recovery studies (one extraction cycle) for OA and UA were between 91.3 and 91.7% with RSD values less than 4.5%. The pioneer method was further applied to quantitate OA and UA in six samples of H. diffusa and five samples of H. corymbosa. The results showed that the OA and UA content in the samples from different sources were significantly different. This report is valuable for the application of H. diffusa and H. corymbosa obtained from different regions in clinical research and pharmacology. PMID:24555272

  2. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  3. Setting health research priorities using the CHNRI method: V. Quantitative properties of human collective knowledge

    PubMed Central

    Rudan, Igor; Yoshida, Sachiyo; Wazny, Kerri; Chan, Kit Yee; Cousens, Simon

    2016-01-01

    Introduction The CHNRI method for setting health research priorities has crowdsourcing as the major component. It uses the collective opinion of a group of experts to generate, assess and prioritize between many competing health research ideas. It is difficult to compare the accuracy of human individual and collective opinions in predicting uncertain future outcomes before the outcomes are known. However, this limitation does not apply to existing knowledge, which is an important component underlying opinion. In this paper, we report several experiments to explore the quantitative properties of human collective knowledge and discuss their relevance to the CHNRI method. Methods We conducted a series of experiments in groups of about 160 (range: 122–175) undergraduate Year 2 medical students to compare their collective knowledge to their individual knowledge. We asked them to answer 10 questions on each of the following: (i) an area in which they have a degree of expertise (undergraduate Year 1 medical curriculum); (ii) an area in which they likely have some knowledge (general knowledge); and (iii) an area in which they are not expected to have any knowledge (astronomy). We also presented them with 20 pairs of well–known celebrities and asked them to identify the older person of the pair. In all these experiments our goal was to examine how the collective answer compares to the distribution of students’ individual answers. Results When answering the questions in their own area of expertise, the collective answer (the median) was in the top 20.83% of the most accurate individual responses; in general knowledge, it was in the top 11.93%; and in an area with no expertise, the group answer was in the top 7.02%. However, the collective answer based on mean values fared much worse, ranging from top 75.60% to top 95.91%. Also, when confronted with guessing the older of the two celebrities, the collective response was correct in 18/20 cases (90%), while the 8 most

  4. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    NASA Astrophysics Data System (ADS)

    Ryan, C. G.; Laird, J. S.; Fisher, L. A.; Kirkham, R.; Moorhead, G. F.

    2015-11-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  5. Combination of an enzymatic method and HPLC for the quantitation of cholesterol in cultured cells.

    PubMed

    Contreras, J A; Castro, M; Bocos, C; Herrera, E; Lasunción, M A

    1992-06-01

    The study of the cellular events that lead to the foam cell formation requires the development of fast, accurate, and sensitive methods to quantify cholesterol in cultured cells. Here we describe a procedure that allows the rapid determination of free and total cholesterol in a reduced number of cells, which makes it very suitable for cholesterol determination in cell cultures. The method consists of the enzymatic conversion of cholesterol to cholest-4-ene-3-one by cholesterol oxidase followed by the analysis of the sample by high performance liquid chromatography (HPLC) to detect this oxidized product. Due to the relatively high wavelength at which cholest-4-ene-3-one has its maximum absorption (240 nm), other cellular components do not interfere with the chromatographic procedure and prior lipid extraction is not required. Moreover, the duration of each chromatogram is about 3 min, contributing to the celerity of the method. All the cholesteryl esters used (oleate, palmitate, stearate and linoleate) were quantitatively hydrolyzed by incubation with cholesterol esterase; this was observed to occur with both pure standards and in cell homogenates. Sensitivity is enough to allow the determination of free and total cholesterol in less than 5 x 10(3) cells. We have applied this method to human monocyte-derived macrophages and the values obtained for free and total cholesterol are in close agreement with published data. PMID:1512516

  6. Quantitative calcium resistivity based method for accurate and scalable water vapor transmission rate measurement.

    PubMed

    Reese, Matthew O; Dameron, Arrelaine A; Kempe, Michael D

    2011-08-01

    The development of flexible organic light emitting diode displays and flexible thin film photovoltaic devices is dependent on the use of flexible, low-cost, optically transparent and durable barriers to moisture and/or oxygen. It is estimated that this will require high moisture barriers with water vapor transmission rates (WVTR) between 10(-4) and 10(-6) g/m(2)/day. Thus there is a need to develop a relatively fast, low-cost, and quantitative method to evaluate such low permeation rates. Here, we demonstrate a method where the resistance changes of patterned Ca films, upon reaction with moisture, enable one to calculate a WVTR between 10 and 10(-6) g/m(2)/day or better. Samples are configured with variable aperture size such that the sensitivity and/or measurement time of the experiment can be controlled. The samples are connected to a data acquisition system by means of individual signal cables permitting samples to be tested under a variety of conditions in multiple environmental chambers. An edge card connector is used to connect samples to the measurement wires enabling easy switching of samples in and out of test. This measurement method can be conducted with as little as 1 h of labor time per sample. Furthermore, multiple samples can be measured in parallel, making this an inexpensive and high volume method for measuring high moisture barriers. PMID:21895269

  7. Spectrophotometric Method for Quantitative Determination of Cefixime in Bulk and Pharmaceutical Preparation Using Ferroin Complex

    NASA Astrophysics Data System (ADS)

    Naeem Khan, M.; Qayum, A.; Ur Rehman, U.; Gulab, H.; Idrees, M.

    2015-09-01

    A method was developed for the quantitative determination of cefixime in bulk and pharmaceutical preparations using ferroin complex. The method is based on the oxidation of the cefixime with Fe(III) in acidic medium. The formed Fe(II) reacts with 1,10-phenanthroline, and the ferroin complex is measured spectrophotometrically at 510 nm against reagent blank. Beer's law was obeyed in the concentration range 0.2-10 μg/ml with a good correlation of 0.993. The molar absorptivity was calculated and was found to be 1.375×105 L/mol × cm. The limit of detection (LOD) and limit of quantification (LOQ) were found to be 0.030 and 0.101 μg/ml respectively. The proposed method has reproducibility with a relative standard deviation of 5.28% (n = 6). The developed method was validated statistically by performing a recoveries study and successfully applied for the determination of cefixime in bulk powder and pharmaceutical formulations without interferences from common excipients. Percent recoveries were found to range from 98.00 to 102.05% for the pure form and 97.83 to 102.50% for pharmaceutical preparations.

  8. A Simple, Quantitative Method Using Alginate Gel to Determine Rat Colonic Tumor Volume In Vivo

    PubMed Central

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-01-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% ± 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the ApcPirc/+ rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mm3. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained. PMID:24674588

  9. Quantitative test method for evaluation of anti-fingerprint property of coated surfaces

    NASA Astrophysics Data System (ADS)

    Wu, Linda Y. L.; Ngian, S. K.; Chen, Z.; Xuan, D. T. T.

    2011-01-01

    An artificial fingerprint liquid is formulated from artificial sweat, hydroxyl-terminated polydimethylsiloxane and a solvent for direct determination of anti-fingerprint property of a coated surface. A range of smooth and rough surfaces with different anti-fingerprint (AF) properties were fabricated by sol-gel technology, on which the AF liquid contact angles, artificial fingerprint and real human fingerprints (HF) were verified and correlated. It is proved that a surface with AF contact angle above 87° is fingerprint free. This provides an objective and quantitative test method to determine anti-fingerprint property of coated surfaces. It is also concluded that AF property can be achieved on smooth and optically clear surfaces. Deep porous structures are more favorable than bumpy structure for oleophobic and AF properties.

  10. A quantitative and qualitative method to control chemotherapeutic preparations by Fourier transform infrared-ultraviolet spectrophotometry.

    PubMed

    Dziopa, Florian; Galy, Guillaume; Bauler, Stephanie; Vincent, Benoit; Crochon, Sarah; Tall, Mamadou Lamine; Pirot, Fabrice; Pivot, Christine

    2013-06-01

    Chemotherapy products in hospitals include a reconstitution step of manufactured drugs providing an adapted dosage to each patient. The administration of highly iatrogenic drugs raises the question of patients' safety and treatment efficiency. In order to reduce administration errors due to faulty preparations, we introduced a new qualitative and quantitative routine control based on Fourier Transform Infrared (FTIR) and UV-Visible spectrophotometry. This automated method enabled fast and specific control for 14 anticancer drugs. A 1.2 mL sample was used to assay and identify each preparation in less than 90 sec. Over a two-year period, 9370 controlled infusion bags showed a 1.49% nonconformity rate, under 15% tolerance from the theoretical concentration and 96% minimum identification matching factor. This study evaluated the reliability of the control process, as well as its accordance to chemotherapy deliverance requirements. Thus, corrective measures were defined to improve the control process. PMID:23014899

  11. A method to optimize selection on multiple identified quantitative trait loci

    PubMed Central

    Chakraborty, Reena; Moreau, Laurence; Dekkers, Jack CM

    2002-01-01

    A mathematical approach was developed to model and optimize selection on multiple known quantitative trait loci (QTL) and polygenic estimated breeding values in order to maximize a weighted sum of responses to selection over multiple generations. The model allows for linkage between QTL with multiple alleles and arbitrary genetic effects, including dominance, epistasis, and gametic imprinting. Gametic phase disequilibrium between the QTL and between the QTL and polygenes is modeled but polygenic variance is assumed constant. Breeding programs with discrete generations, differential selection of males and females and random mating of selected parents are modeled. Polygenic EBV obtained from best linear unbiased prediction models can be accommodated. The problem was formulated as a multiple-stage optimal control problem and an iterative approach was developed for its solution. The method can be used to develop and evaluate optimal strategies for selection on multiple QTL for a wide range of situations and genetic models. PMID:12081805

  12. Methods for quantitative evaluation of dynamics of repair proteins within irradiated cells

    NASA Astrophysics Data System (ADS)

    Hable, V.; Dollinger, G.; Greubel, C.; Hauptner, A.; Krücken, R.; Dietzel, S.; Cremer, T.; Drexler, G. A.; Friedl, A. A.; Löwe, R.

    2006-04-01

    Living HeLa cells are irradiated well directed with single 100 MeV oxygen ions by the superconducting ion microprobe SNAKE, the Superconducting Nanoscope for Applied Nuclear (=Kern-) Physics Experiments, at the Munich 14 MV tandem accelerator. Various proteins, which are involved directly or indirectly in repair processes, accumulate as clusters (so called foci) at DNA-double strand breaks (DSBs) induced by the ions. The spatiotemporal dynamics of these foci built by the phosphorylated histone γ-H2AX are studied. For this purpose cells are irradiated in line patterns. The γ-H2AX is made visible under the fluorescence microscope using immunofluorescence techniques. Quantitative analysis methods are developed to evaluate the data of the microscopic images in order to analyze movement of the foci and their changing size.

  13. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N., Jr.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  14. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    PubMed Central

    Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho

    2013-01-01

    Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428

  15. A quantitative method for defining high-arched palate using the Tcof1(+/-) mutant mouse as a model.

    PubMed

    Conley, Zachary R; Hague, Molly; Kurosaka, Hiroshi; Dixon, Jill; Dixon, Michael J; Trainor, Paul A

    2016-07-15

    The palate functions as the roof of the mouth in mammals, separating the oral and nasal cavities. Its complex embryonic development and assembly poses unique susceptibilities to intrinsic and extrinsic disruptions. Such disruptions may cause failure of the developing palatal shelves to fuse along the midline resulting in a cleft. In other cases the palate may fuse at an arch, resulting in a vaulted oral cavity, termed high-arched palate. There are many models available for studying the pathogenesis of cleft palate but a relative paucity for high-arched palate. One condition exhibiting either cleft palate or high-arched palate is Treacher Collins syndrome, a congenital disorder characterized by numerous craniofacial anomalies. We quantitatively analyzed palatal perturbations in the Tcof1(+/-) mouse model of Treacher Collins syndrome, which phenocopies the condition in humans. We discovered that 46% of Tcof1(+/-) mutant embryos and new born pups exhibit either soft clefts or full clefts. In addition, 17% of Tcof1(+/-) mutants were found to exhibit high-arched palate, defined as two sigma above the corresponding wild-type population mean for height and angular based arch measurements. Furthermore, palatal shelf length and shelf width were decreased in all Tcof1(+/-) mutant embryos and pups compared to controls. Interestingly, these phenotypes were subsequently ameliorated through genetic inhibition of p53. The results of our study therefore provide a simple, reproducible and quantitative method for investigating models of high-arched palate. PMID:26772999

  16. Comparison of two methods for obtaining quantitative mass concentrations from aerosol time-of-flight mass spectrometry measurements.

    PubMed

    Qin, Xueying; Bhave, Prakash V; Prather, Kimberly A

    2006-09-01

    Aerosol time-of-flight mass spectrometry (ATOFMS) measurements provide continuous information on the aerodynamic size and chemical composition of individual particles. In this work, we compare two approaches for converting unscaled ATOFMS measurements into quantitative particle mass concentrations using (1) reference mass concentrations from a co-located micro-orifice uniform deposit impactor (MOUDI) with an accurate estimate of instrument busy time and (2) reference number concentrations from a co-located aerodynamic particle sizer (APS). Aerodynamic-diameter-dependent scaling factors are used for both methods to account for particle transmission efficiencies through the ATOFMS inlet. Scaling with APS data retains the high-resolution characteristics of the ambient aerosol because the scaling functions are specific for each hourly time period and account for a maximum in the ATOFMS transmission efficiency curve for larger-sized particles. Scaled mass concentrations obtained from both methods are compared with co-located PM(2.5) measurements for evaluation purposes. When compared against mass concentrations from a beta attenuation monitor (BAM), the MOUDI-scaled ATOFMS mass concentrations show correlations of 0.79 at Fresno, and the APS-scaled results show correlations of 0.91 at Angiola. Applying composition-dependent density corrections leads to a slope of nearly 1 with 0 intercept between the APS-scaled absolute mass concentration values and BAM mass measurements. This paper provides details on the methodologies used to convert ATOFMS data into continuous, quantitative, and size-resolved mass concentrations that will ultimately be used to provide a quantitative estimate of the number and mass concentrations of particles from different sources. PMID:16944899

  17. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    USGS Publications Warehouse

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  18. Development of an Analytical Method for Quantitative Determination of Atmospheric Particles By Laap-TOF Instrument

    NASA Astrophysics Data System (ADS)

    Gemayel, R.; Temime-Roussel, B.; Hellebust, S.; Gligorovski, S.; Wortham, H.

    2014-12-01

    A comprehensive understanding of the chemical composition of the atmospheric particles is of paramount importance in order to understand their impact on the health and climate. Hence, there is an imperative need for the development of appropriate analytical methods of analysis for the on-line, time-resolved measurements of atmospheric particles. Laser Ablation Aerosol Particle Time of Flight Mass Spectrometry (LAAP-TOF-MS) allows a real time qualitative analysis of nanoparticles of differing composition and size. LAAP-TOF-MS is aimed for on-line and continuous measurements of atmospheric particles with the fast time resolution in order of millisecond. This system uses a 193 nm excimer laser for particle ablation/ionization and a 403 nm scattering laser for sizing (and single particle detection/triggering). The charged ions are then extracted into a bi-polar Time-of-Flight mass spectrometer. Here we present an analytical methodology for quantitative determination of the composition and size-distribution of the particles by LAAP-TOF instrument. We developed and validate an analytical methodology of this high time resolution instrument by comparison with the conventional analysis systems with lower time resolution (electronic microscopy, optical counters…) with final aim to render the methodology quantitative. This was performed with the aid of other instruments for on-line and off-line measurement such as Scanning Mobility Particle Sizer, electronic microscopy... Validation of the analytical method was performed under laboratory conditions by detection and identification of the targeted main types such as SiO2, CeO2, and TiO2

  19. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors

  20. Transconvolution and the virtual positron emission tomograph-A new method for cross calibration in quantitative PET/CT imaging

    SciTech Connect

    Prenosil, George A.; Weitzel, Thilo; Hentschel, Michael; Klaeser, Bernd; Krause, Thomas

    2013-06-15

    Purpose: Positron emission tomography (PET)/computed tomography (CT) measurements on small lesions are impaired by the partial volume effect, which is intrinsically tied to the point spread function of the actual imaging system, including the reconstruction algorithms. The variability resulting from different point spread functions hinders the assessment of quantitative measurements in clinical routine and especially degrades comparability within multicenter trials. To improve quantitative comparability there is a need for methods to match different PET/CT systems through elimination of this systemic variability. Consequently, a new method was developed and tested that transforms the image of an object as produced by one tomograph to another image of the same object as it would have been seen by a different tomograph. The proposed new method, termed Transconvolution, compensates for differing imaging properties of different tomographs and particularly aims at quantitative comparability of PET/CT in the context of multicenter trials. Methods: To solve the problem of image normalization, the theory of Transconvolution was mathematically established together with new methods to handle point spread functions of different PET/CT systems. Knowing the point spread functions of two different imaging systems allows determining a Transconvolution function to convert one image into the other. This function is calculated by convolving one point spread function with the inverse of the other point spread function which, when adhering to certain boundary conditions such as the use of linear acquisition and image reconstruction methods, is a numerically accessible operation. For reliable measurement of such point spread functions characterizing different PET/CT systems, a dedicated solid-state phantom incorporating {sup 68}Ge/{sup 68}Ga filled spheres was developed. To iteratively determine and represent such point spread functions, exponential density functions in combination

  1. A quantitative method for evaluating numerical simulation accuracy of time-transient Lamb wave propagation with its applications to selecting appropriate element size and time step.

    PubMed

    Wan, Xiang; Xu, Guanghua; Zhang, Qing; Tse, Peter W; Tan, Haihui

    2016-01-01

    Lamb wave technique has been widely used in non-destructive evaluation (NDE) and structural health monitoring (SHM). However, due to the multi-mode characteristics and dispersive nature, Lamb wave propagation behavior is much more complex than that of bulk waves. Numerous numerical simulations on Lamb wave propagation have been conducted to study its physical principles. However, few quantitative studies on evaluating the accuracy of these numerical simulations were reported. In this paper, a method based on cross correlation analysis for quantitatively evaluating the simulation accuracy of time-transient Lamb waves propagation is proposed. Two kinds of error, affecting the position and shape accuracies are firstly identified. Consequently, two quantitative indices, i.e., the GVE (group velocity error) and MACCC (maximum absolute value of cross correlation coefficient) derived from cross correlation analysis between a simulated signal and a reference waveform, are proposed to assess the position and shape errors of the simulated signal. In this way, the simulation accuracy on the position and shape is quantitatively evaluated. In order to apply this proposed method to select appropriate element size and time step, a specialized 2D-FEM program combined with the proposed method is developed. Then, the proper element size considering different element types and time step considering different time integration schemes are selected. These results proved that the proposed method is feasible and effective, and can be used as an efficient tool for quantitatively evaluating and verifying the simulation accuracy of time-transient Lamb wave propagation. PMID:26315506

  2. A systematic study on the influencing parameters and improvement of quantitative analysis of multi-component with single marker method using notoginseng as research subject.

    PubMed

    Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing

    2015-03-01

    A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi

  3. Initial Results of an MDO Method Evaluation Study

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley MDO method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of re- producible experiments. In the first phase of the study, three MDO methods were implemented in the SIGHT: framework and used to solve a set of ten relatively simple problems. In this paper, we comment on the general considerations for conducting method evaluation studies and report some initial results obtained to date. In particular, although the results are not conclusive because of the small initial test set, other formulations, optimality conditions, and sensitivity of solutions to various perturbations. Optimization algorithms are used to solve a particular MDO formulation. It is then appropriate to speak of local convergence rates and of global convergence properties of an optimization algorithm applied to a specific formulation. An analogous distinction exists in the field of partial differential equations. On the one hand, equations are analyzed in terms of regularity, well-posedness, and the existence and unique- ness of solutions. On the other, one considers numerous algorithms for solving differential equations. The area of MDO methods studies MDO formulations combined with optimization algorithms, although at times the distinction is blurred. It is important to

  4. Quantitative fault analysis of roller bearings based on a novel matching pursuit method with a new step-impulse dictionary

    NASA Astrophysics Data System (ADS)

    Cui, Lingli; Wu, Na; Ma, Chunqing; Wang, Huaqing

    2016-02-01

    A novel matching pursuit method based on a new step-impulse dictionary to measure the size of a bearing's spall-like fault is presented in this study. Based on the seemingly double-impact theory and the rolling bearing fault mechanism, a theoretical model for the bearing fault with different spall-like fault sizes is developed and analyzed, and the seemingly double-impact characteristic of the bearing faults is explained. The first action that causes a bearing fault is due to the entry of the roller element into the spall-like fault which can be described as a step-like response. The second action is the exit of the roller element from the spall-like fault, which can be described as an impulse-like response. Based on the quantitative relationship between the time interval of the seemingly double-impact actions and the fault size, a novel matching pursuit method is proposed based on a new step-impulse dictionary. In addition, the quantitative matching pursuit algorithm is proposed for bearing fault diagnosis based on the new dictionary model. Finally, an atomic selection mechanism is proposed to improve the measurement accuracy of bearing fault size. The simulation results of this study indicate that the new matching pursuit method based on the new step-impulse dictionary can be reliably used to measure the sizes of bearing spall-like faults. The applications of this method to the fault signals of bearing outer-races measured at different speeds have shown that the proposed method can effectively measure a bearing's spall-like fault size.

  5. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  6. Quantitative Analysis of Single and Mix Food Antiseptics Basing on SERS Spectra with PLSR Method

    NASA Astrophysics Data System (ADS)

    Hou, Mengjing; Huang, Yu; Ma, Lingwei; Zhang, Zhengjun

    2016-06-01

    Usage and dosage of food antiseptics are very concerned due to their decisive influence in food safety. Surface-enhanced Raman scattering (SERS) effect was employed in this research to realize trace potassium sorbate (PS) and sodium benzoate (SB) detection. HfO2 ultrathin film-coated Ag NR array was fabricated as SERS substrate. Protected by HfO2 film, the SERS substrate possesses good acid resistance, which enables it to be applicable in acidic environment where PS and SB work. Regression relationship between SERS spectra of 0.3~10 mg/L PS solution and their concentration was calibrated by partial least squares regression (PLSR) method, and the concentration prediction performance was quite satisfactory. Furthermore, mixture solution of PS and SB was also quantitatively analyzed by PLSR method. Spectrum data of characteristic peak sections corresponding to PS and SB was used to establish the regression models of these two solutes, respectively, and their concentrations were determined accurately despite their characteristic peak sections overlapping. It is possible that the unique modeling process of PLSR method prevented the overlapped Raman signal from reducing the model accuracy.

  7. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime

    PubMed Central

    Fitterer, Jessica L.; Nelson, Trisalyn A.

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016

  8. Quantitatively estimating defects in graphene devices using discharge current analysis method

    PubMed Central

    Jung, Ukjin; Lee, Young Gon; Kang, Chang Goo; Lee, Sangchul; Kim, Jin Ju; Hwang, Hyeon June; Lim, Sung Kwan; Ham, Moon-Ho; Lee, Byoung Hun

    2014-01-01

    Defects of graphene are the most important concern for the successful applications of graphene since they affect device performance significantly. However, once the graphene is integrated in the device structures, the quality of graphene and surrounding environment could only be assessed using indirect information such as hysteresis, mobility and drive current. Here we develop a discharge current analysis method to measure the quality of graphene integrated in a field effect transistor structure by analyzing the discharge current and examine its validity using various device structures. The density of charging sites affecting the performance of graphene field effect transistor obtained using the discharge current analysis method was on the order of 1014/cm2, which closely correlates with the intensity ratio of the D to G bands in Raman spectroscopy. The graphene FETs fabricated on poly(ethylene naphthalate) (PEN) are found to have a lower density of charging sites than those on SiO2/Si substrate, mainly due to reduced interfacial interaction between the graphene and the PEN. This method can be an indispensable means to improve the stability of devices using a graphene as it provides an accurate and quantitative way to define the quality of graphene after the device fabrication. PMID:24811431

  9. Quantitative Analysis of Single and Mix Food Antiseptics Basing on SERS Spectra with PLSR Method.

    PubMed

    Hou, Mengjing; Huang, Yu; Ma, Lingwei; Zhang, Zhengjun

    2016-12-01

    Usage and dosage of food antiseptics are very concerned due to their decisive influence in food safety. Surface-enhanced Raman scattering (SERS) effect was employed in this research to realize trace potassium sorbate (PS) and sodium benzoate (SB) detection. HfO2 ultrathin film-coated Ag NR array was fabricated as SERS substrate. Protected by HfO2 film, the SERS substrate possesses good acid resistance, which enables it to be applicable in acidic environment where PS and SB work. Regression relationship between SERS spectra of 0.3~10 mg/L PS solution and their concentration was calibrated by partial least squares regression (PLSR) method, and the concentration prediction performance was quite satisfactory. Furthermore, mixture solution of PS and SB was also quantitatively analyzed by PLSR method. Spectrum data of characteristic peak sections corresponding to PS and SB was used to establish the regression models of these two solutes, respectively, and their concentrations were determined accurately despite their characteristic peak sections overlapping. It is possible that the unique modeling process of PLSR method prevented the overlapped Raman signal from reducing the model accuracy. PMID:27299651

  10. Quantitative ultrasound method for assessing stress-strain properties and the cross-sectional area of Achilles tendon

    NASA Astrophysics Data System (ADS)

    Du, Yi-Chun; Chen, Yung-Fu; Li, Chien-Ming; Lin, Chia-Hung; Yang, Chia-En; Wu, Jian-Xing; Chen, Tainsong

    2013-12-01

    The Achilles tendon is one of the most commonly observed tendons injured with a variety of causes, such as trauma, overuse and degeneration, in the human body. Rupture and tendinosis are relatively common for this strong tendon. Stress-strain properties and shape change are important biomechanical properties of the tendon to assess surgical repair or healing progress. Currently, there are rather limited non-invasive methods available for precisely quantifying the in vivo biomechanical properties of the tendons. The aim of this study was to apply quantitative ultrasound (QUS) methods, including ultrasonic attenuation and speed of sound (SOS), to investigate porcine tendons in different stress-strain conditions. In order to find a reliable method to evaluate the change of tendon shape, ultrasound measurement was also utilized for measuring tendon thickness and compared with the change in tendon cross-sectional area under different stress. A total of 15 porcine tendons of hind trotters were examined. The test results show that the attenuation and broadband ultrasound attenuation decreased and the SOS increased by a smaller magnitude as the uniaxial loading of the stress-strain upon tendons increased. Furthermore, the tendon thickness measured with the ultrasound method was significantly correlated with tendon cross-sectional area (Pearson coefficient = 0.86). These results also indicate that attenuation of QUS and ultrasonic thickness measurement are reliable and potential parameters for assessing biomechanical properties of tendons. Further investigations are needed to warrant the application of the proposed method in a clinical setting.

  11. Quantifying social norms: by coupling the ecosystem management concept and semi-quantitative sociological methods

    NASA Astrophysics Data System (ADS)

    Zhang, D.; Xu, H.

    2012-12-01

    Over recent decades, human-induced environmental changes have steadily and rapidly grown in intensity and impact to where they now often exceed natural impacts. As one of important components of human activities, social norms play key roles in environmental and natural resources management. But the lack of relevant quantitative data about social norms greatly limits our scientific understanding of the complex linkages between humans and nature, and hampers our solving of pressing environmental and social problems. In this study, we built a quantified method by coupling the ecosystem management concept, semi-quantitative sociological methods and mathematical statistics. We got the quantified value of social norms from two parts, whether the content of social norms coincide with the concept of ecosystem management (content value) and how about the performance after social norms were put into implementation (implementation value) . First, we separately identified 12 core elements of ecosystem management and 16 indexes of social norms, and then matched them one by one. According to their matched degree, we got the content value of social norms. Second, we selected 8 key factors that can represent the performance of social norms after they were put into implementation, and then we got the implementation value by Delph method. Adding these two parts values, we got the final value of each social norms. Third, we conducted a case study in Heihe river basin, the second largest inland river in China, by selecting 12 official edicts related to the river basin ecosystem management of Heihe River Basin. By doing so, we first got the qualified data of social norms which can be directly applied to the research that involved observational or experimental data collection of natural processes. Second, each value was supported by specific contents, so it can assist creating a clear road map for building or revising management and policy guidelines. For example, in this case study

  12. Evaluation of a Rapid, Quantitative Real-Time PCR Method for Enumeration of Pathogenic Candida Cells in Water

    PubMed Central

    Brinkman, Nichole E.; Haugland, Richard A.; Wymer, Larry J.; Byappanahalli, Muruleedhara; Whitman, Richard L.; Vesper, Stephen J.

    2003-01-01

    Quantitative PCR (QPCR) technology, incorporating fluorigenic 5′ nuclease (TaqMan) chemistry, was utilized for the specific detection and quantification of six pathogenic species of Candida (C. albicans, C. tropicalis, C. krusei, C. parapsilosis, C. glabrata and C. lusitaniae) in water. Known numbers of target cells were added to distilled and tap water samples, filtered, and disrupted directly on the membranes for recovery of DNA for QPCR analysis. The assay's sensitivities were between one and three cells per filter. The accuracy of the cell estimates was between 50 and 200% of their true value (95% confidence level). In similar tests with surface water samples, the presence of PCR inhibitory compounds necessitated further purification and/or dilution of the DNA extracts, with resultant reductions in sensitivity but generally not in quantitative accuracy. Analyses of a series of freshwater samples collected from a recreational beach showed positive correlations between the QPCR results and colony counts of the corresponding target species. Positive correlations were also seen between the cell quantities of the target Candida species detected in these analyses and colony counts of Enterococcus organisms. With a combined sample processing and analysis time of less than 4 h, this method shows great promise as a tool for rapidly assessing potential exposures to waterborne pathogenic Candida species from drinking and recreational waters and may have applications in the detection of fecal pollution. PMID:12620869

  13. Evaluation of a rapid, quantitative real-time PCR method for enumeration of pathogenic Candida cells in water

    USGS Publications Warehouse

    Brinkman, Nichole E.; Haugland, Richard A.; Wymer, Larry J.; Byappanahalli, Muruleedhara N.; Whitman, Richard L.; Vesper, Stephen J.

    2003-01-01

    Quantitative PCR (QPCR) technology, incorporating fluorigenic 5′ nuclease (TaqMan) chemistry, was utilized for the specific detection and quantification of six pathogenic species of Candida (C. albicans, C. tropicalis, C. krusei, C. parapsilosis, C. glabrata and C. lusitaniae) in water. Known numbers of target cells were added to distilled and tap water samples, filtered, and disrupted directly on the membranes for recovery of DNA for QPCR analysis. The assay's sensitivities were between one and three cells per filter. The accuracy of the cell estimates was between 50 and 200% of their true value (95% confidence level). In similar tests with surface water samples, the presence of PCR inhibitory compounds necessitated further purification and/or dilution of the DNA extracts, with resultant reductions in sensitivity but generally not in quantitative accuracy. Analyses of a series of freshwater samples collected from a recreational beach showed positive correlations between the QPCR results and colony counts of the corresponding target species. Positive correlations were also seen between the cell quantities of the target Candida species detected in these analyses and colony counts of Enterococcus organisms. With a combined sample processing and analysis time of less than 4 h, this method shows great promise as a tool for rapidly assessing potential exposures to waterborne pathogenic Candida species from drinking and recreational waters and may have applications in the detection of fecal pollution.

  14. A composite method for mapping quantitative trait loci without interference of female achiasmatic and gender effects in silkworm, Bombyx mori.

    PubMed

    Li, C; Zuo, W; Tong, X; Hu, H; Qiao, L; Song, J; Xiong, G; Gao, R; Dai, F; Lu, C

    2015-08-01

    The silkworm, Bombyx mori, is an economically important insect that was domesticated more than 5000 years ago. Its major economic traits focused on by breeders are quantitative traits, and an accurate and efficient QTL mapping method is necessary to explore their genetic architecture. However, current widely used QTL mapping models are not well suited for silkworm because they ignore female achiasmate and gender effects. In this study, we propose a composite method combining rational population selection and special mapping methods to map QTL in silkworm. By determining QTL for cocoon shell weight (CSW), we demonstrated the effectiveness of this method. In the CSW mapping process, only 56 markers were used and five loci or chromosomes were detected, more than in previous studies. Additionally, loci on chromosomes 1 and 11 dominated and accounted for 35.10% and 15.03% of the phenotypic variance respectively. Unlike previous studies, epistasis was detected between loci on chromosomes 11 and 22. These mapping results demonstrate the power and convenience of this method for QTL mapping in silkworm, and this method may inspire the development of similar approaches for other species with special genetic characteristics. PMID:26059330

  15. A novel method for quantitative determination of tea polysaccharide by resonance light scattering

    NASA Astrophysics Data System (ADS)

    Wei, Xinlin; Xi, Xionggang; Wu, Muxia; Wang, Yuanfeng

    2011-09-01

    A new method for the determination of tea polysaccharide (TPS) in green tea ( Camellia sinensis) leaves has been developed. The method was based on the enhancement of resonance light scattering (RLS) of TPS in the presence of cetylpyridinium chloride (CPC)-NaOH system. Under the optimum conditions, the RLS intensity of CPC was greatly enhanced by adding TPS. The maximum peak of the enhanced RLS spectra was located at 484.02 nm. The enhanced RLS intensity was proportional to the concentration of TPS in the range of 2.0-20 μg/ml. It showed that the new method and phenol-sulfuric acid method give some equivalent results by measuring the standard compounds. The recoveries of the two methods were 96.39-103.7% (novel method) and 100.15-103.65% (phenol-sulfuric acid method), respectively. However, it showed that the two methods were different to some extent. The new method offered a limit of detection (LOD) of 0.047 μg/ml, whereas the phenol-sulfuric acid method gives a LOD of 1.54 μg/ml. Interfered experiment demonstrated that the new method had highly selectivity, and was more suitable for the determination of TPS than phenol-sulfuric method. Stability test showed that new method had good stability. Moreover, the proposed method owns the advantages of easy operation, rapidity and practicability, which suggested that the proposed method could be satisfactorily applied to the determination of TPS in green tea.

  16. Cloned plasmid DNA fragments as calibrators for controlling GMOs: different real-time duplex quantitative PCR methods.

    PubMed

    Taverniers, Isabel; Van Bockstaele, Erik; De Loose, Marc

    2004-03-01

    Analytical real-time PCR technology is a powerful tool for implementation of the GMO labeling regulations enforced in the EU. The quality of analytical measurement data obtained by quantitative real-time PCR depends on the correct use of calibrator and reference materials (RMs). For GMO methods of analysis, the choice of appropriate RMs is currently under debate. So far, genomic DNA solutions from certified reference materials (CRMs) are most often used as calibrators for GMO quantification by means of real-time PCR. However, due to some intrinsic features of these CRMs, errors may be expected in the estimations of DNA sequence quantities. In this paper, two new real-time PCR methods are presented for Roundup Ready soybean, in which two types of plasmid DNA fragments are used as calibrators. Single-target plasmids (STPs) diluted in a background of genomic DNA were used in the first method. Multiple-target plasmids (MTPs) containing both sequences in one molecule were used as calibrators for the second method. Both methods simultaneously detect a promoter 35S sequence as GMO-specific target and a lectin gene sequence as endogenous reference target in a duplex PCR. For the estimation of relative GMO percentages both "delta C(T)" and "standard curve" approaches are tested. Delta C(T) methods are based on direct comparison of measured C(T) values of both the GMO-specific target and the endogenous target. Standard curve methods measure absolute amounts of target copies or haploid genome equivalents. A duplex delta C(T) method with STP calibrators performed at least as well as a similar method with genomic DNA calibrators from commercial CRMs. Besides this, high quality results were obtained with a standard curve method using MTP calibrators. This paper demonstrates that plasmid DNA molecules containing either one or multiple target sequences form perfect alternative calibrators for GMO quantification and are especially suitable for duplex PCR reactions. PMID:14689155

  17. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  18. A Novel Method for Relative Quantitation of N-Glycans by Isotopic Labeling Using 18O-Water

    PubMed Central

    Tao, Shujuan; Orlando, Ron

    2014-01-01

    Quantitation is an essential aspect of comprehensive glycomics study. Here, a novel isotopic-labeling method is described for N-glycan quantitation using 18O-water. The incorporation of the 18O-labeling into the reducing end of N-glycans is simply and efficiently achieved during peptide-N4-(N-acetyl-β-glucosaminyl) asparagine amidase F release. This process provides a 2-Da mass difference compared with the N-glycans released in 16O-water. A mathematical calculation method was also developed to determine the 18O/16O ratios from isotopic peaks. Application of this method to several standard glycoprotein mixtures and human serum demonstrated that this method can facilitate the relative quantitation of N-glycans over a linear dynamic range of two orders, with high accuracy and reproducibility. PMID:25365792

  19. Quantitative (1)H NMR method for hydrolytic kinetic investigation of salvianolic acid B.

    PubMed

    Pan, Jianyang; Gong, Xingchu; Qu, Haibin

    2013-11-01

    This work presents an exploratory study for monitoring the hydrolytic process of salvianolic acid B (Sal B) in low oxygen condition using a simple quantitative (1)H NMR (Q-NMR) method. The quantity of the compounds was calculated by the relative ratio of the integral values of the target peak for each compound to the known amount of the internal standard trimethylsilyl propionic acid (TSP). Kinetic runs have been carried out on different initial concentrations of Sal B (5.00, 10.0, 20.0mg/mL) and temperatures of 70, 80, 90°C. The effect of these two factors during the transformation process of Sal B was investigated. The hydrolysis followed pseudo-first-order kinetics and the apparent degradation kinetic constant at 80°C decreased when concentration of Sal B increased. Under the given conditions, the rate constant of overall hydrolysis as a function of temperature obeyed the Arrhenius equation. Six degradation products were identified by NMR and mass spectrometric analysis. Four of these degradation products, i.e. danshensu (DSS), protocatechuic aldehyde (PRO), salvianolic acid D (Sal D) and lithospermic acid (LA) were further identified by comparing the retention times with standard compounds. The advantage of this Q-NMR method was that no reference compounds were required for calibration curves, the quantification could be directly realized on hydrolyzed samples. It was proved to be simple, convenient and accurate for hydrolytic kinetic study of Sal B. PMID:23867115

  20. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    SciTech Connect

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%.

  1. Process analytical technology case study part I: feasibility studies for quantitative near-infrared method development.

    PubMed

    Cogdill, Robert P; Anderson, Carl A; Delgado-Lopez, Miriam; Molseed, David; Chisholm, Robert; Bolton, Raymond; Herkert, Thorsten; Afnán, Ali M; Drennen, James K

    2005-01-01

    This article is the first of a series of articles detailing the development of near-infrared (NIR) methods for solid-dosage form analysis. Experiments were conducted at the Duquesne University Center for Pharmaceutical Technology to qualify the capabilities of instrumentation and sample handling systems, evaluate the potential effect of one source of a process signature on calibration development, and compare the utility of reflection and transmission data collection methods. A database of 572 production-scale sample spectra was used to evaluate the interbatch spectral variability of samples produced under routine manufacturing conditions. A second database of 540 spectra from samples produced under various compression conditions was analyzed to determine the feasibility of pooling spectral data acquired from samples produced at diverse scales. Instrument qualification tests were performed, and appropriate limits for instrument performance were established. To evaluate the repeatability of the sample positioning system, multiple measurements of a single tablet were collected. With the application of appropriate spectral preprocessing techniques, sample repositioning error was found to be insignificant with respect to NIR analyses of product quality attributes. Sample shielding was demonstrated to be unnecessary for transmission analyses. A process signature was identified in the reflection data. Additional tests demonstrated that the process signature was largely orthogonal to spectral variation because of hardness. Principal component analysis of the compression sample set data demonstrated the potential for quantitative model development. For the data sets studied, reflection analysis was demonstrated to be more robust than transmission analysis. PMID:16353986

  2. 3D reconstruction and quantitative assessment method of mitral eccentric regurgitation from color Doppler echocardiography

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Ge, Yi Nan; Wang, Tian Fu; Zheng, Chang Qiong; Zheng, Yi

    2005-10-01

    Based on the two-dimensional color Doppler image in this article, multilane transesophageal rotational scanning method is used to acquire original Doppler echocardiography while echocardiogram is recorded synchronously. After filtering and interpolation, the surface rendering and volume rendering methods are performed. Through analyzing the color-bar information and the color Doppler flow image's superposition principle, the grayscale mitral anatomical structure and color-coded regurgitation velocity parameter were separated from color Doppler flow images, three-dimensional reconstruction of mitral structure and regurgitation velocity distribution was implemented separately, fusion visualization of the reconstructed regurgitation velocity distribution parameter with its corresponding 3D mitral anatomical structures was realized, which can be used in observing the position, phase, direction and measuring the jet length, area, volume, space distribution and severity level of the mitral regurgitation. In addition, in patients with eccentric mitral regurgitation, this new modality overcomes the inherent limitations of two-dimensional color Doppler flow image by depicting the full extent of the jet trajectory, the area of eccentric regurgitation on three-dimensional image was much larger than that on two-dimensional image, the area variation tendency and volume variation tendency of regurgitation have been shown in figure at different angle and different systolic phase. The study shows that three-dimensional color Doppler provides quantitative measurements of eccentric mitral regurgitation that are more accurate and reproducible than conventional color Doppler.

  3. Simple saponification method for the quantitative determination of carotenoids in green vegetables.

    PubMed

    Larsen, Erik; Christensen, Lars P

    2005-08-24

    A simple, reliable, and gentle saponification method for the quantitative determination of carotenoids in green vegetables was developed. The method involves an extraction procedure with acetone and the selective removal of the chlorophylls and esterified fatty acids from the organic phase using a strongly basic resin (Ambersep 900 OH). Extracts from common green vegetables (beans, broccoli, green bell pepper, chive, lettuce, parsley, peas, and spinach) were analyzed by high-performance liquid chromatography (HPLC) for their content of major carotenoids before and after action of Ambersep 900 OH. The mean recovery percentages for most carotenoids [(all-E)-violaxanthin, (all-E)-lutein epoxide, (all-E)-lutein, neolutein A, and (all-E)-beta-carotene] after saponification of the vegetable extracts with Ambersep 900 OH were close to 100% (99-104%), while the mean recovery percentages of (9'Z)-neoxanthin increased to 119% and that of (all-E)-neoxanthin and neolutein B decreased to 90% and 72%, respectively. PMID:16104772

  4. Recommended Methods for Brain Processing and Quantitative Analysis in Rodent Developmental Neurotoxicity Studies.

    PubMed

    Garman, Robert H; Li, Abby A; Kaufmann, Wolfgang; Auer, Roland N; Bolon, Brad

    2016-01-01

    Neuropathology methods in rodent developmental neurotoxicity (DNT) studies have evolved with experience and changing regulatory guidance. This article emphasizes principles and methods to promote more standardized DNT neuropathology evaluation, particularly procurement of highly homologous brain sections and collection of the most reproducible morphometric measurements. To minimize bias, brains from all animals at all dose levels should be processed from brain weighing through paraffin embedding at one time using a counterbalanced design. Morphometric measurements should be anchored by distinct neuroanatomic landmarks that can be identified reliably on the faced block or in unstained sections and which address the region-specific circuitry of the measured area. Common test article-related qualitative changes in the developing brain include abnormal cell numbers (yielding altered regional size), displaced cells (ectopia and heterotopia), and/or aberrant differentiation (indicated by defective myelination or synaptogenesis), but rarely glial or inflammatory reactions. Inclusion of digital images in the DNT pathology raw data provides confidence that the quantitative analysis was done on anatomically matched (i.e., highly homologous) sections. Interpreting DNT neuropathology data and their presumptive correlation with neurobehavioral data requires an integrative weight-of-evidence approach including consideration of maternal toxicity, body weight, brain weight, and the pattern of findings across brain regions, doses, sexes, and ages. PMID:26296631

  5. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging

    NASA Astrophysics Data System (ADS)

    Könik, Arda; Kupinski, Meredith; Hendrik Pretorius, P.; King, Michael A.; Barrett, Harrison H.

    2015-08-01

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3 cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested.

  6. Validation of a simple and inexpensive method for the quantitation of infarct in the rat brain.

    PubMed

    Schilichting, C L R; Lima, K C M; Cestari, L A; Sekiyama, J Y; Silva, F M; Milani, H

    2004-04-01

    A gravimetric method was evaluated as a simple, sensitive, reproducible, low-cost alternative to quantify the extent of brain infarct after occlusion of the medial cerebral artery in rats. In ether-anesthetized rats, the left medial cerebral artery was occluded for 1, 1.5 or 2 h by inserting a 4-0 nylon monofilament suture into the internal carotid artery. Twenty-four hours later, the brains were processed for histochemical triphenyltetrazolium chloride (TTC) staining and quantitation of the schemic infarct. In each TTC-stained brain section, the ischemic tissue was dissected with a scalpel and fixed in 10% formalin at 0 masculine C until its total mass could be estimated. The mass (mg) of the ischemic tissue was weighed on an analytical balance and compared to its volume (mm(3)), estimated either by plethysmometry using platinum electrodes or by computer-assisted image analysis. Infarct size as measured by the weighing method (mg), and reported as a percent (%) of the affected (left) hemisphere, correlated closely with volume (mm(3), also reported as %) estimated by computerized image analysis (r = 0.88; P < 0.001; N = 10) or by plethysmography (r = 0.97-0.98; P < 0.0001; N = 41). This degree of correlation was maintained between different experimenters. The method was also sensitive for detecting the effect of different ischemia durations on infarct size (P < 0.005; N = 23), and the effect of drug treatments in reducing the extent of brain damage (P < 0.005; N = 24). The data suggest that, in addition to being simple and low cost, the weighing method is a reliable alternative for quantifying brain infarct in animal models of stroke. PMID:15064814

  7. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging.

    PubMed

    Könik, Arda; Kupinski, Meredith; Pretorius, P Hendrik; King, Michael A; Barrett, Harrison H

    2015-08-21

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3 cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested. PMID:26247228

  8. Comparison of concentration methods for rapid detection of hookworm ova in wastewater matrices using quantitative PCR.

    PubMed

    Gyawali, P; Ahmed, W; Jagals, P; Sidhu, J P S; Toze, S

    2015-12-01

    Hookworm infection contributes around 700 million infections worldwide especially in developing nations due to increased use of wastewater for crop production. The effective recovery of hookworm ova from wastewater matrices is difficult due to their low concentrations and heterogeneous distribution. In this study, we compared the recovery rates of (i) four rapid hookworm ova concentration methods from municipal wastewater, and (ii) two concentration methods from sludge samples. Ancylostoma caninum ova were used as surrogate for human hookworm (Ancylostoma duodenale and Necator americanus). Known concentration of A. caninum hookworm ova were seeded into wastewater (treated and raw) and sludge samples collected from two wastewater treatment plants (WWTPs) in Brisbane and Perth, Australia. The A. caninum ova were concentrated from treated and raw wastewater samples using centrifugation (Method A), hollow fiber ultrafiltration (HFUF) (Method B), filtration (Method C) and flotation (Method D) methods. For sludge samples, flotation (Method E) and direct DNA extraction (Method F) methods were used. Among the four methods tested, filtration (Method C) method was able to recover higher concentrations of A. caninum ova consistently from treated wastewater (39-50%) and raw wastewater (7.1-12%) samples collected from both WWTPs. The remaining methods (Methods A, B and D) yielded variable recovery rate ranging from 0.2 to 40% for treated and raw wastewater samples. The recovery rates for sludge samples were poor (0.02-4.7), although, Method F (direct DNA extraction) provided 1-2 orders of magnitude higher recovery rate than Method E (flotation). Based on our results it can be concluded that the recovery rates of hookworm ova from wastewater matrices, especially sludge samples, can be poor and highly variable. Therefore, choice of concentration method is vital for the sensitive detection of hookworm ova in wastewater matrices. PMID:26358269

  9. Development of a quantitative method for the characterization of hole quality during laser trepan drilling of high-temperature alloy

    NASA Astrophysics Data System (ADS)

    Zhang, Hongyu; Zhou, Ming; Wang, Yunlong; Zhang, Xiangchao; Yan, Yu; Wang, Rong

    2016-02-01

    Short-pulsed lasers are of significant industrial relevance in laser drilling, with an acceptable compromise between accuracy and efficiency. However, an intensive research with regard to qualitative and quantitative characterization of the hole quality has rarely been reported. In the present study, a series of through holes were fabricated on a high-temperature alloy workpiece with a thickness of 3 mm using a LASERTEC 80 PowerDrill manufacturing system, which incorporated a Nd:YAG millisecond laser into a five-axis positioning platform. The quality of the holes manufactured under different laser powers (80-140 W) and beam expanding ratios (1-6) was characterized by a scanning electron microscope associated with an energy-dispersive X-ray analysis, focusing mainly on the formation of micro-crack and recast layer. Additionally, the conicity and circularity of the holes were quantitatively evaluated by the apparent radius, root-mean-square deviation, and maximum deviation, which were calculated based on the extraction of hole edge through programming with MATLAB. The results showed that an amount of melting and spattering contents were presented at the entrance end and the exit end of the holes, and micro-cracks and recast layer (average thickness 15-30 µm) were detected along the side wall of the holes. The elemental composition of the melting and spattering contents and the recast layer was similar, with an obvious increase in the contents of O, Nb, and Cr and a great reduction in the contents of Fe and Ni in comparison with the bulk material. Furthermore, the conicity and circularity evaluation of the holes indicated that a laser power of 100 W and a beam expanding ratio of 4 or 5 represented the typical optimal drilling parameters in this specific experimental situation. It is anticipated that the quantitative method developed in the present study can be applied for the evaluation of hole quality in laser drilling and other drilling conditions.

  10. [Quantitative ultrasound].

    PubMed

    Barkmann, R; Glüer, C-C

    2006-10-01

    Methods of quantitative ultrasound (QUS) can be used to obtain knowledge about bone fragility. Comprehensive study results exist showing the power of QUS for the estimation of osteoporotic fracture risk. Nevertheless, the variety of technologies, devices, and variables as well as different degrees of validation of the single devices have to be taken into account. Using methods to simulate ultrasound propagation, the complex interaction between ultrasound and bone could be understood and the propagation could be visualized. Preceding widespread clinical use, it has to be clarified if patients with low QUS values will profit from therapy, as it has been shown for DXA. Moreover, the introduction of quality assurance measures is essential. The user should know the limitations of the methods and be able to interpret the results correctly. Applied in an adequate manner QUS methods could then, due to lower costs and absence of ionizing radiation, become important players in osteoporosis management. PMID:16896637

  11. Test Results for Entry Guidance Methods for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2003-01-01

    There are a number of approaches to advanced guidance and control (AG&C) that have the potential for achieving the goals of significantly increasing reusable launch vehicle (RLV) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies (ITAGCT) has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future reusable vehicle concepts.

  12. Test Results for Entry Guidance Methods for Space Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2004-01-01

    There are a number of approaches to advanced guidance and control that have the potential for achieving the goals of significantly increasing reusable launch vehicle (or any space vehicle that enters an atmosphere) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future vehicle concepts.

  13. A validated micellar electrokinetic chromatography method for the quantitation of dexamethasone, ondansetron and aprepitant, antiemetic drugs, in organogel.

    PubMed

    Bourdon, Florence; Lecoeur, Marie; Duhaut, Marion; Odou, Pascal; Vaccher, Claude; Foulon, Catherine

    2013-12-01

    A micellar electrokinetic chromatography (MEKC) method was developed for the determination of three anti-vomiting drugs (aprepitant, dexamethasone and ondansetron) in pharmaceutical formulations. The method was optimized using a central composite design (CCD). Four main factors (borate buffer concentration, pH, methanol content and sodium dodecyl sulfate concentration) were optimized in order to obtain best resolutions and peak efficiencies in a minimum runtime. The separation was performed in a fused-silica capillary. After optimization, the background electrolyte consisted of a borate buffer (62.5mM, pH 8.75) containing sodium dodecyl sulfate (77.5mM) and methanol (3.75%). Under these conditions, a complete separation of each antiemetic drug and its respective internal standards was achieved in 38min. The method was validated with trueness values from 94.9 to 107.2% and precision results (repeatability and intermediate precision) lower than 5.9%. MEKC-UV was the first method allowing the separation of aprepitant, dexamethasone and ondansetron and was suitable for the quantitation of these three antiemetic drugs in organogel formulations. The rapid sample preparation coupled with an automated separation technique make this method convenient for quality control of extemporaneous magistral ready-to-use formulation. PMID:23978340

  14. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  15. Research and Evaluation in Education and Psychology: Integrating Diversity with Quantitative, Qualitative, and Mixed Methods. Second Edition

    ERIC Educational Resources Information Center

    Mertens, Donna M.

    2004-01-01

    In this new edition, the author explains quantitative, qualitative, and mixed methods, and incorporates the viewpoints of various research paradigms (postpositivist, constructivist, transformative, and pragmatic) into descriptions of these methods. Special emphasis is provided for conducting research in culturally complex communities. Each chapter…

  16. Understanding Variation in Treatment Effects in Education Impact Evaluations: An Overview of Quantitative Methods. NCEE 2014-4017

    ERIC Educational Resources Information Center

    Schochet, Peter Z.; Puma, Mike; Deke, John

    2014-01-01

    This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…

  17. Monochloramine disinfection kinetics of Nitrosomonas europaea by propidium monoazide quantitative PCR and Live/Dead BacLight Methods

    EPA Science Inventory

    Monochloramine disinfection kinetics were determined for the pure culture ammonia-oxidizing bacterium Nitrosomonas europaea (ATCC 19718) by two culture independent methods: (1) LIVE/DEAD® BacLight™ (LD) and (2) propidium monoazide quantitative PCR (PMA-qPCR). Both methods were f...

  18. Determination of Calcium in Cereal with Flame Atomic Absorption Spectroscopy: An Experiment for a Quantitative Methods of Analysis Course

    ERIC Educational Resources Information Center

    Bazzi, Ali; Kreuz, Bette; Fischer, Jeffrey

    2004-01-01

    An experiment for determination of calcium in cereal using two-increment standard addition method in conjunction with flame atomic absorption spectroscopy (FAAS) is demonstrated. The experiment is intended to introduce students to the principles of atomic absorption spectroscopy giving them hands on experience using quantitative methods of…

  19. A Quantitative Study of a Software Tool that Supports a Part-Complete Solution Method on Learning Outcomes

    ERIC Educational Resources Information Center

    Garner, Stuart

    2009-01-01

    This paper reports on the findings from a quantitative research study into the use of a software tool that was built to support a part-complete solution method (PCSM) for the learning of computer programming. The use of part-complete solutions to programming problems is one of the methods that can be used to reduce the cognitive load that students…

  20. A quantitative method for determining relative colonization rates of maize callus by Fusarium graminearum for resistance gene evaluations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A quantitative PCR method was developed for detecting Fusarium graminearum growing in maize callus. Fungal DNA was found 12 hours after inoculation and was correlated with visual ratings. We demonstrated the efficacy of the method to quantify fungal growth in callus overexpressing a peroxidase gene ...

  1. Quantitative determination of fatty tissue on body surface in obese children by ultrasound method.

    PubMed

    Czinner, A; Várady, M

    1992-01-01

    Authors compared a new ultrasound method with the recognized anthropometrical examination in obese children. Between the two methods the determination of body fat did not indicate any divergence. In extreme obesity they considered the advantage of ultrasound examination more reliable. A further advantage of the ultrasound method is that it is well documentable, reproducible and is not invasive. In this way the therapeutic result is well followable. PMID:1560996

  2. Quantitatively Verifying the Results' Rationality for Farmland Quality Evaluation with Crop Yield, a Case Study in the Northwest Henan Province, China

    PubMed Central

    Huang, Junchang; Wang, Song

    2016-01-01

    Evaluating the assessing results’ rationality for farmland quality (FQ) is usually qualitative and based on farmers and experts’ perceptions of soil quality and crop yield. Its quantitative checking still remains difficult and is likely ignored. In this paper, FQ in Xiuwu County, the Northwest Henan Province, China was evaluated by the gray relational analysis (GRA) method and the traditional analytic hierarchy process (AHP) method. The consistency rate of two results was analysed. Research focused on proposing one method of testing the evaluation results’ rationality for FQ based on the crop yield. Firstly generating a grade map of crop yield and overlying it with the FQ evaluation maps. Then analysing their consistency rate for each grade in the same spatial position. Finally examining the consistency effects and allowing for a decision on adopting the results. The results showed that the area rate consistency and matching evaluation unit numbers between the two methods were 84.68% and 87.29%, respectively, and the space distribution was approximately equal. The area consistency rates between crop yield level and FQ evaluation levels by GRA and AHP were 78.15% and 74.29%, respectively. Therefore, the verifying effects of GRA and AHP were near, good and acceptable, and the FQ results from both could reflect the crop yield levels. The evaluation results by GCA, as a whole, were slightly more rational than that by AHP. PMID:27490247

  3. [CHROMATOSPECTROPHOTOMETRIC METHOD OF QUANTITATIVE ANALYSIS OF LAPPACONITINE IN THE UNDERGROUND PARTS OF ACONITUM ORIENTALE MILL, GROWING IN GEORGIA].

    PubMed

    Kintsurashvili, L

    2016-05-01

    Aconitum orientale Mill (family Helleboraceae) is a perennial herb. It is spread in forests of the west and the east Georgia and in the subalpine zone. The research objects were underground parts of Aconitum orientale Mill, which were picked in the phase of fruiting in Borjomi in 2014. We had received alkaloids sum from the air-dry underground parts (1.5 kg) with chloroform extract which was alkalined by 5% sodium carbonate. We received the alkaloids sum of 16.5 g and determined that predominant is pharmacologically active diterpenic alkaloid - Lappaconitine, which is an acting initial part of the antiarrhythmic drug "Allapinin". The chromatospectrophotometrical method of quantitative analysis of Lappaconitine is elaborated for the detection of productivity of the underground parts of Aconitum orientale Mill. It was determined that maximal absorption wave length in ultra-violet spectrum (λmax) is 308 nm; It is established that relative error is norm (4%) from statical processing of quantitative analysis results. We determined that the content of Lappaconitine in the underground parts of Aconitum orientale Mill is 0.11-0.13% in the phase of fruiting. In consequence of experimental data Aconitum orientale Mill is approved as the raw material to receive pharmacologically active Lappaconitine. PMID:27348177

  4. Application of a quantitative (1)H-NMR method for the determination of paeonol in Moutan cortex, Hachimijiogan and Keishibukuryogan.

    PubMed

    Tanaka, Rie; Shibata, Hikari; Sugimoto, Naoki; Akiyama, Hiroshi; Nagatsu, Akito

    2016-10-01

    Quantitative (1)H-NMR ((1)H-qNMR) was applied to the determination of paeonol concentration in Moutan cortex, Hachimijiogan, and Keishibukuryogan. Paeonol is a major component of Moutan cortex, and its purity was calculated from the ratio of the intensity of the paeonol H-3' signal at δ 6.41 ppm in methanol-d 4 or 6.40 ppm in methanol-d 4 + TFA-d to that of a hexamethyldisilane (HMD) signal at 0 ppm. The concentration of HMD was corrected with SI traceability by using potassium hydrogen phthalate of certified reference material grade. As a result, the paeonol content in two lots of Moutan cortex as determined by (1)H-qNMR was found to be 1.59 % and 1.62 %, respectively, while the paeonol content in Hachimijiogan and Keishibukuryogan was 0.15 % and 0.22 %, respectively. The present study demonstrated that the (1)H-NMR method is useful for the quantitative analysis of crude drugs and Kampo formulas. PMID:27164909

  5. Linking Functional Connectivity and Structural Connectivity Quantitatively: A Comparison of Methods.

    PubMed

    Huang, Haiqing; Ding, Mingzhou

    2016-03-01

    Structural connectivity in the brain is the basis of functional connectivity. Quantitatively linking the two, however, remains a challenge. For a pair of regions of interest (ROIs), anatomical connections derived from diffusion-weighted imaging are often quantified by fractional anisotropy (FA) or edge weight, whereas functional connections, derived from resting-state functional magnetic resonance imaging, can be characterized by non-time-series measures such as zero-lag cross correlation and partial correlation, as well as by time-series measures such as coherence and Granger causality. In this study, we addressed the question of linking structural connectivity and functional connectivity quantitatively by considering two pairs of ROIs, one from the default mode network (DMN) and the other from the central executive network (CEN), using two different data sets. Selecting (1) posterior cingulate cortex and medial prefrontal cortex of the DMN as the first pair of ROIs and (2) left dorsal lateral prefrontal cortex and left inferior parietal lobule of the CEN as the second pair of ROIs, we show that (1) zero-lag cross correlation, partial correlation, and pairwise Granger causality were not significantly correlated with either mean FA or edge weight and (2) conditional Granger causality (CGC) was significantly correlated with edge weight but not with mean FA. These results suggest that (1) edge weight may be a more appropriate measure to quantify the strength of the anatomical connection between ROIs and (2) CGC, which statistically removes common input and the indirect influences between a given ROI pair, may be a more appropriate measure to quantify the strength of the functional interaction enabled by the fibers linking the two ROIs. PMID:26598788

  6. Quantitative cytochemistry of glycogen in blood cells. Methods and clinical application.

    PubMed

    Gahrton, G; Yataganas, X

    1976-01-01

    Quantitative glycogen determinations can be made in single blood and bone marrow cells, using microspectrophotometry or microfluorometry after staining with variants of the periodic acid--Schiff (PAS) reaction. These PAS variant reactions generally do not indicate the presence of non-glycogen PAS-positive substances, known to be prevalent in various hematopoietic cells, possibly due to masking of reactive groups. The specificity of the reaction in blood cells was ascertained by alpha-amylase digestion, which removed more than 95% of the PAS-positive material. Calibration of the PAS reaction was undertaken with a microdroplet model of pure leukocyte glycogen. The glycogen amounts in the droplets were determined by microinterferometry, the droplets were stained with a variant PAS reaction, and the total extinction of the reaction product in the stained droplets was determined by microspectrophotometry. The extinction coefficient (k) was obtained from the equation k equals Etot divided by M where (Etot) is the total extinction as determined by microspectrophotometry and (M) the dry glycogen amount as determined by microinterferometry. The microinterferometric dry mass determinations were calibrated by X-ray absorption in order to obtain the absolute amounts of glycogen. For practical purposes a reference system was made of normal neutrophil leukocytes. The glycogen content in the reference neutrophils was first determined with the micromodel. These neutrophils, now with a known glycogen amount, were stained with the PAS reagents and measured microspectrophotometrically in parallel with cells containing an unknown glycogen amount. Alternatively, the staining was made with a fluorescent PAS reaction, and the glycogen content determined by microfluorometry. Both methods appeared suitable for determining the glycogen content of blood cells from patients with various diseases, though the microfluorometric method was preferable for measurements of small amounts of

  7. Effect of platform, reference material, and quantification model on enumeration of Enterococcus by quantitative PCR methods

    EPA Science Inventory

    Quantitative polymerase chain reaction (qPCR) is increasingly being used for the quantitative detection of fecal indicator bacteria in beach water. QPCR allows for same-day health warnings, and its application is being considered as an optionn for recreational water quality testi...

  8. A method for quantitative analysis of clump thickness in cervical cytology slides.

    PubMed

    Fan, Yilun; Bradley, Andrew P

    2016-01-01

    Knowledge of the spatial distribution and thickness of cytology specimens is critical to the development of digital slide acquisition techniques that minimise both scan times and image file size. In this paper, we evaluate a novel method to achieve this goal utilising an exhaustive high-resolution scan, an over-complete wavelet transform across multi-focal planes and a clump segmentation of all cellular materials on the slide. The method is demonstrated with a quantitative analysis of ten normal, but difficult to scan Pap stained, Thin-prep, cervical cytology slides. We show that with this method the top and bottom of the specimen can be estimated to an accuracy of 1 μm in 88% and 97% of the fields of view respectively. Overall, cellular material can be over 30 μm thick and the distribution of cells is skewed towards the cover-slip (top of the slide). However, the median clump thickness is 10 μm and only 31% of clumps contain more than three nuclei. Therefore, by finding a focal map of the specimen the number of 1 μm spaced focal planes that are required to be scanned to acquire 95% of the in-focus material can be reduced from 25.4 to 21.4 on average. In addition, we show that by considering the thickness of the specimen, an improved focal map can be produced which further reduces the required number of 1 μm spaced focal planes to 18.6. This has the potential to reduce scan times and raw image data by over 25%. PMID:26477005

  9. Supersonic cruise research aircraft structural studies: Methods and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Gross, D.; Kurtze, W.; Newsom, J.; Wrenn, G.; Greene, W.

    1981-01-01

    NASA Langley Research Center SCAR in-house structural studies are reviewed. In methods development, advances include a new system of integrated computer programs called ISSYS, progress in determining aerodynamic loads and aerodynamically induced structural loads (including those due to gusts), flutter optimization for composite and metal airframe configurations using refined and simplified mathematical models, and synthesis of active controls. Results given address several aspects of various SCR configurations. These results include flutter penalties on composite wing, flutter suppression using active controls, roll control effectiveness, wing tip ground clearance, tail size effect on flutter, engine weight and mass distribution influence on flutter, and strength and flutter optimization of new configurations. The ISSYS system of integrated programs performed well in all the applications illustrated by the results, the diversity of which attests to ISSYS' versatility.

  10. A novel imaging method for quantitative Golgi localization reveals differential intra-Golgi trafficking of secretory cargoes

    PubMed Central

    Tie, Hieng Chiong; Mahajan, Divyanshu; Chen, Bing; Cheng, Li; VanDongen, Antonius M. J.; Lu, Lei

    2016-01-01

    Cellular functions of the Golgi are determined by the unique distribution of its resident proteins. Currently, electron microscopy is required for the localization of a Golgi protein at the sub-Golgi level. We developed a quantitative sub-Golgi localization method based on centers of fluorescence masses of nocodazole-induced Golgi ministacks under conventional optical microscopy. Our method is rapid, convenient, and quantitative, and it yields a practical localization resolution of ∼30 nm. The method was validated by the previous electron microscopy data. We quantitatively studied the intra-Golgi trafficking of synchronized secretory membrane cargoes and directly demonstrated the cisternal progression of cargoes from the cis- to the trans-Golgi. Our data suggest that the constitutive efflux of secretory cargoes could be restricted at the Golgi stack, and the entry of the trans-Golgi network in secretory pathway could be signal dependent. PMID:26764092

  11. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    EPA Science Inventory

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  12. A processing method and results of meteor shower radar observations

    NASA Technical Reports Server (NTRS)

    Belkovich, O. I.; Suleimanov, N. I.; Tokhtasjev, V. S.

    1987-01-01

    Studies of meteor showers permit the solving of some principal problems of meteor astronomy: to obtain the structure of a stream in cross section and along its orbits; to retrace the evolution of particle orbits of the stream taking into account gravitational and nongravitational forces and to discover the orbital elements of its parent body; to find out the total mass of solid particles ejected from the parent body taking into account physical and chemical evolution of meteor bodies; and to use meteor streams as natural probes for investigation of the average characteristics of the meteor complex in the solar system. A simple and effective method of determining the flux density and mass exponent parameter was worked out. This method and its results are discussed.

  13. Classical dense matter physics: some basic methods and results

    NASA Astrophysics Data System (ADS)

    Čelebonović, Vladan

    2002-07-01

    This is an introduction to the basic notions, some methods and open problems of dense matter physics and their applications in astrophysics. Experimental topics cover the range from the work of P. W. Bridgman to the discovery and basic results of use of the diamond anvil cell. On the theoretical side, the semiclassical method of P. Savić and R. Kašanin is described. The choice of these topics is conditioned by their applicability in astrophysics and the author's research experience. At the end of the paper is presented a list of some unsolved problems in dense matter physics and astrophysics, some (or all) of which could form a basis of future collaborations.

  14. Methods and preliminary measurement results of liquid Li wettability

    SciTech Connect

    Zuo, G. Z. Hu, J. S.; Ren, J.; Sun, Z.; Yang, Q. X.; Li, J. G.; Zakharov, L. E.; Mansfield, D. K.

    2014-02-15

    A test of lithium wettability was performed in high vacuum (< 3 × 10{sup −4} Pa). High magnification images of Li droplets on stainless steel substrates were produced and processed using the MATLAB{sup ®} program to obtain clear image edge points. In contrast to the more standard “θ/2” or polynomial fitting methods, ellipse fitting of the complete Li droplet shape resulted in reliable contact angle measurements over a wide range of contact angles. Using the ellipse fitting method, it was observed that the contact angle of a liquid Li droplet on a stainless steel substrate gradually decreased with increasing substrate temperature. The critical wetting temperature of liquid Li on stainless steel was observed to be about 290 °C.

  15. Quantitative analysis of ecological effects for land use planning based on ecological footprint method: a case research in Nanyang City

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Liu, Yaolin; Chen, Xinming

    2008-10-01

    The research of coordinated development between land use and ecological building is a new problem with the development of country economy, whose intention is to improve economy development and protect eco-environment in order to realize regional sustainable development. Evaluating human effects on the ecosystem by a comprehensive, scientific and quantitative method is a critical issue in the process of general land use planning. At present, ecological footprint methodology, as an excellent educational tool applicable to global issues, is essential for quantifying humanity's consumption of natural capital, for overall assessments of human impact on earth as well as for general land use planning. However, quantitative studies on the development trends of ecological footprint (EF) time series and biological capacity (BC) time series in a given region are still rare. Taking Nanyang City as a case study, this paper presents two quantitative estimate indices over time scale called the change rate and scissors difference to quantitatively analyze the trends of EF and BC over the planning period in general land use planning form 1997-2004 and to evaluate the ecological effects of the land use general planning form 1997 to.2010. The results showed that: 1 In Nanyang city, trends of the per capita EF and BC were on the way round, and the ecological deficit enhanced from 1997 to 2010. 2 The difference between the two development trends of per capita EF and BC had been increasing rapidly and the conflict between the EF and BC was aggravated from 1997 to 2010. 3 The general land use planning (1997 - 2010) of Nanyang city had produced some positive effects on the local ecosystem, but the expected biological capacity in 2010 can hardly be realized following this trend. Therefore, this paper introduces a "trinity" land use model in the guidelines of environment- friendly land use pattern and based on the actual situation of Nanyang city, with the systemic synthesis of land

  16. Development of Screening Method for an Frail Elderly by Measurement Quantitative Lower Limb Muscular Strength

    NASA Astrophysics Data System (ADS)

    Yamashita, Kazuhiko; Iwakami, Yumi; Imaizumi, Kazuya; Sato, Mitsuru; Nakajima, Sawako; Ino, Shuichi; Kawasumi, Masashi; Ifukube, Tohru

    Falling is one of the most serious problems for the elderly. The aim of this study was to develop a screening method for identifying factors that increase the risk of falling among the elderly, particularly with regard to lower limb muscular strength. Subjects were 48 elderly volunteers, including 25 classed as healthy and 23 classed as frail. All subjects underwent measurement of lower limb muscular strength via toe gap force and measurement of muscle strength of the hip joint adductor via knee gap force. In the frail group, toe gap force of the right foot was 20% lower than that in the healthy group; toe gap force of the left foot in the frail group was 23% lower than that in the healthy group, while knee gap force was 20% lower. Furthermore, we found that combining left toe gap force and knee gap force gave the highest odds ratio (6.05) with 82.6% sensitivity and 56.0% specificity when the toe gap force was 24 N and the knee gap force was 100 N. Thus, lower limb muscular strength can be used for simple and efficient screening, and approaches to prevent falls can be based on quantitative data such as lower limb muscular strength.

  17. A rapid fluorescence based method for the quantitative analysis of cell culture media photo-degradation.

    PubMed

    Calvet, Amandine; Li, Boyan; Ryder, Alan G

    2014-01-01

    Cell culture media are very complex chemical mixtures that are one of the most important aspects in biopharmaceutical manufacturing. The complex composition of many media leads to materials that are inherently unstable and of particular concern, is media photo-damage which can adversely affect cell culture performance. This can be significant particularly with small scale transparent bioreactors and media containers are used for process development or research. Chromatographic and/or mass spectrometry based analyses are often time-consuming and expensive for routine high-throughput media analysis particularly during scale up or development processes. Fluorescence excitation-emission matrix (EEM) spectroscopy combined with multi-way chemometrics is a robust methodology applicable for the analysis of raw materials, media, and bioprocess broths. Here we demonstrate how EEM spectroscopy was used for the rapid, quantitative analysis of media degradation caused by ambient visible light exposure. The primary degradation pathways involve riboflavin (leading to the formation of lumichrome, LmC) which also causes photo-sensitised degradation of tryptophan, which was validated using high pressure liquid chromatography (HPLC) measurements. The use of PARallel FACtor analysis (PARAFAC), multivariate curve resolution (MCR), and N-way partial least squares (NPLS) enabled the rapid and easy monitoring of the compositional changes in tryptophan (Trp), tyrosine (Tyr), and riboflavin (Rf) concentration caused by ambient light exposure. Excellent agreement between HPLC and EEM methods was found for the change in Trp, Rf, and LmC concentrations. PMID:24356227

  18. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  19. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, Robert V.

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.

  20. Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation

    PubMed Central

    Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.

    2013-01-01

    Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400

  1. A simple method for the quantitative microextraction of polychlorinated biphenyls from soils and sediments.

    SciTech Connect

    Szostek, B.; Tinklenberg, J. A.; Aldstadt, J. H., III; Environmental Research

    1999-01-01

    We demonstrate the quantitative extraction of polychlorinated biphenyls (PCBs) from environmental solids by using a microscale adaptation of pressurized fluid extraction ({mu}PFE). The stainless steel extraction cells are filled with a solid sample and solvent and are heated at elevated temperature. After cooling the cell to room temperature, we determined PCBs in the extract by direct injection to a gas chromatograph with an electron capture detection system. This extraction method was tested on a set of PCB-spiked solid matrices and on a PCB-contaminated river sediment (KIST SRM 1939). Recoveries were measured for eight PCB congeners spiked into two soil types with hexane extraction at 100{sup o}C (>81.9 {+-} 5.4% to 112.5 {+-} 10.1 %). The extraction process for SRM 1939 with hexane at 300{sup o}C provided significantly higher recoveries for several representative PCB congeners than reported for a duplicate 16-hour Soy-Wet extraction with a mixture of organic solvents (acetone/hexane).

  2. Quantitative assessment of MS plaques and brain atrophy in multiple sclerosis using semiautomatic segmentation method

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Dastidar, Prasun; Ryymin, Pertti; Lahtinen, Antti J.; Eskola, Hannu; Malmivuo, Jaakko

    1997-05-01

    Quantitative magnetic resonance (MR) imaging of the brain is useful in multiple sclerosis (MS) in order to obtain reliable indices of disease progression. The goal of this project was to estimate the total volume of gliotic and non gliotic plaques in chronic progressive multiple sclerosis with the help of a semiautomatic segmentation method developed at the Ragnar Granit Institute. Youth developed program running on a PC based computer provides de displays of the segmented data, in addition to the volumetric analyses. The volumetric accuracy of the program was demonstrated by segmenting MR images of fluid filed syringes. An anatomical atlas is to be incorporated in the segmentation system to estimate the distribution of MS plaques in various neural pathways of the brain. A total package including MS plaque volume estimation, estimation of brain atrophy and ventricular enlargement, distribution of MS plaques in different neural segments of the brain has ben planned for the near future. Our study confirmed that total lesion volumes in chronic MS disease show a poor correlation to EDSS scores but show a positive correlation to neuropsychological scores. Therefore accurate total volume measurements of MS plaques using the developed semiautomatic segmentation technique helped us to evaluate the degree of neuropsychological impairment.

  3. Toward a quantitative account of pitch distribution in spontaneous narrative: method and validation.

    PubMed

    Matteson, Samuel E; Olness, Gloria Streit; Caplow, Nancy J

    2013-05-01

    Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the "e-la") superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400

  4. Revisiting the Isobole and Related Quantitative Methods for Assessing Drug Synergism

    PubMed Central

    2012-01-01

    The isobole is well established and commonly used in the quantitative study of agonist drug combinations. This article reviews the isobole, its derivation from the concept of dose equivalence, and its usefulness in providing the predicted effect of an agonist drug combination, a topic not discussed in pharmacology textbooks. This review addresses that topic and also shows that an alternate method, called “Bliss independence,” is inconsistent with the isobolar approach and also has a less clear conceptual basis. In its simplest application the isobole is the familiar linear plot in Cartesian coordinates with intercepts representing the individual drug potencies. It is also shown that the isobole can be nonlinear, a fact recognized by its founder (Loewe) but neglected or rejected by virtually all other users. Whether its shape is linear or nonlinear the isobole is equally useful in detecting synergism and antagonism for drug combinations, and its theoretical basis leads to calculations of the expected effect of a drug combination. Numerous applications of isoboles in preclinical testing have shown that synergism or antagonism is not only a property of the two agonist drugs; the dose ratio is also important, a fact of potential importance to the design and testing of drug combinations in clinical trials. PMID:22511201

  5. Challenges of interdisciplinary research: reconciling qualitative and quantitative methods for understanding human-landscape systems.

    PubMed

    Lach, Denise

    2014-01-01

    While interdisciplinary research is increasingly practiced as a way to transcend the limitations of individual disciplines, our concepts, and methods are primarily rooted in the disciplines that shape the way we think about the world and how we conduct research. While natural and social scientists may share a general understanding of how science is conducted, disciplinary differences in methodologies quickly emerge during interdisciplinary research efforts. This paper briefly introduces and reviews different philosophical underpinnings of quantitative and qualitative methodological approaches and introduces the idea that a pragmatic, realistic approach may allow natural and social scientists to work together productively. While realism assumes that there is a reality that exists independently of our perceptions, the work of scientists is to explore the mechanisms by which actions cause meaningful outcomes and the conditions under which the mechanisms can act. Our task as interdisciplinary researchers is to use the insights of our disciplines in the context of the problem to co-produce an explanation for the variables of interest. Research on qualities necessary for successful interdisciplinary researchers is also discussed along with recent efforts by funding agencies and academia to increase capacities for interdisciplinary research. PMID:23892682

  6. Challenges of Interdisciplinary Research: Reconciling Qualitative and Quantitative Methods for Understanding Human-Landscape Systems

    NASA Astrophysics Data System (ADS)

    Lach, Denise

    2014-01-01

    While interdisciplinary research is increasingly practiced as a way to transcend the limitations of individual disciplines, our concepts, and methods are primarily rooted in the disciplines that shape the way we think about the world and how we conduct research. While natural and social scientists may share a general understanding of how science is conducted, disciplinary differences in methodologies quickly emerge during interdisciplinary research efforts. This paper briefly introduces and reviews different philosophical underpinnings of quantitative and qualitative methodological approaches and introduces the idea that a pragmatic, realistic approach may allow natural and social scientists to work together productively. While realism assumes that there is a reality that exists independently of our perceptions, the work of scientists is to explore the mechanisms by which actions cause meaningful outcomes and the conditions under which the mechanisms can act. Our task as interdisciplinary researchers is to use the insights of our disciplines in the context of the problem to co-produce an explanation for the variables of interest. Research on qualities necessary for successful interdisciplinary researchers is also discussed along with recent efforts by funding agencies and academia to increase capacities for interdisciplinary research.

  7. Development and validation of an event-specific quantitative PCR method for genetically modified maize MIR162.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2014-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162. PMID:25743383

  8. A novel preclinical method to quantitatively evaluate early-stage metastatic events at the murine blood-brain barrier.

    PubMed

    Adkins, Chris E; Nounou, Mohamed I; Mittapalli, Rajendar K; Terrell-Hall, Tori B; Mohammad, Afroz S; Jagannathan, Rajaganapathi; Lockman, Paul R

    2015-01-01

    The observation that approximately 15% of women with disseminated breast cancer will develop symptomatic brain metastases combined with treatment guidelines discouraging single-agent chemotherapeutic strategies facilitates the desire for novel strategies aimed at outright brain metastasis prevention. Effective and robust preclinical methods to evaluate early-stage metastatic processes, brain metastases burden, and overall mean survival are lacking. Here, we develop a novel method to quantitate early metastatic events (arresting and extravasation) in addition to traditional end time-point parameters such as tumor burden and survival in an experimental mouse model of brain metastases of breast cancer. Using this method, a reduced number of viable brain-seeking metastatic cells (from 3,331 ± 263 cells/brain to 1,079 ± 495 cells/brain) were arrested in brain one week postinjection after TGFβ knockdown. Treatment with a TGFβ receptor inhibitor, galunisertib, reduced the number of arrested cells in brain to 808 ± 82 cells/brain. Furthermore, we observed a reduction in the percentage of extravasated cells (from 63% to 30%) compared with cells remaining intralumenal when TGFβ is knocked down or inhibited with galunisertib (40%). The observed reduction of extravasated metastatic cells in brain translated to smaller and fewer brain metastases and resulted in prolonged mean survival (from 36 days to 62 days). This method opens up potentially new avenues of metastases prevention research by providing critical data important to early brain metastasis of breast cancer events. PMID:25348853

  9. A novel preclinical method to quantitatively evaluate early-stage metastatic events at the murine blood-brain barrier

    PubMed Central

    Adkins, Chris E; Nounou, Mohamed I; Mittapalli, Rajendar K; Terrell-Hall, Tori B; Mohammad, Afroz S; Jagannathan, Rajaganapathi; Lockman, Paul R

    2014-01-01

    The observation that approximately 15% of women with disseminated breast cancer will develop symptomatic brain metastases combined with treatment guidelines discouraging single-agent chemotherapeutic strategies facilitates the desire for novel strategies aimed at outright brain metastasis prevention. Effective and robust preclinical methods to evaluate early stage metastatic processes, brain metastases burden, and overall mean survival are lacking. Here, we develop a novel method to quantitate early metastatic events (arresting and extravasation) in addition to traditional end time-point parameters such as tumor burden and survival in an experimental mouse model of brain metastases of breast cancer. Using this method, a reduced number of viable brain seeking metastatic cells (from 3331 ± 263 cells/brain to 1079 ± 495 cells/brain) arrested in brain one week post injection after TGFβ knockdown. Treatment with a TGFβ receptor inhibitor, galunisertib, reduced the number of arrested cells in brain to 808 ± 82 cells/brain. Further, we observed a reduction in the percent of extravasated cells (from 63% to 30%) compared to cells remaining intralumenal when TGFβ is knocked down or inhibited with galunisertib (40%). The observed reduction of extravasated metastatic cells in brain translated to smaller and fewer brain metastases and resulted in prolonged mean survival (from 36 days to 62 days). This method opens up potentially new avenues of metastases prevention research by providing critical data important to early brain metastasis of breast cancer events. PMID:25348853

  10. The Rapid and Sensitive Quantitative Determination of Galactose by Combined Enzymatic and Colorimetric Method: Application in Neonatal Screening.

    PubMed

    Kianmehr, Anvarsadat; Mahrooz, Abdolkarim; Ansari, Javad; Oladnabi, Morteza; Shahbazmohammadi, Hamid

    2016-05-01

    The quantitative measurement of galactose in blood is essential for the early diagnosis, treatment, and dietary monitoring of galactosemia patients. In this communication, we aimed to develop a rapid, sensitive, and cost-effective combined method for galactose determination in dry blood spots. This procedure was based on the combination of enzymatic reactions of galactose dehydrogenase (GalDH), dihydrolipoyl dehydrogenase (DLD), and alkaline phosphates with a colorimetric system. The incubation time and the concentration of enzymes used in new method were also optimized. The analytical performance was studied by the precision, recovery, linearity, and sensitivity parameters. Statistical analysis was applied to method comparison experiment. The regression equation and correlation coefficient (R (2)) were Y = 0.0085x + 0.032 and R (2) = 0.998, respectively. This assay exhibited a recovery in the range of 91.7-114.3 % and had the limit detection of 0.5 mg/dl for galactose. The between-run coefficient of variation (CV) was between 2.6 and 11.1 %. The within-run CV was between 4.9 and 9.2 %. Our results indicated that the new and reference methods were in agreement because no significant biases exist between them. Briefly, a quick and reliable combined enzymatic and colorimetric assay was presented for application in newborn mass screening and monitoring of galactosemia patients. PMID:26821257

  11. Basic investigation on acoustic velocity change imaging method for quantitative assessment of fat content in human liver

    NASA Astrophysics Data System (ADS)

    Mano, Kazune; Tanigawa, Shohei; Hori, Makoto; Yokota, Daiki; Wada, Kenji; Matsunaka, Toshiyuki; Morikawa, Hiroyasu; Horinaka, Hiromichi

    2016-07-01

    Fatty liver is a disease caused by the excess accumulation of fat in the human liver. The early diagnosis of fatty liver is very important, because fatty liver is the major marker linked to metabolic syndrome. We already proposed the ultrasonic velocity change imaging method to diagnose fatty liver by using the fact that the temperature dependence of ultrasonic velocity is different in water and in fat. For the diagonosis of a fatty liver stage, we attempted a feasibility study of the quantitative assessment of the fat content in the human liver using our ultrasonic velocity change imaging method. Experimental results showed that the fat content in the tissue mimic phantom containing lard was determined by its ultrasonic velocity change in the flat temperature region formed by a circular warming ultrasonic transducer with an acoustic lens having an appropriate focal length. By considering the results of our simulation using a thermal diffusion equation, we determined whether this method could be applied to fatty liver assessment under the condition that the tissue had the thermal relaxation effect caused by blood flow.

  12. Quantitative Pleistocene calcareous nannofossil biostratigraphy: preliminary results from the IODP Site U1385 (Exp 339), the Shackleton Site

    NASA Astrophysics Data System (ADS)

    Balestra, B.; Flores, J. A.; Acton, G.; Alvarez Zarikian, C. A.; Grunert, P.; Hernandez-Molina, F. J.; Hodell, D. A.; Li, B.; Richter, C.; Sanchez Goni, M.; Sierro, F. J.; Singh, A.; Stow, D. A.; Voelker, A.; Xuan, C.

    2013-12-01

    In order to explore the effects of Mediterranean Outflow Water (MOW) on North Atlantic circulation and climate, Integrated Ocean Drilling Program (IODP) Expedition 339 (Mediterranean Outflow) cored a series of sites in the Gulf of Cadiz slope and off West Iberia (North East Atlantic). Site U1385 (37°48'N, 10°10‧W, 3146 m water depth) was selected and drilled in the lower slope of the Portuguese margin, at a location close to the so-called Shackleton Site MD95-2042 (in honor of the late Sir Nicholas Shackleton), to provide a marine reference section of Pleistocene millennial-scale climate variability. Three holes were cored at Site U1385 using the Advanced Piston Corer (APC) to a depth of ~151 meters below seafloor in order to recover a continuous stratigraphic record covering the past 1.4 Ma. Here we present preliminary results of the succession of standard and alternative calcareous nannofossil events. Our quantitative study based on calcareous nannofossils shows well-preserved and abundant assemblages throughout the core. Most conventional Pleistocene events were recognized. Moreover, our quantitative investigations provide further data on the stratigraphic distribution of some species and groups, such as the large Emiliania huxleyi (>4 μm), the small Gephyrocapsa group, and Reticulofenestra cisnerosii. A preliminary calibration of the calcareous nannofossil events with the paleomagnetic and astronomical signal, estimated by comparison with geophysical and logging parameters is also presented. *IODP Expedition 339 Scientists: Bahr, A., Ducassou. E., Flood, R., Furota, S., Jimenez-Espejo, F., Kim, J. K., Krissek, L., Kuroda, J., Llave, E., Lofi, J., Lourens, L., Miller, M., Nanayama, F., Nishida, N., Roque, C., Sloss, C., Takashimizu, Y., Tzanova, A., Williams, T.

  13. A quantitative method to analyse an open-ended questionnaire: A case study about the Boltzmann Factor

    NASA Astrophysics Data System (ADS)

    Rosario Battaglia, Onofrio; Di Paola, Benedetto

    2016-05-01

    This paper describes a quantitative method to analyse an open-ended questionnaire. Student responses to a specially designed written questionnaire are quantitatively analysed by not hierarchical clustering called k -means method. Through this we can characterise behaviour students with respect their expertise to formulate explanations for phenomena or processes and/or use a given model in the different context. The physics topic is about the Boltzmann Factor, which allows the students to have a unifying view of different phenomena in different contexts.

  14. Quantitative evaluation of automatic methods for lesions detection in breast ultrasound images

    NASA Astrophysics Data System (ADS)

    Marcomini, Karem D.; Schiabel, Homero; Carneiro, Antonio Adilton O.

    2013-02-01

    Ultrasound (US) is a useful diagnostic tool to distinguish benign from malignant breast masses, providing more detailed evaluation in dense breasts. Due to the subjectivity in the images interpretation, computer-aid diagnosis (CAD) schemes have been developed, increasing the mammography analysis process to include ultrasound images as complementary exams. As one of most important task in the evaluation of this kind of images is the mass detection and its contours interpretation, automated segmentation techniques have been investigated in order to determine a quite suitable procedure to perform such an analysis. Thus, the main goal in this work is investigating the effect of some processing techniques used to provide information on the determination of suspicious breast lesions as well as their accurate boundaries in ultrasound images. In tests, 80 phantom and 50 clinical ultrasound images were preprocessed, and 5 segmentation techniques were tested. By using quantitative evaluation metrics the results were compared to a reference image delineated by an experienced radiologist. A self-organizing map artificial neural network has provided the most relevant results, demonstrating high accuracy and low error rate in the lesions representation, corresponding hence to the segmentation process for US images in our CAD scheme under tests.

  15. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…

  16. Errors in quantitative T1rho imaging and the correction methods.

    PubMed

    Chen, Weitian

    2015-08-01

    The spin-lattice relaxation time constant in rotating frame (T1rho) is useful for assessment of the properties of macromolecular environment inside tissue. Quantification of T1rho is found promising in various clinical applications. However, T1rho imaging is prone to image artifacts and quantification errors, which remains one of the greatest challenges to adopt this technique in routine clinical practice. The conventional continuous wave spin-lock is susceptible to B1 radiofrequency (RF) and B0 field inhomogeneity, which appears as banding artifacts in acquired images. A number of methods have been reported to modify T1rho prep RF pulse cluster to mitigate this effect. Adiabatic RF pulse can also be used for spin-lock with insensitivity to both B1 RF and B0 field inhomogeneity. Another source of quantification error in T1rho imaging is signal evolution during imaging data acquisition. Care is needed to affirm such error does not take place when specific pulse sequence is used for imaging data acquisition. Another source of T1rho quantification error is insufficient signal-to-noise ratio (SNR), which is common among various quantitative imaging approaches. Measurement of T1rho within an ROI can mitigate this issue, but at the cost of reduced resolution. Noise-corrected methods are reported to address this issue in pixel-wise quantification. For certain tissue type, T1rho quantification can be confounded by magic angle effect and the presence of multiple tissue components. Review of these confounding factors from inherent tissue properties is not included in this article. PMID:26435922

  17. Errors in quantitative T1rho imaging and the correction methods

    PubMed Central

    2015-01-01

    The spin-lattice relaxation time constant in rotating frame (T1rho) is useful for assessment of the properties of macromolecular environment inside tissue. Quantification of T1rho is found promising in various clinical applications. However, T1rho imaging is prone to image artifacts and quantification errors, which remains one of the greatest challenges to adopt this technique in routine clinical practice. The conventional continuous wave spin-lock is susceptible to B1 radiofrequency (RF) and B0 field inhomogeneity, which appears as banding artifacts in acquired images. A number of methods have been reported to modify T1rho prep RF pulse cluster to mitigate this effect. Adiabatic RF pulse can also be used for spin-lock with insensitivity to both B1 RF and B0 field inhomogeneity. Another source of quantification error in T1rho imaging is signal evolution during imaging data acquisition. Care is needed to affirm such error does not take place when specific pulse sequence is used for imaging data acquisition. Another source of T1rho quantification error is insufficient signal-to-noise ratio (SNR), which is common among various quantitative imaging approaches. Measurement of T1rho within an ROI can mitigate this issue, but at the cost of reduced resolution. Noise-corrected methods are reported to address this issue in pixel-wise quantification. For certain tissue type, T1rho quantification can be confounded by magic angle effect and the presence of multiple tissue components. Review of these confounding factors from inherent tissue properties is not included in this article. PMID:26435922

  18. A simple and rapid method to identify and quantitatively analyze triterpenoid saponins in Ardisia crenata using ultrafast liquid chromatography coupled with electrospray ionization quadrupole mass spectrometry.

    PubMed

    Ma, Ling; Li, Wei; Wang, Hanqing; Kuang, Xinzhu; Li, Qin; Wang, Yinghua; Xie, Peng; Koike, Kazuo

    2015-01-01

    Ardisia plant species have been used in traditional medicines, and their bioactive constituents of 13,28-epoxy triterpenoid saponins have excellent biological activities for new drug development. In this study, a fast and simple method based on ultrafast liquid chromatography coupled to electrospray ionization mass spectrometry (UFLC-MS) was developed to simultaneously identify and quantitatively analyze triterpenoid saponins in Ardisia crenata extracts. In total, 22 triterpenoid saponins, including two new compounds, were identified from A. crenata. The method exhibited good linearity, precision and recovery for the quantitative analysis of eight marker saponins. A relative quantitative method was also developed using one major saponin (ardisiacrispin B) as the standard to break through the choke-point of the lack of standards in phytochemical analysis. The method was successfully applied to quantitatively analyze saponins in commercially available plant samples. This study describes the first systematic analysis of 13,28-epoxy-oleanane-type triterpenoid saponins in the genus Ardisia using LC-ESI-MS. The results can provide the chemical support for further biological studies, phytochemotaxonomical studies and quality control of triterpenoid saponins in medicinal plants of the genus Ardisia. PMID:25459939

  19. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the ex