Science.gov

Sample records for quantitative methods results

  1. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for

  2. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  3. A new method for quantitative ultrasound measurements at multiple skeletal sites: first results of precision and fracture discrimination.

    PubMed

    Barkmann, R; Kantorovich, E; Singal, C; Hans, D; Genant, H K; Heller, M; Glüer, C C

    2000-01-01

    We investigated a new multisite quantitative ultrasound device that measures the acoustic velocity in axial transmission mode along the cortex. Using a prototype of the Omnisense (Sunlight Ultrasound Technologies, Rehovot, Israel), we tested the performance of this instrument at four sites of the skeleton: radius, ulna, metacarpal, and phalanx. Intraobserver (interobserver) precision errors ranged from 0.2% to 0.3% (0.3% to 0.7%) for triplicate measurements with repositioning. Fracture discrimination was tested by comparing a group of 34 women who had previously suffered a fracture of the hip, spine, ankle, or forearm to a group of 28 healthy women who had not suffered a fracture. Age-adjusted standardized odds ratios ranged from 1.6 to 4.5. Except for the ulna the sites showed a significant fracture discrimination (p < 0.01). The areas under the receiver operating curves (ROC) curves were from 0.88 to 0.89 for radius, metacarpal, and phalanx. A combination of the results from the three sites showed a significant increase of the ROC area to 0.95 (p < 0. 05). Our results show promising performance of this new device. The ability to measure a large variety of sites and the potential to combine these measurements are promising with regard to optimizing fracture risk assessment. PMID:10745297

  4. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  5. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  6. Quantitative results from the focusing schlieren technique

    NASA Technical Reports Server (NTRS)

    Cook, S. P.; Chokani, Ndaona

    1993-01-01

    An iterative theoretical approach to obtain quantitative density data from the focusing schlieren technique is proposed. The approach is based on an approximate modeling of the focusing action in a focusing schlieren system, and an estimation of an appropriate focal plane thickness. The theoretical approach is incorporated in a computer program, and results obtained from a supersonic wind tunnel experiment evaluated by comparison with CFD data. The density distributions compared favorably with CFD predictions. However, improvements to the system are required in order to reduce noise in the data, to improve specifications of a depth of focus, and to refine the modeling of the focusing action.

  7. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. PMID:26763302

  8. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  9. Quantitative statistical methods for image quality assessment.

    PubMed

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  10. Quantitative Statistical Methods for Image Quality Assessment

    PubMed Central

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  11. Results from the NIST 2004 DNA Quantitation Study.

    PubMed

    Kline, Margaret C; Duewer, David L; Redman, Janette W; Butler, John M

    2005-05-01

    For optimal DNA short tandem repeat (STR) typing results, the DNA concentration ([DNA]) of the sample must be accurately determined prior to the polymerase chain reaction (PCR) amplification step in the typing process. In early 2004, the National Institute of Standards and Technology (NIST) conducted an interlaboratory study to help assess the accuracy of DNA quantitation in forensic DNA laboratories. This study was designed with four primary purposes: (1) to examine concentration effects and to probe performance at the lower DNA concentration levels that are frequently seen in forensic casework; (2) to examine consistency with various methodologies across multiple laboratories; (3) to examine single versus multiple source samples; and (4) to study DNA stability over time and through shipping in two types of storage tubes. Eight DNA samples of [DNA] from 0.05 ng/microL to 1.5 ng/microL were distributed. A total of 287 independent data sets were returned from 80 participants. Results were reported for 19 different DNA quantitation methodologies. Approximately 65% of the data were obtained using traditional slot blot hybridization methods; 21% were obtained using newly available quantitative real-time PCR (Q-PCR) techniques. Information from this interlaboratory study is guiding development of a future NIST Standard Reference Material for Human DNA Quantitation, SRM 2372. PMID:15932088

  12. A quantitative method to evaluate neutralizer toxicity against Acanthamoeba castellanii.

    PubMed Central

    Buck, S L; Rosenthal, R A

    1996-01-01

    A standard methodology for quantitatively evaluating neutralizer toxicity against Acanthamoeba castellanii does not exist. The objective of this study was to provide a quantitative method for evaluating neutralizer toxicity against A. castellanii. Two methods were evaluated. A quantitative microtiter method for enumerating A. castellanii was evaluated by a 50% lethal dose endpoint method. The microtiter method was compared with the hemacytometer count method. A method for determining the toxicity of neutralizers for antimicrobial agents to A. castellanii was also evaluated. The toxicity to A. castellanii of Dey-Engley neutralizing broth was compared with Page's saline. The microtiter viable cell counts were lower than predicted by the hemacytometer counts. However, the microtiter method gives more reliable counts of viable cells. Dey-Engley neutralizing medium was not toxic to A. castellanii. The method presented gives consistent, reliable results and is simple compared with previous methods. PMID:8795247

  13. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  14. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  15. Quantitative laser-induced breakdown spectroscopy data using peak area step-wise regression analysis: an alternative method for interpretation of Mars science laboratory results

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Dyar, Melinda D; Schafer, Martha W; Tucker, Jonathan M

    2008-01-01

    The ChemCam instrument on the Mars Science Laboratory (MSL) will include a laser-induced breakdown spectrometer (LIBS) to quantify major and minor elemental compositions. The traditional analytical chemistry approach to calibration curves for these data regresses a single diagnostic peak area against concentration for each element. This approach contrasts with a new multivariate method in which elemental concentrations are predicted by step-wise multiple regression analysis based on areas of a specific set of diagnostic peaks for each element. The method is tested on LIBS data from igneous and metamorphosed rocks. Between 4 and 13 partial regression coefficients are needed to describe each elemental abundance accurately (i.e., with a regression line of R{sup 2} > 0.9995 for the relationship between predicted and measured elemental concentration) for all major and minor elements studied. Validation plots suggest that the method is limited at present by the small data set, and will work best for prediction of concentration when a wide variety of compositions and rock types has been analyzed.

  16. Protein staining methods in quantitative cytochemistry.

    PubMed

    Tas, J; van der Ploeg, M; Mitchell, J P; Cohn, N S

    1980-08-01

    The chemical action and practical application of the Naphthol Yellow S, Alkaline Fast Green, Coomassie Brilliant Blue, Dinitrofluorobenzene and some lesser known protein staining methods have been surveyed with respect to their potentialities for quantitative cytochemical analyses. None of the dyes can be said to bind to any specific protein or group of proteins, but each may be used to analyse the presence of one or more particular amino acid residues. For the cytophotometric measurement of the 'total protein content' of individual cells and cell organelles the covalent binding Dinitrofluorobenzene and the electrostatic binding Naphthol Yellow S can properly be used. Fast Green FCF, applied at alkaline pH, binds electrostatically to the basic amino acid side chains of strongly basic proteins only but not in a quantitative (stoichiometrical) way. Coomassie Brilliant Blue, recently introduced to protein cytochemistry, may be useful for quantitative purposes. The combined Feulgen-Pararosaniline(SO2)/Naphthol Yellow S and Dinitrofluorobenzene/Feulgen-Pararosaniline(SO2) methods enable the simultaneous cytophotometric analysis at two different wavelengths for protein and DNA within the same microscopical preparation. PMID:6157816

  17. A quantitation method for mass spectrometry imaging.

    PubMed

    Koeniger, Stormy L; Talaty, Nari; Luo, Yanping; Ready, Damien; Voorbach, Martin; Seifert, Terese; Cepa, Steve; Fagerland, Jane A; Bouska, Jennifer; Buck, Wayne; Johnson, Robert W; Spanton, Stephen

    2011-02-28

    A new quantitation method for mass spectrometry imaging (MSI) with matrix-assisted laser desorption/ionization (MALDI) has been developed. In this method, drug concentrations were determined by tissue homogenization of five 10 µm tissue sections adjacent to those analyzed by MSI. Drug levels in tissue extracts were measured by liquid chromatography coupled to tandem mass spectrometry (LC/MS/MS). The integrated MSI response was correlated to the LC/MS/MS drug concentrations to determine the amount of drug detected per MSI ion count. The study reported here evaluates olanzapine in liver tissue. Tissue samples containing a range of concentrations were created from liver harvested from rats administered a single dose of olanzapine at 0, 1, 4, 8, 16, 30, or 100 mg/kg. The liver samples were then analyzed by MALDI-MSI and LC/MS/MS. The MALDI-MSI and LC/MS/MS correlation was determined for tissue concentrations of ~300 to 60,000 ng/g and yielded a linear relationship over two orders of magnitude (R(2) = 0.9792). From this correlation, a conversion factor of 6.3 ± 0.23 fg/ion count was used to quantitate MSI responses at the pixel level (100 µm). The details of the method, its importance in pharmaceutical analysis, and the considerations necessary when implementing it are presented. PMID:21259359

  18. An unconventional method of quantitative microstructural analysis

    SciTech Connect

    Rastani, M.

    1995-06-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method.

  19. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  20. Advances in quantitative electroencephalogram analysis methods.

    PubMed

    Thakor, Nitish V; Tong, Shanbao

    2004-01-01

    Quantitative electroencephalogram (qEEG) plays a significant role in EEG-based clinical diagnosis and studies of brain function. In past decades, various qEEG methods have been extensively studied. This article provides a detailed review of the advances in this field. qEEG methods are generally classified into linear and nonlinear approaches. The traditional qEEG approach is based on spectrum analysis, which hypothesizes that the EEG is a stationary process. EEG signals are nonstationary and nonlinear, especially in some pathological conditions. Various time-frequency representations and time-dependent measures have been proposed to address those transient and irregular events in EEG. With regard to the nonlinearity of EEG, higher order statistics and chaotic measures have been put forward. In characterizing the interactions across the cerebral cortex, an information theory-based measure such as mutual information is applied. To improve the spatial resolution, qEEG analysis has also been combined with medical imaging technology (e.g., CT, MR, and PET). With these advances, qEEG plays a very important role in basic research and clinical studies of brain injury, neurological disorders, epilepsy, sleep studies and consciousness, and brain function. PMID:15255777

  1. Quantitative rotating frame relaxometry methods in MRI.

    PubMed

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27100142

  2. Sparse methods for Quantitative Susceptibility Mapping

    NASA Astrophysics Data System (ADS)

    Bilgic, Berkin; Chatnuntawech, Itthi; Langkammer, Christian; Setsompop, Kawin

    2015-09-01

    Quantitative Susceptibility Mapping (QSM) aims to estimate the tissue susceptibility distribution that gives rise to subtle changes in the main magnetic field, which are captured by the image phase in a gradient echo (GRE) experiment. The underlying susceptibility distribution is related to the acquired tissue phase through an ill-posed linear system. To facilitate its inversion, spatial regularization that imposes sparsity or smoothness assumptions can be employed. This paper focuses on efficient algorithms for regularized QSM reconstruction. Fast solvers that enforce sparsity under Total Variation (TV) and Total Generalized Variation (TGV) constraints are developed using Alternating Direction Method of Multipliers (ADMM). Through variable splitting that permits closed-form iterations, the computation efficiency of these solvers are dramatically improved. An alternative approach to improve the conditioning of the ill-posed inversion is to acquire multiple GRE volumes at different head orientations relative to the main magnetic field. The phase information from such multi-orientation acquisition can be combined to yield exquisite susceptibility maps and obviate the need for regularized reconstruction, albeit at the cost of increased data acquisition time.

  3. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  4. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  5. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  6. Quantitative MR imaging in fracture dating-Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34±15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895±607ms), which decreased over time to a value of 1094±182ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115±80ms) and decreased to 73±33ms within 21 days after the fracture event. After that time point, no significant changes could be detected for T2. MTR remained constant at 35.5±8.0% over time. The study shows that the quantitative assessment of T1 and T2 behaviour over time in the fractured region enable the generation of a novel model allowing for an objective age determination of a fracture. PMID:26890805

  7. Informatics Methods to Enable Sharing of Quantitative Imaging Research Data

    PubMed Central

    Levy, Mia A.; Freymann, John B.; Kirby, Justin S.; Fedorov, Andriy; Fennessy, Fiona M.; Eschrich, Steven A.; Berglund, Anders E.; Fenstermacher, David A.; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L.; Brown, Bartley J.; Braun, Terry A.; Dekker, Andre; Roelofs, Erik; Mountz, James M.; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-01-01

    Introduction The National Cancer Institute (NCI) Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. Methods We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. Results There area variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. Conclusions As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. PMID:22770688

  8. Blending Qualitative & Quantitative Research Methods in Theses and Dissertations.

    ERIC Educational Resources Information Center

    Thomas, R. Murray

    This guide discusses combining qualitative and quantitative research methods in theses and dissertations. It covers a wide array of methods, the strengths and limitations of each, and how they can be effectively interwoven into various research designs. The first chapter is "The Qualitative and the Quantitative." Part 1, "A Catalogue of…

  9. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    PubMed Central

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies. PMID:24744684

  10. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings. PMID:25782471

  11. A quantitative method for determining the robustness of complex networks

    NASA Astrophysics Data System (ADS)

    Qin, Jun; Wu, Hongrun; Tong, Xiaonian; Zheng, Bojin

    2013-06-01

    Most current studies estimate the invulnerability of complex networks using a qualitative method that analyzes the decay rate of network performance. This method results in confusion over the invulnerability of various types of complex networks. By normalizing network performance and defining a baseline, this paper defines the invulnerability index as the integral of the normalized network performance curve minus the baseline. This quantitative method seeks to measure network invulnerability under both edge and node attacks and provides a definition on the distinguishment of the robustness and fragility of networks. To demonstrate the proposed method, three small-world networks were selected as test beds. The simulation results indicate that the proposed invulnerability index can effectively and accurately quantify network resilience and can deal with both the node and edge attacks. The index can provide a valuable reference for determining network invulnerability in future research.

  12. A quantitative method for measuring the quality of history matches

    SciTech Connect

    Shaw, T.S.; Knapp, R.M.

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  13. A rapid chemiluminescent method for quantitation of human DNA.

    PubMed Central

    Walsh, P S; Varlaro, J; Reynolds, R

    1992-01-01

    A sensitive and simple method for the quantitation of human DNA is described. This method is based on probe hybridization to a human alpha satellite locus, D17Z1. The biotinylated probe is hybridized to sample DNA immobilized on nylon membrane. The subsequent binding of streptavidin-horseradish peroxidase to the bound probe allows for chemiluminescent detection using a luminol-based reagent and X-ray film. Less than 150 pg of human DNA can easily be detected with a 15 minute exposure. The entire procedure can be performed in 1.5 hours. Microgram quantities of nonhuman DNA have been tested and the results indicate very high specificity for human DNA. The data on film can be scanned into a computer and a commercially available program can be used to create a standard curve where DNA quantity is plotted against the mean density of each slot blot signal. The methods described can also be applied to the very sensitive determination of quantity and quality (size) of DNA on Southern blots. The high sensitivity of this quantitation method requires the consumption of only a fraction of sample for analysis. Determination of DNA quantity is necessary for RFLP and many PCR-based tests where optimal results are obtained only with a relatively narrow range of DNA quantities. The specificity of this quantitation method for human DNA will be useful for the analysis of samples that may also contain bacterial or other non-human DNA, for example forensic evidence samples, ancient DNA samples, or clinical samples. Images PMID:1408822

  14. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  15. Quantitative methods for ecological network analysis.

    PubMed

    Ulanowicz, Robert E

    2004-12-01

    The analysis of networks of ecological trophic transfers is a useful complement to simulation modeling in the quest for understanding whole-ecosystem dynamics. Trophic networks can be studied in quantitative and systematic fashion at several levels. Indirect relationships between any two individual taxa in an ecosystem, which often differ in either nature or magnitude from their direct influences, can be assayed using techniques from linear algebra. The same mathematics can also be employed to ascertain where along the trophic continuum any individual taxon is operating, or to map the web of connections into a virtual linear chain that summarizes trophodynamic performance by the system. Backtracking algorithms with pruning have been written which identify pathways for the recycle of materials and energy within the system. The pattern of such cycling often reveals modes of control or types of functions exhibited by various groups of taxa. The performance of the system as a whole at processing material and energy can be quantified using information theory. In particular, the complexity of process interactions can be parsed into separate terms that distinguish organized, efficient performance from the capacity for further development and recovery from disturbance. Finally, the sensitivities of the information-theoretic system indices appear to identify the dynamical bottlenecks in ecosystem functioning. PMID:15556474

  16. Machine Learning methods for Quantitative Radiomic Biomarkers

    PubMed Central

    Parmar, Chintan; Grossmann, Patrick; Bussink, Johan; Lambin, Philippe; Aerts, Hugo J. W. L.

    2015-01-01

    Radiomics extracts and mines large number of medical imaging features quantifying tumor phenotypic characteristics. Highly accurate and reliable machine-learning approaches can drive the success of radiomic applications in clinical care. In this radiomic study, fourteen feature selection methods and twelve classification methods were examined in terms of their performance and stability for predicting overall survival. A total of 440 radiomic features were extracted from pre-treatment computed tomography (CT) images of 464 lung cancer patients. To ensure the unbiased evaluation of different machine-learning methods, publicly available implementations along with reported parameter configurations were used. Furthermore, we used two independent radiomic cohorts for training (n = 310 patients) and validation (n = 154 patients). We identified that Wilcoxon test based feature selection method WLCX (stability = 0.84 ± 0.05, AUC = 0.65 ± 0.02) and a classification method random forest RF (RSD = 3.52%, AUC = 0.66 ± 0.03) had highest prognostic performance with high stability against data perturbation. Our variability analysis indicated that the choice of classification method is the most dominant source of performance variation (34.21% of total variance). Identification of optimal machine-learning methods for radiomic applications is a crucial step towards stable and clinically relevant radiomic biomarkers, providing a non-invasive way of quantifying and monitoring tumor-phenotypic characteristics in clinical practice. PMID:26278466

  17. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  18. Research radiometric calibration quantitative transfer methods between internal and external

    NASA Astrophysics Data System (ADS)

    Guo, Ju Guang; Ma, Yong hui; Zhang, Guang; Yang, Zhi hui

    2015-10-01

    This paper puts forward a method by realizing the internal and external radiation calibration transfer for infrared radiation characteristics quantitative measuring system. Through technological innovation and innovation application to establish a theoretical model of the corresponding radiated transfer method. This method can be well in engineering application for technology conversion process of radiometric calibration that with relatively simple and effective calibration in the half light path radiation instead of complex difficult whole optical path radiometric calibration. At the same time, it also will provide the basis of effective support to further carry out the target radiated characteristics quantitative measurement and application for ground type infrared radiated quantitative measuring system.

  19. Review of Quantitative Software Reliability Methods

    SciTech Connect

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of digital systems using dynamic PRA methods. These efforts, documented in NUREG/CR-6901, NUREG/CR-6942, and NUREG/CR-6985, included a functional representation of the system's software but did not explicitly address failure modes caused by software defects or by inadequate design requirements. An important identified research need is to establish a commonly accepted basis for incorporating the behavior of software into digital I&C system reliability models for use in PRAs. To address this need, BNL is exploring the inclusion of software failures into the reliability models of digital I&C systems, such that their contribution to the risk of the associated NPP can be assessed.

  20. Fluorometric method of quantitative cell mutagenesis

    SciTech Connect

    Dolbeare, F.A.

    1982-08-17

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  1. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, F.A.

    1980-12-12

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  2. Fluorometric method of quantitative cell mutagenesis

    DOEpatents

    Dolbeare, Frank A.

    1982-01-01

    A method for assaying a cell culture for mutagenesis is described. A cell culture is stained first with a histochemical stain, and then a fluorescent stain. Normal cells in the culture are stained by both the histochemical and fluorescent stains, while abnormal cells are stained only by the fluorescent stain. The two stains are chosen so that the histochemical stain absorbs the wavelengths that the fluorescent stain emits. After the counterstained culture is subjected to exciting light, the fluorescence from the abnormal cells is detected.

  3. [Quantitative analysis of alloy steel based on laser induced breakdown spectroscopy with partial least squares method].

    PubMed

    Cong, Zhi-Bo; Sun, Lan-Xiang; Xin, Yong; Li, Yang; Qi, Li-Feng; Yang, Zhi-Jia

    2014-02-01

    In the present paper both the partial least squares (PLS) method and the calibration curve (CC) method are used to quantitatively analyze the laser induced breakdown spectroscopy data obtained from the standard alloy steel samples. Both the major and trace elements were quantitatively analyzed. By comparing the results of two different calibration methods some useful results were obtained: for major elements, the PLS method is better than the CC method in quantitative analysis; more importantly, for the trace elements, the CC method can not give the quantitative results due to the extremely weak characteristic spectral lines, but the PLS method still has a good ability of quantitative analysis. And the regression coefficient of PLS method is compared with the original spectral data with background interference to explain the advantage of the PLS method in the LIBS quantitative analysis. Results proved that the PLS method used in laser induced breakdown spectroscopy is suitable for quantitative analysis of trace elements such as C in the metallurgical industry. PMID:24822436

  4. Forward-backward splitting method for quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Zhang, Xue; Zhou, Weifeng; Zhang, Xiaoqun; Gao, Hao

    2014-12-01

    Quantitative photoacoustic tomography (PAT) reconstructs optical maps using ultrasonic measurements, with improved resolution from conventional optical imaging due to significantly smaller acoustic scattering than optical scattering for detecting signals in depth. In this work, formulating quantitative PAT as a nonlinear least-squares problem with l1-norm sparsity regularization, we develop an efficient gradient-based reconstruction algorithm using a forward-backward splitting method, and prove its convergence for such a nonconvex problem.

  5. Integrating Qualitative and Quantitative Evaluation Methods in Substance Abuse Research.

    ERIC Educational Resources Information Center

    Dennis, Michael L.; And Others

    1994-01-01

    Some specific opportunities and techniques are described for combining and integrating qualitative and quantitative methods from the design stage of a substance abuse program evaluation through implementation and reporting. The multiple problems and requirements of such an evaluation make integrated methods essential. (SLD)

  6. A Quantitative Vainberg Method for Black Box Scattering

    NASA Astrophysics Data System (ADS)

    Galkowski, Jeffrey

    2016-05-01

    We give a quantitative version of Vainberg's method relating pole free regions to propagation of singularities for black box scatterers. In particular, we show that there is a logarithmic resonance free region near the real axis of size {τ} with polynomial bounds on the resolvent if and only if the wave propagator gains derivatives at rate {τ} . Next we show that if there exist singularities in the wave trace at times tending to infinity which smooth at rate {τ} , then there are resonances in logarithmic strips whose width is given by {τ} . As our main application of these results, we give sharp bounds on the size of resonance free regions in scattering on geometrically nontrapping manifolds with conic points. Moreover, these bounds are generically optimal on exteriors of nontrapping polygonal domains.

  7. [Teaching quantitative methods in public health: the EHESP experience].

    PubMed

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis. PMID:25629671

  8. A Quantitative Method for Weight Selection in SGDDP.

    PubMed

    Huang, Qin; Chen, Gang; Yuan, Zhilong; Zhang, Ying; Wenrich, Judy

    2015-01-01

    Ethnic factors pose major challenge to evaluating the treatment effect of a new drug in a targeted ethnic (TE) population in emerging regions based on the results from a multiregional clinical trial (MRCT). To address this issue with statistical rigor, Huang et al. (2012) proposed a new design of a simultaneous global drug development program (SGDDP) which used weighted Z tests to combine the information collected from the nontargeted ethnic (NTE) group in the MRCT with that from the TE group in both the MRCT and a simultaneously designed local clinical trial (LCT). An important and open question in the SGDDP design was how to downweight the information collected from the NTE population to reflect the potential impact of ethnic factors and ensure that the effect size for TE patients is clinically meaningful. In this paper, we will relate the weight selection for the SGDDP to Method 1 proposed in the Japanese regulatory guidance published by the Ministry of Health, Labour and Welfare (MHLW) in 2007. Method 1 is only applicable when true effect sizes are assumed to be equal for both TE and NTE groups. We modified the Method 1 formula for more general scenarios, and use it to develop a quantitative method of weight selection for the design of the SGDDP which, at the same time, also provides sufficient power to descriptively check the consistency of the effect size for TE patients to a clinically meaningful magnitude. PMID:25365548

  9. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  10. A Quantitative Assessment Method for Ascaris Eggs on Hands

    PubMed Central

    Jeandron, Aurelie; Ensink, Jeroen H. J.; Thamsborg, Stig M.; Dalsgaard, Anders; Sengupta, Mita E.

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  11. A quantitative assessment method for Ascaris eggs on hands.

    PubMed

    Jeandron, Aurelie; Ensink, Jeroen H J; Thamsborg, Stig M; Dalsgaard, Anders; Sengupta, Mita E

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  12. Simple laboratory methods for quantitative IR measurements of CW agents

    NASA Astrophysics Data System (ADS)

    Puckrin, Eldon; Thériault, Jean-Marc; Lavoie, Hugo; Dubé, Denis; Lepage, Carmela J.; Petryk, Michael

    2005-11-01

    A simple method is presented for quantitatively measuring the absorbance of chemical warfare (CW) agents and their simulants in the vapour phase. The technique is based on a standard lab-bench FTIR spectrometer, 10-cm gas cell, a high accuracy Baratron pressure manometer, vacuum pump and simple stainless-steel hardware components. The results of this measurement technique are demonstrated for sarin (GB) and soman (GD). A second technique is also introduced for the passive IR detection of CW agents in an open- air path located in a fumehood. Using a modified open-cell with a pathlength of 45 cm, open-air passive infrared measurements have been obtained for simulants and several classical CW agents. Detection, identification and quantification results based on passive infrared measurements are presented for GB and the CW agent simulant, DMMP, using the CATSI sensor which has been developed by DRDC Valcartier. The open-cell technique represents a relatively simple and feasible method for examining the detection capability of passive sensors, such as CATSI, for CW agents.

  13. Trojan Horse Method: Recent Results

    SciTech Connect

    Pizzone, R. G.; Spitaleri, C.

    2008-01-24

    Owing the presence of the Coulomb barrier at astrophysically relevant kinetic energies, it is very difficult, or sometimes impossible to measure astrophysical reaction rates in laboratory. This is why different indirect techniques are being used along with direct measurements. The THM is unique indirect technique allowing one measure astrophysical rearrangement reactions down to astrophysical relevant energies. The basic principle and a review of the main application of the Trojan Horse Method are presented. The applications aiming at the extraction of the bare S{sub b}(E) astrophysical factor and electron screening potentials U{sub e} for several two body processes are discussed.

  14. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  15. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  16. Spy quantitative inspection with a machine vision light sectioning method

    NASA Astrophysics Data System (ADS)

    Tu, Da-Wei; Lin, Cai-Xing

    2000-08-01

    Machine vision light sectioning sensing is developed and expanded to the range of spy quantitative inspection for hole-like work pieces in this paper. A light beam from a semiconductor laser diode is converged into a line-shape by a cylindrical lens. A special compact reflecting-refracting prism group is designed to ensure that such a sectioning light is projected axially onto the inner surface, and to make the deformed line be imaged onto a CCD sensitive area. The image is digitized and captured into a computer by a 512×512 pixel card, and machine vision image processing methods such as thresholding, line centre detect and the least-squares method are developed for contour feature extraction and description. Two other important problems in such an inspection system are how to orientate the deep-going optical probe and how to bring the projected line into focus. A focusing criterion based on image position deviation and a four-step orientating procedure are put forward, and analysed to be feasible respectively. The experimental results show that the principle is correct and the techniques are realizable, and a good future for application in industry is possible.

  17. A Novel Targeted Learning Method for Quantitative Trait Loci Mapping

    PubMed Central

    Wang, Hui; Zhang, Zhongyang; Rose, Sherri; van der Laan, Mark

    2014-01-01

    We present a novel semiparametric method for quantitative trait loci (QTL) mapping in experimental crosses. Conventional genetic mapping methods typically assume parametric models with Gaussian errors and obtain parameter estimates through maximum-likelihood estimation. In contrast with univariate regression and interval-mapping methods, our model requires fewer assumptions and also accommodates various machine-learning algorithms. Estimation is performed with targeted maximum-likelihood learning methods. We demonstrate our semiparametric targeted learning approach in a simulation study and a well-studied barley data set. PMID:25258376

  18. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  19. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  20. Method for depth-resolved quantitation of optical properties in layered media using spatially modulated quantitative spectroscopy

    PubMed Central

    Saager, Rolf B.; Truong, Alex; Cuccia, David J.; Durkin, Anthony J.

    2011-01-01

    We have demonstrated that spatially modulated quantitative spectroscopy (SMoQS) is capable of extracting absolute optical properties from homogeneous tissue simulating phantoms that span both the visible and near-infrared wavelength regimes. However, biological tissue, such as skin, is highly structured, presenting challenges to quantitative spectroscopic techniques based on homogeneous models. In order to more accurately address the challenges associated with skin, we present a method for depth-resolved optical property quantitation based on a two layer model. Layered Monte Carlo simulations and layered tissue simulating phantoms are used to determine the efficacy and accuracy of SMoQS to quantify layer specific optical properties of layered media. Initial results from both the simulation and experiment show that this empirical method is capable of determining top layer thickness within tens of microns across a physiological range for skin. Layer specific chromophore concentration can be determined to <±10% the actual values, on average, whereas bulk quantitation in either visible or near infrared spectroscopic regimes significantly underestimates the layer specific chromophore concentration and can be confounded by top layer thickness. PMID:21806282

  1. Gap analysis: Concepts, methods, and recent results

    USGS Publications Warehouse

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  2. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    NASA Astrophysics Data System (ADS)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  3. A review of methods for quantitative evaluation of axial vertebral rotation

    PubMed Central

    Pernuš, Franjo; Likar, Boštjan

    2009-01-01

    Quantitative evaluation of axial vertebral rotation is essential for the determination of reference values in normal and pathological conditions and for understanding the mechanisms of the progression of spinal deformities. However, routine quantitative evaluation of axial vertebral rotation is difficult and error-prone due to the limitations of the observer, characteristics of the observed vertebral anatomy and specific imaging properties. The scope of this paper is to review the existing methods for quantitative evaluation of axial vertebral rotation from medical images along with all relevant publications, which may provide a valuable resource for studying the existing methods or developing new methods and evaluation strategies. The reviewed methods are divided into the methods for evaluation of axial vertebral rotation in 2D images and the methods for evaluation of axial vertebral rotation in 3D images. Key evaluation issues and future considerations, supported by the results of the overview, are also discussed. PMID:19242736

  4. Analytical methods for quantitation of prenylated flavonoids from hops

    PubMed Central

    Nikoli?, Dejan; van Breemen, Richard B.

    2013-01-01

    The female flowers of hops (Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach. PMID:24077106

  5. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gnen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. PMID:24919831

  6. Quantitative imaging of volcanic plumes - Results, needs, and future trends

    NASA Astrophysics Data System (ADS)

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-07-01

    Recent technology allows two-dimensional "imaging" of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2 cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry-Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  7. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  8. Quantitative analytical method to evaluate the metabolism of vitamin D.

    PubMed

    Mena-Bravo, A; Ferreiro-Vera, C; Priego-Capote, F; Maestro, M A; Mouriño, A; Quesada-Gómez, J M; Luque de Castro, M D

    2015-03-10

    A method for quantitative analysis of vitamin D (both D2 and D3) and its main metabolites - monohydroxylated vitamin D (25-hydroxyvitamin D2 and 25-hydroxyvitamin D3) and dihydroxylated metabolites (1,25-dihydroxyvitamin D2, 1,25-dihydroxyvitamin D3 and 24,25-dihydroxyvitamin D3) in human serum is here reported. The method is based on direct analysis of serum by an automated platform involving on-line coupling of a solid-phase extraction workstation to a liquid chromatograph-tandem mass spectrometer. Detection of the seven analytes was carried out by the selected reaction monitoring (SRM) mode, and quantitative analysis was supported on the use of stable isotopic labeled internal standards (SIL-ISs). The detection limits were between 0.3-75pg/mL for the target compounds, while precision (expressed as relative standard deviation) was below 13.0% for between-day variability. The method was externally validated according to the vitamin D External Quality Assurance Scheme (DEQAS) through the analysis of ten serum samples provided by this organism. The analytical features of the method support its applicability in nutritional and clinical studies targeted at elucidating the role of vitamin D metabolism. PMID:25575651

  9. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  10. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  11. Method for a quantitative investigation of the frozen flow hypothesis

    PubMed

    Schock; Spillar

    2000-09-01

    We present a technique to test the frozen flow hypothesis quantitatively, using data from wave-front sensors such as those found in adaptive optics systems. Detailed treatments of the theoretical background of the method and of the error analysis are presented. Analyzing data from the 1.5-m and 3.5-m telescopes at the Starfire Optical Range, we find that the frozen flow hypothesis is an accurate description of the temporal development of atmospheric turbulence on time scales of the order of 1-10 ms but that significant deviations from the frozen flow behavior are present for longer time scales. PMID:10975375

  12. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding, proliferation, fishing, resources protection and management of specific plans.

  13. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples. PMID:24190861

  14. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  15. Quantitative Methods in the Study of Local History

    ERIC Educational Resources Information Center

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  16. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    PubMed

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (P< .001). Repeated measures pairwise correlation between any of the methods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies. PMID:26768372

  17. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    SciTech Connect

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.; Palmer, M.A.; Furman, M.; Harkay, K.

    2011-03-28

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and one can obtain best fit values for important simulation parameters with a chi-square minimization method.

  18. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    ERIC Educational Resources Information Center

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  19. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    NASA Astrophysics Data System (ADS)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  20. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  1. Complementarity as a Program Evaluation Strategy: A Focus on Qualitative and Quantitative Methods.

    ERIC Educational Resources Information Center

    Lafleur, Clay

    Use of complementarity as a deliberate and necessary program evaluation strategy is discussed. Quantitative and qualitative approaches are viewed as complementary and can be integrated into a single study. The synergy that results from using complementary methods in a single study seems to enhance understanding and interpretation. A review of the…

  2. A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra

    NASA Astrophysics Data System (ADS)

    Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

    2013-06-01

    A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

  3. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  4. Understanding youth: using qualitative methods to verify quantitative community indicators.

    PubMed

    Makhoul, Jihad; Nakkash, Rima

    2009-01-01

    Community- and individual-level data were collected from interviews with 1,294 boys and girls, 13 to 19 years old, in three impoverished urban communities of Beirut. Univariate analyses of variables provide quantitative indicators of adolescents' lives and communities. Researchers including the authors, interested in using these indicators to plan for community interventions with youth in the Palestinian refugee camp, discuss the pertinent results with youth from the camp in six focus groups. The authors find that many indicators misrepresent the situation of youth in the camp. For example, adolescents may have underreported cigarette and argileh (water pipe) smoking (8.3% and 22.4%, respectively) because of the lack of social desirability of these behaviors; other questions may have been misunderstood, such as perceived health and health compared to others. Also, important issues for them such as drug abuse, violence, and school problems were not asked. Implications for intervention research are discussed. PMID:17971480

  5. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    SciTech Connect

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  6. A Method for Designing Instrument-Free Quantitative Immunoassays.

    PubMed

    Lathwal, Shefali; Sikes, Hadley D

    2016-03-15

    Colorimetric readouts are widely used in point-of-care diagnostic immunoassays to indicate either the presence or the absence of an analyte. For a variety of reasons, it is more difficult to quantify rather than simply detect an analyte using a colorimetric test. We report a method for designing, with minimal iteration, a quantitative immunoassay that can be interpreted objectively by a simple count of number of spots visible to the unaided eye. We combined a method called polymerization-based amplification (PBA) with a series of microscale features containing a decreasing surface density of capture molecules, and the central focus of the study is understanding how the choice of surface densities impacts performance. Using a model pair of antibodies, we have shown that our design approach does not depend on measurement of equilibrium and kinetic binding parameters and can provide a dynamic working range of 3 orders of magnitude (70 pM to 70 nM) for visual quantification. PMID:26878154

  7. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells

    PubMed Central

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R2 > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/106 cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/106 letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  8. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells.

    PubMed

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R(2) > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/10(6) cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/10(6) letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  9. A processing method enabling the use of peak height for accurate and precise proton NMR quantitation.

    PubMed

    Hays, Patrick A; Thompson, Robert A

    2009-10-01

    In NMR, peak area quantitation is the most common method used because the area under a peak or peak group is proportional to the number of nuclei at those frequencies. Peak height quantitation has not enjoyed as much utility because of poor precision and linearity as a result of inconsistent shapes and peak widths (measured at half height). By using a post-acquisition processing method employing a Gaussian or line-broadening (exponential decay) apodization (i.e. weighting function) to normalize the shape and width of the internal standard (ISTD) peak, the heights of an analyte calibration spectrum can be compared to the analyte peaks in a sample spectrum resulting in accurate and precise quantitative results. Peak height results compared favorably with 'clean' peak area results for several hundred illicit samples of methamphetamine HCl, cocaine HCl, and heroin HCl, of varying composition and purity. Using peak height and peak area results together can enhance the confidence in the reported purity value; a major advantage in high throughput, automated quantitative analyses. PMID:19548253

  10. Legionella in water samples: how can you interpret the results obtained by quantitative PCR?

    PubMed

    Ditommaso, Savina; Ricciardi, Elisa; Giacomuzzi, Monica; Arauco Rivera, Susan R; Zotti, Carla M

    2015-02-01

    Evaluation of the potential risk associated with Legionella has traditionally been determined from culture-based methods. Quantitative polymerase chain reaction (qPCR) is an alternative tool that offers rapid, sensitive and specific detection of Legionella in environmental water samples. In this study we compare the results obtained by conventional qPCR (iQ-Check™ Quanti Legionella spp.; Bio-Rad) and by culture method on artificial samples prepared in Page's saline by addiction of Legionella pneumophila serogroup 1 (ATCC 33152) and we analyse the selective quantification of viable Legionella cells by the qPCR-PMA method. The amount of Legionella DNA (GU) determined by qPCR was 28-fold higher than the load detected by culture (CFU). Applying the qPCR combined with PMA treatment we obtained a reduction of 98.5% of the qPCR signal from dead cells. We observed a dissimilarity in the ability of PMA to suppress the PCR signal in samples with different amounts of bacteria: the effective elimination of detection signals by PMA depended on the concentration of GU and increasing amounts of cells resulted in higher values of reduction. Using the results from this study we created an algorithm to facilitate the interpretation of viable cell level estimation with qPCR-PMA. PMID:25241149

  11. Quantitative analysis of rib kinematics based on dynamic chest bone images: preliminary results

    PubMed Central

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-01-01

    Abstract. An image-processing technique for separating bones from soft tissue in static chest radiographs has been developed. The present study was performed to evaluate the usefulness of dynamic bone images in quantitative analysis of rib movement. Dynamic chest radiographs of 16 patients were obtained using a dynamic flat-panel detector and processed to create bone images by using commercial software (Clear Read BS, Riverain Technologies). Velocity vectors were measured in local areas on the dynamic images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as a reduced rib velocity field, resulting in an asymmetrical distribution of rib movement. Vector maps in all normal cases exhibited left/right symmetric distributions of the velocity field, whereas those in abnormal cases showed asymmetric distributions because of locally limited rib movements. Dynamic bone images were useful for accurate quantitative analysis of rib movements. The present method has a potential for an additional functional examination in chest radiography. PMID:26158097

  12. Overview of Student Affairs Research Methods: Qualitative and Quantitative.

    ERIC Educational Resources Information Center

    Perl, Emily J.; Noldon, Denise F.

    2000-01-01

    Reviews the strengths and weaknesses of quantitative and qualitative research in student affairs research, noting that many student affairs professionals question the value of more traditional quantitative approaches to research, though they typically have very good people skills that they have applied to being good qualitative researchers.…

  13. Disordered Speech Assessment Using Automatic Methods Based on Quantitative Measures

    NASA Astrophysics Data System (ADS)

    Gu, Lingyun; Harris, John G.; Shrivastav, Rahul; Sapienza, Christine

    2005-12-01

    Speech quality assessment methods are necessary for evaluating and documenting treatment outcomes of patients suffering from degraded speech due to Parkinson's disease, stroke, or other disease processes. Subjective methods of speech quality assessment are more accurate and more robust than objective methods but are time-consuming and costly. We propose a novel objective measure of speech quality assessment that builds on traditional speech processing techniques such as dynamic time warping (DTW) and the Itakura-Saito (IS) distortion measure. Initial results show that our objective measure correlates well with the more expensive subjective methods.

  14. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  15. Quantitative methods in the study of trypanosomes and their applications*

    PubMed Central

    Lumsden, W. H. R.

    1963-01-01

    In the first part of this paper the author summarizes and discusses previous quantitative work on trypanosomes, with particular reference to biometrical studies, in vivo and in vitro studies on numbers of trypanosomes, studies on hosts infected with trypanosomes, and physiological studies. The second part discusses recent work done at the East African Trypanosomiasis Research Organization. A method for the measurement of the infectivity of trypanosome suspensions, based on serial dilution and inoculation into test animals, is outlined, and applications likely to improve diagnostic procedures are suggested for it. Such applications might include: the establishment of experimental procedures not significantly reducing the infectivity of trypanosomes under experiment; determination of the effects on the infectivity of preserved material of some of the factors in the process of preservation, important for the preparation of standard material; comparison of the efficiency of different culture media for the isolation of trypanosomes; study of the distribution of trypanosomes in the vertebrate host; and measurement of the susceptibility of trypanosomes to drugs. The author stresses the importance of relating future experimental work with trypanosomes to preserved material for which comprehensive documentation is available. PMID:20604152

  16. Quantitative Methods for Comparing Different Polyline Stream Network Models

    SciTech Connect

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  17. A new method for the quantitative analysis of endodontic microleakage.

    PubMed

    Haïkel, Y; Wittenmeyer, W; Bateman, G; Bentaleb, A; Allemann, C

    1999-03-01

    The aim of this in vitro study was to evaluate the apical seal obtained with three commonly used root canal sealing cements: Sealapex, AH Plus or Topseal, and Sealite, using a new method based on the quantitative analysis of 125I-radiolabeled lysozyme penetration. One hundred thirteen teeth with straight single root canals were instrumented to master apical point #25/30. These were divided into three groups: (i) negative control (4 roots) covered with two layers of nail polish, (ii) test group (105 roots) obturated by laterally condensed guttapercha with the three cements; and (iii) positive control (4 roots) obturated without cement. The groups were then immersed in 125I lysozyme solution for a period of 1, 7, 14, or 28 days. After removal, six sections of 0.8 mm length each were made of each root with a fine diamond wire. Each section was analyzed for activity by a gamma counter, corrected for decay, and used to quantify protein penetration. Leakage was high in the positive control and almost negligible in the negative control. AH Plus (Topseal) and Sealapex showed similar leakage behavior over time, with AH Plus (Topseal) performing better. Sealite showed acceptable leakage up until day 14, after which a large increase occurred, presumably due to three-dimensional instability. PMID:10321181

  18. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  19. [Quantitative methods of cancer risk assessment in exposure to chemicals].

    PubMed

    Szymczak, Wies?aw

    2009-01-01

    This is a methodology paper--it contains a review of different quantitative risk assessment methods and their comparison. There are two aspects of cancer risk modeling discussed here: 1. When there is one effective dose only. There were compared two models in this evaluation: one proposed by the Dutch Expert Committee on Occupational Standards and the other--a classical two-stage model. It was taken into account that in both models the animals were exposed for less than two years. An exposure period and a study period of animals were considered in the Dutch methodology. If we use as an exposure measure average lifespan dose estimated with different coefficients of exposure time in an experiment, we get two different dose-response models. And each of them will create different human risk models. There is no criterion that would let us assess which of them is better. 2. There are many models used in the BenchMark Dose (BMD) method. But there is no criterion that allows us to choose the best model objectively. In this paper a two-stage classical model and three BMD models (two-stage, Weibull and linear) were fit for particular data. Very small differences between all the models were noticed. The differences were insignificant because of uncertainties in the risk modeling. The possibility of choice of one model from a bigger set of models is the greatest benefit of this comparison. If the examined chemical is a genotoxic carcinogen, nothing more is needed than to estimate the threshold value. PMID:19746890

  20. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    PubMed Central

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-01-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output. PMID:26430292

  1. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  2. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  4. Quantitative measurement of Φ140mm F/2 parabolic surface with Ronchi grating test method

    NASA Astrophysics Data System (ADS)

    Lei, Bai-ping; Wu, Fan; Zhou, Chen-bo

    2009-05-01

    Ronchi grating test has been used widely to test optical surfaces in a qualitative way since it was contrived, while rarely to test the parabolic surface in a quantitative way. This paper discusses the application of Ronchi grating test to optical aspheric surfaces in a quantitative way on the base of self-made software which includes Ronchi null grating design, collection of Ronchi graph, data procession and so on. The whole system has been used to test a concave parabolic mirror with diameter 140mm and F number 2, and the result is approximately the same as that of the outcome of interferometer. The analysis software and test method establish a good foundation for the coming of quantitative measurement of big error of large-aperture aspheric surfaces.

  5. Quantitative biomechanical comparison of ankle fracture casting methods.

    PubMed

    Shipman, Alastair; Alsousou, Joseph; Keene, David J; Dyson, Igor N; Lamb, Sarah E; Willett, Keith M; Thompson, Mark S

    2015-06-01

    The incidence of ankle fractures is increasing rapidly due to the ageing demographic. In older patients with compromised distal circulation, conservative treatment of fractures may be indicated. High rates of malunion and complications due to skin fragility motivate the design of novel casting systems, but biomechanical stability requirements are poorly defined. This article presents the first quantitative study of ankle cast stability and hypothesises that a newly proposed close contact cast (CCC) system provides similar biomechanical stability to standard casts (SC). Two adult mannequin legs transected at the malleoli, one incorporating an inflatable model of tissue swelling, were stabilised with casts applied by an experienced surgeon. They were cyclically loaded in torsion, measuring applied rotation angle and resulting torque. CCC stiffness was equal to or greater than that of SC in two measures of ankle cast resistance to torsion. The effect of swelling reduction at the ankle site was significantly greater on CCC than on SC. The data support the hypothesis that CCC provides similar biomechanical stability to SC and therefore also the clinical use of CCC. They suggest that more frequent re-application of CCC is likely required to maintain stability following resolution of swelling at the injury site. PMID:25719278

  6. A quantitative measurement method for comparison of seated postures.

    PubMed

    Hillman, Susan J; Hollington, James

    2016-05-01

    This technical note proposes a method to measure and compare seated postures. The three-dimensional locations of palpable anatomical landmarks corresponding to the anterior superior iliac spines, clavicular notch, head, shoulders and knees are measured in terms of x, y and z co-ordinates in the reference system of the measuring apparatus. These co-ordinates are then transformed onto a body-based axis system which allows comparison within-subject. The method was tested on eleven unimpaired adult participants and the resulting data used to calculate a Least Significant Difference (LSD) for the measure, which is used to determine whether two postures are significantly different from one another. The method was found to be sensitive to the four following standardised static postural perturbations: posterior pelvic tilt, pelvic obliquity, pelvic rotation, and abduction of the thighs. The resulting data could be used as an outcome measure for the postural alignment aspect of seating interventions in wheelchairs. PMID:26920073

  7. Qualitative and quantitative PCR methods for detection of three lines of genetically modified potatoes.

    PubMed

    Rho, Jae Kyun; Lee, Theresa; Jung, Soon-Il; Kim, Tae-San; Park, Yong-Hwan; Kim, Young-Mi

    2004-06-01

    Qualitative and quantitative polymerase chain reaction (PCR) methods have been developed for the detection of genetically modified (GM) potatoes. The combination of specific primers for amplification of the promoter region of Cry3A gene, potato leafroll virus replicase gene, and potato virus Y coat protein gene allows to identify each line of NewLeaf, NewLeaf Y, and NewLeaf Plus GM potatoes. Multiplex PCR method was also established for the simple and rapid detection of the three lines of GM potato in a mixture sample. For further quantitative detection, the realtime PCR method has been developed. This method features the use of a standard plasmid as a reference molecule. Standard plasmid contains both a specific region of the transgene Cry3A and an endogenous UDP-glucose pyrophosphorylase gene of the potato. The test samples containing 0.5, 1, 3, and 5% GM potatoes were quantified by this method. At the 3.0% level of each line of GM potato, the relative standard deviations ranged from 6.0 to 19.6%. This result shows that the above PCR methods are applicable to detect GM potatoes quantitatively as well as qualitatively. PMID:15161181

  8. A quantitative analytical method to test for salt effects on giant unilamellar vesicles.

    PubMed

    Hadorn, Maik; Boenzli, Eva; Hotz, Peter Eggenberger

    2011-01-01

    Today, free-standing membranes, i.e. liposomes and vesicles, are used in a multitude of applications, e.g. as drug delivery devices and artificial cell models. Because current laboratory techniques do not allow handling of large sample sizes, systematic and quantitative studies on the impact of different effectors, e.g. electrolytes, are limited. In this work, we evaluated the Hofmeister effects of ten alkali metal halides on giant unilamellar vesicles made of palmitoyloleoylphosphatidylcholine for a large sample size by combining the highly parallel water-in-oil emulsion transfer vesicle preparation method with automatic haemocytometry. We found that this new quantitative screening method is highly reliable and consistent with previously reported results. Thus, this method may provide a significant methodological advance in analysis of effects on free-standing model membranes. PMID:22355683

  9. HPTLC Method for Quantitative Determination of Zopiclone and Its Impurity.

    PubMed

    Naguib, Ibrahim A; Abdelrahman, Maha M; El Ghobashy, Mohamed R; Ali, Nesma A

    2015-09-01

    This study was designed to establish, optimize and validate a sensitive, selective and accurate high-performance thin layer chromatographic (HPTLC) method for determination of zopiclone (ZPC) and its main impurity, 2-amino-5-chloropyridine, one of its degradation products, in raw material and pharmaceutical formulation. The proposed method was applied for analysis of ZPC and its impurity over the concentration range of 0.3-1.4 and 0.05-0.8 µg/band with accuracy of mean percentage recovery 99.92% ± 1.521 and 99.28% ± 2.296, respectively. The method is based on the separation of two components followed by densitometric measurement of the separated peaks at 305 nm. The separation was carried out on silica gel HPTLC F254 plates, using chloroform-methanol-glacial acetic acid (9:1:0.1, by volume) as a developing system. The suggested method was validated according to International Conference on Harmonization guidelines and can be applied for routine analysis in quality control laboratories. The results obtained by the proposed method were statistically compared with the reported method revealing high accuracy and good precision. PMID:25740427

  10. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  11. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works. Engineers need more quantitative information. In order to apply geophysical methods to engineering design works, quantitative interpretation is very important. The presentation introduces several case studies from different countries around the world (Fig. 2) from the integrated and quantitative points of view.

  12. Methods and challenges in quantitative imaging biomarker development.

    PubMed

    Abramson, Richard G; Burton, Kirsteen R; Yu, John-Paul J; Scalzetti, Ernest M; Yankeelov, Thomas E; Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M

    2015-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This article, drafted by the Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field. PMID:25481515

  13. Methods and Challenges in Quantitative Imaging Biomarker Development

    PubMed Central

    Abramson, Richard G.; Burton, Kirsteen R.; Yu, John-Paul J.; Scalzetti, Ernest M.; Yankeelov, Thomas E.; Rosenkrantz, Andrew B.; Mendiratta-Lala, Mishal; Bartholmai, Brian J.; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M.

    2014-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This manuscript, drafted by the Association of University Radiologists (AUR) Radiology Research Alliance (RRA) Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field. PMID:25481515

  14. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  15. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    SciTech Connect

    Kiefel, Denis E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  16. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize. PMID:23470871

  17. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    NASA Astrophysics Data System (ADS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  18. Quantitative lacrimal scintillography. I. Method and physiological application.

    PubMed Central

    Hurwitz, J J; Maisey, M N; Welham, R A

    1975-01-01

    Quantitative lacrimal scintillography, using 99mTc sulphur colloid, a high resolution gamma camera, and quantification using a digital computer, is a highly effective way of assessing lacrimal physiology, and of establishing normal flow and drainage values against which pathological cases may be compared. Images PMID:1174486

  19. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems

  20. Quantitative Methods for Administrative Decision Making in Junior Colleges.

    ERIC Educational Resources Information Center

    Gold, Benjamin Knox

    With the rapid increase in number and size of junior colleges, administrators must take advantage of the decision-making tools already used in business and industry. This study investigated how these quantitative techniques could be applied to junior college problems. A survey of 195 California junior college administrators found that the problems…

  1. An Uneasy Alliance: Combining Qualitative and Quantitative Research Methods.

    ERIC Educational Resources Information Center

    Buchanan, David R.

    1992-01-01

    In a study of the relationship between moral reasoning and teenage drug use, problems arose in an attempt to reduce qualitative data to a quantitative format: (1) making analytic sense of singular and universal responses; (2) the mistaken logical inference that each pattern of judgment should have behavioral indicators; and (3) construction and…

  2. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    PubMed Central

    Yamashiro, Tsuneo; Miyara, Tetsuhiro; Honda, Osamu; Tomiyama, Noriyuki; Ohno, Yoshiharu; Noma, Satoshi; Murayama, Sadayuki

    2015-01-01

    Purpose To assess the advantages of iterative reconstruction for quantitative computed tomography (CT) analysis of pulmonary emphysema. Materials and methods Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D) and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < −950 Hounsfield units) and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001). For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01), but was not significantly different between each pair among scans when using AIDR3D. On scans without using AIDR3D, measurement errors between different tube current settings were significantly correlated with patients’ body weights (P<0.05), whereas these errors between scans when using AIDR3D were insignificantly or minimally correlated with body weight. Conclusion The extent of emphysema was more consistent across different tube currents when CT scans were converted to CT images using AIDR3D than using a conventional filtered-back projection method. PMID:25709426

  3. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  4. A new quantitative method for gunshot residue analysis by ion beam analysis.

    PubMed

    Christopher, Matthew E; Warmenhoeven, John-William; Romolo, Francesco S; Donghi, Matteo; Webb, Roger P; Jeynes, Christopher; Ward, Neil I; Kirkby, Karen J; Bailey, Melanie J

    2013-08-21

    Imaging and analyzing gunshot residue (GSR) particles using the scanning electron microscope equipped with an energy dispersive X-ray spectrometer (SEM-EDS) is a standard technique that can provide important forensic evidence, but the discrimination power of this technique is limited due to low sensitivity to trace elements and difficulties in obtaining quantitative results from small particles. A new, faster method using a scanning proton microbeam and Particle Induced X-ray Emission (μ-PIXE), together with Elastic Backscattering Spectrometry (EBS) is presented for the non-destructive, quantitative analysis of the elemental composition of single GSR particles. In this study, the GSR particles were all Pb, Ba, Sb. The precision of the method is assessed. The grouping behaviour of different makes of ammunition is determined using multivariate analysis. The protocol correctly groups the cartridges studied here, with a confidence >99%, irrespective of the firearm or population of particles selected. PMID:23775063

  5. [Application of calibration curve method and partial least squares regression analysis to quantitative analysis of nephrite samples using XRF].

    PubMed

    Liu, Song; Su, Bo-min; Li, Qing-hui; Gan, Fu-xi

    2015-01-01

    The authors tried to find a method for quantitative analysis using pXRF without solid bulk stone/jade reference samples. 24 nephrite samples were selected, 17 samples were calibration samples and the other 7 are test samples. All the nephrite samples were analyzed by Proton induced X-ray emission spectroscopy (PIXE) quantitatively. Based on the PIXE results of calibration samples, calibration curves were created for the interested components/elements and used to analyze the test samples quantitatively; then, the qualitative spectrum of all nephrite samples were obtained by pXRF. According to the PIXE results and qualitative spectrum of calibration samples, partial least square method (PLS) was used for quantitative analysis of test samples. Finally, the results of test samples obtained by calibration method, PLS method and PIXE were compared to each other. The accuracy of calibration curve method and PLS method was estimated. The result indicates that the PLS method is the alternate method for quantitative analysis of stone/jade samples. PMID:25993858

  6. Machine learning methods for quantitative analysis of Raman spectroscopy data

    NASA Astrophysics Data System (ADS)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  7. Quantitative assessment of gene expression network module-validation methods

    PubMed Central

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  8. Quantitative Analysis and Validation of Method Using HPTLC

    NASA Astrophysics Data System (ADS)

    Dhandhukia, Pinakin C.; Thakker, Janki N.

    High performance thin layer chromatography is an emerging alternative analytical technique in comparison with conventional column chromatography because of its simplicity, rapidity, accuracy, robustness, and cost effectiveness. Choice of vast array of supporting matrices and solvent systems resulted in separation of almost all types of analytes except volatiles. First step of a robust method development for routine quantification is to check the stability of analyte during various steps of chromatographic development followed by preparation of calibration curves. Thereafter, various validation aspects of analysis namely peak purity, linearity and range, precision, limit of detection, limit of quantification, robustness, and accuracy have to be measured.

  9. A quantitative method for analysis of in vitro neurite outgrowth.

    PubMed

    Mitchell, P J; Hanson, J C; Quets-Nguyen, A T; Bergeron, M; Smith, R C

    2007-08-30

    The adult mammalian CNS is extremely limited in its ability to regenerate axons following injury. Glial scar, neuroinflammatory processes and molecules released from myelin impair axonal regrowth and contribute to the lack of neural regeneration. An in vitro assay that quantitates neurite outgrowth from cultured neurons as a model of neuronal regenerative potential is described. Specifically, the neurite outgrowth from primary neurons (rat cerebellar granule neurons; CGNs) and a neuronal cell line (NG108-15) were quantitatively measured after optimization of culture conditions. After cultures were fixed and immunostained to label neurons and nuclei, microscope images were captured and an image analysis algorithm was developed using Image-Pro Plus software to allow quantitative analysis. The algorithm allowed the determination of total neurite length, number of neurons, and number of neurons without neurites. The algorithm also allows for end-user control of thresholds for staining intensity and cell/nuclei size. This assay represents a useful tool for quantification of neurite outgrowth from a variety of neuronal sources with applications that include: (1) assessment of neurite outgrowth potential; (2) identification of molecules that can block or stimulate neurite outgrowth in conventional culture media; and (3) identification of agents that can overcome neurite outgrowth inhibition by inhibitory substrates. PMID:17570533

  10. A Quantitative Method for Estimating Probable Public Costs of Hurricanes.

    PubMed

    BOSWELL; DEYLE; SMITH; BAKER

    1999-04-01

    / A method is presented for estimating probable public costs resulting from damage caused by hurricanes, measured as local government expenditures approved for reimbursement under the Stafford Act Section 406 Public Assistance Program. The method employs a multivariate model developed through multiple regression analysis of an array of independent variables that measure meteorological, socioeconomic, and physical conditions related to the landfall of hurricanes within a local government jurisdiction. From the regression analysis we chose a log-log (base 10) model that explains 74% of the variance in the expenditure data using population and wind speed as predictors. We illustrate application of the method for a local jurisdiction-Lee County, Florida, USA. The results show that potential public costs range from $4.7 million for a category 1 hurricane with winds of 137 kilometers per hour (85 miles per hour) to $130 million for a category 5 hurricane with winds of 265 kilometers per hour (165 miles per hour). Based on these figures, we estimate expected annual public costs of $2.3 million. These cost estimates: (1) provide useful guidance for anticipating the magnitude of the federal, state, and local expenditures that would be required for the array of possible hurricanes that could affect that jurisdiction; (2) allow policy makers to assess the implications of alternative federal and state policies for providing public assistance to jurisdictions that experience hurricane damage; and (3) provide information needed to develop a contingency fund or other financial mechanism for assuring that the community has sufficient funds available to meet its obligations. KEY WORDS: Hurricane; Public costs; Local government; Disaster recovery; Disaster response; Florida; Stafford Act PMID:9950698

  11. Performance analysis of quantitative phase retrieval method in Zernike phase contrast X-ray microscopy

    NASA Astrophysics Data System (ADS)

    Heng, Chen; Kun, Gao; Da-Jiang, Wang; Li, Song; Zhi-Li, Wang

    2016-02-01

    Since the invention of Zernike phase contrast method in 1930, it has been widely used in optical microscopy and more recently in X-ray microscopy. Considering the image contrast is a mixture of absorption and phase information, we recently have proposed and demonstrated a method for quantitative phase retrieval in Zernike phase contrast X-ray microscopy. In this contribution, we analyze the performance of this method at different photon energies. Intensity images of PMMA samples are simulated at 2.5 keV and 6.2 keV, respectively, and phase retrieval is performed using the proposed method. The results demonstrate that the proposed phase retrieval method is applicable over a wide energy range. For weakly absorbing features, the optimal photon energy is 2.5 keV, from the point of view of image contrast and accuracy of phase retrieval. On the other hand, in the case of strong absorption objects, a higher photon energy is preferred to reduce the error of phase retrieval. These results can be used as guidelines to perform quantitative phase retrieval in Zernike phase contrast X-ray microscopy with the proposed method. Supported by the State Key Project for Fundamental Research (2012CB825801), National Natural Science Foundation of China (11475170, 11205157 and 11179004) and Anhui Provincial Natural Science Foundation (1508085MA20).

  12. Quantitative electromechanical impedance method for nondestructive testing based on a piezoelectric bimorph cantilever

    NASA Astrophysics Data System (ADS)

    Fu, Ji; Tan, Chi; Li, Faxin

    2015-06-01

    The electromechanical impedance (EMI) method, which holds great promise in structural health monitoring (SHM), is usually treated as a qualitative method. In this work, we proposed a quantitative EMI method based on a piezoelectric bimorph cantilever using the sample’s local contact stiffness (LCS) as the identification parameter for nondestructive testing (NDT). Firstly, the equivalent circuit of the contact vibration system was established and the analytical relationship between the cantilever’s contact resonance frequency and the LCS was obtained. As the LCS is sensitive to typical defects such as voids and delamination, the proposed EMI method can then be used for NDT. To verify the equivalent circuit model, two piezoelectric bimorph cantilevers were fabricated and their free resonance frequencies were measured and compared with theoretical predictions. It was found that the stiff cantilever’s EMI can be well predicted by the equivalent circuit model while the soft cantilever’s cannot. Then, both cantilevers were assembled into a homemade NDT system using a three-axis motorized stage for LCS scanning. Testing results on a specimen with a prefabricated defect showed that the defect could be clearly reproduced in the LCS image, indicating the validity of the quantitative EMI method for NDT. It was found that the single-frequency mode of the EMI method can also be used for NDT, which is faster but not quantitative. Finally, several issues relating to the practical application of the NDT method were discussed. The proposed EMI-based NDT method offers a simple and rapid solution for damage evaluation in engineering structures and may also shed some light on EMI-based SHM.

  13. A novel method for quantitative geosteering using azimuthal gamma-ray logging.

    PubMed

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-02-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated. PMID:25479436

  14. Multipoint methods for linkage analysis of quantitative trait loci in sib pairs

    SciTech Connect

    Cardon, L.R. |; Cherny, S.S.; Fulker, D.W.

    1994-09-01

    The sib-pair method of Haseman and Elston is widely used for linkage analysis of quantitative traits. The method requires no assumptions concerning the mode of transmission of the trait, it is robust with respect to genetic heterogeneity, and it is computationally efficient. However, the practical usefulness of the method is limited by its statistical power, requiring large numbers of sib paris and highly informative markers to detect genetic loci of only moderate effect size. We have developed a family of interval mapping procedures which dramatically increase the statistical power of the classical sib-pair approach. The methods make use of information from pairs of markers which flank a putative quantitative trait locus (QTL) in order to estimate the location and effect size of the QTL. Here we describe an extension of the interval mapping procedure which takes into account all available marker information on a chromosome simultaneously, rather than just pairs of markers. The method provides a computationally fast approximation to full multipoint analysis of sib-pair data using a modified Haserman-Elston approach. It gives very similar results to the earlier interval mapping procedure when marker information is relatively uniform and a coarse map is used. However, there is a substantial improvement over the original method when markers differ in information content and when a dense map is employed. The method is illustrated using real and simulated sib-pair data.

  15. Quantitative evaluation of peptide-extraction methods by HPLC-triple-quad MS-MS.

    PubMed

    Du, Yan; Wu, Dapeng; Wu, Qian; Guan, Yafeng

    2015-02-01

    In this study, the efficiency of five peptide-extraction methodsacetonitrile (ACN) precipitation, ultrafiltration, C18 solid-phase extraction (SPE), dispersed SPE with mesoporous carbon CMK-3, and mesoporous silica MCM-41was quantitatively investigated. With 28 tryptic peptides as target analytes, these methods were evaluated on the basis of recovery and reproducibility by using high-performance liquid chromatography-triple-quad tandem mass spectrometry in selected-reaction-monitoring mode. Because of the distinct extraction mechanisms of the methods, their preferences for extracting peptides of different properties were revealed to be quite different, usually depending on the pI values or hydrophobicity of peptides. When target peptides were spiked in bovine serum albumin (BSA) solution, the extraction efficiency of all the methods except ACN precipitation changed significantly. The binding of BSA with target peptides and nonspecific adsorption on adsorbents were believed to be the ways through which BSA affected the extraction behavior. When spiked in plasma, the performance of all five methods deteriorated substantially, with the number of peptides having recoveries exceeding 70% being 15 for ACN precipitation, and none for the other methods. Finally, the methods were evaluated in terms of the number of identified peptides for extraction of endogenous plasma peptides. Only ultrafiltration and CMK-3 dispersed SPE performed differently from the quantitative results with target peptides, and the wider distribution of the properties of endogenous peptides was believed to be the main reason. PMID:25542575

  16. Reliability and Feasibility of Methods to Quantitatively Assess Peripheral Edema

    PubMed Central

    Brodovicz, Kimberly G.; McNaughton, Kristin; Uemura, Naoto; Meininger, Gary; Girman, Cynthia J.; Yale, Steven H.

    2009-01-01

    Objective: To evaluate methods to assess peripheral edema for reliability, feasibility and correlation with the classic clinical assessment of pitting edema. Design: Cross-sectional observational study. Setting: Large primary care clinic in Marshfield, Wisconsin, USA. Participants: Convenience sample of 20 patients with type 2 diabetes and a range of edema severity, including patients without edema. Methods: Eight methods of edema assessment were evaluated: (1) clinical assessment of pit depth and recovery at three locations, (2) patient questionnaire, (3) ankle circumference, (4) figure-of-eight (ankle circumference using eight ankle/foot landmarks), (5) edema tester (plastic card with holes of varying size pressed to the ankle with a blood pressure cuff), (6) modified edema tester (edema tester with bumps), (7) indirect leg volume (by series of ankle/leg circumferences), and (8) foot/ankle volumetry by water displacement. Patients were evaluated independently by three nurse examiners. Results: Water displacement and ankle circumference had high inter-examiner agreement (intraclass correlation coefficient 0.93, 0.96 right; 0.97, 0.97 left). Agreement was inconsistent for figure-of-eight (0.64, 0.86), moderate for indirect leg volume (0.53, 0.66), and low for clinical assessments at all locations. Agreement was low for the edema testers but varied by the pressure administered. Correlation with the classic, subjective clinical assessment was good for the nurse-performed assessments and patient questionnaire. Ankle circumference and patient questionnaires each took 1 minute to complete. Other tools took >5 minutes to complete. Conclusions: Water displacement and ankle circumference showed excellent reliability; however, water displacement is a time-consuming measure and may pose implementation challenges in the clinical and clinical trial environments. Patient-reported level and frequency of edema, based on an unvalidated questionnaire, was generally well correlated with the physician assessment of edema severity and may prove to be another reliable and accurate method of assessing edema. Additional study is needed to evaluate the validity and responsiveness of these methods. PMID:19251582

  17. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    NASA Astrophysics Data System (ADS)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  18. Novel quantitative test method of laser range finder for range measurement: computerized instrument test method

    NASA Astrophysics Data System (ADS)

    Chen, Zhibin

    1996-10-01

    The maximum rang finding ability of laser range finder (LRF) is greatly influenced by various factors under field conditions, and these factors are changing constantly. In this paper, the advantages and shortcomings of traditional test method 'dissipated light power ratio method' through field object target are analyzed. A computerized instrument is developed, this instrument has no influence with field environment and characteristics of targets, it is simple, effective, accurate and quantitative to test comprehensive ability of range measurement which LRF system itself has. This paper introduces optimal formulas by use of computerized instrument to estimate the measuring rang of LRF, studied the theory of equations of measuring range by laser, and made a breakthrough of test method at program control of time-delay simulating space range and transmission and receiving of LRF under field conditions. This computerized test instrument has highly practical application and theoretical guiding meaning in demarcating checking and accepting of product produced by factories.

  19. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this combined method in terms of quantitative accuracy using the realistic 3D NCAT phantom and an activity distribution obtained from patient studies. We compared the accuracy of organ activity estimates in images reconstructed with and without addition of downscatter compensation from projections with and without downscatter contamination. Results: We observed that the proposed method provided substantial improvements in accuracy compared to no downscatter compensation and had accuracies comparable to reconstructions from projections without downscatter contamination. Conclusions: The results demonstrate that the proposed model-based downscatter compensation method is effective and may have a role in quantitative 131I imaging. PMID:21815394

  20. Nonequilibrium atomistic simulations: methods and results

    SciTech Connect

    Hoover, W.G.

    1985-11-01

    Steady diffusive, viscous, plastic, and heat-conducting flows have all been successfully simulated using nonequilibrium molecular dynamics. Steady-state nonequilibrium simulations use ''driving'' forces to do mechanical work and ''constraint'' forces to extract the resulting heat. Just as in the Newtonian case, these generalized equations of motion are time-reversible and have constants of the motion associated with them. But, just as in the Newtonian case, the reversibility is illusory. The equilibrium and nonequilibrium equations both exhibit Lyapunov instability, with neighboring phase-space trajectories separating exponentially with time. We describe the methods by treating small systems of two or three particles. Results, for many-body systems, are discussed in terms of a generalized Principle of Corresponding States. 26 refs.

  1. Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.

    PubMed

    Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko

    2013-06-19

    A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods. PMID:23039046

  2. On the quantitative method for measurement and analysis of the fine structure of Fraunhofer line profiles

    NASA Astrophysics Data System (ADS)

    Kuli-Zade, D. M.

    The methods of measurement and analysis of the fine structure of weak and moderate Fraunhofer line profiles are considered. The digital spectral materials were obtained using rapid scanning high dispersion and high resolution double monochromators. The methods of asymmetry coefficient, bisector method and new quantitative method pro- posed by the author are discussed. The new physical values of differential, integral, residual and relative asymmetries are first introduced. These quantitative values permit us to investigate the dependence of asymmetry on microscopic (atomic) and macro- scopic (photospheric) values. It is shown that the integral profile asymmetries grow appreciably with increase in line equivalent width. The average effective depths of the formation of used Fraunhofer lines in the photosphere of the Sun are determined. It is shown that with the increasing of the effective formation depths of the lines integral and residual asymmetries of the lines profiles noticeably decrease. It is in fine agree- ment with the results of intensity dependence of asymmetry. The above-mentioned methods are critically compared and the advantages of author's method are shown. The computer program of calculation of the line-profile asymmetry parameters has been worked out.

  3. A competitive quantitative polymerase chain reaction method for characterizing the population dynamics during kimchi fermentation.

    PubMed

    Ahn, Gee-Hyun; Moon, Jin Seok; Shin, So-Yeon; Min, Won Ki; Han, Nam Soo; Seo, Jin-Ho

    2015-01-01

    The aim of this study was to develop a competitive quantitative-PCR (CQ-PCR) method for rapid analysis of the population dynamics of lactic acid bacteria (LAB) in kimchi. For this, whole chromosome sequences of Leuconostoc mesenteroides, Lactobacillus plantarum, and Lb. brevis were compared and species-specific PCR primers targeting dextransucrase, 16S rRNA, and surface layer protein D (SlpD) genes, respectively, were constructed. The tested strains were quantified both in medium and kimchi by CQ-PCR and the results were compared with the data obtained using a conventional plate-counting method. As a result, the three species were successfully detected and quantified by the indicated primer sets. Our results show that the CQ-PCR method targeting species-specific genes is suitable for rapid estimation of LAB population to be used in the food fermentation industry. PMID:25475752

  4. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  5. Quantitative investigation into methods for evaluating neocortical slice viability

    PubMed Central

    2013-01-01

    Background In cortical and hippocampal brain slice experiments, the viability of processed tissue is usually judged by the amplitude of extracellularly-recorded seizure-like event (SLE) activity. Surprisingly, the suitability of this approach for evaluating slice quality has not been objectively studied. Furthermore, a method for gauging the viability of quiescent tissue, in which SLE activity is intentionally suppressed, has not been documented. In this study we undertook to address both of these matters using the zero-magnesium SLE model in neocortical slices. Methods Using zero-magnesium SLE activity as the output parameter, we investigated: 1) changes in the pattern (amplitude, frequency and length) of SLE activity as slice health either deteriorated; or was compromised by altering the preparation methodology and; 2) in quiescent tissue, whether the triggering of high frequency field activity following electrode insertion predicted subsequent development of SLE activity and hence slice viability. Results SLE amplitude was the single most important variable correlating with slice viability, with a value less than 50?V indicative of tissue unlikely to be able to sustain population activity for more than 3060minutes. In quiescent slices, an increase in high frequency field activity immediately after electrode insertion predicted the development of SLE activity in 100% of cases. Furthermore, the magnitude of the increase in spectral power correlated with the amplitude of succeeding SLE activity (R2 40.9%, p?

  6. Quantitative estimation of poikilocytosis by the coherent optical method

    NASA Astrophysics Data System (ADS)

    Safonova, Larisa P.; Samorodov, Andrey V.; Spiridonov, Igor N.

    2000-05-01

    The investigation upon the necessity and the reliability required of the determination of the poikilocytosis in hematology has shown that existing techniques suffer from grave shortcomings. To determine a deviation of the erythrocytes' form from the normal (rounded) one in blood smears it is expedient to use an integrative estimate. The algorithm which is based on the correlation between erythrocyte morphological parameters with properties of the spatial-frequency spectrum of blood smear is suggested. During analytical and experimental research an integrative form parameter (IFP) which characterizes the increase of the relative concentration of cells with the changed form over 5% and the predominating type of poikilocytes was suggested. An algorithm of statistically reliable estimation of the IFP on the standard stained blood smears has been developed. To provide the quantitative characterization of the morphological features of cells a form vector has been proposed, and its validity for poikilocytes differentiation was shown.

  7. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy.

    PubMed

    Tran Khac, Bien Cuong; Chung, Koo-Hyun

    2016-02-01

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8-29% smaller than those obtained from the other two methods. This discrepancy decreased to 3-19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method. PMID:26624514

  8. Geothermal investigations in Nebraska: methods and results

    SciTech Connect

    Gosnold, W.D. Jr.; Eversoll, D.A.; Carlson, M.P.; Ruscetta, C.A.; Foley, D.

    1981-05-01

    At the inception of the geothermal resource assessment program in Nebraska there was some skepticism about the existence of any geothermal resources within the state. Now after two years of study and collaboration with other workers in the geothermal field it is found that about two-thirds of the state has access to a potential low-temperature resource. The nature of the resource is warm water in laterally extensive aquifers which are overlain by thick (> 1 km) sections of low thermal conductivity sediments. For most of the resource area the high temperatures in the aquifers result from high temperature gradients in the overlying shales. However, in the northcentral and far western parts of the state there is evidence for convective heat flow due to updip water flow in the aquifers. The success of the program has resulted from the synthesis of heat flow and temperature gradient measurements with stratigraphic and lithologic data. The methods used and the results obtained during the study are described.

  9. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    PubMed

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. PMID:22207079

  10. Full quantitative phase analysis of hydrated lime using the Rietveld method

    SciTech Connect

    Lassinantti Gualtieri, Magdalena

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  11. An ECL-PCR method for quantitative detection of point mutation

    NASA Astrophysics Data System (ADS)

    Zhu, Debin; Xing, Da; Shen, Xingyan; Chen, Qun; Liu, Jinfeng

    2005-04-01

    A new method for identification of point mutations was proposed. Polymerase chain reaction (PCR) amplification of a sequence from genomic DNA was followed by digestion with a kind of restriction enzyme, which only cut the wild-type amplicon containing its recognition site. Reaction products were detected by electrochemiluminescence (ECL) assay after adsorption of the resulting DNA duplexes to the solid phase. One strand of PCR products carries biotin to be bound on a streptavidin-coated microbead for sample selection. Another strand carries Ru(bpy)32+ (TBR) to react with tripropylamine (TPA) to emit light for ECL detection. The method was applied to detect a specific point mutation in H-ras oncogene in T24 cell line. The results show that the detection limit for H-ras amplicon is 100 fmol and the linear range is more than 3 orders of magnitude, thus, make quantitative analysis possible. The genotype can be clearly discriminated. Results of the study suggest that ECL-PCR is a feasible quantitative method for safe, sensitive and rapid detection of point mutation in human genes.

  12. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  13. Quantitative research on the primary process: method and findings.

    PubMed

    Holt, Robert R

    2002-01-01

    Freud always defined the primary process metapsychologically, but he described the ways it shows up in dreams, parapraxes, jokes, and symptoms with enough observational detail to make it possible to create an objective, reliable scoring system to measure its manifestations in Rorschach responses, dreams, TAT stories, free associations, and other verbal texts. That system can identify signs of the thinker's efforts, adaptive or maladaptive, to control or defend against the emergence of primary process. A prerequisite and a consequence of the research that used this system was clarification and elaboration of the psychoanalytic theory of thinking. Results of empirical tests of several propositions derived from psychoanalytic theory are summarized. Predictions concerning the method's most useful index, of adaptive vs. maladaptive regression, have been repeatedly verified: People who score high on this index (who are able to produce well-controlled "primary products" in their Rorschach responses), as compared to those who score at the maladaptive pole (producing primary-process-filled responses with poor reality testing, anxiety, and pathological defensive efforts), are better able to tolerate sensory deprivation, are more able to enter special states of consciousness comfortably (drug-induced, hypnotic, etc.), and have higher achievements in artistic creativity, while schizophrenics tend to score at the extreme of maladaptive regression. Capacity for adaptive regression also predicts success in psychotherapy, and rises with the degree of improvement after both psychotherapy and drug treatment. Some predictive failures have been theoretically interesting: Kris's hypothesis about creativity and the controlled use of primary process holds for males but usually not for females. This body of work is presented as a refutation of charges, brought by such critics as Crews, that psychoanalysis cannot become a science. PMID:12206540

  14. Quantitative Analysis of Single Particle Trajectories: Mean Maximal Excursion Method

    PubMed Central

    Tejedor, Vincent; Bénichou, Olivier; Voituriez, Raphael; Jungmann, Ralf; Simmel, Friedrich; Selhuber-Unkel, Christine; Oddershede, Lene B.; Metzler, Ralf

    2010-01-01

    An increasing number of experimental studies employ single particle tracking to probe the physical environment in complex systems. We here propose and discuss what we believe are new methods to analyze the time series of the particle traces, in particular, for subdiffusion phenomena. We discuss the statistical properties of mean maximal excursions (MMEs), i.e., the maximal distance covered by a test particle up to time t. Compared to traditional methods focusing on the mean-squared displacement we show that the MME analysis performs better in the determination of the anomalous diffusion exponent. We also demonstrate that combination of regular moments with moments of the MME method provides additional criteria to determine the exact physical nature of the underlying stochastic subdiffusion processes. We put the methods to test using experimental data as well as simulated time series from different models for normal and anomalous dynamics such as diffusion on fractals, continuous time random walks, and fractional Brownian motion. PMID:20371337

  15. Semi-quantitative method to estimate levels of Campylobacter

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  16. A method and fortran program for quantitative sampling in paleontology

    USGS Publications Warehouse

    Tipper, J.C.

    1976-01-01

    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  17. Reconstruction-classification method for quantitative photoacoustic tomography.

    PubMed

    Malone, Emma; Powell, Samuel; Cox, Ben T; Arridge, Simon

    2015-12-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in two and three dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches. PMID:26662815

  18. Reconstruction-classification method for quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Malone, Emma; Powell, Samuel; Cox, Ben T.; Arridge, Simon

    2015-12-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in two and three dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches.

  19. Quantitative Diagnostic Method for Biceps Long Head Tendinitis by Using Ultrasound

    PubMed Central

    Huang, Shih-Wei; Wang, Wei-Te

    2013-01-01

    Objective. To investigate the feasibility of grayscale quantitative diagnostic method for biceps tendinitis and determine the cut-off points of a quantitative biceps ultrasound (US) method to diagnose biceps tendinitis. Design. Prospective cross-sectional case controlled study. Setting. Outpatient rehabilitation service. Methods. A total of 336 shoulder pain patients with suspected biceps tendinitis were recruited in this prospective observational study. The grayscale pixel data of the range of interest (ROI) were obtained for both the transverse and longitudinal views of the biceps US. Results. A total of 136 patients were classified with biceps tendinitis, and 200 patients were classified as not having biceps tendinitis based on the diagnostic criteria. Based on the Youden index, the cut-off points were determined as 26.85 for the transverse view and 21.25 for the longitudinal view of the standard deviation (StdDev) of the ROI values, respectively. When the ROI evaluation of the US surpassed the cut-off point, the sensitivity was 68% and the specificity was 90% in the StdDev of the transverse view, and the sensitivity was 81% and the specificity was 73% in the StdDev of the longitudinal view to diagnose biceps tendinitis. Conclusion. For equivocal cases or inexperienced sonographers, our study provides a more objective method for diagnosing biceps tendinitis in shoulder pain patients. PMID:24385888

  20. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  1. MODIS Radiometric Calibration Program, Methods and Results

    NASA Technical Reports Server (NTRS)

    Xiong, Xiaoxiong; Guenther, Bruce; Angal, Amit; Barnes, William; Salomonson, Vincent; Sun, Junqiang; Wenny, Brian

    2012-01-01

    As a key instrument for NASA s Earth Observing System (EOS), the Moderate Resolution Imaging Spectroradiometer (MODIS) has made significant contributions to the remote sensing community with its unprecedented amount of data products continuously generated from its observations and freely distributed to users worldwide. MODIS observations, covering spectral regions from visible (VIS) to long-wave infrared (LWIR), have enabled a broad range of research activities and applications for studies of the earth s interactive system of land, oceans, and atmosphere. In addition to extensive pre-launch measurements, developed to characterize sensor performance, MODIS carries a set of on-board calibrators (OBC) that can be used to track on-orbit changes of various sensor characteristics. Most importantly, dedicated and continuous calibration efforts have been made to maintain sensor data quality. This paper provides an overview of the MODIS calibration program, on-orbit calibration activities, methods, and performance. Key calibration results and lessons learned from the MODIS calibration effort are also presented in this paper.

  2. Quantifying disability: data, methods and results.

    PubMed Central

    Murray, C. J.; Lopez, A. D.

    1994-01-01

    Conventional methods for collecting, analysing and disseminating data and information on disability in populations have relied on cross-sectional censuses and surveys which measure prevalence in a given period. While this may be relevant for defining the extent and demographic pattern of disabilities in a population, and thus indicating the need for rehabilitative services, prevention requires detailed information on the underlying diseases and injuries that cause disabilities. The Global Burden of Disease methodology described in this paper provides a mechanism for quantifying the health consequences of the years of life lived with disabilities by first estimating the age-sex-specific incidence rates of underlying conditions, and then mapping these to a single disability index which collectively reflects the probability of progressing to a disability, the duration of life lived with the disability, and the approximate severity of the disability in terms of activity restriction. Detailed estimates of the number of disability-adjusted life years (DALYs) lived are provided in this paper, for eight geographical regions. The results should be useful to those concerned with planning health services for the disabled and, more particularly, with determining policies to prevent the underlying conditions which give rise to serious disabling sequelae. PMID:8062403

  3. Limitations of the ferrozine method for quantitative assay of mineral systems for ferrous and total iron

    NASA Astrophysics Data System (ADS)

    Anastácio, Alexandre S.; Harris, Brittany; Yoo, Hae-In; Fabris, José Domingos; Stucki, Joseph W.

    2008-10-01

    The quantitative assay of clay minerals, soils, and sediments for Fe(II) and total Fe is fundamental to understanding biogeochemical cycles occurring therein. The commonly used ferrozine method was originally designed to assay extracted forms of Fe(II) from non-silicate aqueous systems. It is becoming, however, increasingly the method of choice to report the total reduced state of Fe in soils and sediments. Because Fe in soils and sediments commonly exists in the structural framework of silicates, extraction by HCl, as used in the ferrozine method, fails to dissolve all of the Fe. The phenanthroline (phen) method, on the other hand, was designed to assay silicate minerals for Fe(II) and total Fe and has been proven to be highly reliable. In the present study potential sources of error in the ferrozine method were evaluated by comparing its results to those obtained by the phen method. Both methods were used to analyze clay mineral and soil samples for Fe(II) and total Fe. Results revealed that the conventional ferrozine method under reports total Fe in samples containing Fe in silicates and gives erratic results for Fe(II). The sources of error in the ferrozine method are: (1) HCl fails to dissolve silicates and (2) if the analyte solution contains Fe 3+, the analysis for Fe 2+ will be photosensitive, and reported Fe(II) values will likely be greater than the actual amount in solution. Another difficulty with the ferrozine method is that it is tedious and much more labor intensive than the phen method. For these reasons, the phen method is preferred and recommended. Its procedure is simpler, takes less time, and avoids the errors found in the ferrozine method.

  4. Quantitative interpretation of mineral hyperspectral images based on principal component analysis and independent component analysis methods.

    PubMed

    Jiang, Xiping; Jiang, Yu; Wu, Fang; Wu, Fenghuang

    2014-01-01

    Interpretation of mineral hyperspectral images provides large amounts of high-dimensional data, which is often complicated by mixed pixels. The quantitative interpretation of hyperspectral images is known to be extremely difficult when three types of information are unknown, namely, the number of pure pixels, the spectrum of pure pixels, and the mixing matrix. The problem is made even more complex by the disturbance of noise. The key to interpreting abstract mineral component information, i.e., pixel unmixing and abundance inversion, is how to effectively reduce noise, dimension, and redundancy. A three-step procedure is developed in this study for quantitative interpretation of hyperspectral images. First, the principal component analysis (PCA) method can be used to process the pixel spectrum matrix and keep characteristic vectors with larger eigenvalues. This can effectively reduce the noise and redundancy, which facilitates the abstraction of major component information. Second, the independent component analysis (ICA) method can be used to identify and unmix the pixels based on the linear mixed model. Third, the pure-pixel spectrums can be normalized for abundance inversion, which gives the abundance of each pure pixel. In numerical experiments, both simulation data and actual data were used to demonstrate the performance of our three-step procedure. Under simulation data, the results of our procedure were compared with theoretical values. Under the actual data measured from core hyperspectral images, the results obtained through our algorithm are compared with those of similar software (Mineral Spectral Analysis 1.0, Nanjing Institute of Geology and Mineral Resources). The comparisons show that our method is effective and can provide reference for quantitative interpretation of hyperspectral images. PMID:24694708

  5. A SVM-based Quantitative fMRI Method For Resting State Functional Network Detection

    PubMed Central

    Song, Xiaomu; Chen, Nan-kuei

    2014-01-01

    Resting state functional magnetic resonance imaging (fMRI) aims to measure baseline neuronal connectivity independent of specific functional tasks and to capture changes in the connectivity due to neurological diseases. Most existing network detection methods rely on a fixed threshold to identify functionally connected voxels under the resting state. Due to fMRI non-stationarity, the threshold cannot adapt to variation of data characteristics across sessions and subjects, and generates unreliable mapping results. In this study, a new method is presented for resting state fMRI data analysis. Specifically, the resting state network mapping is formulated as an outlier detection process that is implemented using one-class support vector machine (SVM). The results are refined by using a spatial-feature domain prototype selection method and two-class SVM reclassification. The final decision on each voxel is made by comparing its probabilities of functionally connected and unconnected instead of a threshold. Multiple features for resting state analysis were extracted and examined using a SVM-based feature selection method, and the most representative features were identified. The proposed method was evaluated using synthetic and experimental fMRI data. A comparison study was also performed with independent component analysis (ICA) and correlation analysis. The experimental results show that the proposed method can provide comparable or better network detection performance than ICA and correlation analysis. The method is potentially applicable to various resting state quantitative fMRI studies. PMID:24928301

  6. A general method for the quantitative assessment of mineral pigments.

    PubMed

    Zurita Ares, M C; Fernández, J M

    2016-01-01

    A general method for the estimation of mineral pigment contents in different bases has been proposed using a sole set of calibration curves, (one for each pigment), calculated for a white standard base, thus elaborating patterns for each utilized base is not necessary. The method can be used in different bases and its validity had ev en been proved in strongly tinted bases. The method consists of a novel procedure that combines diffuse reflectance spectroscopy, second derivatives and the Kubelka-Munk function. This technique has proved to be at least one order of magnitude more sensitive than X-Ray diffraction for colored compounds, since it allowed the determination of the pigment amount in colored samples containing 0.5wt% of pigment that was not detected by X-Ray Diffraction. The method can be used to estimate the concentration of mineral pigments in a wide variety of either natural or artificial materials, since it does not requiere the calculation of each pigment pattern in every base. This fact could have important industrial consequences, as the proposed method would be more convenient, faster and cheaper. PMID:26695268

  7. A Powerful and Robust Method for Mapping Quantitative Trait Loci in General Pedigrees

    PubMed Central

    Diao, G. ; Lin, D. Y. 

    2005-01-01

    The variance-components model is the method of choice for mapping quantitative trait loci in general human pedigrees. This model assumes normally distributed trait values and includes a major gene effect, random polygenic and environmental effects, and covariate effects. Violation of the normality assumption has detrimental effects on the type I error and power. One possible way of achieving normality is to transform trait values. The true transformation is unknown in practice, and different transformations may yield conflicting results. In addition, the commonly used transformations are ineffective in dealing with outlying trait values. We propose a novel extension of the variance-components model that allows the true transformation function to be completely unspecified. We present efficient likelihood-based procedures to estimate variance components and to test for genetic linkage. Simulation studies demonstrated that the new method is as powerful as the existing variance-components methods when the normality assumption holds; when the normality assumption fails, the new method still provides accurate control of type I error and is substantially more powerful than the existing methods. We performed a genomewide scan of monoamine oxidase B for the Collaborative Study on the Genetics of Alcoholism. In that study, the results that are based on the existing variance-components method changed dramatically when three outlying trait values were excluded from the analysis, whereas our method yielded essentially the same answers with or without those three outliers. The computer program that implements the new method is freely available. PMID:15918154

  8. The fundamentals and applications of phase field method in quantitative microstructural modeling

    NASA Astrophysics Data System (ADS)

    Shen, Chen

    The key to predicting and therefore controlling properties of materials is the knowledge of microstructure. As computer modeling and simulation is becoming an important part of materials science and engineering, there is an ever-increasing demand for quantitative models that are able to handle microstructures of realistic complexity at length and time scales of practical interest. The phase field approach has become the method of choice for modeling complicated microstructural evolutions during various phase transformations, grain growth and plastic deformation. Using gradient thermodynamics of non-uniform systems and Langevin dynamics, the method characterizes arbitrary microstructures and their spatial-temporal evolution with field variables, and is capable of simulating microstructures and their evolution under various realistic conditions. However, the adoption of the phase field method in practical applications has been slow because the current phase field microstructure modeling is qualitative in nature. In this thesis, recent efforts in developing the phase field method for quantitative microstructure modeling are presented. This includes extension of the phase field method to situations where nucleation, growth and coarsening occur concurrently, incorporation of anisotropic elastic energy into the nucleation activation energy, and comparison of phase field kinetics for diffusion-controlled phase transformations with Johnson-Mehl-Avrami-Kolmogorov (JMAK) theory. The most recent extensions of the phase field method to modeling dislocation networks, dislocation core structures and partial dislocations, and dislocation interactions with gamma/gamma' microstructures in superalloys are also presented. The length scale limitations and practical approaches to increase simulation length scales for quantitative modeling are discussed for a quite general category of phase field applications. These extensions enable various new understandings of microstructure. For example, coherent precipitates are found to behave similar to dislocations and grain boundaries, causing solute segregation, correlated nucleation and autocatalytic effect. The overall kinetics in diffusion-controlled precipitation agrees with the JMAK prediction only at early stages, and due to soft-impingement and the Gibbs-Thomson effect the later kinetics could deviate considerably. The new formulations of the crystalline energy and gradient energy in phase field model of dislocations allow to study complex dislocation structures, including networks and dissociated nodes, in a self-consistent way. The introduction of gamma-surfaces for constituent phases enables treating dislocation motion in multi-phase microstructure in one model. Finally, the discussion on the length scale clarifies the applicability of the conventional approaches for increasing simulation length scales, and their respective consequence to the quantitative results.

  9. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, D.A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  10. Quantitative Decomposition of Dynamics of Mathematical Cell Models: Method and Application to Ventricular Myocyte Models

    PubMed Central

    Shimayoshi, Takao; Cha, Chae Young; Amano, Akira

    2015-01-01

    Mathematical cell models are effective tools to understand cellular physiological functions precisely. For detailed analysis of model dynamics in order to investigate how much each component affects cellular behaviour, mathematical approaches are essential. This article presents a numerical analysis technique, which is applicable to any complicated cell model formulated as a system of ordinary differential equations, to quantitatively evaluate contributions of respective model components to the model dynamics in the intact situation. The present technique employs a novel mathematical index for decomposed dynamics with respect to each differential variable, along with a concept named instantaneous equilibrium point, which represents the trend of a model variable at some instant. This article also illustrates applications of the method to comprehensive myocardial cell models for analysing insights into the mechanisms of action potential generation and calcium transient. The analysis results exhibit quantitative contributions of individual channel gating mechanisms and ion exchanger activities to membrane repolarization and of calcium fluxes and buffers to raising and descending of the cytosolic calcium level. These analyses quantitatively explicate principle of the model, which leads to a better understanding of cellular dynamics. PMID:26091413

  11. Quantitative method for in vitro matrigel invasiveness measurement through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Rey, Juan A; Dotor, Javier; Castresana, Javier S

    2014-10-01

    The determination of cell invasion by matrigel assay is usually evaluated by counting cells able to pass through a porous membrane and attach themselves to the other side, or by an indirect quantification of eluted specific cell staining dye by means of optical density measurement. This paper describes a quantitative analytical imaging approach for determining the invasiveness of tumor cells using a simple method, based on images processing with the public domain software, ImageJ. Images obtained by direct capture are split into the red channel, and the generated image is used to measure the area that cells cover in the picture. To overcome the several disadvantages that classical cell invasion determinations present, we propose this method because it generates more accurate and sensitive determinations, and it could be a reasonable option for improving the quality of the results. The cost-effective alternative method proposed is based on this simple and robust software that is worldwide affordable. PMID:24990701

  12. Quantitative trait locus gene mapping: a new method for locating alcohol response genes.

    PubMed

    Crabbe, J C

    1996-01-01

    Alcoholism is a multigenic trait with important non-genetic determinants. Studies with genetic animal models of susceptibility to several of alcohol's effects suggest that several genes contributing modest effects on susceptibility (Quantitative Trait Loci, or QTLs) are important. A new technique of QTL gene mapping has allowed the identification of the location in mouse genome of several such QTLs. The method is described, and the locations of QTLs affecting the acute alcohol withdrawal reaction are described as an example of the method. Verification of these QTLs in ancillary studies is described and the strengths, limitations, and future directions to be pursued are discussed. QTL mapping is a promising method for identifying genes in rodents with the hope of directly extrapolating the results to the human genome. This review is based on a paper presented at the First International Congress of the Latin American Society for Biomedical Research on Alcoholism, Santiago, Chile, November 1994. PMID:12893462

  13. Selection methods in forage breeding: a quantitative appraisal

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Forage breeding can be extraordinarily complex because of the number of species, perenniality, mode of reproduction, mating system, and the genetic correlation for some traits evaluated in spaced plants vs. performance under cultivation. Aiming to compare eight forage breeding methods for direct sel...

  14. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1981-02-25

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules or ions.

  15. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, Edward F.; Keller, Richard A.; Apel, Charles T.

    1983-01-01

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions.

  16. Optogalvanic intracavity quantitative detector and method for its use

    DOEpatents

    Zalewski, E.F.; Keller, R.A.; Apel, C.T.

    1983-09-06

    The disclosure relates to an optogalvanic intracavity detector and method for its use. Measurement is made of the amount of light absorbed by atoms, small molecules and ions in a laser cavity utilizing laser-produced changes in plasmas containing the same atoms, molecules, or ions. 6 figs.

  17. QUANTITATIVE METHODS FOR CROSS-SPECIES MAPPING (CSM)

    EPA Science Inventory

    Cross species extrapolation will be defined as prediction from one species to another without empirical verification. ross species mapping (CSM) is the same except empirical verification is performed. SM may be viewed as validation of methods for extrapolation. Algorithms for CSM...

  18. Compatibility of Qualitative and Quantitative Methods: Studying Child Sexual Abuse in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    1987-01-01

    Illustrates how the combined use of qualitative and quantitative methods were necessary in obtaining a clearer understanding of the process of incest in American society. Argues that the exclusive use of one methodology would have obscured important information. (FMW)

  19. Generalized multiple internal standard method for quantitative liquid chromatography mass spectrometry.

    PubMed

    Hu, Yuan-Liang; Chen, Zeng-Ping; Chen, Yao; Shi, Cai-Xia; Yu, Ru-Qin

    2016-05-01

    In this contribution, a multiplicative effects model for generalized multiple-internal-standard method (MEMGMIS) was proposed to solve the signal instability problem of LC-MS over time. MEMGMIS model seamlessly integrates the multiple-internal-standard strategy with multivariate calibration method, and takes full use of all the information carried by multiple internal standards during the quantification of target analytes. Unlike the existing methods based on multiple internal standards, MEMGMIS does not require selecting an optimal internal standard for the quantification of a specific analyte from multiple internal standards used. MEMGMIS was applied to a proof-of-concept model system: the simultaneous quantitative analysis of five edible artificial colorants in two kinds of cocktail drinks. Experimental results demonstrated that MEMGMIS models established on LC-MS data of calibration samples prepared with ultrapure water could provide quite satisfactory concentration predictions for colorants in cocktail samples from their LC-MS data measured 10days after the LC-MS analysis of the calibration samples. The average relative prediction errors of MEMGMIS models did not exceed 6.0%, considerably better than the corresponding values of commonly used univariate calibration models combined with multiple internal standards. The advantages of good performance and simple implementation render MEMGMIS model a promising alternative tool in quantitative LC-MS assays. PMID:27072522

  20. Characterization of working iron Fischer-Tropsch catalysts using quantitative diffraction methods

    NASA Astrophysics Data System (ADS)

    Mansker, Linda Denise

    This study presents the results of the ex-situ characterization of working iron Fischer-Tropsch synthesis (F-TS) catalysts, reacted hundreds of hours at elevated pressures, using a new quantitative x-ray diffraction analytical methodology. Compositions, iron phase structures, and phase particle morphologies were determined and correlated with the observed reaction kinetics. Conclusions were drawn about the character of each catalyst in its most and least active state. The identity of the active phase(s) in the Fe F-TS catalyst has been vigorously debated for more than 45 years. The highly-reduced catalyst, used to convert coal-derived syngas to hydrocarbon products, is thought to form a mixture of oxides, metal, and carbides upon pretreatment and reaction. Commonly, Soxhlet extraction is used to effect catalyst-product slurry separation; however, the extraction process could be producing irreversible changes in the catalyst, contributing to the conflicting results in the literature. X-ray diffraction doesn't require analyte-matrix separation before analysis, and can detect trace phases down to 300 ppm/2 nm; thus, working catalyst slurries could be characterized as-sampled. Data were quantitatively interpreted employing first principles methods, including the Rietveld polycrystalline structure method. Pretreated catalysts and pure phases were examined experimentally and modeled to explore specific behavior under x-rays. Then, the working catalyst slurries were quantitatively characterized. Empirical quantitation factors were calculated from experimental data or single crystal parameters, then validated using the Rietveld method results. In the most active form, after pretreatment in H 2 or in CO at Pambient, well-preserved working catalysts contained significant amounts of Fe7C3 with trace alpha-Fe, once reaction had commenced at elevated pressure. Amounts of Fe3O 4 were constant and small, with carbide dpavg < 15 nm. Small amounts of Fe7C3 were found in unreacted catalyst pretreated in CO at elevated pressures. In the least active form, well-preserved working catalysts contained Fe5C2 amounts >65 wt%, regardless of pretreatment gas and pressure, with all dpavg 18 nm. epsilon '-Fe2.2C carbide was found to probably consist of an {Fe5C2/FexO/epsilon-Fe3C} mixture. Fe5C2 carbide exhibited wide variations in diffraction pattern which could be correlated with sample handling events, changes in process conditions, or dpavg.

  1. Quantitative analysis method to evaluate optical clearing effect of skin using a hyperosmotic chemical agent.

    PubMed

    Yoon, Jinhee; Son, Taeyoon; Jung, Byungjo

    2007-01-01

    Light penetration depth in highly scattering tissues can be increased by using hyperosmotic chemical agents such as glycerol, PEG (polyethylene glycol) and glucose. Previous many studies used OCT, spectrometer, integrating method to quantitatively evaluate the optical clearing effect of skin. In this study, we show the optical clearing effect of skin using glycerol and suggest a new quantitative analysis method to evaluate the spatial optical clearing effect of skin using glycerol. PMID:18002713

  2. Measurable impact of RNA quality on gene expression results from quantitative PCR.

    PubMed

    Vermeulen, Jolle; De Preter, Katleen; Lefever, Steve; Nuytens, Justine; De Vloed, Fanny; Derveaux, Stefaan; Hellemans, Jan; Speleman, Frank; Vandesompele, Jo

    2011-05-01

    Compromised RNA quality is suggested to lead to unreliable results in gene expression studies. Therefore, assessment of RNA integrity and purity is deemed essential prior to including samples in the analytical pipeline. This may be of particular importance when diagnostic, prognostic or therapeutic conclusions depend on such analyses. In this study, the comparative value of six RNA quality parameters was determined using a large panel of 740 primary tumour samples for which real-time quantitative PCR gene expression results were available. The tested parameters comprise of microfluidic capillary electrophoresis based 18S/28S rRNA ratio and RNA Quality Index value, HPRT1 5'-3' difference in quantification cycle (Cq) and HPRT1 3' Cq value based on a 5'/3' ratio mRNA integrity assay, the Cq value of expressed Alu repeat sequences and a normalization factor based on the mean expression level of four reference genes. Upon establishment of an innovative analytical framework to assess impact of RNA quality, we observed a measurable impact of RNA quality on the variation of the reference genes, on the significance of differential expression of prognostic marker genes between two cancer patient risk groups, and on risk classification performance using a multigene signature. This study forms the basis for further rational assessment of reverse transcription quantitative PCR based results in relation to RNA quality. PMID:21317187

  3. A novel method for generating quantitative local electrochemical impedance spectroscopy

    SciTech Connect

    Lillard, R.S. ); Moran, P.J. ); Isaacs, H.S. )

    1992-04-01

    This paper reports on a local electrochemical impedance spectroscopy (LEIS) technique for mapping the ac impedance distribution, as a function of frequency, of an electrode. In LEIS, as in traditional ac impedance methods, a sinusoidal voltage perturbation between the working and reference electrode is maintained by driving an ac current between the working electrode and a distant counterelectrode with a potentiostat. Local ac impedances are then derived from the ratio of the applied ac voltage and the local ac solution current density. The local ac current density is obtained from potential difference measurements near the electrode surface using a probe consisting of two micro-electrodes. By measuring the ac potential difference between the micro-electrodes, and knowing their separation distance and the solution conductivity, the local ac solution current density is derived. The accuracy of the local ac impedance data generated with this technique was established by investigating two model systems.

  4. Quantitative Trait Locus Mapping Methods for Diversity Outbred Mice

    PubMed Central

    Gatti, Daniel M.; Svenson, Karen L.; Shabalin, Andrey; Wu, Long-Yang; Valdar, William; Simecek, Petr; Goodwin, Neal; Cheng, Riyan; Pomp, Daniel; Palmer, Abraham; Chesler, Elissa J.; Broman, Karl W.; Churchill, Gary A.

    2014-01-01

    Genetic mapping studies in the mouse and other model organisms are used to search for genes underlying complex phenotypes. Traditional genetic mapping studies that employ single-generation crosses have poor mapping resolution and limit discovery to loci that are polymorphic between the two parental strains. Multiparent outbreeding populations address these shortcomings by increasing the density of recombination events and introducing allelic variants from multiple founder strains. However, multiparent crosses present new analytical challenges and require specialized software to take full advantage of these benefits. Each animal in an outbreeding population is genetically unique and must be genotyped using a high-density marker set; regression models for mapping must accommodate multiple founder alleles, and complex breeding designs give rise to polygenic covariance among related animals that must be accounted for in mapping analysis. The Diversity Outbred (DO) mice combine the genetic diversity of eight founder strains in a multigenerational breeding design that has been maintained for >16 generations. The large population size and randomized mating ensure the long-term genetic stability of this population. We present a complete analytical pipeline for genetic mapping in DO mice, including algorithms for probabilistic reconstruction of founder haplotypes from genotyping array intensity data, and mapping methods that accommodate multiple founder haplotypes and account for relatedness among animals. Power analysis suggests that studies with as few as 200 DO mice can detect loci with large effects, but loci that account for <5% of trait variance may require a sample size of up to 1000 animals. The methods described here are implemented in the freely available R package DOQTL. PMID:25237114

  5. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    PubMed Central

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  6. Quantitative mineralogical composition of complex mineral wastes - Contribution of the Rietveld method

    SciTech Connect

    Mahieux, P.-Y.; Aubert, J.-E.; Cyr, M.; Coutand, M.; Husson, B.

    2010-03-15

    The objective of the work presented in this paper is the quantitative determination of the mineral composition of two complex mineral wastes: a sewage sludge ash (SSA) and a municipal solid waste incineration fly ash (MSWIFA). The mineral compositions were determined by two different methods: the first based on calculation using the qualitative mineralogical composition of the waste combined with physicochemical analyses; the second the Rietveld method, which uses only X-ray diffraction patterns. The results obtained are coherent, showing that it is possible to quantify the mineral compositions of complex mineral waste with such methods. The apparent simplicity of the Rietveld method (due principally to the availability of software packages implementing the method) facilitates its use. However, care should be taken since the crystal structure analysis based on powder diffraction data needs experience and a thorough understanding of crystallography. So the use of another, complementary, method such as the first one used in this study, may sometimes be needed to confirm the results.

  7. Bird community as an indicator of biodiversity: results from quantitative surveys in Brazil.

    PubMed

    Vielliard, J M

    2000-09-01

    This short review presents the results obtained in several localities of Brazil on the composition of forest bird communities. Data were collected since the late 80's, after we introduced a new methodology of quantitative survey, based on acoustic identification and unlimited-radius point census. Although these data are still scattered, they show uniquely precise and coherently comparative patterns of composition of forest bird communities. Our methodology has the advantage of being absolutely non-disturbing, highly efficient in the field and immediately processed. Results confirm that the structure of a bird community is a good indicator of biodiversity, particularly useful where biodiversity is high. Many of these data are available only in unpublished dissertations and abstracts of congress communications, or are being analysed. A cooperative program is needed to promote new surveys and publish their results, as a contribution for measuring and monitoring biodiversity, especially in complex endangered habitats. PMID:11028097

  8. New methods for quantitative and qualitative facial studies: an overview.

    PubMed

    Thomas, I T; Hintz, R J; Frias, J L

    1989-01-01

    The clinical study of birth defects has traditionally followed the Gestalt approach, with a trend, in recent years, toward more objective delineation. Data collection, however, has been largely restricted to measurements from X-rays and anthropometry. In other fields, new techniques are being applied that capitalize on the use of modern computer technology. One such technique is that of remote sensing, of which photogrammetry is a branch. Cartographers, surveyors and engineers, using specially designed cameras, have applied geometrical techniques to locate points on an object precisely. These techniques, in their long-range application, have become part of our industrial technology and have assumed great importance with the development of satellite-borne surveillance systems. The close-range application of similar techniques has the potential for extremely accurate clinical measurement. We are currently evaluating the application of remote sensing to facial measurement using three conventional 35 mm still cameras. The subject is photographed in front of a carefully measured grid, and digitization is then carried out on 35-mm slides specific landmarks on the cranioface are identified, along with points on the background grid and the four corners of the slide frame, and are registered as xy coordinates by a digitizer. These coordinates are then converted into precise locations in object space. The technique is capable of producing measurements to within 1/100th of an inch. We suggest that remote sensing methods such as this may well be of great value in the study of congenital malformations. PMID:2677039

  9. Immunochemical methods for quantitation of vitamin B6. Technical report

    SciTech Connect

    Brandon, D.L.; Corse, J.W.

    1981-09-30

    A procedure is described which proposes schemes for determining the total of all B6 vitamins in acid-hydrolyzed samples utilizing a radio-immunoassay (RIA) or an enzyme-immunoassay (EIA). Sample preparation is similar for both RIA and EIA. Two specific antibodies (antipyridoxine and antipyridoxamine) are employed to determine pyridoxamine, a portion of the sample is reduced with sodium borohydride. Pyridoxal is determined by difference between pyridoxine before and after reduction. The results indicate that two procedures have been developed which are selective for pyridoxamine (the fluorescent enzyme immunoassay and the spin immunoassay) and one assay which is equally sensitive to pyridoxine and pyridoxamine (the radio-immunoassay).

  10. Quantitative analysis of uranium in aqueous solutions using a semiconductor laser-based spectroscopic method.

    PubMed

    Cho, Hye-Ryun; Jung, Euo Chang; Cha, Wansik; Song, Kyuseok

    2013-05-01

    A simple analytical method based on the simultaneous measurement of the luminescence of hexavalent uranium ions (U(VI)) and the Raman scattering of water, was investigated for determining the concentration of U(VI) in aqueous solutions. Both spectra were measured using a cw semiconductor laser beam at a center wavelength of 405 nm. The empirical calibration curve for the quantitative analysis of U(VI) was obtained by measuring the ratio of the luminescence intensity of U(VI) at 519 nm to the Raman scattering intensity of water at 469 nm. The limit of detection (LOD) in the parts per billion range and a dynamic range from the LOD up to several hundred parts per million were achieved. The concentration of uranium in groundwater determined by this method is in good agreement with the results determined by kinetic phosphorescence analysis and inductively coupled plasma mass spectrometry. PMID:23534889

  11. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level. PMID:24187313

  12. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling

    PubMed Central

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol. PMID:23176383

  13. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  14. Spatial Access Priority Mapping (SAPM) with Fishers: A Quantitative GIS Method for Participatory Planning

    PubMed Central

    Yates, Katherine L.; Schoeman, David S.

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers’ spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers’ willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process in a transparent, quantitative way. PMID:23874623

  15. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (?2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (?2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  16. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    PubMed Central

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  17. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    PubMed

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time <60 s) and method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4 (+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical capability and sample analysis throughput suitable for broad applications in life sciences, agricultural chemistry, consumer safety, and beyond. Graphical abstract Position of FIA-MS relative to chromatography-MS and ambient MS in terms of analytical figures of merit and sample analysis throughput. PMID:26670771

  18. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the clinical setting.

  19. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;…

  20. Methods Used by Pre-Service Nigeria Certificate in Education Teachers in Solving Quantitative Problems in Chemistry

    ERIC Educational Resources Information Center

    Danjuma, Ibrahim Mohammed

    2011-01-01

    This paper reports part of the results of research on chemical problem solving behavior of pre-service teachers in Plateau and Northeastern states of Nigeria. Specifically, it examines and describes the methods used by 204 pre-service teachers in solving quantitative problems from four topics in chemistry. Namely, gas laws; electrolysis;

  1. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  2. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  3. Quantitative accuracy analysis of the discontinuous Galerkin method for seismic wave propagation

    NASA Astrophysics Data System (ADS)

    Kser, Martin; Hermann, Verena; Puente, Josep de la

    2008-06-01

    We present a quantitative accuracy analysis of the Discontinuous Galerkin Finite-Element method for the simulation of seismic wave propagation on tetrahedral meshes. Several parameters are responsible for the accuracy of results, such as the chosen approximation order, the spatial discretization, that is, number of elements per wavelength, and the propagation distance of the waves due to numerical dispersion and dissipation. As error norm we choose the time-frequency representation of the envelope and phase misfit of seismograms to assess the accuracy of the resulting seismograms since this provides the time evolution of the spectral content and allows for the clear separation of amplitude and phase errors obtained by the numerical method. Our results can be directly used to set up the necessary modelling parameters for practical applications, such as the minimum approximation order for a given mesh spacing to reach a desired accuracy. Finally, we apply our results to the well-acknowledged LOH.1 and LOH.3 problems of the SPICE Code Validation project, including heterogeneous material and the free surface boundary condition, and compare our solutions with those of other methods. In general, we want to stress the increasing importance of certain standard procedures to facilitate future code validations and comparisons of results in the community of numerical seismology.

  4. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods

    NASA Astrophysics Data System (ADS)

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-01

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  5. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods.

    PubMed

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-15

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%). PMID:26774813

  6. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  7. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials. PMID:11767156

  8. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer.

    PubMed

    Fu, Guanglei; Sanjay, Sharma T; Dou, Maowei; Li, XiuJun

    2016-03-01

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays. PMID:26838516

  9. Quantitative imaging mass spectrometry of renal sulfatides: validation by classical mass spectrometric methods1[S

    PubMed Central

    Marsching, Christian; Jennemann, Richard; Heilig, Raphael; Gröne, Hermann-Josef; Hopf, Carsten; Sandhoff, Roger

    2014-01-01

    Owing to its capability of discriminating subtle mass-altering structural differences such as double bonds or elongated acyl chains, MALDI-based imaging MS (IMS) has emerged as a powerful technique for analysis of lipid distribution in tissue at moderate spatial resolution of about 50 μm. However, it is still unknown if MS1-signals and ion intensity images correlate with the corresponding apparent lipid concentrations. Analyzing renal sulfated glycosphingolipids, sulfatides, we validate for the first time IMS-signal identities using corresponding sulfatide-deficient kidneys. To evaluate the extent of signal quenching effects interfering with lipid quantification, we surgically dissected the three major renal regions (papillae, medulla, and cortex) and systematically compared MALDI IMS of renal sulfatides with quantitative analyses of corresponding lipid extracts by on-target MALDI TOF-MS and by ultra-performance LC-ESI-(triple-quadrupole)tandem MS. Our results demonstrate a generally strong correlation (R2 > 0.9) between the local relative sulfatide signal intensity in MALDI IMS and absolute sulfatide quantities determined by the other two methods. However, high concentrations of sulfatides in the papillae and medulla result in an up to 4-fold signal suppression. In conclusion, our study suggests that MALDI IMS is useful for semi-quantitative dissection of relative local changes of sulfatides and possibly other lipids in tissue. PMID:25274613

  10. Evaluation of methods for quantitating erythrocyte antibodies and description of a new method using horseradish peroxidase-labelled antiglobulin.

    PubMed

    Greenwalt, T J; Steane, E A

    1980-01-01

    Concentrates of anti-D and rabbit anti-human globulin (AHG) prepared by standard elution and ammonium sulphate precipitation methods were labelled with 125I and 131I and horseradish peroxidase (HRP), respectively. The number of anti-D molecules attached per rbc ranged from 12,500 to 21,300 for the various phenotypes studied and the label of D-negative rbc never exceeded 3.4% of these values. With 131I-AHG the uptake by control D-negative cells averaged 13% of the uptake by D-positive cells. It was also found that the average ratio of AHG molecules reacting with each IgG molecule was between 2.6 and 3.3 in free solution regardless of the label but was between 3.4 and 7.5 on rbc and ghosts with 131I-AHG and 3 or lower with HRP-AHG. A colorimetric procedure for quantitating IgG antibodies on rbc ghosts is described using HRP-labelled-AHG and o-dianisidine as the hydrogen donor. The method is very sensitive and useful for the detection of coating antibodies but cannot be used for precise quantitation because about 50% of the IgG molecules on rbc are lost in preparing the ghosts. An AutoAnalyzer method for estimating the number of antibodies attached to red cells is briefly described. Direct measurements of numbers of antigens receptors with radiolabeled specific antibodies gave the most reproducible results. Labelled rabbit AHG was not as good because the ratio of AHG to IgG varied. The AutoAnalyzer method may prove useful because of its convenience. PMID:7019023

  11. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer's Disease: Results from the DIAN Study Group.

    PubMed

    Su, Yi; Blazey, Tyler M; Owen, Christopher J; Christensen, Jon J; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C; Ances, Beau M; Snyder, Abraham Z; Cash, Lisa A; Koeppe, Robert A; Klunk, William E; Galasko, Douglas; Brickman, Adam M; McDade, Eric; Ringman, John M; Thompson, Paul M; Saykin, Andrew J; Ghetti, Bernardino; Sperling, Reisa A; Johnson, Keith A; Salloway, Stephen P; Schofield, Peter R; Masters, Colin L; Villemagne, Victor L; Fox, Nick C; Förster, Stefan; Chen, Kewei; Reiman, Eric M; Xiong, Chengjie; Marcus, Daniel S; Weiner, Michael W; Morris, John C; Bateman, Randall J; Benzinger, Tammie L S

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer's Network (DIAN), an autosomal dominant Alzheimer's disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer's disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  12. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  13. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran.

    PubMed

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-03-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  14. Introducing handheld computing into a residency program: preliminary results from qualitative and quantitative inquiry.

    PubMed Central

    Manning, B.; Gadd, C. S.

    2001-01-01

    Although published reports describe specific handheld computer applications in medical training, we know very little yet about how, and how well, handheld computing fits into the spectrum of information resources available for patient care and physician training. This paper reports preliminary quantitative and qualitative results from an evaluation study designed to track changes in computer usage patterns and computer-related attitudes before and after introduction of handheld computing. Pre-implementation differences between residents and faculty s usage patterns are interpreted in terms of a "work role" construct. We hypothesize that over time residents and faculty will adopt, adapt, or abandon handheld computing according to how, and how well, this technology supports their successful completion of work role-related tasks. This hypothesis will be tested in the second phase of this pre- and post-implementation study. PMID:11825224

  15. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran

    PubMed Central

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-01-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  16. Application of bias correction methods to improve the accuracy of quantitative radar rainfall in Korea

    NASA Astrophysics Data System (ADS)

    Lee, J.-K.; Kim, J.-H.; Suk, M.-K.

    2015-11-01

    There are many potential sources of the biases in the radar rainfall estimation process. This study classified the biases from the rainfall estimation process into the reflectivity measurement bias and the rainfall estimation bias by the Quantitative Precipitation Estimation (QPE) model and also conducted the bias correction methods to improve the accuracy of the Radar-AWS Rainrate (RAR) calculation system operated by the Korea Meteorological Administration (KMA). In the Z bias correction for the reflectivity biases occurred by measuring the rainfalls, this study utilized the bias correction algorithm. The concept of this algorithm is that the reflectivity of the target single-pol radars is corrected based on the reference dual-pol radar corrected in the hardware and software bias. This study, and then, dealt with two post-process methods, the Mean Field Bias Correction (MFBC) method and the Local Gauge Correction method (LGC), to correct the rainfall estimation bias by the QPE model. The Z bias and rainfall estimation bias correction methods were applied to the RAR system. The accuracy of the RAR system was improved after correcting Z bias. For the rainfall types, although the accuracy of the Changma front and the local torrential cases was slightly improved without the Z bias correction the accuracy of the typhoon cases got worse than the existing results in particular. As a result of the rainfall estimation bias correction, the Z bias_LGC was especially superior to the MFBC method because the different rainfall biases were applied to each grid rainfall amount in the LGC method. For the rainfall types, the results of the Z bias_LGC showed that the rainfall estimates for all types was more accurate than only the Z bias and, especially, the outcomes in the typhoon cases was vastly superior to the others.

  17. Quantitative computed tomography in measurement of vertebral trabecular bone mass. A modified method.

    PubMed

    Nilsson, M; Johnell, O; Jonsson, K; Redlund-Johnell, I

    1988-01-01

    Measurement of bone mineral concentration (BMC) can be done by several modalities. Quantitative computed tomography (QCT) can be used for measurements at different sites and with different types of bone (trabecular-cortical). This study presents a modified method reducing the influence of fat. Determination of BMC was made from measurements with single-energy computed tomography (CT) of the mean Hounsfield number in the trabecular part of the L1 vertebra. The method takes into account the age-dependent composition of the trabecular part of the vertebra. As the amount of intravertebral fat increases with age, the effective atomic number for these parts decreases. This results in a non-linear calibration curve for single-energy CT. Comparison of BMC values using the non-linear calibration curve or the traditional linear calibration with those obtained with a pixel-by-pixel based electron density calculation method (theoretically better) showed results clearly in favor of the non-linear method. The material consisted of 327 patients aged 6 to 91 years, of whom 197 were considered normal. The normal data show a sharp decrease in trabecular bone after the age of 50 in women. In men a slower decrease was found. The vertebrae were larger in men than in women. PMID:3190950

  18. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  19. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGESBeta

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  20. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for maps that explicitly expressed geomorphically implausible relationships indicating that the predictive performance of a model might be misleading in the case a predictor systematically relates to a spatially consistent bias of the inventory. Furthermore, we observed that random forest-based maps displayed spatial artifacts. The most plausible susceptibility map of the study area showed smooth prediction surfaces while the underlying model revealed a high predictive capability and was generated with an accurate landslide inventory and predictors that did not directly describe a bias. However, none of the presented models was found to be completely unbiased. This study showed that high predictive performances cannot be equated with a high plausibility and applicability of subsequent landslide susceptibility maps. We suggest that greater emphasis should be placed on identifying confounding factors and biases in landslide inventories. A joint discussion between modelers and decision makers of the spatial pattern of the final susceptibility maps in the field might increase their acceptance and applicability.

  1. A method for the quantitative determination of crystalline phases by X-ray

    NASA Technical Reports Server (NTRS)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  2. Comparison of Overlap Methods for Quantitatively Synthesizing Single-Subject Data

    ERIC Educational Resources Information Center

    Wolery, Mark; Busick, Matthew; Reichow, Brian; Barton, Erin E.

    2010-01-01

    Four overlap methods for quantitatively synthesizing single-subject data were compared to visual analysts' judgments. The overlap methods were percentage of nonoverlapping data, pairwise data overlap squared, percentage of data exceeding the median, and percentage of data exceeding a median trend. Visual analysts made judgments about 160 A-B data

  3. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  4. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  5. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a particular…

  6. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  7. Developing Investigative Entry Points: Exploring the Use of Quantitative Methods in English Education Research

    ERIC Educational Resources Information Center

    McGraner, Kristin L.; Robbins, Daniel

    2010-01-01

    Although many research questions in English education demand the use of qualitative methods, this paper will briefly explore how English education researchers and doctoral students may use statistics and quantitative methods to inform, complement, and/or deepen their inquiries. First, the authors will provide a general overview of the survey areas…

  8. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-03-01

    This data article describes a controlled, spiked proteomic dataset for which the "ground truth" of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values. PMID:26862574

  9. Validation of a quantitative method for defining CAD/CAM socket modifications.

    PubMed

    Lemaire, E D; Bexiga, P; Johnson, F; Solomonidis, S E; Paul, J P

    1999-04-01

    A quantitative method was developed for defining manual socket modifications, averaging these modifications over a series of amputees, and using the average modifications as a template in commercial CAD/CAM systems. The CADVIEW programme (i.e. software for viewing and analysing CAD sockets) was rewritten to provide comparison functions for aligning sockets to a common axis, visualising the differences between sockets, generating modification outlines, assigning apex point values, and averaging the modification outlines. A CAD template generated in this manner should be the best general representation of a prosthetist's modification style. To test this hypothesis, 13 people with trans-tibial amputations were fitted with both a manual and a CAD/CAM socket. Questionnaires were completed by the subjects and by the prosthetist to obtain information on prosthetic comfort, function, and overall satisfaction. Ground reaction force and stride parameter data were also collected for each prosthesis during gait laboratory testing. No significant differences were found between the manually designed socket and the CAD/CAM designed socket for all data except the vertical peak forces on the amputated side. These results support the clinical application of this quantitative technique for making the transition from manual to CAD/CAM prosthetic modification procedures. PMID:10355641

  10. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods

    PubMed Central

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2015-01-01

    This data article describes a controlled, spiked proteomic dataset for which the “ground truth” of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values. PMID:26862574

  11. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs. PMID:26085428

  12. An effective method for the quantitative detection of porcine endogenous retrovirus in pig tissues.

    PubMed

    Zhang, Peng; Yu, Ping; Wang, Wei; Zhang, Li; Li, Shengfu; Bu, Hong

    2010-05-01

    Xenotransplantation shows great promise for providing a virtually limitless supply of cells, tissues, and organs for a variety of therapeutical procedures. However, the potential of porcine endogenous retrovirus (PERV) as a human-tropic pathogen, particularly as a public health risk, is a major concern for xenotransplantation. This study focus on the detection of copy number in various tissues and organs in Banna Minipig Inbreed (BMI) from 2006 to 2007 in West China Hospital, Sichuan University. Real-time quantitative polymerase chain reaction (SYBR Green I) was performed in this study. The results showed that the pol gene had the most copy number in tissues compared with gag, envA, and envB. Our experiment will offer a rapid and accurate method for the detection of the copy number in various tissues and was especially suitable for the selection of tissues or organs in future clinical xenotransplantation. PMID:20108128

  13. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  14. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study

    NASA Astrophysics Data System (ADS)

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Aglyamov, Salavat R.; Twa, Michael D.; Larin, Kirill V.

    2015-05-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessment of biomechanical properties of tissues with micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of a proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Young’s modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods.

  15. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study.

    PubMed

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Aglyamov, Salavat R; Twa, Michael D; Larin, Kirill V

    2015-05-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessment of biomechanical properties of tissues with micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of a proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Young's modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods. PMID:25860076

  16. Quantitative methods for reconstructing tissue biomechanical properties in optical coherence elastography: a comparison study

    PubMed Central

    Han, Zhaolong; Li, Jiasong; Singh, Manmohan; Wu, Chen; Liu, Chih-hao; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Sudheendran, Narendran; Aglyamov, Salavat R.; Twa, Michael D.; Larin, Kirill V.

    2015-01-01

    We present a systematic analysis of the accuracy of five different methods for extracting the biomechanical properties of soft samples using optical coherence elastography (OCE). OCE is an emerging noninvasive technique, which allows assessing biomechanical properties of tissues with a micrometer spatial resolution. However, in order to accurately extract biomechanical properties from OCE measurements, application of proper mechanical model is required. In this study, we utilize tissue-mimicking phantoms with controlled elastic properties and investigate the feasibilities of four available methods for reconstructing elasticity (Young’s modulus) based on OCE measurements of an air-pulse induced elastic wave. The approaches are based on the shear wave equation (SWE), the surface wave equation (SuWE), Rayleigh-Lamb frequency equation (RLFE), and finite element method (FEM), Elasticity values were compared with uniaxial mechanical testing. The results show that the RLFE and the FEM are more robust in quantitatively assessing elasticity than the other simplified models. This study provides a foundation and reference for reconstructing the biomechanical properties of tissues from OCE data, which is important for the further development of noninvasive elastography methods. PMID:25860076

  17. Intracranial aneurysm segmentation in 3D CT angiography: method and quantitative validation

    NASA Astrophysics Data System (ADS)

    Firouzian, Azadeh; Manniesing, R.; Flach, Z. H.; Risselada, R.; van Kooten, F.; Sturkenboom, M. C. J. M.; van der Lugt, A.; Niessen, W. J.

    2010-03-01

    Accurately quantifying aneurysm shape parameters is of clinical importance, as it is an important factor in choosing the right treatment modality (i.e. coiling or clipping), in predicting rupture risk and operative risk and for pre-surgical planning. The first step in aneurysm quantification is to segment it from other structures that are present in the image. As manual segmentation is a tedious procedure and prone to inter- and intra-observer variability, there is a need for an automated method which is accurate and reproducible. In this paper a novel semi-automated method for segmenting aneurysms in Computed Tomography Angiography (CTA) data based on Geodesic Active Contours is presented and quantitatively evaluated. Three different image features are used to steer the level set to the boundary of the aneurysm, namely intensity, gradient magnitude and variance in intensity. The method requires minimum user interaction, i.e. clicking a single seed point inside the aneurysm which is used to estimate the vessel intensity distribution and to initialize the level set. The results show that the developed method is reproducible, and performs in the range of interobserver variability in terms of accuracy.

  18. Quantitative analysis and efficiency study of PSD methods for a LaBr3:Ce detector

    NASA Astrophysics Data System (ADS)

    Zeng, Ming; Cang, Jirong; Zeng, Zhi; Yue, Xiaoguang; Cheng, Jianping; Liu, Yinong; Ma, Hao; Li, Junli

    2016-03-01

    The LaBr3:Ce scintillator has been widely studied for nuclear spectroscopy because of its optimal energy resolution (<3%@ 662 keV) and time resolution (~300 ps). Despite these promising properties, the intrinsic radiation background of LaBr3:Ce is a critical issue, and pulse shape discrimination (PSD) has been shown to be an efficient potential method to suppress the alpha background from the 227Ac. In this paper, the charge comparison method (CCM) for alpha and gamma discrimination in LaBr3:Ce is quantitatively analysed and compared with two other typical PSD methods using digital pulse processing. The algorithm parameters and discrimination efficiency are calculated for each method. Moreover, for the CCM, the correlation between the CCM feature value distribution and the total charge (energy) is studied, and a fitting equation for the correlation is inferred and experimentally verified. Using the equations, an energy-dependent threshold can be chosen to optimize the discrimination efficiency. Additionally, the experimental results show a potential application in low-activity high-energy γ measurement by suppressing the alpha background.

  19. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. PMID:25842334

  20. Quantitative analysis of collagen change between normal and cancerous thyroid tissues based on SHG method

    NASA Astrophysics Data System (ADS)

    Chen, Xiwen; Huang, Zufang; Xi, Gangqin; Chen, Yongjian; Lin, Duo; Wang, Jing; Li, Zuanfang; Sun, Liqing; Chen, Jianxin; Chen, Rong

    2012-03-01

    Second-harmonic generation (SHG) is proved to be a high spatial resolution, large penetration depth and non-photobleaching method. In our study, SHG method was used to investigate the normal and cancerous thyroid tissue. For SHG imaging performance, system parameters were adjusted for high-contrast images acquisition. Each x-y image was recorded in pseudo-color, which matches the wavelength range in the visible spectrum. The acquisition time for a 512×512-pixels image was 1.57 sec; each acquired image was averaged four frames to improve the signal-to-noise ratio. Our results indicated that collagen presence as determined by counting the ratio of the SHG pixels over the whole pixels for normal and cancerous thyroid tissues were 0.48+/-0.05, 0.33+/-0.06 respectively. In addition, to quantitatively assess collagen-related changes, we employed GLCM texture analysis to the SHG images. Corresponding results showed that the correlation both fell off with distance in normal and cancerous group. Calculated value of Corr50 (the distance where the correlation crossed 50% of the initial correlation) indicated significant difference. This study demonstrates that SHG method can be used as a complementary tool in thyroid histopathology.

  1. Quantitative analysis of collagen change between normal and cancerous thyroid tissues based on SHG method

    NASA Astrophysics Data System (ADS)

    Chen, Xiwen; Huang, Zufang; Xi, Gangqin; Chen, Yongjian; Lin, Duo; Wang, Jing; Li, Zuanfang; Sun, Liqing; Chen, Jianxin; Chen, Rong

    2011-11-01

    Second-harmonic generation (SHG) is proved to be a high spatial resolution, large penetration depth and non-photobleaching method. In our study, SHG method was used to investigate the normal and cancerous thyroid tissue. For SHG imaging performance, system parameters were adjusted for high-contrast images acquisition. Each x-y image was recorded in pseudo-color, which matches the wavelength range in the visible spectrum. The acquisition time for a 512×512-pixels image was 1.57 sec; each acquired image was averaged four frames to improve the signal-to-noise ratio. Our results indicated that collagen presence as determined by counting the ratio of the SHG pixels over the whole pixels for normal and cancerous thyroid tissues were 0.48+/-0.05, 0.33+/-0.06 respectively. In addition, to quantitatively assess collagen-related changes, we employed GLCM texture analysis to the SHG images. Corresponding results showed that the correlation both fell off with distance in normal and cancerous group. Calculated value of Corr50 (the distance where the correlation crossed 50% of the initial correlation) indicated significant difference. This study demonstrates that SHG method can be used as a complementary tool in thyroid histopathology.

  2. Comparison of Concentration Methods for Quantitative Detection of Sewage-Associated Viral Markers in Environmental Waters

    PubMed Central

    Harwood, V. J.; Gyawali, P.; Sidhu, J. P. S.; Toze, S.

    2015-01-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ∼10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg2+) and elution with acid were inefficient and more prone to inhibition, and will result in underestimation of the prevalence and concentrations of HAdVs and HPyVs markers in environmental waters. PMID:25576614

  3. Comparison of concentration methods for quantitative detection of sewage-associated viral markers in environmental waters.

    PubMed

    Ahmed, W; Harwood, V J; Gyawali, P; Sidhu, J P S; Toze, S

    2015-03-01

    Pathogenic human viruses cause over half of gastroenteritis cases associated with recreational water use worldwide. They are relatively difficult to concentrate from environmental waters due to typically low concentrations and their small size. Although rapid enumeration of viruses by quantitative PCR (qPCR) has the potential to greatly improve water quality analysis and risk assessment, the upstream steps of capturing and recovering viruses from environmental water sources along with removing PCR inhibitors from extracted nucleic acids remain formidable barriers to routine use. Here, we compared the efficiency of virus recovery for three rapid methods of concentrating two microbial source tracking (MST) viral markers human adenoviruses (HAdVs) and polyomaviruses (HPyVs) from one liter tap water and river water samples on HA membranes (90 mm in diameter). Samples were spiked with raw sewage, and viral adsorption to membranes was promoted by acidification (method A) or addition of MgCl2 (methods B and C). Viral nucleic acid was extracted directly from membranes (method A), or viruses were eluted with NaOH and concentrated by centrifugal ultrafiltration (methods B and C). No inhibition of qPCR was observed for samples processed by method A, but inhibition occurred in river samples processed by B and C. Recovery efficiencies of HAdVs and HPyVs were ?10-fold greater for method A (31 to 78%) than for methods B and C (2.4 to 12%). Further analysis of membranes from method B revealed that the majority of viruses were not eluted from the membrane, resulting in poor recovery. The modification of the originally published method A to include a larger diameter membrane and a nucleic acid extraction kit that could accommodate the membrane resulted in a rapid virus concentration method with good recovery and lack of inhibitory compounds. The frequently used strategy of viral absorption with added cations (Mg(2+)) and elution with acid were inefficient and more prone to inhibition, and will result in underestimation of the prevalence and concentrations of HAdVs and HPyVs markers in environmental waters. PMID:25576614

  4. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    PubMed

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances. PMID:26545323

  5. Evaluation of quantitative methods for the determination of polyphenols in algal extracts.

    PubMed

    Parys, Sabine; Rosenbaum, Anne; Kehraus, Stefan; Reher, Gerrit; Glombitza, Karl-Werner; König, Gabriele M

    2007-12-01

    Marine brown algae such as Ascophyllum nodosum and Fucus vesiculosus accumulate polyphenols composed of phloroglucinol units. These compounds are of ecological importance and, due to their antioxidative activity, of pharmacological value as well. In this study four methods for the quantitative determination of phlorotannins are compared: spectrophotometric determinations using Folin-Ciocalteu's phenol reagent or 2,4-dimethoxybenzaldehyde (DMBA), quantitative (1)H NMR spectroscopy (qHNMR), and gravimetrical measurements. On the basis of the relative standard deviation and the F-test, the determination using Folin-Ciocalteu's phenol reagent and qHNMR proved to be the most reliable and precise methods. PMID:18052031

  6. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    PubMed Central

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  7. A rapid, sensitive, and selective method for quantitation of lamprey migratory pheromones in river water.

    PubMed

    Stewart, Michael; Baker, Cindy F; Cooney, Terry

    2011-11-01

    The methodology of using fish pheromones, or chemical signatures, as a tool to monitor or manage species of fish is rapidly gaining popularity. Unequivocal detection and accurate quantitation of extremely low concentrations of these chemicals in natural waters is paramount to using this technique as a management tool. Various species of lamprey are known to produce a mixture of three important migratory pheromones; petromyzonol sulfate (PS), petromyzonamine disulfate (PADS), and petromyzosterol disulfate (PSDS), but presently there are no established robust methods for quantitation of all three pheromones. In this study, we report a new, highly sensitive and selective method for the rapid identification and quantitation of these pheromones in river water samples. The procedure is based on pre-concentration, followed by liquid chromatography/tandem mass spectrometry (LC/MS/MS) analysis. The method is fast, with unambiguous pheromone determination. Practical quantitation limits of 0.25 ng/l were achieved for PS and PADS and 2.5 ng/l for PSDS in river water, using a 200-fold pre-concentration, However, lower quantitation limits can be achieved with greater pre-concentration. The methodology can be modified easily to include other chemicals of interest. Furthermore, the pre-concentration step can be applied easily in the field, circumventing potential stability issues of these chemicals. PMID:22076684

  8. Simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and dead reckoning

    NASA Astrophysics Data System (ADS)

    Davey, Neil S.; Godil, Haris

    2013-05-01

    This article presents a comparative study between a well-known SLAM (Simultaneous Localization and Mapping) algorithm, called Gmapping, and a standard Dead-Reckoning algorithm; the study is based on experimental results of both approaches by using a commercial skid-based turning robot, P3DX. Five main base-case scenarios are conducted to evaluate and test the effectiveness of both algorithms. The results show that SLAM outperformed the Dead Reckoning in terms of map-making accuracy in all scenarios but one, since SLAM did not work well in a rapidly changing environment. Although the main conclusion about the excellence of SLAM is not surprising, the presented test method is valuable to professionals working in this area of mobile robots, as it is highly practical, and provides solid and valuable results. The novelty of this study lies in its simplicity. The simple but novel test method for quantitatively comparing robot mapping algorithms using SLAM and Dead Reckoning and some applications using autonomous robots are being patented by the authors in U.S. Patent Application Nos. 13/400,726 and 13/584,862.

  9. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-03-23

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed. PMID:26928571

  10. Use of fundamental parameters method for quantitative analysis of spectra acquired on spectrometer with Kumakhov lens

    NASA Astrophysics Data System (ADS)

    Afanasiev, I. B.; Danichev, V. V.; Ivanov, V. F.; Kondratenko, R. I.; Mikhin, V. A.

    2005-07-01

    A method has been developed for reconstructing continuous x-ray spectrum based on processing spectrometry information obtained as a result of x-rays scattering on light targets. The reconstruction model takes into account coherent and Compton components of scattered radiation, detector resolution and efficacy. The suggested method is universal: it permits to reconstruct the actual shape of spectrum falling onto the x-ray sample. It should be noted that the initial shape of x-ray spectrum as emitted by the anode of x-ray tube is significantly distorted due to various filters, collimators, including x-ray lenses, scattering processes in the media between the anode and sample. A number of examples is given, where x-ray spectra were reconstructed for different configurations of spectrometry tracts. Thus reconstructed x-ray spectra are further used (as input) for quantitative XRF analysis of samples by the method of fundamental parameters (MFP). The developed calculation code implements the MFP version in the original Sherman interpretation6. As input, both the absolute values of intensities for the base lines of characteristic radiation (in case of 1 00% rating), and relative values of intensities rated by the value of the base lines of a "pure element" are used. The procedure of calculating intensities of the "pure element" base lines based on the analysis of samples with known chemical composition is given. Intensities of the base lines of characteristic radiation are determined through application of the deconvolution procedure by the least-squares method. As basic functions, the following is used: Gauss distributions for characteristic radiation lines and piecewise-linear approximation for the background. The efficiency and universal nature of the above comprehensive method is supported by the results of qualitative RFA obtained for a number of samples with known chemical composition using different types of spectrometers.

  11. Meta-analysis of results from quantitative trait loci mapping studies on pig chromosome 4.

    PubMed

    Silva, K M; Bastiaansen, J W M; Knol, E F; Merks, J W M; Lopes, P S; Guimares, S E F; van Arendonk, J A M

    2011-06-01

    Meta-analysis of results from multiple studies could lead to more precise quantitative trait loci (QTL) position estimates compared to the individual experiments. As the raw data from many different studies are not readily available, the use of results from published articles may be helpful. In this study, we performed a meta-analysis of QTL on chromosome 4 in pig, using data from 25 separate experiments. First, a meta-analysis was performed for individual traits: average daily gain and backfat thickness. Second, a meta-analysis was performed for the QTL of three traits affecting loin yield: loin eye area, carcass length and loin meat weight. Third, 78 QTL were selected from 20 traits that could be assigned to one of three broad categories: carcass, fatness or growth traits. For each analysis, the number of identified meta-QTL was smaller than the number of initial QTL. The reduction in the number of QTL ranged from 71% to 86% compared to the total number before the meta-analysis. In addition, the meta-analysis reduced the QTL confidence intervals by as much as 85% compared to individual QTL estimates. The reduction in the confidence interval was greater when a large number of independent QTL was included in the meta-analysis. Meta-QTL related to growth and fatness were found in the same region as the FAT1 region. Results indicate that the meta-analysis is an efficient strategy to estimate the number and refine the positions of QTL when QTL estimates are available from multiple populations and experiments. This strategy can be used to better target further studies such as the selection of candidate genes related to trait variation. PMID:21198696

  12. Quantitative comparison of reconstruction methods for intra-voxel fiber recovery from diffusion MRI.

    PubMed

    Daducci, Alessandro; Canales-Rodríguez, Erick Jorge; Descoteaux, Maxime; Garyfallidis, Eleftherios; Gur, Yaniv; Lin, Ying-Chia; Mani, Merry; Merlet, Sylvain; Paquette, Michael; Ramirez-Manzanares, Alonso; Reisert, Marco; Reis Rodrigues, Paulo; Sepehrband, Farshid; Caruyer, Emmanuel; Choupan, Jeiran; Deriche, Rachid; Jacob, Mathews; Menegaz, Gloria; Prčkovska, Vesna; Rivera, Mariano; Wiaux, Yves; Thiran, Jean-Philippe

    2014-02-01

    Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies. PMID:24132007

  13. Quantitative evaluation of linear and nonlinear methods characterizing interdependencies between brain signals

    NASA Astrophysics Data System (ADS)

    Ansari-Asl, Karim; Senhadji, Lotfi; Bellanger, Jean-Jacques; Wendling, Fabrice

    2006-09-01

    Brain functional connectivity can be characterized by the temporal evolution of correlation between signals recorded from spatially-distributed regions. It is aimed at explaining how different brain areas interact within networks involved during normal (as in cognitive tasks) or pathological (as in epilepsy) situations. Numerous techniques were introduced for assessing this connectivity. Recently, some efforts were made to compare methods performances but mainly qualitatively and for a special application. In this paper, we go further and propose a comprehensive comparison of different classes of methods (linear and nonlinear regressions, phase synchronization, and generalized synchronization) based on various simulation models. For this purpose, quantitative criteria are used: in addition to mean square error under null hypothesis (independence between two signals) and mean variance computed over all values of coupling degree in each model, we provide a criterion for comparing performances. Results show that the performances of the compared methods are highly dependavxx on the hypothesis regarding the underlying model for the generation of the signals. Moreover, none of them outperforms the others in all cases and the performance hierarchy is model dependent.

  14. Quantitative determination of aflatoxin B1 concentration in acetonitrile by chemometric methods using terahertz spectroscopy.

    PubMed

    Ge, Hongyi; Jiang, Yuying; Lian, Feiyu; Zhang, Yuan; Xia, Shanhong

    2016-10-15

    Aflatoxins contaminate and colonize agricultural products, such as grain, and thereby potentially cause human liver carcinoma. Detection via conventional methods has proven to be time-consuming and complex. In this paper, the terahertz (THz) spectra of aflatoxin B1 in acetonitrile solutions with concentration ranges of 1-50μg/ml and 1-50μg/l are obtained and analyzed for the frequency range of 0.4-1.6THz. Linear and nonlinear regression models are constructed to relate the absorption spectra and the concentrations of 160 samples using the partial least squares (PLS), principal component regression (PCR), support vector machine (SVM), and PCA-SVM methods. Our results indicate that PLS and PCR models are more accurate for the concentration range of 1-50μg/ml, whereas SVM and PCA-SVM are more accurate for the concentration range of 1-50μg/l. Furthermore, ten unknown concentration samples extracted from mildewed maize are analyzed quantitatively using these methods. PMID:27173565

  15. Quantitative fluorescence method for continuous measurement of DNA hybridization kinetics using a fluorescent intercalator.

    PubMed

    Yguerabide, J; Ceballos, A

    1995-07-01

    We present a quantitative fluorescence method for continuous measurement of DNA or RNA hybridization (including renaturation) kinetics using a fluorescent DNA intercalator. The method has high sensitivity and can be used with reaction volumes as small as 1 microliter and amounts of DNA around 1 ng. The method is based on the observations that (i) for the usual hybridization conditions, intercalators such as ethidium bromide bind (intercalate) to double-stranded DNA (dsDNA) but not single-stranded DNA or RNA and (ii) there is a large increase in fluorescence intensity when intercalators such as ethidium bromide bind to dsDNA. In this application, the intercalator can be considered as a quantitative indicator of dsDNA concentration. When a small amount of intercalator is added to a hybridizing solution, the fluorescence intensity of the intercalators increases with increase in dsDNA. The hybridization reaction can thus be monitored by continuously recording fluorescence intensity vs time. Because the amount of intercalator bound to dsDNA is not necessarily proportional to dsDNA concentration, the time-dependent fluorescence intensity graph is not identical to the kinetic graph [dsDNA] vs t. However, the fluorescence intensity vs time graph can easily be converted to the true [dsDNA] vs t graph by means of an experimental calibration graph of fluorescence intensity vs [dsDNA]. This calibration graph is obtained in a separate experiment using samples containing known amounts of dsDNA in the ethidium bromide buffer used in the kinetic measurement. We present results of experimental tests of the intercalator technique using ethidium bromide as an intercalator and DNA from Escherichia coli and lambda-phage and Poly(I)-Poly(C) RNA hybrids. These DNA and RNA samples have Cot1/2 values that cover a range of 10(6). Our experimental results show that (i) the kinetics of hybridization are not significantly perturbed by the intercalator at concentrations where no more than 10% of the binding sites on DNA or RNA hybrids are occupied, (ii) the kinetic graphs obtained by the intercalator fluorescence method and corrected with the calibration graph agree with kinetic graphs obtained by optical absorbance measurements at 260 nm, and (iii) the intercalator technique can be used in the different salt environments often used to increase the velocity of the hybridization reaction and at the hybridization temperatures (35-75 degrees C) normally used to minimize nonspecific hybridization. PMID:8572297

  16. A multiplex lectin-channel monitoring method for human serum glycoproteins by quantitative mass spectrometry.

    PubMed

    Ahn, Yeong Hee; Ji, Eun Sun; Shin, Park Min; Kim, Kwang Hoe; Kim, Yong-Sam; Ko, Jeong Heon; Yoo, Jong Shin

    2012-02-01

    A mass profiling method and multiple reaction monitoring (MRM)-based quantitative approach were used to analyze multiple lectin-captured fractions of human serum using different lectins such as aleuria aurantia lectin (AAL), phytohemagglutinin-L(4) (L-PHA), concanavalin A (Con A), and Datura stramonium agglutinin (DSA) to quantitatively monitor protein glycosylation diversity. Each fraction, prepared by multiple lectin-fractionation and tryptic digestion, was analyzed by 1-D LC-MS/MS. Semi-quantitative profiling showed that the list of glycoproteins identified from each lectin-captured fraction is significantly different according to the used lectin. Thus, it was confirmed that the multiplex lectin-channel monitoring (LCM) using multiple lectins is useful for investigating protein glycosylation diversity in a proteome sample. Based on the semi-quantitative mass profiling, target proteins showing lectin-specificity among each lectin-captured fraction were selected and analyzed by the MRM-based method in triplicate using each lectin-captured fraction (average CV 7.9%). The MRM-based analysis for each lectin-captured fraction was similar to those obtained by the profiling experiments. The abundance of each target protein measured varied dramatically, based on the lectin-specificity. The multiplex LCM approach using MRM-based analyses is useful for quantitatively monitoring target protein glycoforms selectively fractionated by multiple lectins. Thus through multiplex LCM rather than single, we could inquire minutely into protein glycosylation states. PMID:22158852

  17. A novel volumetric method for quantitation of titanium dioxide in cosmetics.

    PubMed

    Kim, Young So; Kim, Boo-Min; Park, Sang-Chul; Jeong, Hye-Jin; Chang, Ih Seop

    2006-01-01

    Nowadays there are many sun-protection cosmetics incorporating organic or inorganic UV filters as active ingredients. Chemically stable inorganic sunscreen agents, usually metal oxides, are widely employed in high-SPF (sun protection factor) products. Titanium dioxide is one of the most frequently used inorganic UV filters. It has been used as a pigment for a long period of cosmetic history. With the development of micronization techniques, it has become possible to incorporate titanium dioxide in sunscreen formulations without the previous whitening effect, and hence its use in cosmetics has become an important research topic. However, there are very few works related to quantitation of titanium dioxide in sunscreen products. In this research, we analyzed the amounts of titanium dioxide in sunscreen cosmetics by adapting redox titration, reduction of Ti(IV) to Ti(III), and reoxidation to Ti(IV). After calcification of other organic ingredients of cosmetics, titanium dioxide is dissolved by hot sulfuric acid. The dissolved Ti(IV) is reduced to Ti(III) by adding metallic aluminum. The reduced Ti(III) is titrated against a standard oxidizing agent, Fe(III) (ammonium iron(III) sulfate), with potassium thiocyanate as an indicator. In order to test the accuracy and applicability of the proposed method, we analyzed the amounts of titanium dioxide in four types of sunscreen cosmetics, namely cream, make-up base, foundation, and powder, after adding known amounts of titanium dioxide (1 approximately 25 w/w%). The percentages of titanium dioxide recovered in the four types of formulations were in the range between 96% and 105%. We also analyzed seven commercial cosmetic products labeled with titanium dioxide as an ingredient and compared the results with those obtained from ICP-AES (inductively coupled plasma-atomic emission spectrometry), one of the most powerful atomic analysis techniques. The results showed that the titrated amounts were well in accord with the analyzed amounts of titanium dioxide by ICP-AES. Although instrument-based analytical methods, namely ICP-MS (inductively coupled plasma-mass spectrometry) and ICP-AES, are best for the analysis of titanium, it is difficult for small cosmetic companies to install such instruments because of their high cost. It was found that the volumetric method presented here gives quantitatively accurate and reliable results with routine lab-ware and chemicals. PMID:17111072

  18. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  19. A GC-FID method for quantitative analysis of N,N-carbonyldiimidazole.

    PubMed

    Lee, Claire; Mangion, Ian

    2016-03-20

    N,N-Carbonyldiimidazole (CDI), a common synthetic reagent used in commercial scale pharmaceutical synthesis, is known to be sensitive to hydrolysis from ambient moisture. This liability demands a simple, robust analytical method to quantitatively determine reagent quality to ensure reproducible performance in chemical reactions. This work describes a protocol for a rapid GC-FID based analysis of CDI. PMID:26773533

  20. Media and Communication Research Methods: An Introduction to Qualitative and Quantitative Approaches.

    ERIC Educational Resources Information Center

    Berger, Arthur Asa

    Examining both qualitative and quantitative approaches, this introductory text addresses media and communication research methods. Written for beginning research students at both the undergraduate and graduate levels, the book is clear, concise, and accompanied by many detailed examples. Attention-grabbing dialogue begins each chapter and gives…

  1. Improved GC/MS method for quantitation of n-Alkanes in plant and fecal material

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A gas chromatography-mass spectrometry (GC/MS) method for the quantitation of n-alkanes (carbon backbones ranging from 21 to 36 carbon atoms) in forage and fecal samples has been developed. Automated solid-liquid extraction using elevated temperature and pressure minimized extraction time to 30 min...

  2. Potential Guidelines for Conducting and Reporting Environmental Education Research: Quantitative Methods of Inquiry.

    ERIC Educational Resources Information Center

    Smith-Sebasto, N. J.

    2001-01-01

    Presents potential guidelines for conducting and reporting environmental education research using quantitative methods of inquiry that were developed during a 10-hour (1-1/2 day) workshop sponsored by the North American Commission on Environmental Education Research during the 1998 annual meeting of the North American Association for Environmental…

  3. Correlations between Holistic and Quantitative Methods of Evaluating Student Writing, Grades 4-12.

    ERIC Educational Resources Information Center

    Thomas, David; Donlan, Dan

    A random sample of 175 compositions on the same topic (a lost suitcase) was used in a study examining the correlations between holistic (single impression) and quantitative methods of evaluating student writing. The sample contained 25 papers from each of the following grade levels: four/five, and seven through twelve. A panel of three readers…

  4. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  5. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  6. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  7. A simple method for quantitative diagnosis of small hive beetles, Aethina tumida, in the field

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Here we present a simple and fast method for quantitative diagnosis of small hive beetles (= SHB) in honeybee field colonies using corrugated plastic “diagnostic-strips”. In Australia, we evaluated its efficacy by comparing the number of lured SHB with the total number of beetles in the hives. The d...

  8. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    ERIC Educational Resources Information Center

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  9. Quantitative methods for evaluating optical and frictional properties of cationic polymers.

    PubMed

    Wu, W; Alkema, J; Shay, G D; Basset, D R

    2001-01-01

    This paper presents three quantitative methods to examine gloss, opacity, and friction of cationic polymers. The adsorption of cationic polymers onto hair and skin can be regarded as a thin film coating. Therefore, optical and frictional properties of polymer films are of significant relevance to the applications of cationic polymers in hair care products. Such properties reflect the desirable hair condition attributes consumers seek in shampoo and conditioner products. Using these test methods, polyquaternium-10 and cationic guar samples of varying molecular weight and cationic substitution were compared. The effect of an anionic surfactant, sodium dodecyl sulfate (SDS), on polymer film properties was also investigated. Neat guar hydroxypropyl trimonium chloride imparts less friction than polyquaternium-10 but dulls the substrate employed in this study. The optical data show that polyquaternium-10 provides greater film clarity and gloss than cationic guars. In the presence of SDS, polyquaternium-10 also displays similar or lower friction than cationic guar. The comparative optical and frictional results are in good agreement with the visual assessment of the cationic polymer films. These results clearly demonstrate that polyquaternium-10 exhibits superior film properties in the forms of both neat polymer and polymer/surfactant complex. In addition, microscopic techniques such as scanning electron microscopy (SEM) and atomic force microscopy (AFM) provide powerful explanations for the differences noted between the two popular classes of cationic polymers. The test methods described in this paper can be utilized to differentiate the upper performance potential of cationic polymers. These objective and standardized test methods derived from the coatings industry are not affected by the variability of hair or the formulation complexity of end products. They can be useful tools in the product development process in quickly screening the relative performance of different polymers. PMID:11382843

  10. Bayesian data augmentation methods for the synthesis of qualitative and quantitative research findings

    PubMed Central

    Crandell, Jamie L.; Voils, Corrine I.; Chang, YunKyung; Sandelowski, Margarete

    2010-01-01

    The possible utility of Bayesian methods for the synthesis of qualitative and quantitative research has been repeatedly suggested but insufficiently investigated. In this project, we developed and used a Bayesian method for synthesis, with the goal of identifying factors that influence adherence to HIV medication regimens. We investigated the effect of 10 factors on adherence. Recognizing that not all factors were examined in all studies, we considered standard methods for dealing with missing data and chose a Bayesian data augmentation method. We were able to summarize, rank, and compare the effects of each of the 10 factors on medication adherence. This is a promising methodological development in the synthesis of qualitative and quantitative research. PMID:21572970

  11. The quantitative assessment of the pre- and postoperative craniosynostosis using the methods of image analysis.

    PubMed

    Fabijańska, Anna; Węgliński, Tomasz

    2015-12-01

    This paper considers the problem of the CT based quantitative assessment of the craniosynostosis before and after the surgery. First, fast and efficient brain segmentation approach is proposed. The algorithm is robust to discontinuity of skull. As a result it can be applied both in pre- and post-operative cases. Additionally, image processing and analysis algorithms are proposed for describing the disease based on CT scans. The proposed algorithms automate determination of the standard linear indices used for assessment of the craniosynostosis (i.e. cephalic index CI and head circumference HC) and allow for planar and volumetric analysis which so far have not been reported. Results of applying the introduced methods to sample craniosynostotic cases before and after the surgery are presented and discussed. The results show that the proposed brain segmentation algorithm is characterized by high accuracy when applied both in the pre- and postoperative craniosynostosis, while the introduced planar and volumetric indices for the disease description may be helpful to distinguish between the types of the disease. PMID:26143078

  12. Method for estimating total attenuation from a spatial map of attenuation slope for quantitative ultrasound imaging.

    PubMed

    Pawlicki, Alexander D; O'Brien, William D

    2013-04-01

    Estimating total ultrasound attenuation from backscatter data is essential in the field of quantitative ultrasound (QUS) because of the need to compensate for attenuation when estimating the backscatter coefficient and QUS parameters. This work uses a reference phantom method of attenuation estimation to create a spatial map of attenuation slope (AS) from backscatter radio-frequency (RF) data of three phantoms and a rat mammary adenocarcinoma tumor (MAT). The attenuation maps show changes in attenuation between different regions of the phantoms and the MAT tumor. Analyses of the attenuation maps of the phantoms suggest that the AS estimates are in good quantitative agreement with the known values for the phantoms. Furthermore, estimates of total attenuation from the attenuation maps are likewise in good quantitative agreement with known values. PMID:23493614

  13. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-05-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.

  14. An Evaluation of Quantitative Methods of Determining the Degree of Melting Experienced by a Chondrule

    NASA Technical Reports Server (NTRS)

    Nettles, J. W.; Lofgren, G. E.; Carlson, W. D.; McSween, H. Y., Jr.

    2004-01-01

    Many workers have considered the degree to which partial melting occurred in chondrules they have studied, and this has led to attempts to find reliable methods of determining the degree of melting. At least two quantitative methods have been used in the literature: a convolution index (CVI), which is a ratio of the perimeter of the chondrule as seen in thin section divided by the perimeter of a circle with the same area as the chondrule, and nominal grain size (NGS), which is the inverse square root of the number density of olivines and pyroxenes in a chondrule (again, as seen in thin section). We have evaluated both nominal grain size and convolution index as melting indicators. Nominal grain size was measured on the results of a set of dynamic crystallization experiments previously described, where aliquots of LEW97008(L3.4) were heated to peak temperatures of 1250, 1350, 1370, and 1450 C, representing varying degrees of partial melting of the starting material. Nominal grain size numbers should correlate with peak temperature (and therefore degree of partial melting) if it is a good melting indicator. The convolution index is not directly testable with these experiments because the experiments do not actually create chondrules (and therefore they have no outline on which to measure a CVI). Thus we had no means to directly test how well the CVI predicted different degrees of melting. Therefore, we discuss the use of the CVI measurement and support the discussion with X-ray Computed Tomography (CT) data.

  15. Quantitative analysis of carbonate sediments using powder X-ray diffraction and the Rietveld method

    SciTech Connect

    Post, J.E. . Dept. of Mineral Sciences)

    1992-01-01

    Critical to most studies of carbonate sediments is information about the relative abundances of different carbonate minerals and the Mg mole-percent of the calcite phases. Traditional powder X-ray diffraction methods for extracting these data rely upon measurement of the integrated area and d-spacing of the 104 calcite peak. This approach, however, is difficult for diffraction patterns with severely overlapping peaks and is prone to a range of systematic errors. The Rietveld analysis method makes use of the entire powder X-ray diffraction pattern and can quickly yield accurate quantitative phase analysis (without the need for standards) and unit-cell parameters for even complex mixtures. Furthermore, corrections for certain systematic errors (e.g. sample displacement) are also refined, eliminating the need to add internal standards to the samples. The d-spacings, a, and c values from the refinement can be used with standard determinative curves for Mg mole-percent. Also, in many cases, the Rietveld refinement can yield directly the Mg composition of each calcite phase. Results from several prepared two and three phase carbonate mixtures will be shown to demonstrate the accuracy of the technique. Applications to natural samples of mollusk shells, coralline algae, echinoderm spines, and carbonate muds will also be discussed.

  16. A validated LC-MS-MS method for simultaneous identification and quantitation of rodenticides in blood.

    PubMed

    Bidny, Sergei; Gago, Kim; David, Mark; Duong, Thanh; Albertyn, Desdemona; Gunja, Naren

    2015-04-01

    A rapid, highly sensitive and specific analytical method for the extraction, identification and quantification of nine rodenticides from whole blood has been developed and validated. Commercially available rodenticides in Australia include coumatetralyl, warfarin, brodifacoum, bromadiolone, difenacoum, flocoumafen, difethialone, diphacinone and chlorophacinone. A Waters ACQUITY UPLC TQD system operating in multiple reaction monitoring mode was used to conduct the analysis. Two different ionization techniques, ES+ and ES-, were examined to achieve optimal sensitivity and selectivity resulting in detection by MS-MS using electrospray ionization in positive mode for difenacoum and brodifacoum and in negative mode for all other analytes. All analytes were extracted from 200 L of whole blood with ethylacetate and separated on a Waters ACQUITY UPLC BEH-C18 column using gradient elution. Ammonium acetate (10 mM, pH 7.5) and methanol were used as mobile phases with a total run time of 8 min. Recoveries were between 70 and 105% with limits of detection ranging from 0.5 to 1 ng/mL. The limit of quantitation was 2 ng/mL for all analytes. Calibration curves were linear within the range 2-200 ng/mL for all analytes with the coefficient of determination ?0.98. The application of the proposed method using liquid-liquid extraction in a series of clinical investigations and forensic toxicological analyses was successful. PMID:25595137

  17. The peak to background method in quantitative ion microprobe analysis of thick biological specimens

    NASA Astrophysics Data System (ADS)

    Hiroko, Koyama-Ito

    1991-05-01

    The use of the ratio of the characteristic intensity to the continuum background intensity (P/B ratio) of the X-ray spectrum for a quantitative ion microprobe (IMP) or PIXE (particle induced X-ray emission) analysis of thin biological specimens was proposed previously. The IMP analysis of thick biological specimens is also of considerable practical use. In this paper, the possibility of using the P/B ratio to quantify minor elements in thick biological specimens is investigated. The epoxy resin based standards with gradual concentrations of KCNS up to 0.6 mol/kg and NBS bovine liver were analyzed by a 27 MeV α particle inicroprobe. The measured peak to background ratios (between 4.4 to 5.7 keV) agreed well with the theoretical calculations. The calculations showed that the concentration dependence of the P/B ratios was determined mainly by the absorption of X-rays in specimens. The results indicate that the P/B method is useful for IMP analysis of thick biological specimens as a simple approximate method of quantification. Reasonable accuracy would be expected with the aid of organic standards and computer calculations.

  18. Nanoparticle-mediated photothermal effect enables a new method for quantitative biochemical analysis using a thermometer

    NASA Astrophysics Data System (ADS)

    Fu, Guanglei; Sanjay, Sharma T.; Dou, Maowei; Li, Xiujun

    2016-03-01

    A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays.A new biomolecular quantitation method, nanoparticle-mediated photothermal bioassay, using a common thermometer as the signal reader was developed. Using an immunoassay as a proof of concept, iron oxide nanoparticles (NPs) captured in the sandwich-type assay system were transformed into a near-infrared (NIR) laser-driven photothermal agent, Prussian blue (PB) NPs, which acted as a photothermal probe to convert the assay signal into heat through the photothermal effect, thus allowing sensitive biomolecular quantitation using a thermometer. This is the first report of biomolecular quantitation using a thermometer and also serves as the first attempt to introduce the nanoparticle-mediated photothermal effect for bioassays. Electronic supplementary information (ESI) available: Additional information on FTIR characterization (Fig. S1), photothermal immunoassay of PSA in human serum samples (Table S1), and the Experimental section, including preparation of antibody-conjugated iron oxide NPs, sandwich-type immunoassay, characterization, and photothermal detection protocol. See DOI: 10.1039/c5nr09051b

  19. Method development for a quantitative analysis performed without any standard using an evaporative light-scattering detector.

    PubMed

    Heron, Sylvie; Maloumbi, Marie-Geneviève; Dreux, Michel; Verette, Eric; Tchapla, Alain

    2007-08-17

    A new method for quantitative analyses using an evaporative light-scattering detector (ELSD) is proposed. It is based on the preliminary determination of the calibration curve of an ELSD which correlates coefficient b and loga, the two coefficients from the equation: A=am(b), that characterize the law of the quantitative response for an ELSD. Dilutions of the mixture to be analyzed allow the determination of coefficient b for each component of the mixture. The knowledge of the b value and the experimental relationship correlating b with loga allows to determine the loga value and consequently to quantify each compound of the mixture. This method is an alternative to the quantitative method which uses an internal normalization without any response coefficient. This internal normalization method used with an ELSD provides inaccurate results and this inaccuracy increases when the analytes are in very different proportions. The relevance of the new method proposed in this work lies in the quantification of all the components present in a complex mixture when some of them are not available as standards. PMID:17583717

  20. A quantitative cell modeling and wound-healing analysis based on the Electric Cell-substrate Impedance Sensing (ECIS) method.

    PubMed

    Yang, Jen Ming; Chen, Szi-Wen; Yang, Jhe-Hao; Hsu, Chih-Chin; Wang, Jong-Shyan

    2016-02-01

    In this paper, a quantitative modeling and wound-healing analysis of fibroblast and human keratinocyte cells is presented. Our study was conducted using a continuous cellular impedance monitoring technique, dubbed Electric Cell-substrate Impedance Sensing (ECIS). In fact, we have constructed a mathematical model for quantitatively analyzing the cultured cell growth using the time series data directly derived by ECIS in a previous work. In this study, the applicability of our model into the keratinocyte cell growth modeling analysis was assessed first. In addition, an electrical "wound-healing" assay was used as a means to evaluate the healing process of keratinocyte cells at a variety of pressures. Two innovative and new-defined indicators, dubbed cell power and cell electroactivity, respectively, were developed for quantitatively characterizing the biophysical behavior of cells. We then employed the wavelet transform method to perform a multi-scale analysis so the cell power and cell electroactivity across multiple observational time scales may be captured. Numerical results indicated that our model can well fit the data measured from the keratinocyte cell culture for cell growth modeling analysis. Also, the results produced by our quantitative analysis showed that the wound healing process was the fastest at the negative pressure of 125mmHg, which consistently agreed with the qualitative analysis results reported in previous works. PMID:26773459

  1. Quantitative evaluation of registration methods for atlas-based diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Wu, Xue; Eggebrecht, Adam T.; Culver, Joseph P.; Zhan, Yuxuan; Basevi, Hector; Dehghani, Hamid

    2013-06-01

    In Diffuse Optical Tomography (DOT), an atlas-based model can be used as an alternative to a subject-specific anatomical model for recovery of brain activity. The main step of the generation of atlas-based subject model is the registration of atlas model to the subject head. The accuracy of the DOT then relies on the accuracy of registration method. In this work, 11 registration methods are quantitatively evaluated. The registration method with EEG 10/20 systems with 19 landmarks and non-iterative point to point algorithm provides approximately 1.4 mm surface error and is considered as the most efficient registration method.

  2. Terahertz absorbance spectrum fitting method for quantitative detection of concealed contraband

    NASA Astrophysics Data System (ADS)

    Wang, Yingxin; Zhao, Ziran; Chen, Zhiqiang; Kang, Kejun; Feng, Bing; Zhang, Yan

    2007-12-01

    We present a quantitative method for the nondestructive detection of concealed contraband based on terahertz transmission spectroscopy. Without knowing the prior information of barrier materials, the amount of concealed contraband can be extracted by approximating the terahertz absorbance spectrum of the barrier material with a low-order polynomial and then fitting the measured absorbance spectrum of the inspected object with the polynomial and the known standard spectrum of this kind of contraband. We verify the validity of this method using a sample of explosive 1,3,5-trinitro-s-triazine (RDX) covered with several different barrier materials which are commonly encountered in actual inspection, and good agreement between the calculated and actual value of the amount of RDX is obtained for the experiments performed under both nitrogen and air atmospheres. This indicates that the presented method can achieve quantitative detection of hidden contraband, which is important for security inspection applications.

  3. Evaluation of methods for qualitative and quantitative assessment of esophageal transit of liquid.

    PubMed

    Ham, H R; Georges, B; Guillaume, M; Erbsmann, F; Dobbeleir, A

    1985-01-01

    The aim of this work was to compare the advantages and the limitations of several data processing techniques for the assessment of esophageal transit. The following qualitative methods were evaluated: scintigraphic image, cine-display, regional time-activity curve, and condensed image. The quantitative methods evaluated were the pixel to pixel presentation of parameters of the time-activity curves such as time of arrival, time to maximum, and several downslope parameters, mean transit time, mean time, and a new transit parameter based on the radioactive decay. The study allowed us to conclude that for the detection and the quantitation of esophageal transit the method of choice was the combined use of the condensed image and the pixel to pixel mean time image. The parametric image using the transit parameter calculated from decay was shown as a valuable alternative if an ultra-short half-life radionuclide was used as the tracer. PMID:4043109

  4. Laboratory and field validation of a Cry1Ab protein quantitation method for water.

    PubMed

    Strain, Katherine E; Whiting, Sara A; Lydy, Michael J

    2014-10-01

    The widespread planting of crops expressing insecticidal proteins derived from the soil bacterium Bacillus thuringiensis (Bt) has given rise to concerns regarding potential exposure to non-target species. These proteins are released from the plant throughout the growing season into soil and surface runoff and may enter adjacent waterways as runoff, erosion, aerial deposition of particulates, or plant debris. It is crucial to be able to accurately quantify Bt protein concentrations in the environment to aid in risk analyses and decision making. Enzyme-linked immunosorbent assay (ELISA) is commonly used for quantitation of Bt proteins in the environment; however, there are no published methods detailing and validating the extraction and quantitation of Bt proteins in water. The objective of the current study was to optimize the extraction of a Bt protein, Cry1Ab, from three water matrices and validate the ELISA method for specificity, precision, accuracy, stability, and sensitivity. Recovery of the Cry1Ab protein was matrix-dependent and ranged from 40 to 88% in the validated matrices, with an overall method detection limit of 2.1 ng/L. Precision among two plates and within a single plate was confirmed with a coefficient of variation less than 20%. The ELISA method was verified in field and laboratory samples, demonstrating the utility of the validated method. The implementation of a validated extraction and quantitation protocol adds consistency and reliability to field-collected data regarding transgenic products. PMID:25059137

  5. Quantitative Analysis of Carbon Steel with Multi-Line Internal Standard Calibration Method Using Laser-Induced Breakdown Spectroscopy.

    PubMed

    Pan, Congyuan; Du, Xuewei; An, Ning; Zeng, Qiang; Wang, Shengbo; Wang, Qiuping

    2016-04-01

    A multi-line internal standard calibration method is proposed for the quantitative analysis of carbon steel using laser-induced breakdown spectroscopy (LIBS). A procedure based on the method was adopted to select the best calibration curves and the corresponding emission lines pairs automatically. Laser-induced breakdown spectroscopy experiments with carbon steel samples were performed, and C, Cr, and Mn were analyzed via the proposed method. Calibration curves of these elements were constructed via a traditional single line internal standard calibration method and a multi-line internal standard calibration method. The calibration curves obtained were evaluated with the determination coefficient, the root mean square error of cross-validation, and the average relative error of cross-validation. All of the parameters were improved significantly with the proposed method. The results show that accurate and stable calibration curves can be obtained efficiently via the multi-line internal standard calibration method. PMID:26872822

  6. Pleistocene Lake Bonneville and Eberswalde Crater of Mars: Quantitative Methods for Recognizing Poorly Developed Lacustrine Shorelines

    NASA Astrophysics Data System (ADS)

    Jewell, P. W.

    2014-12-01

    The ability to quantify shoreline features on Earth has been aided by advances in acquisition of high-resolution topography through laser imaging and photogrammetry. Well-defined and well-documented features such as the Bonneville, Provo, and Stansbury shorelines of Late Pleistocene Lake Bonneville are recognizable to the untrained eye and easily mappable on aerial photos. The continuity and correlation of lesser shorelines must rely quantitative algorithms for processing high-resolution data in order to gain widespread scientific acceptance. Using Savitsky-Golay filters and the geomorphic methods and criteria described by Hare et al. [2001], minor, transgressive, erosional shorelines of Lake Bonneville have been identified and correlated across the basin with varying degrees of statistical confidence. Results solve one of the key paradoxes of Lake Bonneville first described by G. K. Gilbert in the late 19th century and point the way for understanding climatically driven oscillations of the Last Glacial Maximum in the Great Basin of the United States. Similar techniques have been applied to the Eberswalde Crater area of Mars using HRiSE DEMs (1 m horizontal resolution) where a paleolake is hypothesized to have existed. Results illustrate the challenges of identifying shorelines where long term aeolian processes have degraded the shorelines and field validation is not possible. The work illustrates the promises and challenges of indentifying remnants of a global ocean elsewhere on the red planet.

  7. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  8. Method Development and Validation of a Stability-Indicating RP-HPLC Method for the Quantitative Analysis of Dronedarone Hydrochloride in Pharmaceutical Tablets

    PubMed Central

    Dabhi, Batuk; Jadeja, Yashwantsinh; Patel, Madhavi; Jebaliya, Hetal; Karia, Denish; Shah, Anamik

    2013-01-01

    A simple, precise, and accurate HPLC method has been developed and validated for the quantitative analysis of Dronedarone Hydrochloride in tablet form. An isocratic separation was achieved using a Waters Symmetry C8 (100 × 4.6 mm), 5 μm particle size column with a flow rate of 1 ml/min and UV detector at 290 nm. The mobile phase consisted of buffer: methanol (40:60 v/v) (buffer: 50 mM KH2PO4 + 1 ml triethylamine in 1 liter water, pH=2.5 adjusted with ortho-phosphoric acid). The method was validated for specificity, linearity, precision, accuracy, robustness, and solution stability. The specificity of the method was determined by assessing interference from the placebo and by stress testing the drug (forced degradation). The method was linear over the concentration range 20–80 μg/ml (r2 = 0.999) with a Limit of Detection (LOD) and Limit of Quantitation (LOQ) of 0.1 and 0.3 μg/ml respectively. The accuracy of the method was between 99.2–100.5%. The method was found to be robust and suitable for the quantitative analysis of Dronedarone Hydrochloride in a tablet formulation. Degradation products resulting from the stress studies did not interfere with the detection of Dronedarone Hydrochloride so the assay is thus stability-indicating. PMID:23641332

  9. A simple HPLC-MS method for the quantitative determination of the composition of bacterial medium chain-length polyhydroxyalkanoates.

    PubMed

    Grubelnik, Andreas; Wiesli, Luzia; Furrer, Patrick; Rentsch, Daniel; Hany, Roland; Meyer, Veronika R

    2008-06-01

    Bacterial poly(hydroxyalkanoates) (PHAs) vary in the composition of their monomeric units. Besides saturated side-chains, unsaturated ones can also be found. The latter leads to unwanted by-products (THF ester, secondary alcohols) during acidic cleavage of the polymer backbone in the conventional analytical assays. To prevent these problems, we developed a new method for the reductive depolymerization of medium chain-length PHAs, leading to monomeric diols that can be separated and quantified by HPLC/MS. Reduction is performed at room temperature with lithium aluminum hydride within 5-15 min. The new method is faster and simpler than the previous ones and is quantitative. The results are consistent with the ones obtained by quantitative (1)H NMR. PMID:18461645

  10. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Cates, Michael R.; Franks, Larry A.

    1985-01-01

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fissions are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for .sup.239 Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  11. Qualitative and quantitative event-specific PCR detection methods for oxy-235 canola based on the 3' integration flanking sequence.

    PubMed

    Yang, Litao; Guo, Jinchao; Zhang, Haibo; Liu, Jia; Zhang, Dabing

    2008-03-26

    As more genetically modified plant events are approved for commercialization worldwide, the event-specific PCR method has become the key method for genetically modified organism (GMO) identification and quantification. This study reveals the 3' flanking sequence of the exogenous integration of Oxy-235 canola employing thermal asymmetric interlaced PCR (TAIL-PCR). On the basis of the revealed 3' flanking sequence, PCR primers and TaqMan probe were designed and qualitative and quantitative PCR assays were established for Oxy-235 canola. The specificity and limits of detection (LOD) and quantification (LOQ) of these two PCR assays were validated to as low as 0.1% for the relative LOD of qualitative PCR assay; the absolute LOD and LOQ were low to 10 and 20 copies of canola genomic DNA in quantitative PCR assay, respectively. Furthermore, ideal quantified results were obtained in the practical canola sample detection. All of the results indicate that the developed qualitative and quantitative PCR methods based on the revealed 3' integration flanking sequence are suitable for GM canola Oxy-235 identification and quantification. PMID:18298073

  12. Comparison of reconstruction methods and quantitative accuracy in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Kang, Joo Hyun; Moo Lim, Sang

    2015-04-01

    PET reconstruction is key to the quantification of PET data. To our knowledge, no comparative study of reconstruction methods has been performed to date. In this study, we compared reconstruction methods with various filters in terms of their spatial resolution, non-uniformities (NU), recovery coefficients (RCs), and spillover ratios (SORs). In addition, the linearity of reconstructed radioactivity between linearity of measured and true concentrations were also assessed. A Siemens Inveon PET scanner was used in this study. Spatial resolution was measured with NEMA standard by using a 1 mm3 sized 18F point source. Image quality was assessed in terms of NU, RC and SOR. To measure the effect of reconstruction algorithms and filters, data was reconstructed using FBP, 3D reprojection algorithm (3DRP), ordered subset expectation maximization 2D (OSEM 2D), and maximum a posteriori (MAP) with various filters or smoothing factors (β). To assess the linearity of reconstructed radioactivity, image quality phantom filled with 18F was used using FBP, OSEM and MAP (β =1.5 & 5 × 10-5). The highest achievable volumetric resolution was 2.31 mm3 and the highest RCs were obtained when OSEM 2D was used. SOR was 4.87% for air and 3.97% for water, obtained OSEM 2D reconstruction was used. The measured radioactivity of reconstruction image was proportional to the injected one for radioactivity below 16 MBq/ml when FBP or OSEM 2D reconstruction methods were used. By contrast, when the MAP reconstruction method was used, activity of reconstruction image increased proportionally, regardless of the amount of injected radioactivity. When OSEM 2D or FBP were used, the measured radioactivity concentration was reduced by 53% compared with true injected radioactivity for radioactivity <16 MBq/ml. The OSEM 2D reconstruction method provides the highest achievable volumetric resolution and highest RC among all the tested methods and yields a linear relation between the measured and true concentrations for radioactivity Our data collectively showed that OSEM 2D reconstruction method provides quantitatively accurate reconstructed PET data results.

  13. Radial period extraction method employing frequency measurement for quantitative collimation testing

    NASA Astrophysics Data System (ADS)

    Li, Sikun; Wang, Xiangzhao

    2016-01-01

    A radial period extraction method employing frequency measurement is proposed for quantitative collimation testing using spiral gratings. The radial period of the difference-frequency fringe is treated as a measure of the collimation condition. A frequency measurement technique based on wavelet transform and a statistical approach is presented to extract the radial period directly from the amplitude-transmittance spiral fringe. A basic constraint to set the parameters of the wavelet is introduced. Strict mathematical demonstration is given. The method outperforms methods employing phase measurement in terms of precision, stability and noise immune ability.

  14. A method for operative quantitative interpretation of multispectral images of biological tissues

    NASA Astrophysics Data System (ADS)

    Lisenko, S. A.; Kugeiko, M. M.

    2013-10-01

    A method for operative retrieval of spatial distributions of biophysical parameters of a biological tissue by using a multispectral image of it has been developed. The method is based on multiple regressions between linearly independent components of the diffuse reflection spectrum of the tissue and unknown parameters. Possibilities of the method are illustrated by an example of determining biophysical parameters of the skin (concentrations of melanin, hemoglobin and bilirubin, blood oxygenation, and scattering coefficient of the tissue). Examples of quantitative interpretation of the experimental data are presented.

  15. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  16. Hepatitis C Virus RNA Real-Time Quantitative RT-PCR Method Based on a New Primer Design Strategy.

    PubMed

    Chen, Lida; Li, Wenli; Zhang, Kuo; Zhang, Rui; Lu, Tian; Hao, Mingju; Jia, Tingting; Sun, Yu; Lin, Guigao; Wang, Lunan; Li, Jinming

    2016-01-01

    Viral nucleic acids are unstable when improperly collected, handled, and stored, resulting in decreased sensitivity of currently available commercial quantitative nucleic acid testing kits. Using known unstable hepatitis C virus RNA, we developed a quantitative RT-PCR method based on a new primer design strategy to reduce the impact of nucleic acid instability on nucleic acid testing. The performance of the method was evaluated for linearity, limit of detection, precision, specificity, and agreement with commercial hepatitis C virus assays. Its clinical application was compared to that of two commercial kits--Cobas AmpliPrep/Cobas TaqMan (CAP/CTM) and Kehua. The quantitative RT-PCR method delivered a good performance, with a linearity of R(2) = 0.99, a total limit of detection (genotypes 1 to 6) of 42.6 IU/mL (95% CI, 32.84 to 67.76 IU/mL), a CV of 1.06% to 3.34%, a specificity of 100%, and a high concordance with the CAP/CTM assay (R(2) = 0.97), with a means ± SD value of -0.06 ± 1.96 log IU/mL (range, -0.38 to 0.25 log IU/mL). The method was superior to commercial assays in detecting unstable hepatitis C virus RNA (P < 0.05). This quantitative RT-PCR method can effectively eliminate the influence of RNA instability on nucleic acid testing. The principle of primer design strategy may be applied to the detection of other RNA or DNA viruses. PMID:26612712

  17. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo)

    PubMed Central

    Li, Yi; Kim, Jong-Joo

    2015-01-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  18. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesC? approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  19. Qualitative and quantitative methods to determine miscibility in amorphous drug-polymer systems.

    PubMed

    Meng, Fan; Dave, Vivek; Chauhan, Harsh

    2015-09-18

    Amorphous drug-polymer systems or amorphous solid dispersions are commonly used in pharmaceutical industry to enhance the solubility of compounds with poor aqueous solubility. The degree of miscibility between drug and polymer is important both for solubility enhancement as well as for the formation of a physically stable amorphous system. Calculation of solubility parameters, Computational data mining, Tg measurements by DSC and Raman mapping are established traditional methods used to qualitatively detect the drug-polymer miscibility. Calculation of Flory-Huggins interaction parameter, computational analysis of X-Ray Diffraction (XRD) data, solid state Nuclear Magnetic Resonance (NMR) spectroscopy and Atomic Forced Microscopy (AFM) have been recently developed to quantitatively determine the miscibility in amorphous drug-polymer systems. This brief review introduces and compiles these qualitative and quantitative methods employed in the evaluation of drug-polymer miscibility. Combination of these techniques can provide deeper insights into the true miscibility of the drug-polymer systems. PMID:26006307

  20. Novel method for quantitative assessment of physical workload of healthcare workers by a tetherless ergonomics workstation.

    PubMed

    Smith, Warren D; Alharbi, Kamal A; Dixon, Jeremy B; Reggad, Hind

    2012-01-01

    Healthcare workers are at risk of physical injury. Our laboratory has developed a tetherless ergonomics workstation that is suitable for studying physicians' and nurses' physical workloads in clinical settings. The workstation uses wearable sensors to record multiple channels of body orientation and muscle activity and wirelessly transmits them to a base station laptop computer for display, storage, and analysis. The ergonomics workstation generates long records of multi-channel data, so it is desired that the workstation automatically process these records and provide graphical and quantitative summaries of the physical workloads experienced by the healthcare workers. This paper describes a novel method of automated quantitative assessment of physical workload, termed joint cumulative amplitude-duration (JCAD) analysis, that has advantages over previous methods and illustrates its use in a comparison of the physical workloads of robotically-assisted surgery versus manual video-endoscopic surgery. PMID:23366204

  1. A Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4 -tetrahidro-1,6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 BrcdotH_2O) has been determined, first using monochromatic Mo K alpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed an R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop-2-en-1-one, (C_{25}H _{20}N_2 O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analyses respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analyses of the benzil compound ((C_6H_5 OcdotCO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new molecular structure of (Dinitrato-(N,N ^'-dimethylethylene-diamine)copper(II)) has been determined using Mo Kalpha radiation on a four circle diffractometer. The refinement resulted in an R-factor (on F) of 4.06%.

  2. a Study of the Synchrotron Laue Method for Quantitative Crystal Structure Analysis.

    NASA Astrophysics Data System (ADS)

    Gomez de Anderez, Dora M.

    1990-01-01

    Available from UMI in association with The British Library. Quantitative crystal structure analyses have been carried out on small molecule crystals using synchrotron radiation and the Laue method. A variety of single crystal structure determinations and associated refinements are used and compared with the monochromatic analyses. The new molecular structure of 7-amino-5-bromo -4-methyl-2-oxo-1,2,3,4-tetrahidro-1, 6 -naphthyridine-8-carbonitrile (C_{10 }H_9ON_4 Br.H_2O) has been determined, first using monochromatic Mo Kalpha radiation and a four-circle diffractometer, then using synchrotron Laue diffraction photography. The structure refinements showed a R-factor of 4.97 and 14.0% for the Mo Kalpha and Laue data respectively. The molecular structure of (S)-2-chloro-2-fluoro-N-((S)-1-phenylethyl) ethanamide, (C_{10}H _{11}ClFNO), has been determined using the same crystal throughout for X-ray monochromatic analyses (Mo Kalpha and Cu K alpha) followed by synchrotron Laue data collection. The Laue and monochromatic data compare favourably. The R -factors (on F) were 6.23, 6.45 and 8.19% for the Mo K alpha, Cu Kalpha and Laue data sets respectively. The molecular structure of 3-(5-hydroxy-3-methyl-1-phenylpyrazol-4-yl)-1,3-diphenyl -prop- 2-en-1-one, (C_{25 }H_{20}N _2O_2) has been determined using the synchrotron Laue method. The results compare very well with Mo Kalpha monochromatic data. The R-factors (on F) were 4.60 and 5.29% for Mo Kalpha and Laue analysis respectively. The Laue method is assessed in locating the 20 hydrogen atoms in this structure. The structure analysis of the benzil compound ((C_6H_5 O.CO_2)) is carried out using the synchrotron Laue method firstly at room temperature and secondly at low temperature -114 ^circC. The structure shows an R-factor (on F) of 13.06% and 6.85% for each data set respectively. The synchrotron Laue method was used to collect data for ergocalciferol (Vitamin D_2). The same crystal was also used to record oscillation data with the synchrotron radiation monochromatic beam. A new molecular structure of (Dinitrato-(N,N ^'-dimethylethylene-diamine)copper(II)) has been determined using Mo Kalpha radiation on a four circle diffractometer. The refinement resulted in an R-factor (on F) of 4.06%.

  3. Quantitative methods for genome-scale analysis of in situ hybridization and correlation with microarray data

    PubMed Central

    Lee, Chang-Kyu; Sunkin, Susan M; Kuan, Chihchau; Thompson, Carol L; Pathak, Sayan; Ng, Lydia; Lau, Chris; Fischer, Shanna; Mortrud, Marty; Slaughterbeck, Cliff; Jones, Allan; Lein, Ed; Hawrylycz, Michael

    2008-01-01

    With the emergence of genome-wide colorimetric in situ hybridization (ISH) data sets such as the Allen Brain Atlas, it is important to understand the relationship between this gene expression modality and those derived from more quantitative based technologies. This study introduces a novel method for standardized relative quantification of colorimetric ISH signal that enables a large-scale cross-platform expression level comparison of ISH with two publicly available microarray brain data sources. PMID:18234097

  4. Localization and quantitation of chloroplast enzymes and light-harvesting components using immunocytochemical methods

    SciTech Connect

    Mustardy, L.; Cunningham, F.X Jr.; Gantt, E. )

    1990-09-01

    Seven chloroplast proteins were localized in Porphyridium cruentum (ATCC 50161) by immunolabeling with colloidal gold on electron microscope sections of log phase cells grown under red, green, and white light. Ribulose bisphosphate carboxylase labeling occurred almost exclusively in the pyrenoid. The major apoproteins of photosystem I (56-64 kD) occurred mostly over the stromal thylakoid region and also appeared over the thylakoids passing through the pyrenoid. Labeling for photosystem II core components (D2 and a 45 kD Chl-binding protein), for phycobilisomes (allophycocyanin, and a 91 kD L{sub CM} linker) and for ATP synthase ({beta} subunit) were predominantly present in the thylakoid region but not in the pyrenoid region of the chloroplast. Red light cells had increased labeling per thylakoid length for polypeptides of photosystem II and of phycobilisomes, while photosystem I density decreased, compared to white light cells. Conversely, green light cells had a decreased density of photosystem II and phycobilisome polypeptides, while photosystem I density changed little compared with white light cells. A comparison of the immunogold labeling results with data from spectroscopic methods and from rocket immunoelectrophoresis indicates that it can provide a quantitative measure of the relative amounts of protein components as well as their localization in specific organeller compartments.

  5. Integrated multiplatform method for in vitro quantitative assessment of cellular uptake for fluorescent polymer nanoparticles.

    PubMed

    Ferrari, Raffaele; Lupi, Monica; Falcetta, Francesca; Bigini, Paolo; Paolella, Katia; Fiordaliso, Fabio; Bisighini, Cinzia; Salmona, Mario; D'Incalci, Maurizio; Morbidelli, Massimo; Moscatelli, Davide; Ubezio, Paolo

    2014-01-31

    Studies of cellular internalization of nanoparticles (NPs) play a paramount role for the design of efficient drug delivery systems, but so far they lack a robust experimental technique able to quantify the NP uptake in terms of number of NPs internalized in each cell. In this work we propose a novel method which provides a quantitative evaluation of fluorescent NP uptake by combining flow cytometry and plate fluorimetry with measurements of number of cells. Single cell fluorescence signals measured by flow cytometry were associated with the number of internalized NPs, exploiting the observed linearity between average flow cytometric fluorescence and overall plate fluorimeter measures, and previous calibration of the microplate reader with serial dilutions of NPs. This precise calibration has been made possible by using biocompatible fluorescent NPs in the range of 20-300 nm with a narrow particle size distribution, functionalized with a covalently bonded dye, Rhodamine B, and synthesized via emulsion free-radical polymerization. We report the absolute number of NPs internalized in mouse mammary tumor cells (4T1) as a function of time for different NP dimensions and surface charges and at several exposure concentrations. The obtained results indicate that 4T1 cells incorporated 10(3)-10(4) polymer NPs in a short time, reaching an intracellular concentration 15 times higher than the external one. PMID:24398665

  6. Quantitative Analysis Method of Output Loss due to Restriction for Grid-connected PV Systems

    NASA Astrophysics Data System (ADS)

    Ueda, Yuzuru; Oozeki, Takashi; Kurokawa, Kosuke; Itou, Takamitsu; Kitamura, Kiyoyuki; Miyamoto, Yusuke; Yokota, Masaharu; Sugihara, Hiroyuki

    Voltage of power distribution line will be increased due to reverse power flow from grid-connected PV systems. In the case of high density grid connection, amount of voltage increasing will be higher than the stand-alone grid connection system. To prevent the over voltage of power distribution line, PV system's output will be restricted if the voltage of power distribution line is close to the upper limit of the control range. Because of this interaction, amount of output loss will be larger in high density case. This research developed a quantitative analysis method for PV systems output and losses to clarify the behavior of grid connected PV systems. All the measured data are classified into the loss factors using 1 minute average of 1 second data instead of typical 1 hour average. Operation point on the I-V curve is estimated to quantify the loss due to the output restriction using module temperature, array output voltage, array output current and solar irradiance. As a result, loss due to output restriction is successfully quantified and behavior of output restriction is clarified.

  7. Integrated multiplatform method for in vitro quantitative assessment of cellular uptake for fluorescent polymer nanoparticles

    NASA Astrophysics Data System (ADS)

    Ferrari, Raffaele; Lupi, Monica; Falcetta, Francesca; Bigini, Paolo; Paolella, Katia; Fiordaliso, Fabio; Bisighini, Cinzia; Salmona, Mario; D'Incalci, Maurizio; Morbidelli, Massimo; Moscatelli, Davide; Ubezio, Paolo

    2014-01-01

    Studies of cellular internalization of nanoparticles (NPs) play a paramount role for the design of efficient drug delivery systems, but so far they lack a robust experimental technique able to quantify the NP uptake in terms of number of NPs internalized in each cell. In this work we propose a novel method which provides a quantitative evaluation of fluorescent NP uptake by combining flow cytometry and plate fluorimetry with measurements of number of cells. Single cell fluorescence signals measured by flow cytometry were associated with the number of internalized NPs, exploiting the observed linearity between average flow cytometric fluorescence and overall plate fluorimeter measures, and previous calibration of the microplate reader with serial dilutions of NPs. This precise calibration has been made possible by using biocompatible fluorescent NPs in the range of 20-300 nm with a narrow particle size distribution, functionalized with a covalently bonded dye, Rhodamine B, and synthesized via emulsion free-radical polymerization. We report the absolute number of NPs internalized in mouse mammary tumor cells (4T1) as a function of time for different NP dimensions and surface charges and at several exposure concentrations. The obtained results indicate that 4T1 cells incorporated 103-104 polymer NPs in a short time, reaching an intracellular concentration 15 times higher than the external one.

  8. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles

    SciTech Connect

    Gonda, Kohsuke; Miyashita, Minoru; Watanabe, Mika; Takahashi, Yayoi; Goda, Hideki; Okada, Hisatake; Nakano, Yasushi; Tada, Hiroshi; Amari, Masakazu; Ohuchi, Noriaki; Department of Surgical Oncology, Graduate School of Medicine, Tohoku University, Seiryo-machi, Aoba-ku, Sendai 980-8574

    2012-09-28

    Highlights: Black-Right-Pointing-Pointer Organic fluorescent material-assembled nanoparticles for IHC were prepared. Black-Right-Pointing-Pointer New nanoparticle fluorescent intensity was 10.2-fold greater than Qdot655. Black-Right-Pointing-Pointer Nanoparticle staining analyzed a wide range of ER expression levels in tissue. Black-Right-Pointing-Pointer Nanoparticle staining enhanced the quantitative sensitivity for ER diagnosis. -- Abstract: The detection of estrogen receptors (ERs) by immunohistochemistry (IHC) using 3,3 Prime -diaminobenzidine (DAB) is slightly weak as a prognostic marker, but it is essential to the application of endocrine therapy, such as antiestrogen tamoxifen-based therapy. IHC using DAB is a poor quantitative method because horseradish peroxidase (HRP) activity depends on reaction time, temperature and substrate concentration. However, IHC using fluorescent material provides an effective method to quantitatively use IHC because the signal intensity is proportional to the intensity of the photon excitation energy. However, the high level of autofluorescence has impeded the development of quantitative IHC using fluorescence. We developed organic fluorescent material (tetramethylrhodamine)-assembled nanoparticles for IHC. Tissue autofluorescence is comparable to the fluorescence intensity of quantum dots, which are the most representative fluorescent nanoparticles. The fluorescent intensity of our novel nanoparticles was 10.2-fold greater than quantum dots, and they did not bind non-specifically to breast cancer tissues due to the polyethylene glycol chain that coated their surfaces. Therefore, the fluorescent intensity of our nanoparticles significantly exceeded autofluorescence, which produced a significantly higher signal-to-noise ratio on IHC-imaged cancer tissues than previous methods. Moreover, immunostaining data from our nanoparticle fluorescent IHC and IHC with DAB were compared in the same region of adjacent tissues sections to quantitatively examine the two methods. The results demonstrated that our nanoparticle staining analyzed a wide range of ER expression levels with higher accuracy and quantitative sensitivity than DAB staining. This enhancement in the diagnostic accuracy and sensitivity for ERs using our immunostaining method will improve the prediction of responses to therapies that target ERs and progesterone receptors that are induced by a downstream ER signal.

  9. A novel quantitative method of pten expression assessment in tumor tissue.

    PubMed

    Waniczek, D; Snietura, M; Kopec, A; Scieglinska, D; Piglowski, W; Lorenc, Z; Muc-Wierzgon, M; Nowakowska-Zajdel, E

    2016-01-01

    Phosphatase and Tensin Homolog deleted on chromosome 10 (PTEN) gene is one of the most important tumor suppressor genes which is involved in the regulation of many signaling cascades (AKT/PKB and MAPK). Subtle changes in its activity lead to cancer susceptibility or aggressive tumor behaviour. Despite the diversity of mechanisms leading to PTEN inactivation, it is frequently associated with a decreased or complete loss of protein expression. About 20% decrease in PTEN expression could lead to the development of cancer. There have been no objective, quantitative methods of PTEN expression assessment that allow to measure the subtle variations of the protein concentration in a tissue-contextual manner. A new quantitative algorithm of immunostaining evaluation based on combination of color deconvolution and relative chromogen signal intensity was used in the study. The proposed algorithm was implemented in the popular ImageJ image analysis software and positively verified in cancer cell lines and tissue models as well as in the tissue samples of colorectal cancer (CRC) patients. The proposed quantitative method of PTEN expression assessment creates an alternative to currently available subjective methods and forms the basis for inter-case and inter-tissue comparisons. Using the algorithm it would be possible to identify three groups of patients with advanced colorectal cancer which could significantly differ in the overall survival. The research should be continued. PMID:27049078

  10. Improvements in quantitative chiral determinations using the mass spectrometric kinetic method

    NASA Astrophysics Data System (ADS)

    Young, Brandy L.; Cooks, R. Graham

    2007-11-01

    A significant shortcoming of the kinetic method for determining chiral purity from the competitive dissociation of a metal complex chelated with a chiral analyte and chiral reference is that it does not give useful quantitative data for high purity chiral samples in comparison to the standard chiral chromatography techniques. To extend the range of applications of the kinetic method to high chiral purity samples, a "vernier" method is reported which gives more accurate quantitative data for samples covering a narrow range of chiral purity. This is demonstrated for the case of a model amino acid in which NiII is used as the metal cation, asparagine as the chiral reference, and the analyte is tryptophan. By selecting experimental conditions to give a measured peak abundance ratio near unity for the two competitive fragments (where abundance ratios are more accurately measured than at higher or lower values), samples of high (90-100%) chiral purity were more accurately assayed in the experiment using pure tryptophan. By switching the chirality of the reference compound, samples in the low (0-10%) chiral purity range could also be more precisely measured. The quantitative improvement in accuracy of the chiral measurement is shown by the average accuracy of 0.51% deviation (for the 90-100% chiral purity range of tryptophan using l-Asn as chiral reference) in comparison to the average deviation of 104.1% in the non-vernier region (the region that falls outside the high accuracy region of interest).

  11. Test Characteristics of Urinary Biomarkers Depend on Quantitation Method in Acute Kidney Injury

    PubMed Central

    Md Ralib, Azrina; Pickering, John W.; Shaw, Geoffrey M.; Devarajan, Prasad; Edelstein, Charles L.; Bonventre, Joseph V.

    2012-01-01

    The concentration of urine influences the concentration of urinary biomarkers of AKI. Whether normalization to urinary creatinine concentration, as commonly performed to quantitate albuminuria, is the best method to account for variations in urinary biomarker concentration among patients in the intensive care unit is unknown. Here, we compared the diagnostic and prognostic performance of three methods of biomarker quantitation: absolute concentration, biomarker normalized to urinary creatinine concentration, and biomarker excretion rate. We measured urinary concentrations of alkaline phosphatase, γ-glutamyl transpeptidase, cystatin C, neutrophil gelatinase–associated lipocalin, kidney injury molecule–1, and IL-18 in 528 patients on admission and after 12 and 24 hours. Absolute concentration best diagnosed AKI on admission, but normalized concentrations best predicted death, dialysis, or subsequent development of AKI. Excretion rate on admission did not diagnose or predict outcomes better than either absolute or normalized concentration. Estimated 24-hour biomarker excretion associated with AKI severity, and for neutrophil gelatinase–associated lipocalin and cystatin C, with poorer survival. In summary, normalization to urinary creatinine concentration improves the prediction of incipient AKI and outcome but provides no advantage in diagnosing established AKI. The ideal method for quantitating biomarkers of urinary AKI depends on the outcome of interest. PMID:22095948

  12. Quantitative HPLC-UV method for the determination of firocoxib from horse and dog plasma.

    PubMed

    Kvaternick, Valerie; Malinski, Thomas; Wortmann, Jill; Fischer, James

    2007-07-01

    A sensitive reversed-phase HPLC-UV method was developed for the determination of firocoxib, a novel and highly selective COX-2 inhibitor, in plasma. A 1.0 mL dog or horse plasma sample is mixed with water and passed through a hydrophobic-lipophilic copolymer solid-phase extraction column to isolate firocoxib. Quantitation is based on an external standard curve. The method has a validated limit of quantitation of 25 ng/mL and a limit of detection of 10 ng/mL. The validated upper limit of quantitation was 2500 ng/mL for horses and 10,000 ng/mL for dogs. The average recoveries ranged from 88-93% for horse plasma and 96-103% for dog plasma. The coefficient of variation in all cases was less than 10%. This method is suitable for the analysis of clinical samples from pharmacokinetic and bioequivalence studies and drug monitoring. PMID:17537684

  13. Task 4.4 - development of supercritical fluid extraction methods for the quantitation of sulfur forms in coal

    SciTech Connect

    Timpe, R.C.

    1995-04-01

    Development of advanced fuel forms depends on having reliable quantitative methods for their analysis. Determination of the true chemical forms of sulfur in coal is necessary to develop more effective methods to reduce sulfur content. Past work at the Energy & Environmental Research Center (EERC) indicates that sulfur chemistry has broad implications in combustion, gasification, pyrolysis, liquefaction, and coal-cleaning processes. Current analytical methods are inadequate for accurately measuring sulfur forms in coal. This task was concerned with developing methods to quantitate and identify major sulfur forms in coal based on direct measurement (as opposed to present techniques based on indirect measurement and difference values). The focus was on the forms that were least understood and for which the analytical methods have been the poorest, i.e., organic and elemental sulfur. Improved measurement techniques for sulfatic and pyritic sulfur also need to be developed. A secondary goal was to understand the interconversion of sulfur forms in coal during thermal processing. EERC has developed the first reliable analytical method for extracting and quantitating elemental sulfur from coal (1). This method has demonstrated that elemental sulfur can account for very little or as much as one-third of the so-called organic sulfur fraction. This method has disproved the generally accepted idea that elemental sulfur is associated with the organic fraction. A paper reporting the results obtained on this subject entitled {open_quote}Determination of Elemental Sulfur in Coal by Supercritical Fluid Extraction and Gas Chromatography with Atomic Emission Detection{close_quote} was published in Fuel (A).

  14. Quantitative ultrasound of cortical bone in the femoral neck predicts femur strength: results of a pilot study.

    PubMed

    Grimal, Quentin; Grondin, Julien; Guérard, Sandra; Barkmann, Reinhard; Engelke, Klaus; Glüer, Claus-C; Laugier, Pascal

    2013-02-01

    A significant risk of femoral neck (FN) fracture exists for men and women with an areal bone mineral density (aBMD) higher than the osteoporotic range, as measured with dual-energy X-ray absorptiometry (DXA). Separately measuring the cortical and trabecular FN compartments and combining the results would likely be a critical aspect of enhancing the diagnostic capabilities of a new technique. Because the cortical shell determines a large part of FN strength a novel quantitative ultrasound (QUS) technique that probes the FN cortical compartment was implemented. The sensitivity of the method to variations of FN cortical properties and FN strength was tested. Nine femurs (women, mean age 83 years) were subjected to QUS to measure the through transmission time-of-flight (TOF) at the FN and mechanical tests to assess strength. Quantitative computed tomography (QCT) scans were performed to enable analysis of the dependence of TOF on bone parameters. DXA was also performed for reference. An ultrasound wave propagating circumferentially in the cortical shell was measured in all specimens. Its TOF was not influenced by the properties of the trabecular compartment. Averaged TOF for nine FN measurement positions/orientations was significantly correlated to strength (R2  = 0.79) and FN cortical QCT variables: total BMD (R(2)  = 0.54); regional BMD in the inferoanterior (R2  = 0.90) and superoanterior (R2  = 0.57) quadrants; and moment of inertia (R2  = 0.71). The results of this study demonstrate that QUS can perform a targeted measurement of the FN cortical compartment. Because the method involves mechanical guided waves, the QUS variable is related to the geometric and material properties of the cortical shell (cortical thickness, tissue elasticity, and porosity). This work opens the way to a multimodal QUS assessment of the proximal femur, combining our approach targeting the cortical shell with the existing modality sensitive to the trabecular compartment. In vivo feasibility of our approach has to be confirmed with experimental data in patients. PMID:22915370

  15. A New Quantitative Method for the Non-Invasive Documentation of Morphological Damage in Paintings Using RTI Surface Normals

    PubMed Central

    Manfredi, Marcello; Bearman, Greg; Williamson, Greg; Kronkright, Dale; Doehne, Eric; Jacobs, Megan; Marengo, Emilio

    2014-01-01

    In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI) we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time. PMID:25010699

  16. A novel method for quantitative analysis of acetylacetone and ethyl acetoacetate by fluorine-19 nuclear magnetic spectroscopy.

    PubMed

    Zhou, Lulin; Li, Cheng; Weng, Xinchu

    2016-03-01

    A new method utilization of NMR spectra was developed for structural and quantitative analysis of enol forms of acetylacetone and ethyl acetoacetate. Acetylacetone and ethyl acetoacetate were determined by (19) F NMR upon derivatisation with р-fluorobenzoyl chloride. The base-catalyzed derivatives of acetylacetone and ethyl acetoacetate reaction with р-fluorobenzoyl chloride were analyzed by (1) H and (13) C NMR spectroscopies. E and Z configurations of acetylacetone and ethyl acetoacetate were separated and purified by thin layer chromatography. In addition, the ability of (19) F NMR for quantitative analysis of acetylacetone by integration of the appropriate signals of the derivatives were tested and compared. The results further testified the enol forms of acetylacetone and ethyl acetoacetate and the feasibility of (19) F NMR method. This method can be potentially used to characterize E and Z isomers and quantitatively analyze E/Z ratio of β-diketone and β-ketoester homologues. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26521683

  17. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  18. Researchers’ views on return of incidental genomic research results: qualitative and quantitative findings

    PubMed Central

    Klitzman, Robert; Appelbaum, Paul S.; Fyer, Abby; Martinez, Josue; Buquez, Brigitte; Wynn, Julia; Waldman, Cameron R.; Phelan, Jo; Parens, Erik; Chung, Wendy K.

    2013-01-01

    Purpose Comprehensive genomic analysis including exome and genome sequencing is increasingly being utilized in research studies, leading to the generation of incidental genetic findings. It is unclear how researchers plan to deal with incidental genetic findings. Methods We conducted a survey of the practices and attitudes of 234 members of the US genetic research community and performed qualitative semistructured interviews with 28 genomic researchers to understand their views and experiences with incidental genetic research findings. Results We found that 12% of the researchers had returned incidental genetic findings, and an additional 28% planned to do so. A large majority of researchers (95%) believe that incidental findings for highly penetrant disorders with immediate medical implications should be offered to research participants. However, there was no consensus on returning incidental results for other conditions varying in penetrance and medical actionability. Researchers raised concerns that the return of incidental findings would impose significant burdens on research and could potentially have deleterious effects on research participants if not performed well. Researchers identified assistance needed to enable effective, accurate return of incidental findings. Conclusion The majority of the researchers believe that research participants should have the option to receive at least some incidental genetic research results. PMID:23807616

  19. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  20. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward.  The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226

  1. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  2. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  3. Quantitative analysis methods for three-dimensional microstructure of the solid-oxide fuel cell anode

    NASA Astrophysics Data System (ADS)

    Song, X.; Guan, Y.; Liu, G.; Chen, L.; Xiong, Y.; Zhang, X.; Tian, Y.

    2013-10-01

    The electrochemical performance is closely related to three-dimensional microstructure of the Ni-YSZ anode. X-ray nano-tomography combined with quantitative analysis methods has been applied to non-destructively study the internal microstructure of the porous Ni-YSZ anode. In this paper, the methods for calculating some critical structural parameters, such as phase volume fraction, connectivity and active triple phase boundary (TPB) density were demonstrated. These structural parameters help us to optimize electrodes and improve the performance.

  4. The effect of hydraulic loading on bioclogging in porous media: Quantitative results from tomographic imaging

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Davit, Y.; Connolly, J. M.; Gerlach, R.; Wood, B. D.; Wildenschild, D.

    2013-12-01

    Biofilm growth in porous media is generally surface attached, and pore filling. A direct result of biofilm formation is the clogging of pore space available for fluid transport. This clogging effect has come to be termed bioclogging. In physical experiments bioclogging expresses as an increase in differential pressure across experimental specimens and traditional investigations of bioclogging in 3D porous media have included measurements of bulk differential pressure changes in order to evaluate changes in permeability or hydraulic conductivity. Due to the opaque nature of most types of porous media, visualization of bioclogging has been limited to the use of 2D or pseudo-3D micromodels. As a result, bioclogging models have relied on parameters derived from 2D visualization experiments. Results from these studies have shown that even small changes in pore morphology associated with biofilm growth can significantly alter fluid hydrodynamics. Recent advances in biofilm imaging facilitate the investigation of biofilm growth and bioclogging in porous media through the implementation of x-ray computed microtomography (CMT) and a functional contrast agent. We used barium sulfate as the contrast agent which consists of a particle suspension that fills all pore space available to fluid flow. Utilization of x-ray CMT with a barium sulfate contrast agent facilitates the examination of biofilm growth at the micron scale throughout experimental porous media growth reactors. This method has been applied to investigate changes in macropore morphology associated with biofilm growth. Applied fluid flow rates correspond to initial Reynolds numbers ranging from 0.1 to 100. Results include direct comparison of measured changes in porosity and hydraulic conductivity as calculated using differential pressure measurements vs. images. In addition, parameters such as biofilm thickness, reactive surface area, and attachment surface area will be presented in order to help characterize biofilm structure at each of the investigated flow rates.

  5. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  6. Detection of human herpesvirus 8 by quantitative polymerase chain reaction: development and standardisation of methods

    PubMed Central

    2012-01-01

    Background Human herpesvirus 8 (HHV-8), the aetiological agent of Kaposi’s sarcoma (KS), multicentric Castleman’s disease (MCD), and primary effusion lymphoma (PEL) is rare in Australia, but endemic in Sub-Saharan Africa, parts of South-east Asia and Oceania. While the treatment of external KS lesions can be monitored by clinical observation, the internal lesions of KS, MCD and PEL require extensive and expensive internal imaging, or autopsy. In patients with MCD and PEL, if HHV-8 viraemia is not reduced quickly, ~50% die within 24 months. HHV-8 qPCR is a valuable tool for monitoring HHV-8 viraemia, but is not available in many parts of the world, including those with high prevalence of KS and HHV-8. Methods A new molecular facility with stringent three-phase workflow was established, adhering to NPAAC and CLSI guidelines. Three fully validated quantitative assays were developed: two for detection and quantification of HHV-8; one for GAPDH, necessary for normalisation of viral loads in tissue and peripheral blood. Results The HHV-8 ORF73 and ORF26 qPCR assays were 100% specific. All qPCR assays, displayed a broad dynamic range (102 to 1010 copies/μL TE Buffer) with a limit of detection of 4.85x103, 5.61x102, and 2.59x102 copies/μL TE Buffer and a limit of quantification of 4.85x103, 3.01x102, and 1.38x102 copies/μL TE Buffer for HHV-8 ORF73, HHV-8 ORF26, and GAPDH respectively. The assays were tested on a panel of 35 KS biopsies from Queensland. All were HHV-8 qPCR positive with average viral load of 2.96x105 HHV-8 copies/μL DNA extract (range: 4.37x103 to 1.47x106 copies/μL DNA extract): When normalised these equate to an average viral load of 2.44x104 HHV-8 copies/103 cells (range: 2.20x102 to 7.38x105 HHV-8 copies/103 cells). Conclusions These are the first fully optimised, validated and MIQE compliant HHV-8 qPCR assays established in Australia. They worked well for qualitative detection of HHV-8 in archival tissue, and are well-suited for quantitative detection in whole blood. They are now available for research, for clinical diagnosis of HHV-8 infection, and for monitoring treatment efficacy. PMID:22963082

  7. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18 O-Labeling Method for Quantitative Proteomics

    SciTech Connect

    Lopez-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather S.; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-08-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min and minimized the amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from bacteria Shewanella oneidensis, and mouse plasma, as well as for the labeling of complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, and rapid, and thus well-suited for automation.

  8. Path Integrals and Exotic Options:. Methods and Numerical Results

    NASA Astrophysics Data System (ADS)

    Bormetti, G.; Montagna, G.; Moreni, N.; Nicrosini, O.

    2005-09-01

    In the framework of Black-Scholes-Merton model of financial derivatives, a path integral approach to option pricing is presented. A general formula to price path dependent options on multidimensional and correlated underlying assets is obtained and implemented by means of various flexible and efficient algorithms. As an example, we detail the case of Asian call options. The numerical results are compared with those obtained with other procedures used in quantitative finance and found to be in good agreement. In particular, when pricing at the money (ATM) and out of the money (OTM) options, path integral exhibits competitive performances.

  9. Quantitative PCR method for evaluating freshness of whiting (Merlangius merlangus) and plaice (Pleuronectes platessa).

    PubMed

    Duflos, Guillaume; Theraulaz, Laurence; Giordano, Gerard; Mejean, Vincent; Malle, Pierre

    2010-07-01

    We have developed a method for rapid quantification of fish spoilage bacteria based on quantitative PCR with degenerated oligonucleotides that hybridize on the torA gene coding for trimethylamine N-oxide reductase, one of the major bacterial enzymes in fish spoilage. To show the utility of this gene, we used a regular PCR with DNA extracts from whiting (Merlangius merlangus) and plaice (Pleuronectes platessa) stored in ice. Quantitative PCR showed that the number of copies of the torA gene, i.e., the number of spoilage bacteria, increases with length of storage. This approach can therefore be used to evaluate freshness for the two fish species studied (whiting and plaice). PMID:20615351

  10. Quantitative and qualitative methods in UK health research: then, now and...?

    PubMed

    McPherson, K; Leydon, G

    2002-09-01

    This paper examines the current status of qualitative and quantitative research in the context of UK (public) health research in cancer. It is proposed that barren competition between qualitative and quantitative methods is inevitable, but that effective synergy between them continues to be essential to research excellence. The perceived methodological utility, with respect to understanding residual uncertainties, can account for the status accorded various research techniques and these will help to explain shifts witnessed in recent years and contribute towards an understanding of what can be realistically expected in terms of future progress. It is argued that the methodological debate, though familiar to many, is worthy of rearticulation in the context of cancer research where the psychosocial aspects of living with a cancer and the related complexity of providing appropriate cancer care are being addressed across Europe, as evidenced in recent directions in policy and research. PMID:12296843

  11. [Study of infrared spectroscopy quantitative analysis method for methane gas based on data mining].

    PubMed

    Zhang, Ai-Ju

    2013-10-01

    Monitoring of methane gas is one of the important factors affecting the coal mine safety. The online real-time monitoring of the methane gas is used for the mine safety protection. To improve the accuracy of model analysis, in the present paper, the author uses the technology of infrared spectroscopy to study the gas infrared quantitative analysis algorithm. By data mining technology application in multi-component infrared spectroscopy quantitative analysis algorithm, it was found that cluster analysis partial least squares algorithm is obviously superior to simply using partial least squares algorithm in terms of accuracy. In addition, to reduce the influence of the error on the accuracy of model individual calibration samples, the clustering analysis was used for the data preprocessing, and such denoising method was found to improve the analysis accuracy. PMID:24409709

  12. Are three generations of quantitative molecular methods sufficient in medical virology? Brief review.

    PubMed

    Clementi, Massimo; Bagnarelli, Patrizia

    2015-10-01

    In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges. PMID:26485007

  13. Three-dimensional quantitative analysis of adhesive remnants and enamel loss resulting from debonding orthodontic molar tubes

    PubMed Central

    2014-01-01

    Aims Presenting a new method for direct, quantitative analysis of enamel surface. Measurement of adhesive remnants and enamel loss resulting from debonding molar tubes. Material and methods Buccal surfaces of fifteen extracted human molars were directly scanned with an optic blue-light 3D scanner to the nearest 2 μm. After 20 s etching molar tubes were bonded and after 24 h storing in 0.9% saline - debonded. Then 3D scanning was repeated. Superimposition and comparison were proceeded and shape alterations of the entire objects were analyzed using specialized computer software. Residual adhesive heights as well as enamel loss depths have been obtained for the entire buccal surfaces. Residual adhesive volume and enamel loss volume have been calculated for every tooth. Results The maximum height of adhesive remaining on enamel surface was 0.76 mm and the volume on particular teeth ranged from 0.047 mm3 to 4.16 mm3. The median adhesive remnant volume was 0.988 mm3. Mean depths of enamel loss for particular teeth ranged from 0.0076 mm to 0.0416 mm. Highest maximum depth of enamel loss was 0.207 mm. Median volume of enamel loss was 0.104 mm3 and maximum volume was 1.484 mm3. Conclusions Blue-light 3D scanning is able to provide direct precise scans of the enamel surface, which can be superimposed in order to calculate shape alterations. Debonding molar tubes leaves a certain amount of adhesive remnants on the enamel, however the interface fracture pattern varies for particular teeth and areas of enamel loss are present as well. PMID:25208969

  14. Quantitative analysis of gene expression in fixed colorectal carcinoma samples as a method for biomarker validation

    PubMed Central

    OSTASIEWICZ, BEATA; OSTASIEWICZ, PAWEŁ; DUŚ-SZACHNIEWICZ, KAMILA; OSTASIEWICZ, KATARZYNA; ZIÓŁKOWSKI, PIOTR

    2016-01-01

    Biomarkers have been described as the future of oncology. Modern proteomics provide an invaluable tool for the near-whole proteome screening for proteins expressed differently in neoplastic vs. healthy tissues. However, in order to select the most promising biomarkers, an independent method of validation is required. The aim of the current study was to propose a methodology for the validation of biomarkers. Due to material availability the majority of large scale biomarker studies are performed using formalin-fixed paraffin-embedded (FFPE) tissues, therefore these were selected for use in the current study. A total of 10 genes were selected from what have been previously described as the most promising candidate biomarkers, and the expression levels were analyzed with reverse transcription-quantitative polymerase chain reaction (RT-qPCR) using calibrator normalized relative quantification with the efficiency correction. For 6/10 analyzed genes, the results were consistent with the proteomic data; for the remaining four genes, the results were inconclusive. The upregulation of karyopherin α 2 (KPNA2) and chromosome segregation 1-like (CSE1L) in colorectal carcinoma, in addition to downregulation of chloride channel accessory 1 (CLCA1), fatty acid binding protein 1 (FABP1), sodium channel, voltage gated, type VII α subunit (SCN7A) and solute carrier family 26 (anion exchanger), member 3 (SLC26A3) was confirmed. With the combined use of proteomic and genetic tools, it was reported, for the first time to the best of our knowledge, that SCN7A was downregulated in colorectal carcinoma at mRNA and protein levels. It had been previously suggested that the remaining five genes served an important role in colorectal carcinogenesis, however the current study provided strong evidence to support their use as biomarkers. Thus, it was concluded that combination of RT-qPCR with proteomics offers a powerful methodology for biomarker identification, which can be used to analyze FFPE samples. PMID:27121919

  15. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years.

    PubMed

    Tapaltsyan, Vagan; Eronen, Jussi T; Lawing, A Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D

    2015-05-01

    The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem-cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3,500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine whether evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  16. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years

    PubMed Central

    Mushegyan, Vagan; Eronen, Jussi T.; Lawing, A. Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D.

    2015-01-01

    Summary The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine if evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic, and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem-cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  17. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  18. A simple regression-based method to map quantitative trait loci underlying function-valued phenotypes.

    PubMed

    Kwak, Il-Youp; Moore, Candace R; Spalding, Edgar P; Broman, Karl W

    2014-08-01

    Most statistical methods for quantitative trait loci (QTL) mapping focus on a single phenotype. However, multiple phenotypes are commonly measured, and recent technological advances have greatly simplified the automated acquisition of numerous phenotypes, including function-valued phenotypes, such as growth measured over time. While methods exist for QTL mapping with function-valued phenotypes, they are generally computationally intensive and focus on single-QTL models. We propose two simple, fast methods that maintain high power and precision and are amenable to extensions with multiple-QTL models using a penalized likelihood approach. After identifying multiple QTL by these approaches, we can view the function-valued QTL effects to provide a deeper understanding of the underlying processes. Our methods have been implemented as a package for R, funqtl. PMID:24931408

  19. Development and Application of Quantitative Detection Method for Viral Hemorrhagic Septicemia Virus (VHSV) Genogroup IVa

    PubMed Central

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-01-01

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R2 values of the primer set developed in this study were −0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID50) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID50, making it a very useful tool for VHSV diagnosis. PMID:24859343

  20. Multiresidue method for the quantitation of 20 pesticides in aquatic products.

    PubMed

    Cho, Ha Ra; Park, Jun Seo; Kim, Junghyun; Han, Sang Beom; Choi, Yong Seok

    2015-12-01

    As the consumption of aquatic products increased, the need for regulation of pesticide residues in aquatic products also emerged. Thus, in this study, a scheduled multiple reaction monitoring (sMRM) method employing a novel extraction and purification step based on QuEChERS with EDTA was developed for the simultaneous quantitation of 20 pesticides (alachlor, aldicarb, carbofuran, diazinon, dimethoate, dimethomorph, ethoprophos, ferimzone, fluridone, hexaconazole, iprobenfos, malathion, methidathion, methiocarb, phenthoate, phosalone, phosmet, phosphamidon, pirimicarb, and simazine) in aquatic products. Additionally, the present method was validated in the aspects of specificity, linearity (r ≥ 0.980), sensitivity (the limit of quantitation (LOQ) ≤ 5 ng/g), relative standard deviation, RSD (1.0% ≤ RSD ≤ 19.4%), and recovery (60.1% ≤ recovery ≤ 117.9%). Finally, the validated method was applied for the determination of the 20 pesticide residues in eel and shrimp purchased from local food markets. In the present study, QuEChERS with EDTA was successfully expanded to residual pesticide analysis for the first time. The present method could contribute to the rapid and successful establishment of the positive list system in South Korea. PMID:26466578

  1. Development of a HPLC Method for the Quantitative Determination of Capsaicin in Collagen Sponge

    PubMed Central

    Guo, Chun-Lian; Chen, Hong-Ying; Cui, Bi-Ling; Chen, Yu-Huan; Zhou, Yan-Fang; Peng, Xin-Sheng; Wang, Qin

    2015-01-01

    Controlling the concentration of drugs in pharmaceutical products is essential to patient's safety. In this study, a simple and sensitive HPLC method is developed to quantitatively analyze capsaicin in collagen sponge. The capsaicin from sponge was extracted for 30 min with ultrasonic wave extraction technique and methanol was used as solvent. The chromatographic method was performed by using isocratic system composed of acetonitrile-water (70 : 30) with a flow rate of 1 mL/min and the detection wavelength was at 280 nm. Capsaicin can be successfully separated with good linearity (the regression equation is A = 9.7182C + 0.8547; R2 = 1.0) and perfect recovery (99.72%). The mean capsaicin concentration in collagen sponge was 49.32 mg/g (RSD = 1.30%; n = 3). In conclusion, the ultrasonic wave extraction method is simple and the extracting efficiency is high. The HPLC assay has excellent sensitivity and specificity and is a convenient method for capsaicin detection in collagen sponge. This paper firstly discusses the quantitative analysis of capsaicin in collagen sponge. PMID:26612986

  2. The quantitative and qualitative recovery of Campylobacter from raw poultry using USDA and Health Canada methods.

    PubMed

    Sproston, E L; Carrillo, C D; Boulter-Bitzer, J

    2014-12-01

    Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating. PMID:25084671

  3. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    PubMed

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities. PMID:26963561

  4. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  5. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  6. Broad-spectrum detection and quantitation methods of Soil-borne cereal mosaic virus isolates.

    PubMed

    Vaïanopoulos, Céline; Legrève, Anne; Moreau, Virginie; Bragard, Claude

    2009-08-01

    A broad-spectrum reverse transcription-polymerase chain reaction (RT-PCR) protocol was developed for detecting Soil-borne cereal mosaic virus (SBCMV) isolates, responsible for mosaic diseases in Europe, using primers targeting the highly conserved 3'-untranslated region of RNA-1 and RNA-2 of SBCMV. The 3'-end region is a privileged target for the detection of a wide range of isolates, because of sequence conservation, of the tRNA-like structure, the major role in viral replication and the signal amplification due to the presence of numerous genomic and subgenomic RNAs. The primers were also designed for virus quantitation using real-time RT-PCR with SYBR-Green chemistry. No cross-reaction with Wheat spindle streak mosaic virus, frequently associated with SBCMV, was observed. The use of RT-PCR and real-time quantitative RT-PCR allowed a more sensitive detection and quantitation of SBCMV to be made than was the case with ELISA. The methods enabled European isolates of SBCMV from Belgium, France, Germany, Italy and the UK to be detected and quantified. Real-time RT-PCR represents a new tool for comparing soil inoculum potential as well as cultivar resistance to SBCMV. PMID:19490978

  7. QDMR: a quantitative method for identification of differentially methylated regions by entropy

    PubMed Central

    Zhang, Yan; Liu, Hongbo; Lv, Jie; Xiao, Xue; Zhu, Jiang; Liu, Xiaojuan; Su, Jianzhong; Li, Xia; Wu, Qiong; Wang, Fang; Cui, Ying

    2011-01-01

    DNA methylation plays critical roles in transcriptional regulation and chromatin remodeling. Differentially methylated regions (DMRs) have important implications for development, aging and diseases. Therefore, genome-wide mapping of DMRs across various temporal and spatial methylomes is important in revealing the impact of epigenetic modifications on heritable phenotypic variation. We present a quantitative approach, quantitative differentially methylated regions (QDMRs), to quantify methylation difference and identify DMRs from genome-wide methylation profiles by adapting Shannon entropy. QDMR was applied to synthetic methylation patterns and methylation profiles detected by methylated DNA immunoprecipitation microarray (MeDIP-chip) in human tissues/cells. This approach can give a reasonable quantitative measure of methylation difference across multiple samples. Then DMR threshold was determined from methylation probability model. Using this threshold, QDMR identified 10 651 tissue DMRs which are related to the genes enriched for cell differentiation, including 4740 DMRs not identified by the method developed by Rakyan et al. QDMR can also measure the sample specificity of each DMR. Finally, the application to methylation profiles detected by reduced representation bisulphite sequencing (RRBS) in mouse showed the platform-free and species-free nature of QDMR. This approach provides an effective tool for the high-throughput identification of potential functional regions involved in epigenetic regulation. PMID:21306990

  8. Development and application of quantitative methods for monitoring dermal and inhalation exposure to propiconazole.

    PubMed

    Flack, Sheila; Goktepe, Ipek; Ball, Louise M; Nylander-French, Leena A

    2008-03-01

    Quantitative methods to measure dermal and inhalation exposure to the fungicide propiconazole were developed in the laboratory and applied in the occupational exposure setting for monitoring five farm workers' exposure during pesticide preparation and application to peach crops. Dermal exposure was measured with tape-strips applied to the skin, and the amount of propiconazole was normalized to keratin content in the tape-strip. Inhalation exposure was measured with an OVS tube placed in the worker's breathing-zone during pesticide handling. Samples were analyzed by GC-MS in EI+ mode (limit of detection 6 pg microl(-1)). Dermal exposure ranged from non-detectable to 32.1 +/- 22.6 ng per microg keratin while breathing-zone concentrations varied from 0.2 to 2.2 microg m(-3). A positive correlation was observed between breathing-zone concentrations and ambient air temperature (r2 = 0.87, p < 0.01). Breathing-zone concentrations did not correlate with dermal exposure levels (r2 = 0.11, p = 0.52). Propiconazole levels were below limit of detection when rubber gloves, coveralls, and full-face mask were used. The total-body propiconazole dose, determined for each worker by summing the estimated dermal dose and inhalation dose, ranged from 0.01 to 12 microg per kg body weight per day. Our results show that tape-stripping of the skin and the OVS can be effectively utilized to measure dermal and inhalation exposure to propiconazole, respectively, and that the dermal route of exposure contributed substantially more to the total dose than the inhalation route. PMID:18392276

  9. Methods of experimentation with models and utilization of results

    NASA Technical Reports Server (NTRS)

    Robert,

    1924-01-01

    The present report treats the subject of testing small models in a wind tunnel and of the methods employed for rendering the results constant, accurate and comparable with one another. Detailed experimental results are given.

  10. A Dilute-and-Shoot LC-MS Method for Quantitating Opioids in Oral Fluid.

    PubMed

    Enders, Jeffrey R; McIntire, Gregory L

    2015-10-01

    Opioid testing represents a dominant share of the market in pain management clinical testing facilities. Testing of this drug class in oral fluid (OF) has begun to rise in popularity. OF analysis has traditionally required extensive clean-up protocols and sample concentration, which can be avoided. This work highlights the use of a fast, 'dilute-and-shoot' method that performs no considerable sample manipulation. A quantitative method for the determination of eight common opioids and associated metabolites (codeine, morphine, hydrocodone, hydromorphone, norhydrocodone, oxycodone, noroxycodone and oxymorphone) in OF is described herein. OF sample is diluted 10-fold in methanol/water and then analyzed using an Agilent chromatographic stack coupled with an AB SCIEX 4500. The method has a 2.2-min LC gradient and a cycle time of 2.9 min. In contrast to most published methods of this particular type, this method uses no sample clean-up or concentration and has a considerably faster LC gradient, making it ideal for very high-throughput laboratories. Importantly, the method requires only 100 ?L of sample and is diluted 10-fold prior to injection to help with instrument viability. Baseline separation of all isobaric opioids listed above was achieved on a phenyl-hexyl column. The validated calibration range for this method is 2.5-1,000 ng/mL. This 'dilute-and-shoot' method removes the unnecessary, costly and time-consuming extraction steps found in traditional methods and still surpasses all analytical requirements. PMID:26378142

  11. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    NASA Astrophysics Data System (ADS)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  12. Investigation of a diffuse optical measurements-assisted quantitative photoacoustic tomographic method in reflection geometry.

    PubMed

    Xu, Chen; Kumavor, Patrick D; Aguirre, Andres; Zhu, Quing

    2012-06-01

    Photoacoustic tomography provides the distribution of absorbed optical energy density, which is the product of optical absorption coefficient and optical fluence distribution. We report the experimental investigation of a novel fitting procedure that quantitatively determines the optical absorption coefficient of chromophores. The experimental setup consisted of a hybrid system of a 64-channel photoacoustic imaging system with a frequency-domain diffused optical measurement system. The fitting procedure included a complete photoacoustic forward model and an analytical solution of a target chromophore using the diffusion approximation. The fitting procedure combines the information from the photoacoustic image and the background information from the diffuse optical measurements to minimize the photoacoustic measurements and forward model data and recover the target absorption coefficient quantitatively. 1-cm-cube phantom absorbers of high and low contrasts were imaged at depths of up to 3.0 cm. The fitted absorption coefficient results were at least 80% of their true values. The sensitivities of this fitting procedure to target location, target radius, and background optical properties were also investigated. We found that this fitting procedure was most sensitive to the accurate determination of the target radius and depth. Blood sample in a thin tube of radius 0.58 mm, simulating a blood vessel, was also studied. The photoacoustic images and fitted absorption coefficients are presented. These results demonstrate the clinical potential of this fitting procedure to quantitatively characterize small lesions in breast imaging. PMID:22734743

  13. Investigation of a diffuse optical measurements-assisted quantitative photoacoustic tomographic method in reflection geometry

    NASA Astrophysics Data System (ADS)

    Xu, Chen; Kumavor, Patrick D.; Aguirre, Andres; Zhu, Quing

    2012-06-01

    Photoacoustic tomography provides the distribution of absorbed optical energy density, which is the product of optical absorption coefficient and optical fluence distribution. We report the experimental investigation of a novel fitting procedure that quantitatively determines the optical absorption coefficient of chromophores. The experimental setup consisted of a hybrid system of a 64-channel photoacoustic imaging system with a frequency-domain diffused optical measurement system. The fitting procedure included a complete photoacoustic forward model and an analytical solution of a target chromophore using the diffusion approximation. The fitting procedure combines the information from the photoacoustic image and the background information from the diffuse optical measurements to minimize the photoacoustic measurements and forward model data and recover the target absorption coefficient quantitatively. 1-cm-cube phantom absorbers of high and low contrasts were imaged at depths of up to 3.0 cm. The fitted absorption coefficient results were at least 80% of their true values. The sensitivities of this fitting procedure to target location, target radius, and background optical properties were also investigated. We found that this fitting procedure was most sensitive to the accurate determination of the target radius and depth. Blood sample in a thin tube of radius 0.58 mm, simulating a blood vessel, was also studied. The photoacoustic images and fitted absorption coefficients are presented. These results demonstrate the clinical potential of this fitting procedure to quantitatively characterize small lesions in breast imaging.

  14. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  15. Tentative Method for the Qualitative Detection and Quantitative Assessment of Air Contamination by Drugs

    PubMed Central

    Buogo, A.; Eboli, V.

    1972-01-01

    A method for detecting and measuring air contamination by drugs is described which uses an electrostatic bacterial air sampler, sprayers for micronizing drugs, and Mueller-Hinton medium seeded with a highly susceptible strain of Sarcina lutea. Three antibiotics (penicillin, tetracycline, aminosidine) and a sulfonamide (sulfapyrazine) were identified by pretreating portions of medium, showing no bacterial growth, with penicillinase or p-aminobenzoic acid solution and subsequently determining how both drug- susceptible and drug-resistant strains of Staphylococcus aureus were affected by this pretreatment. Quantitative determinations were also attempted by measuring the size of the inhibition zones. Images PMID:4483536

  16. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J.; Cremers, David A.

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  17. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  18. Alternative NMR method for quantitative determination of acyl positional distribution in triacylglycerols and related compounds.

    PubMed

    Simova, S; Ivanova, G; Spassov, S L

    2003-12-01

    High-resolution 13C NMR spectroscopy has been used to analyze the positional distribution of fatty acids in model triacylglycerols. A novel method for quantitative determination of the positional distribution of unsaturated chains in triacylglycerols simultaneously with the ratio of saturated/unsaturated acyl chains has been proposed, utilizing the chemical shift differences of the aliphatic atoms C4, C5, and C6. The use of HSQC-TOCSY spectra allows unequivocal proof of the position of the unsaturated chain as well as complete assignment of the 13C NMR signals in tripalmitin. PMID:14623452

  19. The strengths and weaknesses of quantitative and qualitative research: what method for nursing?

    PubMed

    Carr, L T

    1994-10-01

    The overall purpose of research for any profession is to discover the truth of the discipline. This paper examines the controversy over the methods by which truth is obtained, by examining the differences and similarities between quantitative and qualitative research. The historically negative bias against qualitative research is discussed, as well as the strengths and weaknesses of both approaches, with issues highlighted by reference to nursing research. Consideration is given to issues of sampling; the relationship between the researcher and subject; methodologies and collated data; validity; reliability, and ethical dilemmas. The author identifies that neither approach is superior to the other; qualitative research appears invaluable for the exploration of subjective experiences of patients and nurses, and quantitative methods facilitate the discovery of quantifiable information. Combining the strengths of both approaches in triangulation, if time and money permit, is also proposed as a valuable means of discovering the truth about nursing. It is argued that if nursing scholars limit themselves to one method of enquiry, restrictions will be placed on the development of nursing knowledge. PMID:7822608

  20. Depth determination for shallow teleseismic earthquakes Methods and results

    NASA Technical Reports Server (NTRS)

    Stein, Seth; Wiens, Douglas A.

    1986-01-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented.

  1. Depth determination for shallow teleseismic earthquakes Methods and results

    SciTech Connect

    Stein, S.; Wiens, D.A.

    1986-11-01

    Contemporary methods used to determine depths of moderate-sized shallow teleseismic earthquakes are described. These include techniques based on surface wave spectra, and methods which estimate focal depth from the waveforms of body waves. The advantages of different methods and their limitations are discussed, and significant results for plate tectonics, obtained in the last five years by the application of these methods, are presented. 119 references.

  2. A quantitative autoradiographic method for the measurement of local rates of brain protein synthesis

    SciTech Connect

    Dwyer, B.E.; Donatoni, P.; Wasterlain, C.G.

    1982-05-01

    We have developed a new method for measuring local rates of brain protein synthesis in vivo. It combines the intraperitoneal injection of a large dose of low specific activity amino acid with quantitative autoradiography. This method has several advantages: 1) It is ideally suited for young or small animals or where immobilizing an animal is undesirable. 2 The amino acid injection ''floods'' amino acid pools so that errors in estimating precursor specific activity, which is especially important in pathological conditions, are minimized. 3) The method provides for the use of a radioautographic internal standard in which valine incorporation is measured directly. Internal standards from experimental animals correct for tissue protein content and self-absorption of radiation in tissue sections which could vary under experimental conditions.

  3. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  4. Sample preparation methods for quantitative detection of DNA by molecular assays and marine biosensors.

    PubMed

    Cox, Annie M; Goodwin, Kelly D

    2013-08-15

    The need for quantitative molecular methods is growing in environmental, food, and medical fields but is hindered by low and variable DNA extraction and by co-extraction of PCR inhibitors. DNA extracts from Enterococcus faecium, seawater, and seawater spiked with E. faecium and Vibrio parahaemolyticus were tested by qPCR for target recovery and inhibition. Conventional and novel methods were tested, including Synchronous Coefficient of Drag Alteration (SCODA) and lysis and purification systems used on an automated genetic sensor (the Environmental Sample Processor, ESP). Variable qPCR target recovery and inhibition were measured, significantly affecting target quantification. An aggressive lysis method that utilized chemical, enzymatic, and mechanical disruption enhanced target recovery compared to commercial kit protocols. SCODA purification did not show marked improvement over commercial spin columns. Overall, data suggested a general need to improve sample preparation and to accurately assess and account for DNA recovery and inhibition in qPCR applications. PMID:23790450

  5. A method for estimating the effective number of loci affecting a quantitative character.

    PubMed

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci. PMID:23973416

  6. "Do I Need Research Skills in Working Life?": University Students' Motivation and Difficulties in Quantitative Methods Courses

    ERIC Educational Resources Information Center

    Murtonen, Mari; Olkinuora, Erkki; Tynjala, Paivi; Lehtinen, Erno

    2008-01-01

    This study explored university students' views of whether they will need research skills in their future work in relation to their approaches to learning, situational orientations on a learning situation of quantitative methods, and difficulties experienced in quantitative research courses. Education and psychology students in both Finland (N =…

  7. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  8. A fast semi-quantitative method for Plutonium determination in an alpine firn/ice core

    NASA Astrophysics Data System (ADS)

    Gabrieli, J.; Cozzi, G.; Vallelonga, P.; Schwikowski, M.; Sigl, M.; Boutron, C.; Barbante, C.

    2009-04-01

    Plutonium is present in the environment as a consequence of atmospheric nuclear tests carried out in the 1960s, nuclear weapons production and releases by the nuclear industry over the past 50 years. Plutonium, unlike uranium, is essentially anthropogenic and it was first produced and isolated in 1940 by deuteron bombardment of uranium in the cyclotron of Berkeley University. It exists in five main isotopes, 238Pu, 239Pu, 240Pu, 241Pu, 242Pu, derived from civilian and military sources (weapons production and detonation, nuclear reactors, nuclear accidents). In the environment, 239Pu is the most abundant isotope. Approximately 6 tons of 239Pu have been released into the environment as a result of 541 atmospheric weapon tests Nuclear Pu fallout has been studied in various environmental archives, such as sediments, soil and herbarium grass. Mid-latitude ice cores have been studied as well, on Mont Blanc, the Western Alps and on Belukha Glacier, Siberian Altai. We present a Pu record obtained by analyzing 52 discrete samples of an alpine firn/ice core from Colle Gnifetti (M. Rosa, 4450 m a.s.l.), dating from 1945 to 1991. The239Pu signal was recorded directly, without preliminary cleaning or preconcentration steps, using an ICP-SFMS (Thermo Element2) equipped with a desolvation system (APEX). 238UH+ interferences were negligible for U concentrations lower than 50 ppt as verified both in spiked fresh snow and pre-1940 ice samples. The shape of 239Pu profile reflects the three main periods of atmospheric nuclear weapons testing: the earliest peak starts in 1954/55 to 1958 and includes the first testing period which reached a maximum in 1958. Despite a temporary halt in testing in 1959/60, the Pu concentration decreased only by half with respect to the 1958 peak. In 1961/62 Pu concentrations rapidly increased reaching a maximum in 1963, which was about 40% more intense than the 1958 peak. After the sign of the "Limited Test Ban Treaty" between USA and URSS in 1964, Pu deposition decreased very sharply reaching a minimum in 1967. The third period (1967-1975) is characterized by irregular Pu profiles with smaller peaks (about 20-30% compared to the 1964 peak) which could be due to French and Chinese tests. Comparison with the Pu profiles obtained from the Col du Dome and Belukha ice cores by AMS (Accelerator Mass Spectrometry) shows very good agreement. Considering the semi-quantitative method and the analytical uncertainty, the results are also quantitatively comparable. However, the Pu concentrations at Colle Gnifetti are normally 2-3 times greater than in Col du Dome. This could be explained by different air mass transport or, more likely, different accumulation rates at each site.

  9. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods

    PubMed Central

    Ahmed, Rafay; Oborski, Matthew J; Hwang, Misun; Lieberman, Frank S; Mountz, James M

    2014-01-01

    Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12–15 months for glioblastomas and 2–5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies, and importantly, for facilitating patient management, sparing patients from weeks or months of toxicity and ineffective treatment. This review will present an overview of epidemiology, molecular pathogenesis and current advances in diagnoses, and management of malignant gliomas. PMID:24711712

  10. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  11. Rapid method for glutathione quantitation using high-performance liquid chromatography with coulometric electrochemical detection.

    PubMed

    Bayram, Banu; Rimbach, Gerald; Frank, Jan; Esatbeyoglu, Tuba

    2014-01-15

    A rapid, sensitive, and direct method (without derivatization) was developed for the detection of reduced glutathione (GSH) in cultured hepatocytes (HepG2 cells) using high-performance liquid chromatography with electrochemical detection (HPLC-ECD). The method was validated according to the guidelines of the U.S. Food and Drug Administration in terms of linearity, lower limit of quantitation (LOQ), lower limit of detection (LOD), precision, accuracy, recovery, and stabilities of GSH standards and quality control samples. The total analysis time was 5 min, and the retention time of GSH was 1.78 min. Separation was carried out isocratically using 50 mM sodium phosphate (pH 3.0) as a mobile phase with a fused-core column. The detector response was linear between 0.01 and 80 μmol/L, and the regression coefficient (R(2)) was >0.99. The LOD for GSH was 15 fmol, and the intra- and interday recoveries ranged between 100.7 and 104.6%. This method also enabled the rapid detection (in 4 min) of other compounds involved in GSH metabolism such as uric acid, ascorbic acid, and glutathione disulfite. The optimized and validated HPLC-ECD method was successfully applied for the determination of GSH levels in HepG2 cells treated with buthionine sulfoximine (BSO), an inhibitor, and α-lipoic acid (α-LA), an inducer of GSH synthesis. As expected, the amount of GSH concentration-dependently decreased with BSO and increased with α-LA treatments in HepG2 cells. This method could also be useful for the quantitation of GSH, uric acid, ascorbic acid, and glutathione disulfide in other biological matrices such as tissue homogenates and blood. PMID:24328299

  12. A quantitative method for estimation of volume changes in arachnoid foveae with age.

    PubMed

    Duray, Stephen M; Martel, Stacie S

    2006-03-01

    Age-related changes of arachnoid foveae have been described, but objective, quantitative analyses are lacking. A new quantitative method is presented for estimation of change in total volume of arachnoid foveae with age. The pilot sample consisted of nine skulls from the Palmer Anatomy Laboratory. Arachnoid foveae were filled with sand, which was extracted using a vacuum pump. Mass was determined with an analytical balance and converted to volume. A reliability analysis was performed using intraclass correlation coefficients. The method was found to be highly reliable (intraobserver ICC = 0.9935, interobserver ICC = 0.9878). The relationship between total volume and age was then examined in a sample of 63 males of accurately known age from the Hamann-Todd collection. Linear regression analysis revealed no statistically significant relationship between total volume and age, or foveae frequency and age (alpha = 0.05). Development of arachnoid foveae may be influenced by health factors, which could limit its usefulness in aging. PMID:16566755

  13. Development of a High-Sensitivity Quantitation Method for Arginine Vasopressin by High-Performance Liquid Chromatography Tandem Mass Spectrometry, and Comparison with Quantitative Values by Radioimmunoassay.

    PubMed

    Tsukazaki, Yasuko; Senda, Naoto; Kubo, Kinya; Yamada, Shigeru; Kugoh, Hiroyuki; Kazuki, Yasuhiro; Oshimura, Mitsuo

    2016-01-01

    Human plasma arginine vasopressin (AVP) levels serve as a clinically relevant marker of diabetes and related syndromes. We developed a highly sensitive method for measuring human plasma AVP using high-performance liquid chromatography tandem mass spectrometry. AVP was extracted from human plasma using a weak-cation solid-phase extraction plate, and separated on a wide-bore octadecyl reverse-phase column. AVP was quantified in ion-transition experiments utilizing a product ion (m/z 328.3) derived from its parent ion (m/z 542.8). The sensitivity was enhanced using 0.02% dichloromethane as a mobile-phase additive. The lower limit of quantitation was 0.200 pmol/L. The extraction recovery ranged from 70.2 ± 7.2 to 73.3 ± 6.2% (mean ± SD), and the matrix effect ranged from 1.1 - 1.9%. Quality-testing samples revealed interday/intraday accuracy and precision ranging over 0.9 - 3% and -0.3 - 2%, respectively, which included the endogenous baseline. Our results correlated well with radioimmunoassay results using 22 human volunteer plasma samples. PMID:26860558

  14. A quantitative method for measurement of HL-60 cell apoptosis based on diffraction imaging flow cytometry technique

    PubMed Central

    Yang, Xu; Feng, Yuanming; Liu, Yahui; Zhang, Ning; Lin, Wang; Sa, Yu; Hu, Xin-Hua

    2014-01-01

    A quantitative method for measurement of apoptosis in HL-60 cells based on polarization diffraction imaging flow cytometry technique is presented in this paper. Through comparative study with existing methods and the analysis of diffraction images by a gray level co-occurrence matrix algorithm (GLCM), we found 4 GLCM parameters of contrast (CON), cluster shade (CLS), correlation (COR) and dissimilarity (DIS) exhibit high sensitivities as the apoptotic rates. It was further demonstrated that the CLS parameter correlates significantly (R2 = 0.899) with the degree of nuclear fragmentation and other three parameters showed a very good correlations (R2 ranges from 0.69 to 0.90). These results demonstrated that the new method has the capability for rapid and accurate extraction of morphological features to quantify cellular apoptosis without the need for cell staining. PMID:25071957

  15. Characterization of thermal desorption instrumentation with a direct liquid deposition calibration method for trace 2,4,6-trinitrotoluene quantitation.

    PubMed

    Field, Christopher R; Giordano, Braden C; Rogers, Duane A; Lubrano, Adam L; Rose-Pehrsson, Susan L

    2012-03-01

    The use of thermal desorption systems for the analysis of trace vapors typically requires establishing a calibration curve from vapors generated with a permeation tube. The slow equilibration time of permeation tubes causes such an approach to become laborious when covering a wide dynamic range. Furthermore, many analytes of interest, such as explosives, are not available as permeation tubes. A method for easily and effectively establishing calibration curves for explosive vapor samples via direct deposition of standard solutions on thermal desorption tubes was investigated. The various components of the thermal desorption system were compared to a standard split/splitless inlet. Calibration curves using the direct liquid deposition method with a thermal desorption unit coupled to a cryo-focusing inlet were compared to a standard split/splitless inlet, and a statistical difference was observed but does not eliminate or deter the use of the direct liquid deposition method for obtaining quantitative results for explosive vapors. PMID:22265176

  16. Quantitative analysis of gene expression in fixed colorectal carcinoma samples as a method for biomarker validation.

    PubMed

    Ostasiewicz, Beata; Ostasiewicz, Paweł; Duś-Szachniewicz, Kamila; Ostasiewicz, Katarzyna; Ziółkowski, Piotr

    2016-06-01

    Biomarkers have been described as the future of oncology. Modern proteomics provide an invaluable tool for the near‑whole proteome screening for proteins expressed differently in neoplastic vs. healthy tissues. However, in order to select the most promising biomarkers, an independent method of validation is required. The aim of the current study was to propose a methodology for the validation of biomarkers. Due to material availability the majority of large scale biomarker studies are performed using formalin‑fixed paraffin‑embedded (FFPE) tissues, therefore these were selected for use in the current study. A total of 10 genes were selected from what have been previously described as the most promising candidate biomarkers, and the expression levels were analyzed with reverse transcription‑quantitative polymerase chain reaction (RT‑qPCR) using calibrator normalized relative quantification with the efficiency correction. For 6/10 analyzed genes, the results were consistent with the proteomic data; for the remaining four genes, the results were inconclusive. The upregulation of karyopherin α 2 (KPNA2) and chromosome segregation 1‑like (CSE1L) in colorectal carcinoma, in addition to downregulation of chloride channel accessory 1 (CLCA1), fatty acid binding protein 1 (FABP1), sodium channel, voltage gated, type VII α subunit (SCN7A) and solute carrier family 26 (anion exchanger), member 3 (SLC26A3) was confirmed. With the combined use of proteomic and genetic tools, it was reported, for the first time to the best of our knowledge, that SCN7A was downregulated in colorectal carcinoma at mRNA and protein levels. It had been previously suggested that the remaining five genes served an important role in colorectal carcinogenesis, however the current study provided strong evidence to support their use as biomarkers. Thus, it was concluded that combination of RT‑qPCR with proteomics offers a powerful methodology for biomarker identification, which can be used to analyze FFPE samples. PMID:27121919

  17. Age-related changes in rat cerebellar basket cells: a quantitative study using unbiased stereological methods

    PubMed Central

    HENRIQUE, RUI M. F.; ROCHA, EDUARDO; REIS, ALCINDA; MARCOS, RICARDO; OLIVEIRA, MARIA H.; SILVA, MARIA W.; MONTEIRO, ROGÉRIO A. F.

    2001-01-01

    Cortical cerebellar basket cells are stable postmitotic cells; hence, they are liable to endure age-related changes. Since the cerebellum is a vital organ for the postural control, equilibrium and motor coordination, we aimed to determine the quantitative morphological changes in those interneurons with the ageing process, using unbiased techniques. Material from the cerebellar cortex (Crus I and Crus II) was collected from female rats aged 2, 6, 9, 12, 15, 18, 21 and 24 mo (5 animals per each age group), fixed by intracardiac perfusion, and processed for transmission electron microscopy, using conventional techniques. Serial semithin sections were obtained (5 blocks from each rat), enabling the determination of the number-weighted mean nuclear volume (by the nucleator method). On ultrathin sections, 25 cell profiles from each animal were photographed. The volume density of the nucleus, ground substance, mitochondria, Golgi apparatus (Golgi) and dense bodies (DB), and the mean surface density of the rough endoplasmic reticulum (RER) were determined, by point counting, using a morphometric grid. The mean total volumes of the soma and organelles and the mean total surface area of the RER [s̄N (RER)] were then calculated. The results were analysed with 1-way ANOVA; posthoc pairwise comparisons of group means were performed using the Newman-Keuls test. The relation between age and each of the parameters was studied by regression analysis. Significant age-related changes were observed for the mean volumes of the soma, ground substance, Golgi, DB, and s̄N (RER). Positive linear trends were found for the mean volumes of the ground substance, Golgi, and DB; a negative linear trend was found for the s̄N (RER). These results indicate that rat cerebellar basket cells endure important age-related changes. The significant decrease in the s̄N (RER) may be responsible for a reduction in the rate of protein synthesis. Additionally, it may be implicated in a cascade of events leading to cell damage due to the excitotoxic activity of glutamate, which could interfere in the functioning of the complex cerebellar neuronal network. PMID:11465864

  18. A method for accurate detection of genomic microdeletions using real-time quantitative PCR

    PubMed Central

    Weksberg, Rosanna; Hughes, Simon; Moldovan, Laura; Bassett, Anne S; Chow, Eva WC; Squire, Jeremy A

    2005-01-01

    Background Quantitative Polymerase Chain Reaction (qPCR) is a well-established method for quantifying levels of gene expression, but has not been routinely applied to the detection of constitutional copy number alterations of human genomic DNA. Microdeletions or microduplications of the human genome are associated with a variety of genetic disorders. Although, clinical laboratories routinely use fluorescence in situ hybridization (FISH) to identify such cryptic genomic alterations, there remains a significant number of individuals in which constitutional genomic imbalance is suspected, based on clinical parameters, but cannot be readily detected using current cytogenetic techniques. Results In this study, a novel application for real-time qPCR is presented that can be used to reproducibly detect chromosomal microdeletions and microduplications. This approach was applied to DNA from a series of patient samples and controls to validate genomic copy number alteration at cytoband 22q11. The study group comprised 12 patients with clinical symptoms of chromosome 22q11 deletion syndrome (22q11DS), 1 patient trisomic for 22q11 and 4 normal controls. 6 of the patients (group 1) had known hemizygous deletions, as detected by standard diagnostic FISH, whilst the remaining 6 patients (group 2) were classified as 22q11DS negative using the clinical FISH assay. Screening of the patients and controls with a set of 10 real time qPCR primers, spanning the 22q11.2-deleted region and flanking sequence, confirmed the FISH assay results for all patients with 100% concordance. Moreover, this qPCR enabled a refinement of the region of deletion at 22q11. Analysis of DNA from chromosome 22 trisomic sample demonstrated genomic duplication within 22q11. Conclusion In this paper we present a qPCR approach for the detection of chromosomal microdeletions and microduplications. The strategic use of in silico modelling for qPCR primer design to avoid regions of repetitive DNA, whilst providing a level of genomic resolution greater than standard cytogenetic assays. The implementation of qPCR detection in clinical laboratories will address the need to replace complex, expensive and time consuming FISH screening to detect genomic microdeletions or duplications of clinical importance. PMID:16351727

  19. Methods for a quantitative evaluation of odd-even staggering effects

    NASA Astrophysics Data System (ADS)

    Olmi, Alessandro; Piantelli, Silvia

    2015-12-01

    Odd-even effects, also known as staggering effects, are a common feature observed in the yield distributions of fragments produced in different types of nuclear reactions. We review old methods, and we propose new ones, for a quantitative estimation of these effects as a function of proton or neutron number of the reaction products. All methods are compared on the basis of Monte Carlo simulations. We find that some are not well suited for the task, the most reliable ones being those based either on a non-linear fit with a properly oscillating function or on a third (or fourth) finite difference approach. In any case, high statistic is of paramount importance to avoid that spurious structures appear just because of statistical fluctuations in the data and of strong correlations among the yields of neighboring fragments.

  20. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    PubMed Central

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  1. Establishment of a new method to quantitatively evaluate hyphal fusion ability in Aspergillus oryzae.

    PubMed

    Tsukasaki, Wakako; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko

    2014-01-01

    Hyphal fusion is involved in the formation of an interconnected colony in filamentous fungi, and it is the first process in sexual/parasexual reproduction. However, it was difficult to evaluate hyphal fusion efficiency due to the low frequency in Aspergillus oryzae in spite of its industrial significance. Here, we established a method to quantitatively evaluate the hyphal fusion ability of A. oryzae with mixed culture of two different auxotrophic strains, where the ratio of heterokaryotic conidia growing without the auxotrophic requirements reflects the hyphal fusion efficiency. By employing this method, it was demonstrated that AoSO and AoFus3 are required for hyphal fusion, and that hyphal fusion efficiency of A. oryzae was increased by depleting nitrogen source, including large amounts of carbon source, and adjusting pH to 7.0. PMID:25229867

  2. A Rapid and Quantitative Flow Cytometry Method for the Analysis of Membrane Disruptive Antimicrobial Activity

    PubMed Central

    O’Brien-Simpson, Neil M.; Pantarat, Namfon; Attard, Troy J.; Walsh, Katrina A.; Reynolds, Eric C.

    2016-01-01

    We describe a microbial flow cytometry method that quantifies within 3 hours antimicrobial peptide (AMP) activity, termed Minimum Membrane Disruptive Concentration (MDC). Increasing peptide concentration positively correlates with the extent of bacterial membrane disruption and the calculated MDC is equivalent to its MBC. The activity of AMPs representing three different membranolytic modes of action could be determined for a range of Gram positive and negative bacteria, including the ESKAPE pathogens, E. coli and MRSA. By using the MDC50 concentration of the parent AMP, the method provides high-throughput, quantitative screening of AMP analogues. A unique feature of the MDC assay is that it directly measures peptide/bacteria interactions and lysed cell numbers rather than bacteria survival as with MIC and MBC assays. With the threat of multi-drug resistant bacteria, this high-throughput MDC assay has the potential to aid in the development of novel antimicrobials that target bacteria with improved efficacy. PMID:26986223

  3. Goals of Secondary Education as Perceived by Education Consumers. Volume IV, Quantitative Results.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. Inst. for Social Research and Development.

    The results of a study to determine attitudes of parents and professional educators toward educational goals for secondary school students are analyzed in this report. The survey was conducted in two communities--Albuquerque, New Mexico, and Philadelphia, Pennsylvania. The essential nature of the results is summarized by the following categories:…

  4. A QUANTITATIVE, THREE-DIMENSIONAL METHOD FOR ANALYZING ROTATIONAL MOVEMENT FROM SINGLE-VIEW MOVIES

    PubMed

    Berg

    1994-06-01

    The study of animal movement is an important aspect of functional morphological research. The three-dimensional movements of (parts of) animals are usually recorded on two-dimensional film frames. For a quantitative analysis, the real movements should be reconstructed from their projections. If movements occur in one plane, their projection is distorted only if this plane is not parallel to the film plane. Provided that the parallel orientation of the movement with respect to the film plane is checked accurately, a two-dimensional method of analysis (ignoring projection errors) can be justified for quantitative analysis of planar movements. Films of movements of skeletal elements of the fish head have generally been analyzed with the two-dimensional method (e.g. Sibbing, 1982; Hoogenboezem et al. 1990; Westneat, 1990; Claes and de Vree, 1991), which is justifiable for planar movements. Unfortunately, the movements of the head bones of fish are often strongly non-planar, e.g. the movement of the pharyngeal jaws and the gill arches. The two-dimensional method is inappropriate for studying such complex movements (Sibbing, 1982; Hoogenboezem et al. 1990). For a qualitative description of movement patterns, the conditions for the use of the two-dimensional method may be somewhat relaxed. When two (or more) views of a movement are recorded simultaneously, the three-dimensional movements can readily be reconstructed using two two-dimensional images (e.g. Zarnack, 1972; Nachtigall, 1983; van Leeuwen, 1984; Drost and van den Boogaart, 1986). However, because of technical (and budget) limitations, simultaneous views of a movement cannot always be shot. In this paper, a method is presented for reconstructing the three-dimensional orientation and rotational movement of structures using single-view films and for calculating rotation in an object-bound frame. Ellington (1984) presented a similar method for determining three-dimensional wing movements from single-view films of flying insects. Ellington's method is based upon the bilateral symmetry of the wing movements. The present method does not depend on symmetry and can be applied to a variety of kinematic investigations. It eliminates a systematic error: the projection error. The measuring error is not discussed; it is the same in the two-dimensional and three-dimensional method of analysis. PMID:9317811

  5. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  6. Development of a Quantitative Decision Metric for Selecting the Most Suitable Discretization Method for SN Transport Problems

    NASA Astrophysics Data System (ADS)

    Schunert, Sebastian

    In this work we develop a quantitative decision metric for spatial discretization methods of the SN equations. The quantitative decision metric utilizes performance data from selected test problems for computing a fitness score that is used for the selection of the most suitable discretization method for a particular SN transport application. The fitness score is aggregated as a weighted geometric mean of single performance indicators representing various performance aspects relevant to the user. Thus, the fitness function can be adjusted to the particular needs of the code practitioner by adding/removing single performance indicators or changing their importance via the supplied weights. Within this work a special, broad class of methods is considered, referred to as nodal methods. This class is naturally comprised of the DGFEM methods of all function space families. Within this work it is also shown that the Higher Order Diamond Difference (HODD) method is a nodal method. Building on earlier findings that the Arbitrarily High Order Method of the Nodal type (AHOTN) is also a nodal method, a generalized finite-element framework is created to yield as special cases various methods that were developed independently using profoundly different formalisms. A selection of test problems related to a certain performance aspect are considered: an Method of Manufactured Solutions (MMS) test suite for assessing accuracy and execution time, Lathrop's test problem for assessing resilience against occurrence of negative fluxes, and a simple, homogeneous cube test problem to verify if a method possesses the thick diffusive limit. The contending methods are implemented as efficiently as possible under a common SN transport code framework to level the playing field for a fair comparison of their computational load. Numerical results are presented for all three test problems and a qualitative rating of each method's performance is provided for each aspect: accuracy/efficiency, resilience against negative fluxes, and possession of the thick diffusion limit, separately. The choice of the most efficient method depends on the utilized error norm: in Lp error norms higher order methods such as the AHOTN method of order three perform best, while for computing integral quantities the linear nodal (LN) method is most efficient. The most resilient method against occurrence of negative fluxes is the simple corner balance (SCB) method. A validation of the quantitative decision metric is performed based on the NEA box-inbox suite of test problems. The validation exercise comprises two stages: first prediction of the contending methods' performance via the decision metric and second computing the actual scores based on data obtained from the NEA benchmark problem. The comparison of predicted and actual scores via a penalty function (ratio of predicted best performer's score to actual best score) completes the validation exercise. It is found that the decision metric is capable of very accurate predictions (penalty < 10%) in more than 83% of the considered cases and features penalties up to 20% for the remaining cases. An exception to this rule is the third test case NEA-III intentionally set up to incorporate a poor match of the benchmark with the "data" problems. However, even under these worst case conditions the decision metric's suggestions are never detrimental. Suggestions for improving the decision metric's accuracy are to increase the pool of employed data, to refine the mapping of a given configuration to a case in the database, and to better characterize the desired target quantities.

  7. Membrane chromatographic immunoassay method for rapid quantitative analysis of specific serum antibodies.

    PubMed

    Ghosh, Raja

    2006-02-01

    This paper discusses a membrane chromatographic immunoassay method for rapid detection and quantitative analysis of specific serum antibodies. A type of polyvinylidine fluoride (PVDF) microfiltration membrane was used in the method for its ability to reversibly and specifically bind IgG antibodies from antiserum samples by hydrophobic interaction. Using this form of selective antibody binding and enrichment an affinity membrane with antigen binding ability was obtained in-situ. This was done by passing a pulse of diluted antiserum sample through a stack of microporous PVDF membranes. The affinity membrane thus formed was challenged with a pulse of antigen solution and the amount of antigen bound was accurately determined using chromatographic methods. The antigen binding correlated well with the antibody loading on the membrane. This method is direct, rapid and accurate, does not involve any chemical reaction, and uses very few reagents. Moreover, the same membrane could be repeatedly used for sequential immunoassays on account of the reversible nature of the antibody binding. Proof of concept of this method is provided using human hemoglobin as model antigen and rabbit antiserum against human hemoglobin as the antibody source. PMID:16196053

  8. Validated reversed phase LC method for quantitative analysis of polymethoxyflavones in citrus peel extracts.

    PubMed

    Wang, Zhenyu; Li, Shiming; Ferguson, Stephen; Goodnow, Robert; Ho, Chi-Tang

    2008-01-01

    Polymethoxyflavones (PMFs), which exist exclusively in the citrus genus, have biological activities including anti-inflammatory, anticarcinogenic, and antiatherogenic properties. A validated RPLC method was developed for quantitative analysis of six major PMFs, namely nobiletin, tangeretin, sinensetin, 5,6,7,4'-tetramethoxyflavone, 3,5,6,7,3',4'-hexamethoxyflavone, and 3,5,6,7,8,3',4'-heptamethoxyflavone. The polar embedded LC stationary phase was able to fully resolve the six analogues. The developed method was fully validated in terms of linearity, accuracy, precision, sensitivity, and system suitability. The LOD of the method was calculated as 0.15 microg/mL and the recovery rate was between 97.0 and 105.1%. This analytical method was successfully applied to quantify the individual PMFs in four commercially available citrus peel extracts (CPEs). Each extract shows significant difference in the PMF composition and concentration. This method may provide a simple, rapid, and reliable tool to help reveal the correlation between the bioactivity of the PMF extracts and the individual PMF content. PMID:18095294

  9. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials. PMID:24282943

  10. Evaluation of a quantitative fit testing method for N95 filtering facepiece respirators.

    PubMed

    Janssen, Larry; Luinenburg, Michael D; Mullins, Haskell E; Danisch, Susan G; Nelson, Thomas J

    2003-01-01

    A method for performing quantitative fit tests (QNFT) with N95 filtering facepiece respirators was developed by earlier investigators. The method employs a simple clamping device to allow the penetration of submicron aerosols through N95 filter media to be measured. The measured value is subtracted from total penetration, with the assumption that the remaining penetration represents faceseal leakage. The developers have used the clamp to assess respirator performance. This study evaluated the clamp's ability to measure filter penetration and determine fit factors. In Phase 1, subjects were quantitatively fit-tested with elastomeric half-facepiece respirators using both generated and ambient aerosols. QNFT were done with each aerosol with both P100 and N95 filters without disturbing the facepiece. In Phase 2 of the study elastomeric half facepieces were sealed to subjects' faces to eliminate faceseal leakage. Ambient aerosol QNFT were performed with P100 and N95 filters without disturbing the facepiece. In both phases the clamp was used to measure N95 filter penetration, which was then subtracted from total penetration for the N95 QNFT. It was hypothesized that N95 fit factors corrected for filter penetration would equal the P100 fit factors. Mean corrected N95 fit factors were significantly different from the P100 fit factors in each phase of the study. In addition, there was essentially no correlation between corrected N95 fit factors and P100 fit factors. It was concluded that the clamp method should not be used to fit-test N95 filtering facepieces or otherwise assess respirator performance. PMID:12908863

  11. A quantitative method for zoning of protected areas and its spatial ecological implications.

    PubMed

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to improve zoning within protected areas in developing countries. PMID:16690203

  12. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  13. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 μg L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams. PMID:26395288

  14. Tensile testing as a novel method for quantitatively evaluating bioabsorbable material degradation.

    PubMed

    Bowen, Patrick K; Gelbaugh, Jesse A; Mercier, Phillip J; Goldman, Jeremy; Drelich, Jaroslaw

    2012-11-01

    Bioabsorbable metallic materials have become a topic of interest in the field of interventional cardiology because of their potential application in stents. A well-defined, quantitative method for evaluating the degradation rate of candidate materials is currently needed in this area. In this study, biodegradation of 0.25-mm iron and magnesium wires was simulated in vitro through immersion in cell-culture medium with and without a fibrin coating (meant to imitate the neointima). The immersed samples were corroded under physiological conditions (37°C, 5% CO(2)). Iron degraded in a highly localized manner, producing voluminous corrosion product while magnesium degraded more uniformly. To estimate the degradation rate in a quantitative manner, both raw and corroded samples underwent tensile testing using a protocol similar to that used on polymeric nanofibers. The effective ultimate tensile stress (tensile stress holding constant cross-sectional area) was determined to be the mechanical metric that exhibited the smallest amount of variability. When the effective tensile stress data were aggregated, a statistically significant downward, linear trend in strength was observed in both materials (Fe and Mg) with and without the fibrin coating. It was also demonstrated that tensile testing is able to distinguish between the higher degradation rate of the bare wires and the lower degradation rate of the fibrin-coated wires with confidence. PMID:22847989

  15. Characterization of a method for quantitating food consumption for mutation assays in Drosophila

    SciTech Connect

    Thompson, E.D.; Reeder, B.A.; Bruce, R.D. )

    1991-01-01

    Quantitation of food consumption is necessary when determining mutation responses to multiple chemical exposures in the sex-linked recessive lethal assay in Drosophila. One method proposed for quantitating food consumption by Drosophila is to measure the incorporation of 14C-leucine into the flies during the feeding period. Three sources of variation in the technique of Thompson and Reeder have been identified and characterized. First, the amount of food consumed by individual flies differed by almost 30% in a 24 hr feeding period. Second, the variability from vial to vial (each containing multiple flies) was around 15%. Finally, the amount of food consumed in identical feeding experiments performed over the course of 1 year varied nearly 2-fold. The use of chemical consumption values in place of exposure levels provided a better means of expressing the combined mutagenic response. In addition, the kinetics of food consumption over a 3 day feeding period for exposures to cyclophosphamide which produce lethality were compared to non-lethal exposures. Extensive characterization of lethality induced by exposures to cyclophosphamide demonstrate that the lethality is most likely due to starvation, not chemical toxicity.

  16. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  17. Simple, Rapid and Inexpensive Quantitative Fluorescent PCR Method for Detection of Microdeletion and Microduplication Syndromes

    PubMed Central

    Stofanko, Martin; Gonçalves-Dornelas, Higgor; Cunha, Pricila Silva; Pena, Heloísa B.; Vianna-Morgante, Angela M.; Pena, Sérgio Danilo Junho

    2013-01-01

    Because of economic limitations, the cost-effective diagnosis of patients affected with rare microdeletion or microduplication syndromes is a challenge in developing countries. Here we report a sensitive, rapid, and affordable detection method that we have called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR). Our procedure is based on the finding of genomic regions with high homology to segments of the critical microdeletion/microduplication region. PCR amplification of both using the same primer pair, establishes competitive kinetics and relative quantification of amplicons, as happens in microsatellite-based Quantitative Fluorescence PCR. We used patients with two common microdeletion syndromes, the Williams-Beuren syndrome (7q11.23 microdeletion) and the 22q11.2 microdeletion syndromes and discovered that MQF-PCR could detect both with 100% sensitivity and 100% specificity. Additionally, we demonstrated that the same principle could be reliably used for detection of microduplication syndromes, by using patients with the Lubs (MECP2 duplication) syndrome and the 17q11.2 microduplication involving the NF1 gene. We propose that MQF-PCR is a useful procedure for laboratory confirmation of the clinical diagnosis of microdeletion/microduplication syndromes, ideally suited for use in developing countries, but having general applicability as well. PMID:23620743

  18. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  19. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method

    PubMed Central

    Yang, Ganglong; Xu, Zhipeng; Lu, Wei; Li, Xiang; Sun, Chengwen; Guo, Jia; Xue, Peng; Guan, Feng

    2015-01-01

    The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia), KK47 (low grade nonmuscle invasive bladder cancer, NMIBC), and YTS1 (metastatic bladder cancer) have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC) progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO) term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer. PMID:26230496

  20. Quantitative imaging of volcanic plumes — Results, needs, and future trends

    USGS Publications Warehouse

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-01-01

    Recent technology allows two-dimensional “imaging” of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry–Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  1. Automated and Manual Methods of DNA Extraction for Aspergillus fumigatus and Rhizopus oryzae Analyzed by Quantitative Real-Time PCR▿

    PubMed Central

    Francesconi, Andrea; Kasai, Miki; Harrington, Susan M.; Beveridge, Mara G.; Petraitiene, Ruta; Petraitis, Vidmantas; Schaufele, Robert L.; Walsh, Thomas J.

    2008-01-01

    Quantitative real-time PCR (qPCR) may improve the detection of fungal pathogens. Extraction of DNA from fungal pathogens is fundamental to optimization of qPCR; however, the loss of fungal DNA during the extraction process is a major limitation to molecular diagnostic tools for pathogenic fungi. We therefore studied representative automated and manual extraction methods for Aspergillus fumigatus and Rhizopus oryzae. Both were analyzed by qPCR for their ability to extract DNA from propagules and germinated hyphal elements (GHE). The limit of detection of A. fumigatus and R. oryzae GHE in bronchoalveolar lavage (BAL) fluid with either extraction method was 1 GHE/ml. Both methods efficiently extracted DNA from A. fumigatus, with a limit of detection of 1 × 102 conidia. Extraction of R. oryzae by the manual method resulted in a limit of detection of 1 × 103 sporangiospores. However, extraction with the automated method resulted in a limit of detection of 1 × 101 sporangiospores. The amount of time to process 24 samples by the automated method was 2.5 h prior to transferring for automation, 1.3 h of automation, and 10 min postautomation, resulting in a total time of 4 h. The total time required for the manual method was 5.25 h. The automated and manual methods were similar in sensitivity for DNA extraction from A. fumigatus conidia and GHE. For R. oryzae, the automated method was more sensitive for DNA extraction of sporangiospores, while the manual method was more sensitive for GHE in BAL fluid. PMID:18353931

  2. CREST (Climate REconstruction SofTware): a probability density function (PDF)-based quantitative climate reconstruction method

    NASA Astrophysics Data System (ADS)

    Chevalier, M.; Cheddadi, R.; Chase, B. M.

    2014-11-01

    Several methods currently exist to quantitatively reconstruct palaeoclimatic variables from fossil botanical data. Of these, probability density function (PDF)-based methods have proven valuable as they can be applied to a wide range of plant assemblages. Most commonly applied to fossil pollen data, their performance, however, can be limited by the taxonomic resolution of the pollen data, as many species may belong to a given pollen type. Consequently, the climate information associated with different species cannot always be precisely identified, resulting in less-accurate reconstructions. This can become particularly problematic in regions of high biodiversity. In this paper, we propose a novel PDF-based method that takes into account the different climatic requirements of each species constituting the broader pollen type. PDFs are fitted in two successive steps, with parametric PDFs fitted first for each species and then a combination of those individual species PDFs into a broader single PDF to represent the pollen type as a unit. A climate value for the pollen assemblage is estimated from the likelihood function obtained after the multiplication of the pollen-type PDFs, with each being weighted according to its pollen percentage. To test its performance, we have applied the method to southern Africa as a regional case study and reconstructed a suite of climatic variables (e.g. winter and summer temperature and precipitation, mean annual aridity, rainfall seasonality). The reconstructions are shown to be accurate for both temperature and precipitation. Predictable exceptions were areas that experience conditions at the extremes of the regional climatic spectra. Importantly, the accuracy of the reconstructed values is independent of the vegetation type where the method is applied or the number of species used. The method used in this study is publicly available in a software package entitled CREST (Climate REconstruction SofTware) and will provide the opportunity to reconstruct quantitative estimates of climatic variables even in areas with high geographical and botanical diversity.

  3. A Pyrosequencing Assay for the Quantitative Methylation Analysis of GALR1 in Endometrial Samples: Preliminary Results

    PubMed Central

    Kottaridi, Christine; Koureas, Nikolaos; Margari, Niki; Terzakis, Emmanouil; Bilirakis, Evripidis; Pappas, Asimakis; Chrelias, Charalampos; Spathis, Aris; Aga, Evangelia; Pouliakis, Abraham; Panayiotides, Ioannis; Karakitsos, Petros

    2015-01-01

    Endometrial cancer is the most common malignancy of the female genital tract while aberrant DNA methylation seems to play a critical role in endometrial carcinogenesis. Galanin's expression has been involved in many cancers. We developed a new pyrosequencing assay that quantifies DNA methylation of galanin's receptor-1 (GALR1). In this study, the preliminary results indicate that pyrosequencing methylation analysis of GALR1 promoter can be a useful ancillary marker to cytology as the histological status can successfully predict. This marker has the potential to lead towards better management of women with endometrial lesions and eventually reduce unnecessary interventions. In addition it can provide early warning for women with negative cytological result. PMID:26504828

  4. A Pyrosequencing Assay for the Quantitative Methylation Analysis of GALR1 in Endometrial Samples: Preliminary Results.

    PubMed

    Kottaridi, Christine; Koureas, Nikolaos; Margari, Niki; Terzakis, Emmanouil; Bilirakis, Evripidis; Pappas, Asimakis; Chrelias, Charalampos; Spathis, Aris; Aga, Evangelia; Pouliakis, Abraham; Panayiotides, Ioannis; Karakitsos, Petros

    2015-01-01

    Endometrial cancer is the most common malignancy of the female genital tract while aberrant DNA methylation seems to play a critical role in endometrial carcinogenesis. Galanin's expression has been involved in many cancers. We developed a new pyrosequencing assay that quantifies DNA methylation of galanin's receptor-1 (GALR1). In this study, the preliminary results indicate that pyrosequencing methylation analysis of GALR1 promoter can be a useful ancillary marker to cytology as the histological status can successfully predict. This marker has the potential to lead towards better management of women with endometrial lesions and eventually reduce unnecessary interventions. In addition it can provide early warning for women with negative cytological result. PMID:26504828

  5. Quantitative assessment of port-wine stains using chromametry: preliminary results

    NASA Astrophysics Data System (ADS)

    Beacco, Claire; Brunetaud, Jean Marc; Rotteleur, Guy; Steen, D. A.; Brunet, F.

    1996-12-01

    Objective assessment of the efficacy of different lasers for the treatment of port wine stains remains difficult. Chromametry gives reproducible information on the color of PWS, but its data are useless for a medical doctor. Thus a specific software was developed to allow graphic representation of PWS characteristics. Before the first laser treatment and after every treatment, tests were done using a chromameter on a marked zone of the PWS and on the control-lateral normal zone which represents the reference. The software calculates and represents graphically the difference of color between PWS and normal skin using data provided by the chromameter. Three parameters are calculated: (Delta) H is the difference of hue, (Delta) L is the difference of lightness and (Delta) E is the total difference of color. Each measured zone is represented by its coordinates. Calculated initial values were compared with the subjective initial color assessed by the dermatologist. The variation of the color difference was calculated using the successive values of (Delta) E after n treatments and was compared with the subjective classification of fading. Since January 1995, forty three locations have been measured before laser treatment. Purple PWS tended to differentiate from others but red and dark pink PWS could not be differentiated. The evolution of the color after treatment was calculated in 29 PWS treated 3 or 4 times. Poor result corresponded to an increase of (Delta) E. Fair and good results were associated to a decrease of (Delta) E. We did not observe excellent results during this study. These promising preliminary results need to be confirmed in a larger group of patients.

  6. Perception of mobbing during the study: results of a national quantitative research among Slovenian midwifery students.

    PubMed

    Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona

    2014-09-01

    Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations. PMID:25420387

  7. Perception of mobbing during the study: results of a national quantitative research among Slovenian midwifery students.

    PubMed

    Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona

    2014-09-01

    Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations. PMID:25507371

  8. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    NASA Astrophysics Data System (ADS)

    Ryan, C. G.; Laird, J. S.; Fisher, L. A.; Kirkham, R.; Moorhead, G. F.

    2015-11-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  9. A quantitative magnetic analytical method using Curie's law for a mixture of paramagnetic and diamagnetic substances

    NASA Astrophysics Data System (ADS)

    Matsumoto, Nobuhiro; Kato, Kenji

    2012-08-01

    The principle of quantitative magnetic analytical measurement using Curie's law is proposed as a future candidate for a new potential primary direct method of measurement for amount of substance. The targeted main analyte in a mixture sample is a species—a rare-earth ion or a transition-metal ion—that has a free quantum spin. The matrix of the mixture sample is diamagnetic. The spin number of the analyte can be obtained from the temperature dependence of the sample's magnetic moment. This analytical method uses differences in the temperature dependences of Langevin paramagnetism, which obeys Curie's law, and diamagnetism, which exhibits temperature-independent behaviour. For preliminary validation of this analytical method, powder mixture samples of gadolinium oxide in silicon oxide were prepared by a gravimetric blending method. The mass fractions of the prepared mixture samples were approximately 0.2 to 0.008. The values of the samples' magnetic moments at temperatures below 340 K were measured using a superconducting quantum interference device magnetometer. Each analytical concentration of gadolinium oxide determined by the temperature dependence of the magnetic moment was found to be consistent with the gravimetric concentration within a relative difference of approximately 10%.

  10. Quantitative calcium resistivity based method for accurate and scalable water vapor transmission rate measurement.

    PubMed

    Reese, Matthew O; Dameron, Arrelaine A; Kempe, Michael D

    2011-08-01

    The development of flexible organic light emitting diode displays and flexible thin film photovoltaic devices is dependent on the use of flexible, low-cost, optically transparent and durable barriers to moisture and/or oxygen. It is estimated that this will require high moisture barriers with water vapor transmission rates (WVTR) between 10(-4) and 10(-6) g/m(2)/day. Thus there is a need to develop a relatively fast, low-cost, and quantitative method to evaluate such low permeation rates. Here, we demonstrate a method where the resistance changes of patterned Ca films, upon reaction with moisture, enable one to calculate a WVTR between 10 and 10(-6) g/m(2)/day or better. Samples are configured with variable aperture size such that the sensitivity and/or measurement time of the experiment can be controlled. The samples are connected to a data acquisition system by means of individual signal cables permitting samples to be tested under a variety of conditions in multiple environmental chambers. An edge card connector is used to connect samples to the measurement wires enabling easy switching of samples in and out of test. This measurement method can be conducted with as little as 1 h of labor time per sample. Furthermore, multiple samples can be measured in parallel, making this an inexpensive and high volume method for measuring high moisture barriers. PMID:21895269

  11. Spectrophotometric Method for Quantitative Determination of Cefixime in Bulk and Pharmaceutical Preparation Using Ferroin Complex

    NASA Astrophysics Data System (ADS)

    Naeem Khan, M.; Qayum, A.; Ur Rehman, U.; Gulab, H.; Idrees, M.

    2015-09-01

    A method was developed for the quantitative determination of cefixime in bulk and pharmaceutical preparations using ferroin complex. The method is based on the oxidation of the cefixime with Fe(III) in acidic medium. The formed Fe(II) reacts with 1,10-phenanthroline, and the ferroin complex is measured spectrophotometrically at 510 nm against reagent blank. Beer's law was obeyed in the concentration range 0.2-10 μg/ml with a good correlation of 0.993. The molar absorptivity was calculated and was found to be 1.375×105 L/mol × cm. The limit of detection (LOD) and limit of quantification (LOQ) were found to be 0.030 and 0.101 μg/ml respectively. The proposed method has reproducibility with a relative standard deviation of 5.28% (n = 6). The developed method was validated statistically by performing a recoveries study and successfully applied for the determination of cefixime in bulk powder and pharmaceutical formulations without interferences from common excipients. Percent recoveries were found to range from 98.00 to 102.05% for the pure form and 97.83 to 102.50% for pharmaceutical preparations.

  12. Quantitative bioanalytical methods validation and implementation: best practices for chromatographic and ligand binding assays.

    PubMed

    Viswanathan, C T; Bansal, Surendra; Booth, Brian; DeStefano, Anthony J; Rose, Mark J; Sailstad, Jeffrey; Shah, Vinod P; Skelly, Jerome P; Swann, Patrick G; Weiner, Russell

    2007-10-01

    The Third AAPS/FDA Bioanalytical Workshop, entitled "Quantitative Bioanalytical Methods Validation and Implementation: Best Practices for Chromatographic and Ligand Binding Assays" was held on May 1-3, 2006 in Arlington, VA. The format of this workshop consisted of presentations on bioanalytical topics, followed by discussion sessions where these topics could be debated, with the goal of reaching consensus, or identifying subjects where addition input or clarification was required. The discussion also addressed bioanalytical validation requirements of regulatory agencies, with the purpose of clarifying expectations for regulatory submissions. The proceedings from each day were reviewed and summarized in the evening sessions among the speakers and moderators of the day. The consensus summary was presented back to the workshop on the last day and was further debated. This communication represents the distillate of the workshop proceedings and provides the summary of consensus reached and also contains the validation topics where no consensus was reached. PMID:17458684

  13. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    PubMed Central

    Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho

    2013-01-01

    Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428

  14. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N., Jr.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  15. A quantitative and qualitative method to control chemotherapeutic preparations by Fourier transform infrared-ultraviolet spectrophotometry.

    PubMed

    Dziopa, Florian; Galy, Guillaume; Bauler, Stephanie; Vincent, Benoit; Crochon, Sarah; Tall, Mamadou Lamine; Pirot, Fabrice; Pivot, Christine

    2013-06-01

    Chemotherapy products in hospitals include a reconstitution step of manufactured drugs providing an adapted dosage to each patient. The administration of highly iatrogenic drugs raises the question of patients' safety and treatment efficiency. In order to reduce administration errors due to faulty preparations, we introduced a new qualitative and quantitative routine control based on Fourier Transform Infrared (FTIR) and UV-Visible spectrophotometry. This automated method enabled fast and specific control for 14 anticancer drugs. A 1.2 mL sample was used to assay and identify each preparation in less than 90 sec. Over a two-year period, 9370 controlled infusion bags showed a 1.49% nonconformity rate, under 15% tolerance from the theoretical concentration and 96% minimum identification matching factor. This study evaluated the reliability of the control process, as well as its accordance to chemotherapy deliverance requirements. Thus, corrective measures were defined to improve the control process. PMID:23014899

  16. Molecular methods: chip assay and quantitative real-time PCR: in detecting hepatotoxic cyanobacteria.

    PubMed

    Rantala-Ylinen, Anne; Sipari, Hanna; Sivonen, Kaarina

    2011-01-01

    Cyanobacterial mass occurrences are widespread and often contain hepatotoxic, i.e. microcystin- and nodularin-producing, species. Nowadays, detection of microcystin (mcy) and nodularin synthetase (nda) genes is widely used for the recognition of toxic cyanobacterial strains in environmental water samples. Chip assay presented here combines ligation detection reaction and hybridization on a universal microarray to detect and identify the mcyE/ndaF genes of five cyanobacterial genera specifically and sensitively. Thus, one chip assay can reveal the co-occurrence of several hepatotoxin producers. The presented quantitative real-time PCR method is used for the detection of either microcystin-producing Anabaena or Microcystis. Determination of the mcyE-gene copy numbers allows the identification of the dominant producer genus in the sample. PMID:21567319

  17. A novel method for quantitation of favism-inducing agents in legumes.

    PubMed

    Chevion, M; Navok, T

    1983-01-01

    A new method for the quantitation of the favism-inducing agents in legumes is described. The procedure involves differential extraction of the glucosides vicine and convicine with acetic acid (25%), followed by an enzymatic hydrolysis by beta-glucosidase under anaerobic conditions. Each of the aglycone moieties, isouramil and divicine, anaerobically reduces two molecules of o-ferriphenanthroline to o-ferrophenanthroline. This reaction is readily followed spectrophotometrically at 515 nm. Using this procedure, it was found that in various strains of Vicia faba, the level of these two glucosides comprises approximately 0.5% of the wet weight of the seeds. In contrast, these glucosides could not be detected in either green peas or chick peas. PMID:6846790

  18. Alkaline O leads to N-transacylation. A new method for the quantitative deacylation of phospholipids.

    PubMed Central

    Clarke, N G; Dawson, R M

    1981-01-01

    1. Quantitative O-deacylation of phospholipids has been achieved by incubation with a reagent containing monomethylamine, methanol and water. The reaction is primarily an O leads to N-transacylation with N-methyl fatty acid amides being formed. 2. The reagent can be removed easily by volatilization and under defined conditions no secondary decomposition of the phosphorus-containing deacylation products occurs. 3. The water-soluble phosphorus compounds derived by deacylation of mammalian tissue O-diacylated phospholipids have been completely separated by a single-dimensional paper ionophoresis with a volatile pH9 buffer. 4. The O-deacylated alkyl and alkenyl phospholipids have been examined by t.l.c. before and after catalytic hydrolysis with Hg2+. 5. A complete analysis of rat brain phospholipids by the above methods agrees closely with that obtained by other procedures. Images Fig. 1. Fig. 2. PMID:7306057

  19. A method of quantitative analysis of rapid thermal processes through vessel walls under nonisothermal liquid flow

    NASA Astrophysics Data System (ADS)

    Bolshukhin, M. A.; Znamenskaya, I. A.; Fomichev, V. I.

    2015-11-01

    It is shown that the pulsation and energy characteristics of a turbulent boundary liquid layer can be investigated under thermal imaging resolution in the spectral range of 3.7-4.8 µm and a frequency resolution from 100 Hz through an infrared (IR)-transparent vessel wall with a moving liquid. The spatio-temporal, frequency, and spectral characteristics of heat fluxes initiated by the boundary layer of the liquid in the material can be investigated through an IR-opaque vessel wall with a moving liquid in the course of thermalimaging investigation. The dependence of the spectral and frequency characteristics on their thickness and the material of the wall is shown. The proposed method makes it possible to measure and analyze the fields of boundary layers of the liquid, as well as to acquire quantitative data on the wall material and thickness including verification of the corresponding programs and algorithms of numerical modeling.

  20. Topological study on the toxicity of ionic liquids on Vibrio fischeri by the quantitative structure-activity relationship method.

    PubMed

    Yan, Fangyou; Shang, Qiaoyan; Xia, Shuqian; Wang, Qiang; Ma, Peisheng

    2015-04-01

    As environmentally friendly solvents, ionic liquids (ILs) are unlikely to act as air contaminants or inhalation toxins resulting from their negligible vapor pressure and excellent thermal stability. However, they can be potential water contaminants because of their considerable solubility in water; therefore, a proper toxicological assessment of ILs is essential. The environmental fate of ILs is studied by quantitative structure-activity relationship (QSAR) method. A multiple linear regression (MLR) model is obtained by topological method using toxicity data of 157 ILs on Vibrio fischeri, which are composed of 74 cations and 22 anions. The topological index developed in our research group is used for predicting the V. fischeri toxicity for the first time. The MLR model is precise for estimating LogEC50 of ILs on V. fischeri with square of correlation coefficient (R(2)) = 0.908 and the average absolute error (AAE) = 0.278. PMID:25603290

  1. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  2. Quantitative analysis of toxic and essential elements in human hair. Clinical validity of results.

    PubMed

    Kosanovic, Melita; Jokanovic, Milan

    2011-03-01

    Over the last three decades, there has been an increasing awareness of environmental and occupational exposures to toxic or potentially toxic trace elements. The evolution of biological monitoring includes knowledge of kinetics of toxic and/or essential elements and adverse health effects related to their exposure. The debate whether a hair is a valid sample for biomonitoring or not is still attracting the attention of analysts, health care professionals, and environmentalists. Although researchers have found many correlations of essential elements to diseases, metabolic disorders, environmental exposures, and nutritional status, opponents of the concept of hair analysis object that hair samples are unreliable due to the influence of external factors. This review discusses validity of hair as a sample for biomonitoring of essential and toxic elements, with emphasis on pre-analytical, analytical, and post-analytical factors influencing results. PMID:20490915

  3. ADvanced IMage Algebra (ADIMA): a novel method for depicting multiple sclerosis lesion heterogeneity, as demonstrated by quantitative MRI

    PubMed Central

    Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia AM

    2013-01-01

    Background: There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. Objective: To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. Methods: We obtained conventional PDw and T2w images from 10 patients with relapsing–remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Results: Our study’s ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. Conclusion: ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished. PMID:23037551

  4. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    USGS Publications Warehouse

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  5. Transconvolution and the virtual positron emission tomograph-A new method for cross calibration in quantitative PET/CT imaging

    SciTech Connect

    Prenosil, George A.; Weitzel, Thilo; Hentschel, Michael; Klaeser, Bernd; Krause, Thomas

    2013-06-15

    Purpose: Positron emission tomography (PET)/computed tomography (CT) measurements on small lesions are impaired by the partial volume effect, which is intrinsically tied to the point spread function of the actual imaging system, including the reconstruction algorithms. The variability resulting from different point spread functions hinders the assessment of quantitative measurements in clinical routine and especially degrades comparability within multicenter trials. To improve quantitative comparability there is a need for methods to match different PET/CT systems through elimination of this systemic variability. Consequently, a new method was developed and tested that transforms the image of an object as produced by one tomograph to another image of the same object as it would have been seen by a different tomograph. The proposed new method, termed Transconvolution, compensates for differing imaging properties of different tomographs and particularly aims at quantitative comparability of PET/CT in the context of multicenter trials. Methods: To solve the problem of image normalization, the theory of Transconvolution was mathematically established together with new methods to handle point spread functions of different PET/CT systems. Knowing the point spread functions of two different imaging systems allows determining a Transconvolution function to convert one image into the other. This function is calculated by convolving one point spread function with the inverse of the other point spread function which, when adhering to certain boundary conditions such as the use of linear acquisition and image reconstruction methods, is a numerically accessible operation. For reliable measurement of such point spread functions characterizing different PET/CT systems, a dedicated solid-state phantom incorporating {sup 68}Ge/{sup 68}Ga filled spheres was developed. To iteratively determine and represent such point spread functions, exponential density functions in combination with a Gaussian distribution were introduced. Furthermore, simulation of a virtual PET system provided a standard imaging system with clearly defined properties to which the real PET systems were to be matched. A Hann window served as the modulation transfer function for the virtual PET. The Hann's apodization properties suppressed high spatial frequencies above a certain critical frequency, thereby fulfilling the above-mentioned boundary conditions. The determined point spread functions were subsequently used by the novel Transconvolution algorithm to match different PET/CT systems onto the virtual PET system. Finally, the theoretically elaborated Transconvolution method was validated transforming phantom images acquired on two different PET systems to nearly identical data sets, as they would be imaged by the virtual PET system. Results: The proposed Transconvolution method matched different PET/CT-systems for an improved and reproducible determination of a normalized activity concentration. The highest difference in measured activity concentration between the two different PET systems of 18.2% was found in spheres of 2 ml volume. Transconvolution reduced this difference down to 1.6%. In addition to reestablishing comparability the new method with its parameterization of point spread functions allowed a full characterization of imaging properties of the examined tomographs. Conclusions: By matching different tomographs to a virtual standardized imaging system, Transconvolution opens a new comprehensive method for cross calibration in quantitative PET imaging. The use of a virtual PET system restores comparability between data sets from different PET systems by exerting a common, reproducible, and defined partial volume effect.

  6. Rapid method for the quantitative determination of efavirenz in human plasma.

    PubMed

    Mogatle, Seloi; Kanfer, Isadore

    2009-07-12

    A pharmacokinetic interaction study between efavirenz (EFV), a non-nucleoside reverse transcriptase inhibitor used in the treatment of HIV-1 infection, and an African traditional medicine, African potato in human subjects was undertaken. This necessitated the development and validation of a quantitative method for the analysis of EFV in plasma. A simple mobile phase consisting of 0.1M formic acid, acetonitrile and methanol (43:52:5) was pumped at a low flow rate of 0.3 ml/min through a reverse phase Phenomenex Luna C(18) (2) (5 microm, 150 mm x 2.0mm i.d.) column maintained at 40 degrees C. Diclofenac sodium was used as an internal standard (IS) and EFV and IS were monitored at 247 nm and 275 nm, respectively. A simple and rapid sample preparation involved the addition of mobile phase to 100 microl of plasma to precipitate plasma proteins followed by direct injection of 10 microl of supernatant onto the column. The procedures were validated according to international standards with good reproducibility and linear response (r=0.9990). The intra- and inter-day accuracies were between 12.3 and 17.7% at the LLOQ and between -5.8 and 9.1% for the QC samples. The intra- and inter-day precision of EFV determinations were 5.1 or less and 7.2% RSD or less, respectively across the entire QC concentration range. Mean recovery based on high, medium and low quality control standards ranged between 92.7 and 94.1% with %RSD values better than 3%. Plasma samples were evaluated for short-term (ambient temperature for 6h) and long-term (-10+/-2 degrees C for 60 days) storage conditions and were found to be stable. The method described is cost-effective and has the necessary accuracy and precision for the rapid quantitative determination of EFV in human plasma. PMID:19375262

  7. A systematic study on the influencing parameters and improvement of quantitative analysis of multi-component with single marker method using notoginseng as research subject.

    PubMed

    Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing

    2015-03-01

    A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi-component and the quality control of TCMs and TCM prescriptions. PMID:25618711

  8. Application of quantitative 1H-NMR method to determination of gentiopicroside in Gentianae radix and Gentianae scabrae radix.

    PubMed

    Tanaka, Rie; Hasebe, Yuko; Nagatsu, Akito

    2014-07-01

    A quantitative (1)H-NMR method (qHNMR) was used to measure gentiopicroside content in Gentianae radix and Gentianae scabrae radix. Gentiopicroside is a major component of Gentianae radix and Gentianae scabrae radix. The purity of gentiopicroside was calculated from the ratio of the intensity of the H-3 signal at δ 7.44 ppm or the H-8 signal at δ 5.78 ppm in methanol-d 4 of gentiopicroside to that of a hexamethyldisilane (HMD) signal at 0 ppm. The concentration of HMD was corrected with SI traceability by using potassium hydrogen phthalate of certified reference material (CRM) grade. As a result, the gentiopicroside content in two lots of Gentianae radix as determined by qHNMR was found to be 1.76 and 2.17 %, respectively. The gentiopicroside content in two lots of Gentianae scabrae radix was 2.73 and 3.99 %, respectively. We demonstrated that this method is useful for the quantitative analysis of crude drugs. PMID:24687868

  9. A quantitative method for evaluating numerical simulation accuracy of time-transient Lamb wave propagation with its applications to selecting appropriate element size and time step.

    PubMed

    Wan, Xiang; Xu, Guanghua; Zhang, Qing; Tse, Peter W; Tan, Haihui

    2016-01-01

    Lamb wave technique has been widely used in non-destructive evaluation (NDE) and structural health monitoring (SHM). However, due to the multi-mode characteristics and dispersive nature, Lamb wave propagation behavior is much more complex than that of bulk waves. Numerous numerical simulations on Lamb wave propagation have been conducted to study its physical principles. However, few quantitative studies on evaluating the accuracy of these numerical simulations were reported. In this paper, a method based on cross correlation analysis for quantitatively evaluating the simulation accuracy of time-transient Lamb waves propagation is proposed. Two kinds of error, affecting the position and shape accuracies are firstly identified. Consequently, two quantitative indices, i.e., the GVE (group velocity error) and MACCC (maximum absolute value of cross correlation coefficient) derived from cross correlation analysis between a simulated signal and a reference waveform, are proposed to assess the position and shape errors of the simulated signal. In this way, the simulation accuracy on the position and shape is quantitatively evaluated. In order to apply this proposed method to select appropriate element size and time step, a specialized 2D-FEM program combined with the proposed method is developed. Then, the proper element size considering different element types and time step considering different time integration schemes are selected. These results proved that the proposed method is feasible and effective, and can be used as an efficient tool for quantitatively evaluating and verifying the simulation accuracy of time-transient Lamb wave propagation. PMID:26315506

  10. A validated RP-HPLC-UV method for quantitative determination of puerarin in Pueraria tuberosa DC tuber extract

    PubMed Central

    Maji, Amal K.; Maity, Niladri; Banerji, Pratim; Banerjee, Debdulal

    2012-01-01

    Background: Pueraria tuberosa (Fabaceae) is a well-known medicinal herbs used in Indian traditional medicines. The puerarin is one of the most important bioactive constituent found in the tubers of this plant. Quantitative estimation of bioactive molecules is essential for the purpose of quality control and dose determination of herbal medicines. The study was designed to develop a validated reversed phase high-performance liquid chromatography (RP-HPLC) method for the quantification of puerarin in the tuber extract of P. tuberosa. Materials and Methods: The RP-HPLC system with Luna C18 (2) 100 Å, 250 × 4.6 mm column was used in this study. The analysis was performed using the mobile phase: 0.1% acetic acid in acetonitrile and 0.1% acetic acid in water (90:10, v/v) under column temperature 25°C. The detection wavelength was set at 254 nm with a flow rate of 1 ml/min. The method validation was performed according to the guidelines of International Conference on Harmonization. Results: The puerarin content of P. tuberosa extract was found to be 9.28 ±0.09%. The calibration curve showed good linearity relationship in the range of 200-1000μg/ml (r2>0.99). The LOD and LOQ were 57.12 and 181.26μg/ml, respectively and the average recovery of puerarin was 99.73% ±1.02%. The evaluation of system suitability, precision, robustness and ruggedness parameters were also found to produce satisfactory results. Conclusions: The developed method is very simple and rapid with excellent specificity, accuracy and precision which can be useful for the routine analysis and quantitative estimation of puerarin in plant extracts and formulations. PMID:23781483

  11. Quantitative Results from Shockless Compression Experiments on Solids to Multi-Megabar Pressure

    NASA Astrophysics Data System (ADS)

    Davis, Jean-Paul; Brown, Justin; Knudson, Marcus; Lemke, Raymond

    2015-03-01

    Quasi-isentropic, shockless ramp-wave experiments promise accurate equation-of-state (EOS) data in the solid phase at relatively low temperatures and multi-megabar pressures. In this range of pressure, isothermal diamond-anvil techniques have limited pressure accuracy due to reliance on theoretical EOS of calibration standards, thus accurate quasi-isentropic compression data would help immensely in constraining EOS models. Multi-megabar shockless compression experiments using the Z Machine at Sandia as a magnetic drive with stripline targets continue to be performed on a number of solids. New developments will be presented in the design and analysis of these experiments, including topics such as 2-D and magneto-hydrodynamic (MHD) effects and the use of LiF windows. Results will be presented for tantalum and/or gold metals, with comparisons to independently developed EOS. * Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  12. Parents' decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results.

    PubMed

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9-10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents' general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  13. Parents’ decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results

    PubMed Central

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9–10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents’ general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  14. Development of an Effective Method for Recovery of Viral Genomic RNA from Environmental Silty Sediments for Quantitative Molecular Detection ▿

    PubMed Central

    Miura, Takayuki; Masago, Yoshifumi; Sano, Daisuke; Omura, Tatsuo

    2011-01-01

    Nine approaches to recover viral RNA from environmental silty sediments were newly developed and compared to quantify RNA viruses in sediments using molecular methods. Four of the nine approaches employed direct procedures for extracting RNA from sediments (direct methods), and the remaining five approaches used indirect methods wherein viral particles were recovered before RNA extraction. A direct method using an SDS buffer with EDTA to lyse viral capsids in sediments, phenol-chloroform-isoamyl alcohol to extract RNA, isopropanol to concentrate RNA, and magnetic beads to purify RNA resulted in the highest rate of recovery (geometric mean of 11%, with a geometric standard deviation of 0.02; n = 7) of poliovirus 1 (PV1) inoculated in an environmental sediment sample. The direct method exhibiting the highest rate of PV1 recovery was applied to environmental sediment samples. One hundred eight sediment samples were collected from the Takagi River, Miyagi, Japan, and its estuary from November 2007 to April 2009, and the genomic RNAs of enterovirus and human norovirus in these samples were quantified by reverse transcription (RT)-quantitative PCR (qPCR). The human norovirus genome was detected in one sample collected at the bay, although its concentration was below the quantification limit. Meanwhile, the enterovirus genome was detected in two samples at the river mouth and river at concentrations of 8.6 × 102 and 2.4 × 102 copies/g (wet weight), respectively. This is the first report to obtain quantitative data for a human pathogenic virus in a river and in estuarine sediments using RT-qPCR. PMID:21515729

  15. Quantitative fault analysis of roller bearings based on a novel matching pursuit method with a new step-impulse dictionary

    NASA Astrophysics Data System (ADS)

    Cui, Lingli; Wu, Na; Ma, Chunqing; Wang, Huaqing

    2016-02-01

    A novel matching pursuit method based on a new step-impulse dictionary to measure the size of a bearing's spall-like fault is presented in this study. Based on the seemingly double-impact theory and the rolling bearing fault mechanism, a theoretical model for the bearing fault with different spall-like fault sizes is developed and analyzed, and the seemingly double-impact characteristic of the bearing faults is explained. The first action that causes a bearing fault is due to the entry of the roller element into the spall-like fault which can be described as a step-like response. The second action is the exit of the roller element from the spall-like fault, which can be described as an impulse-like response. Based on the quantitative relationship between the time interval of the seemingly double-impact actions and the fault size, a novel matching pursuit method is proposed based on a new step-impulse dictionary. In addition, the quantitative matching pursuit algorithm is proposed for bearing fault diagnosis based on the new dictionary model. Finally, an atomic selection mechanism is proposed to improve the measurement accuracy of bearing fault size. The simulation results of this study indicate that the new matching pursuit method based on the new step-impulse dictionary can be reliably used to measure the sizes of bearing spall-like faults. The applications of this method to the fault signals of bearing outer-races measured at different speeds have shown that the proposed method can effectively measure a bearing's spall-like fault size.

  16. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  17. Brain temperature and limits on transcranial cooling in humans: quantitative modeling results.

    PubMed

    Nelson, D A; Nunneley, S A

    1998-09-01

    Selective brain cooling (SBC) of varying strengths has been demonstrated in a number of mammals and appears to play a role in systemic thermoregulation. Although primates lack obvious specialization for SBC, the possibility of brain cooling in humans has been debated for many years. This paper reports on the use of mathematical modeling to explore whether surface cooling can control effectively the temperature of the human cerebrum. The brain was modeled as a hemisphere with a volume of 1.33 1 and overlying layers of cerebrospinal fluid, skull, and scalp. Each component was assigned appropriate dimensions, physical properties and physiological characteristics that were determined from the literature. The effects of blood flow and of thermal conduction were modeled using the steady-state form of the bio-heat equation. Input parameters included core (arterial) temperature: normal (37 degrees C) or hyperthermic (40 degrees C), air temperature: warm (30 degrees C) or hot (40 degrees C), and sweat evaporation rate: 0, 0.25, or 0.50 l x m(-2) x h(-1). The resulting skin temperatures of the model ranged from 31.8 degrees C to 40.2 degrees C, values which are consistent with data obtained from the literature. Cerebral temperatures were generally insensitive to surface conditions (air temperature and evaporation rate), which affected only the most superficial level of the cerebrum (< or =1.5 mm) The remaining parenchymal temperatures were 0.2-0.3 degrees C above arterial temperatures, regardless of surface conditions. This held true even for the worst-case conditions combining core hyperthermia in a hot environment with zero evaporative cooling. Modeling showed that the low surface-to-volume ratio, low tissue conductivity, and high rate of cerebral perfusion combine to minimize the potential impact of surface cooling, whether by transcranial venous flow or by conduction through intervening layers to the skin or mucosal surfaces. The dense capillary network in the brain assures that its temperature closely follows arterial temperature and is controlled through systemic thermoregulation independent of head surface temperature. A review of the literature reveals several independent lines of evidence which support these findings and indicate the absence of functionally significant transcranial venous flow in either direction. Given the fact that humans sometimes work under conditions which produce face and scalp temperatures that are above core temperature, a transcranial thermal link would not necessarily protect the brain, but might instead increase its vulnerability to environmentally induced thermal injury. PMID:9754976

  18. Quantitatively estimating defects in graphene devices using discharge current analysis method

    PubMed Central

    Jung, Ukjin; Lee, Young Gon; Kang, Chang Goo; Lee, Sangchul; Kim, Jin Ju; Hwang, Hyeon June; Lim, Sung Kwan; Ham, Moon-Ho; Lee, Byoung Hun

    2014-01-01

    Defects of graphene are the most important concern for the successful applications of graphene since they affect device performance significantly. However, once the graphene is integrated in the device structures, the quality of graphene and surrounding environment could only be assessed using indirect information such as hysteresis, mobility and drive current. Here we develop a discharge current analysis method to measure the quality of graphene integrated in a field effect transistor structure by analyzing the discharge current and examine its validity using various device structures. The density of charging sites affecting the performance of graphene field effect transistor obtained using the discharge current analysis method was on the order of 1014/cm2, which closely correlates with the intensity ratio of the D to G bands in Raman spectroscopy. The graphene FETs fabricated on poly(ethylene naphthalate) (PEN) are found to have a lower density of charging sites than those on SiO2/Si substrate, mainly due to reduced interfacial interaction between the graphene and the PEN. This method can be an indispensable means to improve the stability of devices using a graphene as it provides an accurate and quantitative way to define the quality of graphene after the device fabrication. PMID:24811431

  19. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime

    PubMed Central

    Fitterer, Jessica L.; Nelson, Trisalyn A.

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016

  20. Quantitatively estimating defects in graphene devices using discharge current analysis method.

    PubMed

    Jung, Ukjin; Lee, Young Gon; Kang, Chang Goo; Lee, Sangchul; Kim, Jin Ju; Hwang, Hyeon June; Lim, Sung Kwan; Ham, Moon-Ho; Lee, Byoung Hun

    2014-01-01

    Defects of graphene are the most important concern for the successful applications of graphene since they affect device performance significantly. However, once the graphene is integrated in the device structures, the quality of graphene and surrounding environment could only be assessed using indirect information such as hysteresis, mobility and drive current. Here we develop a discharge current analysis method to measure the quality of graphene integrated in a field effect transistor structure by analyzing the discharge current and examine its validity using various device structures. The density of charging sites affecting the performance of graphene field effect transistor obtained using the discharge current analysis method was on the order of 10(14)/cm(2), which closely correlates with the intensity ratio of the D to G bands in Raman spectroscopy. The graphene FETs fabricated on poly(ethylene naphthalate) (PEN) are found to have a lower density of charging sites than those on SiO2/Si substrate, mainly due to reduced interfacial interaction between the graphene and the PEN. This method can be an indispensable means to improve the stability of devices using a graphene as it provides an accurate and quantitative way to define the quality of graphene after the device fabrication. PMID:24811431

  1. Quantitative analysis of the lamellarity of giant liposomes prepared by the inverted emulsion method.

    PubMed

    Chiba, Masataka; Miyazaki, Makito; Ishiwata, Shin'ichi

    2014-07-15

    The inverted emulsion method is used to prepare giant liposomes by pushing water-in-oil droplets through the oil/water interface into an aqueous medium. Due to the high encapsulation efficiency of proteins under physiological conditions and the simplicity of the protocol, it has been widely used to prepare various cell models. However, the lamellarity of liposomes prepared by this method has not been evaluated quantitatively. Here, we prepared liposomes that were partially stained with a fluorescent dye, and analyzed their fluorescence intensity under an epifluorescence microscope. The fluorescence intensities of the membranes of individual liposomes were plotted against their diameter. The plots showed discrete distributions, which were classified into several groups. The group with the lowest fluorescence intensity was determined to be unilamellar by monitoring the exchangeability of the inner and the outer solutions of the liposomes in the presence of the pore-forming toxin α-hemolysin. Increasing the lipid concentration dissolved in oil increased the number of liposomes ∼100 times. However, almost all the liposomes were unilamellar even at saturating lipid concentrations. We also investigated the effects of lipid composition and liposome content, such as highly concentrated actin filaments and Xenopus egg extracts, on the lamellarity of the liposomes. Remarkably, over 90% of the liposomes were unilamellar under all conditions examined. We conclude that the inverted emulsion method can be used to efficiently prepare giant unilamellar liposomes and is useful for designing cell models. PMID:25028876

  2. Quantitative analysis of extracted phycobilin pigments in cyanobacteria-an assessment of spectrophotometric and spectrofluorometric methods.

    PubMed

    Sobiechowska-Sasim, Monika; Stoń-Egiert, Joanna; Kosakowska, Alicja

    2014-01-01

    Phycobilins are an important group of pigments that through complementary chromatic adaptation optimize the light-harvesting process in phytoplankton cells, exhibiting great potential as cyanobacteria species biomarkers. In their extracted form, concentrations of these water-soluble molecules are not easily determined using the chromatographic methods well suited to solvent-soluble pigments. Insights regarding the quantitative spectroscopic analysis of extracted phycobilins also remain limited. Here, we present an in-depth study of two methods that utilize the spectral properties of phycobilins in aqueous extracts. The technical work was carried out using high-purity standards of phycocyanin, phycoerythrin, and allophycocyanin. Calibration parameters for the spectrofluorometer and spectrophotometer were established. This analysis indicated the possibility of detecting pigments in concentrations ranging from 0.001 to 10 μg cm(-3). Fluorescence data revealed a reproducibility of 95 %. The differences in detection limits between the two methods enable the presence of phycobilins to be investigated and their amounts to be monitored from oligotrophic to eutrophic aquatic environments. PMID:25346572

  3. A novel method for quantitative determination of tea polysaccharide by resonance light scattering

    NASA Astrophysics Data System (ADS)

    Wei, Xinlin; Xi, Xionggang; Wu, Muxia; Wang, Yuanfeng

    2011-09-01

    A new method for the determination of tea polysaccharide (TPS) in green tea ( Camellia sinensis) leaves has been developed. The method was based on the enhancement of resonance light scattering (RLS) of TPS in the presence of cetylpyridinium chloride (CPC)-NaOH system. Under the optimum conditions, the RLS intensity of CPC was greatly enhanced by adding TPS. The maximum peak of the enhanced RLS spectra was located at 484.02 nm. The enhanced RLS intensity was proportional to the concentration of TPS in the range of 2.0-20 μg/ml. It showed that the new method and phenol-sulfuric acid method give some equivalent results by measuring the standard compounds. The recoveries of the two methods were 96.39-103.7% (novel method) and 100.15-103.65% (phenol-sulfuric acid method), respectively. However, it showed that the two methods were different to some extent. The new method offered a limit of detection (LOD) of 0.047 μg/ml, whereas the phenol-sulfuric acid method gives a LOD of 1.54 μg/ml. Interfered experiment demonstrated that the new method had highly selectivity, and was more suitable for the determination of TPS than phenol-sulfuric method. Stability test showed that new method had good stability. Moreover, the proposed method owns the advantages of easy operation, rapidity and practicability, which suggested that the proposed method could be satisfactorily applied to the determination of TPS in green tea.

  4. A composite method for mapping quantitative trait loci without interference of female achiasmatic and gender effects in silkworm, Bombyx mori.

    PubMed

    Li, C; Zuo, W; Tong, X; Hu, H; Qiao, L; Song, J; Xiong, G; Gao, R; Dai, F; Lu, C

    2015-08-01

    The silkworm, Bombyx mori, is an economically important insect that was domesticated more than 5000 years ago. Its major economic traits focused on by breeders are quantitative traits, and an accurate and efficient QTL mapping method is necessary to explore their genetic architecture. However, current widely used QTL mapping models are not well suited for silkworm because they ignore female achiasmate and gender effects. In this study, we propose a composite method combining rational population selection and special mapping methods to map QTL in silkworm. By determining QTL for cocoon shell weight (CSW), we demonstrated the effectiveness of this method. In the CSW mapping process, only 56 markers were used and five loci or chromosomes were detected, more than in previous studies. Additionally, loci on chromosomes 1 and 11 dominated and accounted for 35.10% and 15.03% of the phenotypic variance respectively. Unlike previous studies, epistasis was detected between loci on chromosomes 11 and 22. These mapping results demonstrate the power and convenience of this method for QTL mapping in silkworm, and this method may inspire the development of similar approaches for other species with special genetic characteristics. PMID:26059330

  5. Evaluation of a rapid, quantitative real-time PCR method for enumeration of pathogenic Candida cells in water

    USGS Publications Warehouse

    Brinkman, Nichole E.; Haugland, Richard A.; Wymer, Larry J.; Byappanahalli, Muruleedhara N.; Whitman, Richard L.; Vesper, Stephen J.

    2003-01-01

    Quantitative PCR (QPCR) technology, incorporating fluorigenic 5′ nuclease (TaqMan) chemistry, was utilized for the specific detection and quantification of six pathogenic species of Candida (C. albicans, C. tropicalis, C. krusei, C. parapsilosis, C. glabrata and C. lusitaniae) in water. Known numbers of target cells were added to distilled and tap water samples, filtered, and disrupted directly on the membranes for recovery of DNA for QPCR analysis. The assay's sensitivities were between one and three cells per filter. The accuracy of the cell estimates was between 50 and 200% of their true value (95% confidence level). In similar tests with surface water samples, the presence of PCR inhibitory compounds necessitated further purification and/or dilution of the DNA extracts, with resultant reductions in sensitivity but generally not in quantitative accuracy. Analyses of a series of freshwater samples collected from a recreational beach showed positive correlations between the QPCR results and colony counts of the corresponding target species. Positive correlations were also seen between the cell quantities of the target Candida species detected in these analyses and colony counts of Enterococcus organisms. With a combined sample processing and analysis time of less than 4 h, this method shows great promise as a tool for rapidly assessing potential exposures to waterborne pathogenic Candida species from drinking and recreational waters and may have applications in the detection of fecal pollution.

  6. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors of uncertainty, and by partial dissolution of the injected CO2, thus reducing the amount of free gas, which can be detected by seismic time-lapse observations. These quantitative assessment studies have shown that conformity between injected and estimated CO2 quantities can only be achieved with some degree of uncertainty which needs to be quantified for a realistic assessment of conformity studies.

  7. Initial Results of an MDO Method Evaluation Study

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley MDO method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of re- producible experiments. In the first phase of the study, three MDO methods were implemented in the SIGHT: framework and used to solve a set of ten relatively simple problems. In this paper, we comment on the general considerations for conducting method evaluation studies and report some initial results obtained to date. In particular, although the results are not conclusive because of the small initial test set, other formulations, optimality conditions, and sensitivity of solutions to various perturbations. Optimization algorithms are used to solve a particular MDO formulation. It is then appropriate to speak of local convergence rates and of global convergence properties of an optimization algorithm applied to a specific formulation. An analogous distinction exists in the field of partial differential equations. On the one hand, equations are analyzed in terms of regularity, well-posedness, and the existence and unique- ness of solutions. On the other, one considers numerous algorithms for solving differential equations. The area of MDO methods studies MDO formulations combined with optimization algorithms, although at times the distinction is blurred. It is important to

  8. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  9. Development and validation of a reversed-phase liquid chromatography method for the quantitative determination of carboxylic acids in industrial reaction mixtures.

    PubMed

    Destandau, E; Vial, J; Jardya, A; Henniona, M C; Bonnet, D; Lancelin, P

    2005-09-23

    Usually analysis of low molecular-mass carboxylic acids was performed by anion-exchange or ion-exclusion chromatographic methods. Reversed-phase liquid chromatography (RPLC) was evaluated in this work as an alternative method for the analysis of low molecular-mass aliphatic mono- and di-carboxylic acids (formic, acetic, propionic, butyric, valeric, caproic, succinic, glutaric and adipic) in aqueous media. The separation of the nine organic acids was optimised in 21 min on a high-density C18 column with an elution gradient made up of HClO4 aqueous solution 10(-3) mol L(-1) and acetonitrile. For the quantitation, external standard and standard addition methods were compared. Both methods gave similar results, so the most convenient method, external standard, was chosen for acids quantitation. Then the method had been validated and applied to the semi-quantitative analysis of formic and acetic acids and to the quantitative analysis of the others compounds in industrial reaction mixtures with concentrations ranging from 20 to 570 ppm. PMID:16130732

  10. mcrA-Targeted Real-Time Quantitative PCR Method To Examine Methanogen Communities▿

    PubMed Central

    Steinberg, Lisa M.; Regan, John M.

    2009-01-01

    Methanogens are of great importance in carbon cycling and alternative energy production, but quantitation with culture-based methods is time-consuming and biased against methanogen groups that are difficult to cultivate in a laboratory. For these reasons, methanogens are typically studied through culture-independent molecular techniques. We developed a SYBR green I quantitative PCR (qPCR) assay to quantify total numbers of methyl coenzyme M reductase α-subunit (mcrA) genes. TaqMan probes were also designed to target nine different phylogenetic groups of methanogens in qPCR assays. Total mcrA and mcrA levels of different methanogen phylogenetic groups were determined from six samples: four samples from anaerobic digesters used to treat either primarily cow or pig manure and two aliquots from an acidic peat sample stored at 4°C or 20°C. Only members of the Methanosaetaceae, Methanosarcina, Methanobacteriaceae, and Methanocorpusculaceae and Fen cluster were detected in the environmental samples. The three samples obtained from cow manure digesters were dominated by members of the genus Methanosarcina, whereas the sample from the pig manure digester contained detectable levels of only members of the Methanobacteriaceae. The acidic peat samples were dominated by both Methanosarcina spp. and members of the Fen cluster. In two of the manure digester samples only one methanogen group was detected, but in both of the acidic peat samples and two of the manure digester samples, multiple methanogen groups were detected. The TaqMan qPCR assays were successfully able to determine the environmental abundance of different phylogenetic groups of methanogens, including several groups with few or no cultivated members. PMID:19447957

  11. Evaluating Multiple Prevention Programs: Methods, Results, and Lessons Learned

    ERIC Educational Resources Information Center

    Adler-Baeder, Francesca; Kerpelman, Jennifer; Griffin, Melody M.; Schramm, David G.

    2010-01-01

    Extension faculty and agents/educators are increasingly collaborating with local and state agencies to provide and evaluate multiple, distinct programs, yet there is limited information about measuring outcomes and combining results across similar program types. This article explicates the methods and outcomes of a state-level evaluation of…

  12. Cloned plasmid DNA fragments as calibrators for controlling GMOs: different real-time duplex quantitative PCR methods.

    PubMed

    Taverniers, Isabel; Van Bockstaele, Erik; De Loose, Marc

    2004-03-01

    Analytical real-time PCR technology is a powerful tool for implementation of the GMO labeling regulations enforced in the EU. The quality of analytical measurement data obtained by quantitative real-time PCR depends on the correct use of calibrator and reference materials (RMs). For GMO methods of analysis, the choice of appropriate RMs is currently under debate. So far, genomic DNA solutions from certified reference materials (CRMs) are most often used as calibrators for GMO quantification by means of real-time PCR. However, due to some intrinsic features of these CRMs, errors may be expected in the estimations of DNA sequence quantities. In this paper, two new real-time PCR methods are presented for Roundup Ready soybean, in which two types of plasmid DNA fragments are used as calibrators. Single-target plasmids (STPs) diluted in a background of genomic DNA were used in the first method. Multiple-target plasmids (MTPs) containing both sequences in one molecule were used as calibrators for the second method. Both methods simultaneously detect a promoter 35S sequence as GMO-specific target and a lectin gene sequence as endogenous reference target in a duplex PCR. For the estimation of relative GMO percentages both "delta C(T)" and "standard curve" approaches are tested. Delta C(T) methods are based on direct comparison of measured C(T) values of both the GMO-specific target and the endogenous target. Standard curve methods measure absolute amounts of target copies or haploid genome equivalents. A duplex delta C(T) method with STP calibrators performed at least as well as a similar method with genomic DNA calibrators from commercial CRMs. Besides this, high quality results were obtained with a standard curve method using MTP calibrators. This paper demonstrates that plasmid DNA molecules containing either one or multiple target sequences form perfect alternative calibrators for GMO quantification and are especially suitable for duplex PCR reactions. PMID:14689155

  13. Histo-cytometry: a method for highly multiplex quantitative tissue imaging analysis applied to dendritic cell subset microanatomy in lymph nodes.

    PubMed

    Gerner, Michael Y; Kastenmuller, Wolfgang; Ifrim, Ina; Kabat, Juraj; Germain, Ronald N

    2012-08-24

    Flow cytometry allows highly quantitative analysis of complex dissociated populations at the cost of neglecting their tissue localization. In contrast, conventional microscopy methods provide spatial information, but visualization and quantification of cellular subsets defined by complex phenotypic marker combinations is challenging. Here, we describe an analytical microscopy method, "histo-cytometry," for visualizing and quantifying phenotypically complex cell populations directly in tissue sections. This technology is based on multiplexed antibody staining, tiled high-resolution confocal microscopy, voxel gating, volumetric cell rendering, and quantitative analysis. We have tested this technology on various innate and adaptive immune populations in murine lymph nodes (LNs) and were able to identify complex cellular subsets and phenotypes, achieving quantitatively similar results to flow cytometry, while also gathering cellular positional information. Here, we employ histo-cytometry to describe the spatial segregation of resident and migratory dendritic cell subsets into specialized microanatomical domains, suggesting an unexpected LN demarcation into discrete functional compartments. PMID:22863836

  14. Comparison of concentration methods for rapid detection of hookworm ova in wastewater matrices using quantitative PCR.

    PubMed

    Gyawali, P; Ahmed, W; Jagals, P; Sidhu, J P S; Toze, S

    2015-12-01

    Hookworm infection contributes around 700 million infections worldwide especially in developing nations due to increased use of wastewater for crop production. The effective recovery of hookworm ova from wastewater matrices is difficult due to their low concentrations and heterogeneous distribution. In this study, we compared the recovery rates of (i) four rapid hookworm ova concentration methods from municipal wastewater, and (ii) two concentration methods from sludge samples. Ancylostoma caninum ova were used as surrogate for human hookworm (Ancylostoma duodenale and Necator americanus). Known concentration of A. caninum hookworm ova were seeded into wastewater (treated and raw) and sludge samples collected from two wastewater treatment plants (WWTPs) in Brisbane and Perth, Australia. The A. caninum ova were concentrated from treated and raw wastewater samples using centrifugation (Method A), hollow fiber ultrafiltration (HFUF) (Method B), filtration (Method C) and flotation (Method D) methods. For sludge samples, flotation (Method E) and direct DNA extraction (Method F) methods were used. Among the four methods tested, filtration (Method C) method was able to recover higher concentrations of A. caninum ova consistently from treated wastewater (39-50%) and raw wastewater (7.1-12%) samples collected from both WWTPs. The remaining methods (Methods A, B and D) yielded variable recovery rate ranging from 0.2 to 40% for treated and raw wastewater samples. The recovery rates for sludge samples were poor (0.02-4.7), although, Method F (direct DNA extraction) provided 1-2 orders of magnitude higher recovery rate than Method E (flotation). Based on our results it can be concluded that the recovery rates of hookworm ova from wastewater matrices, especially sludge samples, can be poor and highly variable. Therefore, choice of concentration method is vital for the sensitive detection of hookworm ova in wastewater matrices. PMID:26358269

  15. Recommended Methods for Brain Processing and Quantitative Analysis in Rodent Developmental Neurotoxicity Studies.

    PubMed

    Garman, Robert H; Li, Abby A; Kaufmann, Wolfgang; Auer, Roland N; Bolon, Brad

    2016-01-01

    Neuropathology methods in rodent developmental neurotoxicity (DNT) studies have evolved with experience and changing regulatory guidance. This article emphasizes principles and methods to promote more standardized DNT neuropathology evaluation, particularly procurement of highly homologous brain sections and collection of the most reproducible morphometric measurements. To minimize bias, brains from all animals at all dose levels should be processed from brain weighing through paraffin embedding at one time using a counterbalanced design. Morphometric measurements should be anchored by distinct neuroanatomic landmarks that can be identified reliably on the faced block or in unstained sections and which address the region-specific circuitry of the measured area. Common test article-related qualitative changes in the developing brain include abnormal cell numbers (yielding altered regional size), displaced cells (ectopia and heterotopia), and/or aberrant differentiation (indicated by defective myelination or synaptogenesis), but rarely glial or inflammatory reactions. Inclusion of digital images in the DNT pathology raw data provides confidence that the quantitative analysis was done on anatomically matched (i.e., highly homologous) sections. Interpreting DNT neuropathology data and their presumptive correlation with neurobehavioral data requires an integrative weight-of-evidence approach including consideration of maternal toxicity, body weight, brain weight, and the pattern of findings across brain regions, doses, sexes, and ages. PMID:26296631

  16. Process analytical technology case study part I: feasibility studies for quantitative near-infrared method development.

    PubMed

    Cogdill, Robert P; Anderson, Carl A; Delgado-Lopez, Miriam; Molseed, David; Chisholm, Robert; Bolton, Raymond; Herkert, Thorsten; Afnán, Ali M; Drennen, James K

    2005-01-01

    This article is the first of a series of articles detailing the development of near-infrared (NIR) methods for solid-dosage form analysis. Experiments were conducted at the Duquesne University Center for Pharmaceutical Technology to qualify the capabilities of instrumentation and sample handling systems, evaluate the potential effect of one source of a process signature on calibration development, and compare the utility of reflection and transmission data collection methods. A database of 572 production-scale sample spectra was used to evaluate the interbatch spectral variability of samples produced under routine manufacturing conditions. A second database of 540 spectra from samples produced under various compression conditions was analyzed to determine the feasibility of pooling spectral data acquired from samples produced at diverse scales. Instrument qualification tests were performed, and appropriate limits for instrument performance were established. To evaluate the repeatability of the sample positioning system, multiple measurements of a single tablet were collected. With the application of appropriate spectral preprocessing techniques, sample repositioning error was found to be insignificant with respect to NIR analyses of product quality attributes. Sample shielding was demonstrated to be unnecessary for transmission analyses. A process signature was identified in the reflection data. Additional tests demonstrated that the process signature was largely orthogonal to spectral variation because of hardness. Principal component analysis of the compression sample set data demonstrated the potential for quantitative model development. For the data sets studied, reflection analysis was demonstrated to be more robust than transmission analysis. PMID:16353986

  17. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    SciTech Connect

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%.

  18. Validation of a simple and inexpensive method for the quantitation of infarct in the rat brain.

    PubMed

    Schilichting, C L R; Lima, K C M; Cestari, L A; Sekiyama, J Y; Silva, F M; Milani, H

    2004-04-01

    A gravimetric method was evaluated as a simple, sensitive, reproducible, low-cost alternative to quantify the extent of brain infarct after occlusion of the medial cerebral artery in rats. In ether-anesthetized rats, the left medial cerebral artery was occluded for 1, 1.5 or 2 h by inserting a 4-0 nylon monofilament suture into the internal carotid artery. Twenty-four hours later, the brains were processed for histochemical triphenyltetrazolium chloride (TTC) staining and quantitation of the schemic infarct. In each TTC-stained brain section, the ischemic tissue was dissected with a scalpel and fixed in 10% formalin at 0 masculine C until its total mass could be estimated. The mass (mg) of the ischemic tissue was weighed on an analytical balance and compared to its volume (mm(3)), estimated either by plethysmometry using platinum electrodes or by computer-assisted image analysis. Infarct size as measured by the weighing method (mg), and reported as a percent (%) of the affected (left) hemisphere, correlated closely with volume (mm(3), also reported as %) estimated by computerized image analysis (r = 0.88; P < 0.001; N = 10) or by plethysmography (r = 0.97-0.98; P < 0.0001; N = 41). This degree of correlation was maintained between different experimenters. The method was also sensitive for detecting the effect of different ischemia durations on infarct size (P < 0.005; N = 23), and the effect of drug treatments in reducing the extent of brain damage (P < 0.005; N = 24). The data suggest that, in addition to being simple and low cost, the weighing method is a reliable alternative for quantifying brain infarct in animal models of stroke. PMID:15064814

  19. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging

    NASA Astrophysics Data System (ADS)

    Könik, Arda; Kupinski, Meredith; Hendrik Pretorius, P.; King, Michael A.; Barrett, Harrison H.

    2015-08-01

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3 cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested.

  20. Development of a quantitative method for the characterization of hole quality during laser trepan drilling of high-temperature alloy

    NASA Astrophysics Data System (ADS)

    Zhang, Hongyu; Zhou, Ming; Wang, Yunlong; Zhang, Xiangchao; Yan, Yu; Wang, Rong

    2016-02-01

    Short-pulsed lasers are of significant industrial relevance in laser drilling, with an acceptable compromise between accuracy and efficiency. However, an intensive research with regard to qualitative and quantitative characterization of the hole quality has rarely been reported. In the present study, a series of through holes were fabricated on a high-temperature alloy workpiece with a thickness of 3 mm using a LASERTEC 80 PowerDrill manufacturing system, which incorporated a Nd:YAG millisecond laser into a five-axis positioning platform. The quality of the holes manufactured under different laser powers (80-140 W) and beam expanding ratios (1-6) was characterized by a scanning electron microscope associated with an energy-dispersive X-ray analysis, focusing mainly on the formation of micro-crack and recast layer. Additionally, the conicity and circularity of the holes were quantitatively evaluated by the apparent radius, root-mean-square deviation, and maximum deviation, which were calculated based on the extraction of hole edge through programming with MATLAB. The results showed that an amount of melting and spattering contents were presented at the entrance end and the exit end of the holes, and micro-cracks and recast layer (average thickness 15-30 µm) were detected along the side wall of the holes. The elemental composition of the melting and spattering contents and the recast layer was similar, with an obvious increase in the contents of O, Nb, and Cr and a great reduction in the contents of Fe and Ni in comparison with the bulk material. Furthermore, the conicity and circularity evaluation of the holes indicated that a laser power of 100 W and a beam expanding ratio of 4 or 5 represented the typical optimal drilling parameters in this specific experimental situation. It is anticipated that the quantitative method developed in the present study can be applied for the evaluation of hole quality in laser drilling and other drilling conditions.

  1. Novel Method for Automated Analysis of Retinal Images: Results in Subjects with Hypertensive Retinopathy and CADASIL

    PubMed Central

    Cavallari, Michele; Stamile, Claudio; Umeton, Renato; Calimeri, Francesco; Orzi, Francesco

    2015-01-01

    Morphological analysis of the retinal vessels by fundoscopy provides noninvasive means for detecting and staging systemic microvascular damage. However, full exploitation of fundoscopy in clinical settings is limited by paucity of quantitative, objective information obtainable through the observer-driven evaluations currently employed in routine practice. Here, we report on the development of a semiautomated, computer-based method to assess retinal vessel morphology. The method allows simultaneous and operator-independent quantitative assessment of arteriole-to-venule ratio, tortuosity index, and mean fractal dimension. The method was implemented in two conditions known for being associated with retinal vessel changes: hypertensive retinopathy and Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL). The results showed that our approach is effective in detecting and quantifying the retinal vessel abnormalities. Arteriole-to-venule ratio, tortuosity index, and mean fractal dimension were altered in the subjects with hypertensive retinopathy or CADASIL with respect to age- and gender-matched controls. The interrater reliability was excellent for all the three indices (intraclass correlation coefficient ? 85%). The method represents simple and highly reproducible means for discriminating pathological conditions characterized by morphological changes of retinal vessels. The advantages of our method include simultaneous and operator-independent assessment of different parameters and improved reliability of the measurements. PMID:26167496

  2. Novel Method for Automated Analysis of Retinal Images: Results in Subjects with Hypertensive Retinopathy and CADASIL.

    PubMed

    Cavallari, Michele; Stamile, Claudio; Umeton, Renato; Calimeri, Francesco; Orzi, Francesco

    2015-01-01

    Morphological analysis of the retinal vessels by fundoscopy provides noninvasive means for detecting and staging systemic microvascular damage. However, full exploitation of fundoscopy in clinical settings is limited by paucity of quantitative, objective information obtainable through the observer-driven evaluations currently employed in routine practice. Here, we report on the development of a semiautomated, computer-based method to assess retinal vessel morphology. The method allows simultaneous and operator-independent quantitative assessment of arteriole-to-venule ratio, tortuosity index, and mean fractal dimension. The method was implemented in two conditions known for being associated with retinal vessel changes: hypertensive retinopathy and Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL). The results showed that our approach is effective in detecting and quantifying the retinal vessel abnormalities. Arteriole-to-venule ratio, tortuosity index, and mean fractal dimension were altered in the subjects with hypertensive retinopathy or CADASIL with respect to age- and gender-matched controls. The interrater reliability was excellent for all the three indices (intraclass correlation coefficient ? 85%). The method represents simple and highly reproducible means for discriminating pathological conditions characterized by morphological changes of retinal vessels. The advantages of our method include simultaneous and operator-independent assessment of different parameters and improved reliability of the measurements. PMID:26167496

  3. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  4. A quantitative method to detect explosives and other selected semivolatiles in soil samples by Fourier transform infrared spectroscopy

    SciTech Connect

    Clapper-Gowdy, M.; Demirgian, J.; Lang, K.; Robaittaille, G.

    1992-09-01

    The current methods for hazardous waste site characterization are time consuming, cumbersome, and expensive. Typically, characterization requires a preliminary site assessment and subsequent sampling of potentially contaminated soils and waters. The samples are sent to laboratories for analysis using EPA-certified methods. It is often necessary to repeat the entire sampling-analysis cycle to characterize a site completely and accurately. For these reasons, new methods of site assessment and characterization are continually being researched. TWs paper describes a Fourier transform infrared (FTIR) spectroscopy method that rapidly screens soil samples from potentially hazardous waste sites. Analysis of a soil sample by FTIR takes approximately 10 minutes. The method has been developed to identify and quantify explosives in the field and is directly applicable to selected volatile organics, semivolatile organics, and pesticides. The soil samples are desorbed in a CDS 122 thermal desorption unit under vacuum into a variable pathlength, long-path cell heated to 180{degrees}C. The spectral data, 128 co-added scans at I cm{sup {minus}l} resolution, are collected and stored using a Nicolet 60SX FTIR spectrometer. Classical least squares (CLS) analysis has been used to obtain quantitative results.

  5. A quantitative method to detect explosives and other selected semivolatiles in soil samples by Fourier transform infrared spectroscopy

    SciTech Connect

    Clapper-Gowdy, M.; Demirgian, J. ); Lang, K.; Robaittaille, G. )

    1992-01-01

    The current methods for hazardous waste site characterization are time consuming, cumbersome, and expensive. Typically, characterization requires a preliminary site assessment and subsequent sampling of potentially contaminated soils and waters. The samples are sent to laboratories for analysis using EPA-certified methods. It is often necessary to repeat the entire sampling-analysis cycle to characterize a site completely and accurately. For these reasons, new methods of site assessment and characterization are continually being researched. TWs paper describes a Fourier transform infrared (FTIR) spectroscopy method that rapidly screens soil samples from potentially hazardous waste sites. Analysis of a soil sample by FTIR takes approximately 10 minutes. The method has been developed to identify and quantify explosives in the field and is directly applicable to selected volatile organics, semivolatile organics, and pesticides. The soil samples are desorbed in a CDS 122 thermal desorption unit under vacuum into a variable pathlength, long-path cell heated to 180{degrees}C. The spectral data, 128 co-added scans at I cm{sup {minus}l} resolution, are collected and stored using a Nicolet 60SX FTIR spectrometer. Classical least squares (CLS) analysis has been used to obtain quantitative results.

  6. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  7. Determination of Calcium in Cereal with Flame Atomic Absorption Spectroscopy: An Experiment for a Quantitative Methods of Analysis Course

    ERIC Educational Resources Information Center

    Bazzi, Ali; Kreuz, Bette; Fischer, Jeffrey

    2004-01-01

    An experiment for determination of calcium in cereal using two-increment standard addition method in conjunction with flame atomic absorption spectroscopy (FAAS) is demonstrated. The experiment is intended to introduce students to the principles of atomic absorption spectroscopy giving them hands on experience using quantitative methods of…

  8. Monochloramine disinfection kinetics of Nitrosomonas europaea by propidium monoazide quantitative PCR and Live/Dead BacLight Methods

    EPA Science Inventory

    Monochloramine disinfection kinetics were determined for the pure culture ammonia-oxidizing bacterium Nitrosomonas europaea (ATCC 19718) by two culture independent methods: (1) LIVE/DEAD® BacLight™ (LD) and (2) propidium monoazide quantitative PCR (PMA-qPCR). Both methods were f...

  9. Understanding Variation in Treatment Effects in Education Impact Evaluations: An Overview of Quantitative Methods. NCEE 2014-4017

    ERIC Educational Resources Information Center

    Schochet, Peter Z.; Puma, Mike; Deke, John

    2014-01-01

    This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…

  10. Research and Evaluation in Education and Psychology: Integrating Diversity with Quantitative, Qualitative, and Mixed Methods. Second Edition

    ERIC Educational Resources Information Center

    Mertens, Donna M.

    2004-01-01

    In this new edition, the author explains quantitative, qualitative, and mixed methods, and incorporates the viewpoints of various research paradigms (postpositivist, constructivist, transformative, and pragmatic) into descriptions of these methods. Special emphasis is provided for conducting research in culturally complex communities. Each chapter…

  11. Determination of Calcium in Cereal with Flame Atomic Absorption Spectroscopy: An Experiment for a Quantitative Methods of Analysis Course

    ERIC Educational Resources Information Center

    Bazzi, Ali; Kreuz, Bette; Fischer, Jeffrey

    2004-01-01

    An experiment for determination of calcium in cereal using two-increment standard addition method in conjunction with flame atomic absorption spectroscopy (FAAS) is demonstrated. The experiment is intended to introduce students to the principles of atomic absorption spectroscopy giving them hands on experience using quantitative methods of

  12. A Quantitative Study of a Software Tool that Supports a Part-Complete Solution Method on Learning Outcomes

    ERIC Educational Resources Information Center

    Garner, Stuart

    2009-01-01

    This paper reports on the findings from a quantitative research study into the use of a software tool that was built to support a part-complete solution method (PCSM) for the learning of computer programming. The use of part-complete solutions to programming problems is one of the methods that can be used to reduce the cognitive load that students…

  13. Monochloramine disinfection kinetics of Nitrosomonas europaea by propidium monoazide quantitative PCR and Live/Dead BacLight Methods

    EPA Science Inventory

    Monochloramine disinfection kinetics were determined for the pure culture ammonia-oxidizing bacterium Nitrosomonas europaea (ATCC 19718) by two culture independent methods: (1) LIVE/DEAD BacLight (LD) and (2) propidium monoazide quantitative PCR (PMA-qPCR). Both methods were f...

  14. Integrating Quantitative and Qualitative Evaluation Methods to Compare Two Teacher Inservice Training Programs.

    ERIC Educational Resources Information Center

    Lawrenz, Frances; McCreath, Heather

    1988-01-01

    Reports on the evaluation of two programs which followed the master teacher training model espoused by the National Science Foundation, but used different types of master teachers and activities. Different results were produced from the two evaluation methods; using both methods provided a clearer view of the strengths and weaknesses of the…

  15. Semi-quantitative determination of the modes of occurrence of elements in coal: Results from an International Round Robin Project

    SciTech Connect

    Willett, J.C.; Finkelman, R.B.; Mroczkowski, S.J.; Palmer, C.A.; Kolker, A.

    1999-07-01

    Quantifying the modes of occurrence of elements in coal is necessary for the development of models to predict an element's behavior during in-ground leaching, weathering, coal cleaning, and combustion. Anticipating the behavior of the trace elements is necessary for evaluating the environmental and human health impacts, technological impacts, and economic byproduct potential of coal use. To achieve the goal of quantifying element modes of occurrence, an international round robin project was initiated. Four bituminous coal samples (from the United States, England, Australia and Canada) were distributed to participating laboratories (9 labs from 5 countries) for analysis. Preliminary results indicate that there is good agreement among six laboratories for the chemical analyses. Using selective leaching, quantitative electron microprobe analyses, and semi-quantitative X-ray diffraction, the authors found that many elements have similar modes of occurrence in all four samples. For example, at least 75% of the Al, K, and Li and about 50% of Be, Sc, V, and Cr are leached by HF. Because HF dissolves silicates, the authors infer that these elements are in the clays. As, Hg, Cu, Zn, Cd, and Pb are leached primarily by HCl and HNO{sub 3}, indicating that they are associated with mono- (such as sphalerite and galena) and di-sulfides (pyrite). Leaching results indicate that small amounts of these metals may be associated with clays and organics. Iron behaves differently in each three of the samples, likely due to different proportions of iron in sulfide, carbonate, and silicate phases. Results from the other laboratories (using selective leaching and density separations) appear to be consistent with these results.

  16. Linking Functional Connectivity and Structural Connectivity Quantitatively: A Comparison of Methods.

    PubMed

    Huang, Haiqing; Ding, Mingzhou

    2016-03-01

    Structural connectivity in the brain is the basis of functional connectivity. Quantitatively linking the two, however, remains a challenge. For a pair of regions of interest (ROIs), anatomical connections derived from diffusion-weighted imaging are often quantified by fractional anisotropy (FA) or edge weight, whereas functional connections, derived from resting-state functional magnetic resonance imaging, can be characterized by non-time-series measures such as zero-lag cross correlation and partial correlation, as well as by time-series measures such as coherence and Granger causality. In this study, we addressed the question of linking structural connectivity and functional connectivity quantitatively by considering two pairs of ROIs, one from the default mode network (DMN) and the other from the central executive network (CEN), using two different data sets. Selecting (1) posterior cingulate cortex and medial prefrontal cortex of the DMN as the first pair of ROIs and (2) left dorsal lateral prefrontal cortex and left inferior parietal lobule of the CEN as the second pair of ROIs, we show that (1) zero-lag cross correlation, partial correlation, and pairwise Granger causality were not significantly correlated with either mean FA or edge weight and (2) conditional Granger causality (CGC) was significantly correlated with edge weight but not with mean FA. These results suggest that (1) edge weight may be a more appropriate measure to quantify the strength of the anatomical connection between ROIs and (2) CGC, which statistically removes common input and the indirect influences between a given ROI pair, may be a more appropriate measure to quantify the strength of the functional interaction enabled by the fibers linking the two ROIs. PMID:26598788

  17. Quantitation of resistance training using the session rating of perceived exertion method.

    PubMed

    Sweet, Travis W; Foster, Carl; McGuigan, Michael R; Brice, Glenn

    2004-11-01

    The purpose of this study was to apply the session rating of perceived exertion (RPE) method, which is known to work with aerobic training, to resistance training. Ten men (26.1 +/- 10.2 years) and 10 women (22.2 +/- 1.8 years), habituated to both aerobic and resistance training, performed 3 x 30 minutes aerobic training bouts on the cycle ergometer at intensities of 56%, 71%, and 83% Vo(2) peak and then rated the global intensity using the session RPE technique (e.g., 0-10) 30 minutes after the end of the session. They also performed 3 x 30 minutes resistance exercise bouts with 2 sets of 6 exercises at 50% (15 repetitions), 70% (10 repetitions), and 90% (4 repetitions) of 1 repetition maximum (1RM). After each set the exercisers rated the intensity of that exercise using the RPE scale. Thirty minutes after the end of the bout they rated the intensity of the whole session and of only the lifting components of the session, using the session RPE method. The rated intensity of exercise increased with the %Vo(2) peak and the %1RM. There was a general correspondence between the relative intensity (%Vo(2) peak and % 1RM) and the session RPE. Between different types of resistance exercise at the same relative intensity, the average RPE after each lift varied widely. The resistance training session RPE increased as the intensity increased despite a decrease in the total work performed (p < 0.05). Mean RPE and session RPE-lifting only also grew with increased intensity (p < 0.05). In many cases, the mean RPE, session RPE, and session RPE- lifting only measurements were different at given exercise intensities (p < 0.05). The session RPE appears to be a viable method for quantitating the intensity of resistance training, generally comparable to aerobic training. However, the session RPE may meaningfully underestimate the average intensity rated immediately after each set. PMID:15574104

  18. Effect of platform, reference material, and quantification model on enumeration of Enterococcus by quantitative PCR methods

    EPA Science Inventory

    Quantitative polymerase chain reaction (qPCR) is increasingly being used for the quantitative detection of fecal indicator bacteria in beach water. QPCR allows for same-day health warnings, and its application is being considered as an optionn for recreational water quality testi...

  19. Peer Effects and Peer Group Processes: Joining the Conversation on Quantitative and Qualitative Methods.

    ERIC Educational Resources Information Center

    Nash, Roy

    2002-01-01

    Discusses quantitative and qualitative approaches in research on peer effects on student attainment, using two texts to argue that the definition of "effect" cannot be restricted to "statistical effect," and that institutional properties are not the sum of individual properties. Asserts that quantitative investigators have statistical effects that…

  20. Spectral simulation methods for enhancing qualitative and quantitative analyses based on infrared spectroscopy and quantitative calibration methods for passive infrared remote sensing of volatile organic compounds

    NASA Astrophysics Data System (ADS)

    Sulub, Yusuf Ismail

    Infrared spectroscopy (IR) has over the years found a myriad of applications including passive environmental remote sensing of toxic pollutants and the development of a blood glucose sensor. In this dissertation, capabilities of both these applications are further enhanced with data analysis strategies employing digital signal processing and novel simulation approaches. Both quantitative and qualitative determinations of volatile organic compounds are investigated in the passive IR remote sensing research described in this dissertation. In the quantitative work, partial least-squares (PLS) regression analysis is used to generate multivariate calibration models for passive Fourier transform IR remote sensing measurements of open-air generated vapors of ethanol in the presence methanol as an interfering species. A step-wise co-addition scheme coupled with a digital filtering approach is used to attenuate the effects of variation in optical path length or plume width. For the qualitative study, an IR imaging line scanner is used to acquire remote sensing data in both spatial and spectral domains. This technology is capable of not only identifying but also specifying the location of the sample under investigation. Successful implementation of this methodology is hampered by the huge costs incurred to conduct these experiments and the impracticality of acquiring large amounts of representative training data. To address this problem, a novel simulation approach is developed that generates training data based on synthetic analyte-active and measured analyte-inactive data. Subsequently, automated pattern classifiers are generated using piecewise linear discriminant analysis to predict the presence of the analyte signature in measured imaging data acquired in remote sensing applications. Near infrared glucose determinations based on the region of 5000--4000 cm-1 is the focus of the research in the latter part of this dissertation. A six-component aqueous matrix of glucose in the presence of five other interferent species, all spanning physiological levels, is analyzed quantitatively. Multivariate PLS regression analysis in conjunction with samples designated into a calibration set is used to formulate models for predicting glucose concentrations. Variations in the instrumental response caused by drift and environmental factors are observed to degrade the performance of these models. As a remedy, a model updating approach based on spectral simulation is developed that is highly successful in eliminating the adverse effects of non-chemical variations.

  1. Test Results for Entry Guidance Methods for Space Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2004-01-01

    There are a number of approaches to advanced guidance and control that have the potential for achieving the goals of significantly increasing reusable launch vehicle (or any space vehicle that enters an atmosphere) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future vehicle concepts.

  2. Test Results for Entry Guidance Methods for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Jones, Robert E.

    2003-01-01

    There are a number of approaches to advanced guidance and control (AG&C) that have the potential for achieving the goals of significantly increasing reusable launch vehicle (RLV) safety and reliability, and reducing the cost. This paper examines some approaches to entry guidance. An effort called Integration and Testing of Advanced Guidance and Control Technologies (ITAGCT) has recently completed a rigorous testing phase where these algorithms faced high-fidelity vehicle models and were required to perform a variety of representative tests. The algorithm developers spent substantial effort improving the algorithm performance in the testing. This paper lists the test cases used to demonstrate that the desired results are achieved, shows an automated test scoring method that greatly reduces the evaluation effort required, and displays results of the tests. Results show a significant improvement over previous guidance approaches. The two best-scoring algorithm approaches show roughly equivalent results and are ready to be applied to future reusable vehicle concepts.

  3. A novel imaging method for quantitative Golgi localization reveals differential intra-Golgi trafficking of secretory cargoes.

    PubMed

    Tie, Hieng Chiong; Mahajan, Divyanshu; Chen, Bing; Cheng, Li; VanDongen, Antonius M J; Lu, Lei

    2016-03-01

    Cellular functions of the Golgi are determined by the unique distribution of its resident proteins. Currently, electron microscopy is required for the localization of a Golgi protein at the sub-Golgi level. We developed a quantitative sub-Golgi localization method based on centers of fluorescence masses of nocodazole-induced Golgi ministacks under conventional optical microscopy. Our method is rapid, convenient, and quantitative, and it yields a practical localization resolution of ∼30 nm. The method was validated by the previous electron microscopy data. We quantitatively studied the intra-Golgi trafficking of synchronized secretory membrane cargoes and directly demonstrated the cisternal progression of cargoes from the cis- to the trans-Golgi. Our data suggest that the constitutive efflux of secretory cargoes could be restricted at the Golgi stack, and the entry of the trans-Golgi network in secretory pathway could be signal dependent. PMID:26764092

  4. A novel imaging method for quantitative Golgi localization reveals differential intra-Golgi trafficking of secretory cargoes

    PubMed Central

    Tie, Hieng Chiong; Mahajan, Divyanshu; Chen, Bing; Cheng, Li; VanDongen, Antonius M. J.; Lu, Lei

    2016-01-01

    Cellular functions of the Golgi are determined by the unique distribution of its resident proteins. Currently, electron microscopy is required for the localization of a Golgi protein at the sub-Golgi level. We developed a quantitative sub-Golgi localization method based on centers of fluorescence masses of nocodazole-induced Golgi ministacks under conventional optical microscopy. Our method is rapid, convenient, and quantitative, and it yields a practical localization resolution of ∼30 nm. The method was validated by the previous electron microscopy data. We quantitatively studied the intra-Golgi trafficking of synchronized secretory membrane cargoes and directly demonstrated the cisternal progression of cargoes from the cis- to the trans-Golgi. Our data suggest that the constitutive efflux of secretory cargoes could be restricted at the Golgi stack, and the entry of the trans-Golgi network in secretory pathway could be signal dependent. PMID:26764092

  5. Comparison of interlaboratory results for blood lead with results from a definitive method.

    PubMed

    Boone, J; Hearn, T; Lewis, S

    1979-03-01

    Results reported by 113 participants in the Blood Lead Proficiency Testing Program conducted by the Center for Disease Control were compared with those obtained by the National Bureau of Standards (NBS) with a definitive methods (mass spectroscopy-isotopic dilution) for blood lead analyses. Data were compiled from the results obtained for 12 whole-blood samples containing 1.5 g of disodium EDTA per liter. Twelve separate blood samples were obtained from cattle which had been given lead nitrate orally. Lead concentrations in the samples ranged from 0.628 to 4.93 mumol/L (130-1020 micrograms/L) as determined by NBS. The methods used by laboratories were classified according to six basic groups: anodic stripping voltametry; and atomic absorption spectroscopy in which either extraction, carbon rod, graphite furnace, tantalum strip, or Delves cup was used. For results obtained in each group a linear regression analyses of laboratory values was made on the basis of NBS values. In comparison to the definitive method, most field methods for blood lead tended to overestimate the lead concentration when the actual lead concentration was less than 1.96 mumol/L (400 micrograms/L) and to underestimate the lead concentration when the actual lead concentration was greater than 2.45 mumol/L (500 micrograms/L). PMID:262177

  6. A non-radioisotopic quantitative competitive polymerase chain reaction method: application in measurement of human herpesvirus 7 load.

    PubMed

    Kidd, I M; Clark, D A; Emery, V C

    2000-06-01

    Quantitative-competitive polymerase chain reaction (QCPCR) is a well-optimised and objective methodology for the determination of viral load in clinical specimens. A major advantage of QCPCR is the ability to control for the differential modulation of the PCR process in the presence of potentially inhibitory material. QCPCR protocols were developed previously for CMV, HHV-6, HHV-7 and HHV-8 and relied upon radioactively labelled primers, followed by autoradiography of the separated and digested PCR products to quantify viral load. Whilst this approach offers high accuracy and dynamic range, non-radioactive approaches would be attractive. Here, an alternative detection system is reported, based on simple ethidium bromide staining and computer analysis of the separated reaction products, which enables its adoption in the analysis of a large number of samples. In calibration experiments using cloned HHV-7 DNA, the ethidium bromide detection method showed an improved correlation with known copy number over that obtained with the isotopic method. In addition, 67 HHV-7 PCR positive blood samples, derived from immunocompromised patients, were quantified using both detection techniques. The results showed a highly significant correlation with no significant difference between the two methods. The applicability of the computerised densitometry method in the routine laboratory is discussed. PMID:10856765

  7. Quantitative analysis of ecological effects for land use planning based on ecological footprint method: a case research in Nanyang City

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Liu, Yaolin; Chen, Xinming

    2008-10-01

    The research of coordinated development between land use and ecological building is a new problem with the development of country economy, whose intention is to improve economy development and protect eco-environment in order to realize regional sustainable development. Evaluating human effects on the ecosystem by a comprehensive, scientific and quantitative method is a critical issue in the process of general land use planning. At present, ecological footprint methodology, as an excellent educational tool applicable to global issues, is essential for quantifying humanity's consumption of natural capital, for overall assessments of human impact on earth as well as for general land use planning. However, quantitative studies on the development trends of ecological footprint (EF) time series and biological capacity (BC) time series in a given region are still rare. Taking Nanyang City as a case study, this paper presents two quantitative estimate indices over time scale called the change rate and scissors difference to quantitatively analyze the trends of EF and BC over the planning period in general land use planning form 1997-2004 and to evaluate the ecological effects of the land use general planning form 1997 to.2010. The results showed that: 1 In Nanyang city, trends of the per capita EF and BC were on the way round, and the ecological deficit enhanced from 1997 to 2010. 2 The difference between the two development trends of per capita EF and BC had been increasing rapidly and the conflict between the EF and BC was aggravated from 1997 to 2010. 3 The general land use planning (1997 - 2010) of Nanyang city had produced some positive effects on the local ecosystem, but the expected biological capacity in 2010 can hardly be realized following this trend. Therefore, this paper introduces a "trinity" land use model in the guidelines of environment- friendly land use pattern and based on the actual situation of Nanyang city, with the systemic synthesis of land utilization of the cities, the village and the suburb as a principal part and the land development reorganization and the ecological environment construction as the key point.

  8. Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation

    PubMed Central

    Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.

    2013-01-01

    Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400

  9. A simple method for the quantitative microextraction of polychlorinated biphenyls from soils and sediments.

    SciTech Connect

    Szostek, B.; Tinklenberg, J. A.; Aldstadt, J. H., III; Environmental Research

    1999-01-01

    We demonstrate the quantitative extraction of polychlorinated biphenyls (PCBs) from environmental solids by using a microscale adaptation of pressurized fluid extraction ({mu}PFE). The stainless steel extraction cells are filled with a solid sample and solvent and are heated at elevated temperature. After cooling the cell to room temperature, we determined PCBs in the extract by direct injection to a gas chromatograph with an electron capture detection system. This extraction method was tested on a set of PCB-spiked solid matrices and on a PCB-contaminated river sediment (KIST SRM 1939). Recoveries were measured for eight PCB congeners spiked into two soil types with hexane extraction at 100{sup o}C (>81.9 {+-} 5.4% to 112.5 {+-} 10.1 %). The extraction process for SRM 1939 with hexane at 300{sup o}C provided significantly higher recoveries for several representative PCB congeners than reported for a duplicate 16-hour Soy-Wet extraction with a mixture of organic solvents (acetone/hexane).

  10. A gradient-based method for quantitative photoacoustic tomography using the radiative transfer equation

    NASA Astrophysics Data System (ADS)

    Saratoon, T.; Tarvainen, T.; Cox, B. T.; Arridge, S. R.

    2013-07-01

    Quantitative photoacoustic tomography (QPAT) offers the possibility of high-resolution molecular imaging by quantifying molecular concentrations in biological tissue. QPAT comprises two inverse problems: (1) the construction of a photoacoustic image from surface measurements of photoacoustic wave pulses over time, and (2) the determination of the optical properties of the imaged region. The first is a well-studied area for which a number of solution methods are available, while the second is, in general, a nonlinear, ill-posed inverse problem. Model-based inversion techniques to solve (2) are usually based on the diffusion approximation to the radiative transfer equation (RTE) and typically assume the acoustic inversion step has been solved exactly. Here, neither simplification is made: the full RTE is used to model the light propagation, and the acoustic propagation and image reconstruction are included in the simulations of measured data. Since Hessian- and Jacobian-based minimizations are computationally expensive for the large data sets typically encountered in QPAT, gradient-based minimization schemes provide a practical alternative. The acoustic pressure time series were simulated using a k-space, pseudo-spectral time domain model, and a time-reversal reconstruction algorithm was used to form a set of photoacoustic images corresponding to four illumination positions. A regularized, adjoint-assisted gradient inversion using a finite element model of the RTE was then used to determine the optical absorption and scattering coefficients.

  11. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  12. Toward a quantitative account of pitch distribution in spontaneous narrative: method and validation.

    PubMed

    Matteson, Samuel E; Olness, Gloria Streit; Caplow, Nancy J

    2013-05-01

    Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the "e-la") superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400

  13. Challenges of Interdisciplinary Research: Reconciling Qualitative and Quantitative Methods for Understanding Human-Landscape Systems

    NASA Astrophysics Data System (ADS)

    Lach, Denise

    2014-01-01

    While interdisciplinary research is increasingly practiced as a way to transcend the limitations of individual disciplines, our concepts, and methods are primarily rooted in the disciplines that shape the way we think about the world and how we conduct research. While natural and social scientists may share a general understanding of how science is conducted, disciplinary differences in methodologies quickly emerge during interdisciplinary research efforts. This paper briefly introduces and reviews different philosophical underpinnings of quantitative and qualitative methodological approaches and introduces the idea that a pragmatic, realistic approach may allow natural and social scientists to work together productively. While realism assumes that there is a reality that exists independently of our perceptions, the work of scientists is to explore the mechanisms by which actions cause meaningful outcomes and the conditions under which the mechanisms can act. Our task as interdisciplinary researchers is to use the insights of our disciplines in the context of the problem to co-produce an explanation for the variables of interest. Research on qualities necessary for successful interdisciplinary researchers is also discussed along with recent efforts by funding agencies and academia to increase capacities for interdisciplinary research.

  14. Qualitative and Quantitative PCR-Based Detection Methods for Authorized Genetically Modified Cotton Events in India.

    PubMed

    Chhabra, Rashmi; Randhawa, Gurinder Jit; Bhoge, Rajesh K; Singh, Monika

    2014-01-01

    Qualitative diagnostics for all five commercialized genetically modified (GM) cotton events for insect resistance in India is being reported for the first time in this paper. The cost-effective and robust multiplex PCR (MPCR)-based detection assay, distinguishing the insect resistant transgenic Bt cotton events, viz., MON531, MON15985, Event 1, GFM-cry1A, and MLS-9124, has been developed. This decaplex PCR assay targets nine transgenic elements, viz., sequences of four transgenes, three transgene constructs, and two event-specific sequences along with one endogenous reference gene. The LOD of the qualitative MPCR assay was up to 0.1%. A quantitative detection method for four widely commercially cultivated GM cotton events, namely, MON531, MON15985, Event 1, and GFM-cry1A, covering 99.5% of the total area under GM cultivation in the country, is also reported. A construct-specific real-time PCR assay has been developed for quantification of these GM cotton events with LOQ <0.05% and LOD <0.025%. The developed assays will be of great use to screen for the presence/absence of authorized GM cotton events in unknown samples and to check the authenticity of GM cotton seed samples. PMID:25902979

  15. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, Robert V.

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.

  16. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Nria; Herrero, Pol; Marin, Slvia; Nadal, Pedro; Ras, Maria Rosa; Rodrguez, Miguel ngel; Arola, Llus

    2016-01-01

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. PMID:26275862

  17. Quantitative assessment of MS plaques and brain atrophy in multiple sclerosis using semiautomatic segmentation method

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Dastidar, Prasun; Ryymin, Pertti; Lahtinen, Antti J.; Eskola, Hannu; Malmivuo, Jaakko

    1997-05-01

    Quantitative magnetic resonance (MR) imaging of the brain is useful in multiple sclerosis (MS) in order to obtain reliable indices of disease progression. The goal of this project was to estimate the total volume of gliotic and non gliotic plaques in chronic progressive multiple sclerosis with the help of a semiautomatic segmentation method developed at the Ragnar Granit Institute. Youth developed program running on a PC based computer provides de displays of the segmented data, in addition to the volumetric analyses. The volumetric accuracy of the program was demonstrated by segmenting MR images of fluid filed syringes. An anatomical atlas is to be incorporated in the segmentation system to estimate the distribution of MS plaques in various neural pathways of the brain. A total package including MS plaque volume estimation, estimation of brain atrophy and ventricular enlargement, distribution of MS plaques in different neural segments of the brain has ben planned for the near future. Our study confirmed that total lesion volumes in chronic MS disease show a poor correlation to EDSS scores but show a positive correlation to neuropsychological scores. Therefore accurate total volume measurements of MS plaques using the developed semiautomatic segmentation technique helped us to evaluate the degree of neuropsychological impairment.

  18. Development of Screening Method for an Frail Elderly by Measurement Quantitative Lower Limb Muscular Strength

    NASA Astrophysics Data System (ADS)

    Yamashita, Kazuhiko; Iwakami, Yumi; Imaizumi, Kazuya; Sato, Mitsuru; Nakajima, Sawako; Ino, Shuichi; Kawasumi, Masashi; Ifukube, Tohru

    Falling is one of the most serious problems for the elderly. The aim of this study was to develop a screening method for identifying factors that increase the risk of falling among the elderly, particularly with regard to lower limb muscular strength. Subjects were 48 elderly volunteers, including 25 classed as healthy and 23 classed as frail. All subjects underwent measurement of lower limb muscular strength via toe gap force and measurement of muscle strength of the hip joint adductor via knee gap force. In the frail group, toe gap force of the right foot was 20% lower than that in the healthy group; toe gap force of the left foot in the frail group was 23% lower than that in the healthy group, while knee gap force was 20% lower. Furthermore, we found that combining left toe gap force and knee gap force gave the highest odds ratio (6.05) with 82.6% sensitivity and 56.0% specificity when the toe gap force was 24 N and the knee gap force was 100 N. Thus, lower limb muscular strength can be used for simple and efficient screening, and approaches to prevent falls can be based on quantitative data such as lower limb muscular strength.

  19. Quantitative determination of zopiclone and its impurity by four different spectrophotometric methods.

    PubMed

    Abdelrahman, Maha M; Naguib, Ibrahim A; El Ghobashy, Mohamed R; Ali, Nesma A

    2015-02-25

    Four simple, sensitive and selective spectrophotometric methods are presented for determination of Zopiclone (ZPC) and its impurity, one of its degradation products, namely; 2-amino-5-chloropyridine (ACP). Method A is a dual wavelength spectrophotometry; where two wavelengths (252 and 301 nm for ZPC, and 238 and 261 nm for ACP) were selected for each component in such a way that difference in absorbance is zero for the second one. Method B is isoabsorptive ratio method by combining the isoabsorptive point (259.8 nm) in the ratio spectrum using ACP as a divisor and the ratio difference for a single step determination of both components. Method C is third derivative (D(3)) spectrophotometric method which allows determination of both ZPC at 283.6 nm and ACP at 251.6 nm without interference of each other. Method D is based on measuring the peak amplitude of the first derivative of the ratio spectra (DD(1)) at 263.2 nm for ZPC and 252 nm for ACP. The suggested methods were validated according to ICH guidelines and can be applied for routine analysis in quality control laboratories. Statistical analysis of the results obtained from the proposed methods and those obtained from the reported method has been carried out revealing high accuracy and good precision. PMID:25244295

  20. Quantitative determination of zopiclone and its impurity by four different spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Abdelrahman, Maha M.; Naguib, Ibrahim A.; El Ghobashy, Mohamed R.; Ali, Nesma A.

    2015-02-01

    Four simple, sensitive and selective spectrophotometric methods are presented for determination of Zopiclone (ZPC) and its impurity, one of its degradation products, namely; 2-amino-5-chloropyridine (ACP). Method A is a dual wavelength spectrophotometry; where two wavelengths (252 and 301 nm for ZPC, and 238 and 261 nm for ACP) were selected for each component in such a way that difference in absorbance is zero for the second one. Method B is isoabsorptive ratio method by combining the isoabsorptive point (259.8 nm) in the ratio spectrum using ACP as a divisor and the ratio difference for a single step determination of both components. Method C is third derivative (D3) spectrophotometric method which allows determination of both ZPC at 283.6 nm and ACP at 251.6 nm without interference of each other. Method D is based on measuring the peak amplitude of the first derivative of the ratio spectra (DD1) at 263.2 nm for ZPC and 252 nm for ACP. The suggested methods were validated according to ICH guidelines and can be applied for routine analysis in quality control laboratories. Statistical analysis of the results obtained from the proposed methods and those obtained from the reported method has been carried out revealing high accuracy and good precision.