Sample records for objective quantitative methods

  1. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  2. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  4. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, Robert V.

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.

  5. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  6. Laser flare photometry: a noninvasive, objective, and quantitative method to measure intraocular inflammation.

    PubMed

    Tugal-Tutkun, Ilknur; Herbort, Carl P

    2010-10-01

    Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis.

  7. Characterization and Application of a Grazing Angle Objective for Quantitative Infrared Reflection Microspectroscopy

    NASA Technical Reports Server (NTRS)

    Pepper, Stephen V.

    1995-01-01

    A grazing angle objective on an infrared microspectrometer is studied for quantitative spectroscopy by considering the angular dependence of the incident intensity within the objective's angular aperture. The assumption that there is no angular dependence is tested by comparing the experimental reflectance of Si and KBr surfaces with the reflectance calculated by integrating the Fresnel reflection coefficient over the angular aperture under this assumption. Good agreement was found, indicating that the specular reflectance of surfaces can straight-forwardly be quantitatively integrated over the angular aperture without considering non-uniform incident intensity. This quantitative approach is applied to the thickness determination of dipcoated Krytox on gold. The infrared optical constants of both materials are known, allowing the integration to be carried out. The thickness obtained is in fair agreement with the value determined by ellipsometry in the visible. Therefore, this paper illustrates a method for more quantitative use of a grazing angle objective for infrared reflectance microspectroscopy.

  8. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  9. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  10. Objective comparison of particle tracking methods

    PubMed Central

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F.; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R.; Godinez, William J.; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E. G.; Jaldén, Joakim; Blau, Helen M.; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L.; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P.; Dan, Han-Wei; Tsai, Yuh-Show; de Solórzano, Carlos Ortiz; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-01-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Since manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized, for the first time, an open competition, in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to important practical conclusions for users and developers. PMID:24441936

  11. Objective comparison of particle tracking methods.

    PubMed

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R; Godinez, William J; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E G; Jaldén, Joakim; Blau, Helen M; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P; Dan, Han-Wei; Tsai, Yuh-Show; Ortiz de Solórzano, Carlos; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-03-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Because manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized an open competition in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to notable practical conclusions for users and developers.

  12. Quantitative Information Differences Between Object-Person Presentation Methods

    ERIC Educational Resources Information Center

    Boyd, J. Edwin; Perry, Raymond P.

    1972-01-01

    Subjects used significantly more adjectives, on an adjective checklist (ACL), in giving their impressions of an object-person; based on written and audiovisual presentations, more than audio presentations. (SD)

  13. Objective, Quantitative, Data-Driven Assessment of Chemical Probes.

    PubMed

    Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan

    2018-02-15

    Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  15. [Methods of quantitative proteomics].

    PubMed

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  16. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  17. Phase retrieval with the reverse projection method in the presence of object's scattering

    NASA Astrophysics Data System (ADS)

    Wang, Zhili; Gao, Kun; Wang, Dajiang

    2017-08-01

    X-ray grating interferometry can provide substantially increased contrast over traditional attenuation-based techniques in biomedical applications, and therefore novel and complementary information. Recently, special attention has been paid to quantitative phase retrieval in X-ray grating interferometry, which is mandatory to perform phase tomography, to achieve material identification, etc. An innovative approach, dubbed ;Reverse Projection; (RP), has been developed for quantitative phase retrieval. The RP method abandons grating scanning completely, and is thus advantageous in terms of higher efficiency and reduced radiation damage. Therefore, it is expected that this novel method would find its potential in preclinical and clinical implementations. Strictly speaking, the reverse projection method is applicable for objects exhibiting only absorption and refraction. In this contribution, we discuss the phase retrieval with the reverse projection method for general objects with absorption, refraction and scattering simultaneously. Especially, we investigate the influence of the object's scattering on the retrieved refraction signal. Both theoretical analysis and numerical experiments are performed. The results show that the retrieved refraction signal is the product of object's refraction and scattering signals for small values. In the case of a strong scattering, the reverse projection method cannot provide reliable phase retrieval. Those presented results will guide the use of the reverse projection method for future practical applications, and help to explain some possible artifacts in the retrieved images and/or reconstructed slices.

  18. A GIS-BASED METHOD FOR MULTI-OBJECTIVE EVALUATION OF PARK VEGETATION. (R824766)

    EPA Science Inventory

    Abstract

    In this paper we describe a method for evaluating the concordance between a set of mapped landscape attributes and a set of quantitatively expressed management priorities. The method has proved to be useful in planning urban green areas, allowing objectively d...

  19. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior.

    PubMed

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-01-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  20. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  1. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  2. Multi-objective decision-making under uncertainty: Fuzzy logic methods

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1995-01-01

    Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.

  3. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  4. [Progress in stable isotope labeled quantitative proteomics methods].

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  5. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  6. Nuclear medicine and quantitative imaging research (instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1990-09-01

    This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less

  7. A new method to evaluate image quality of CBCT images quantitatively without observers

    PubMed Central

    Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori

    2017-01-01

    Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343

  8. Multi-objective decision-making under uncertainty: Fuzzy logic methods

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1994-01-01

    Selecting the best option among alternatives is often a difficult process. This process becomes even more difficult when the evaluation criteria are vague or qualitative, and when the objectives vary in importance and scope. Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.

  9. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  10. Methods for Quantitative Creatinine Determination.

    PubMed

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  11. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  12. Quantitative methods in psychology: inevitable and useless.

    PubMed

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  13. From themes to hypotheses: following up with quantitative methods.

    PubMed

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  14. Comparison of DNA fragmentation and color thresholding for objective quantitation of apoptotic cells

    NASA Technical Reports Server (NTRS)

    Plymale, D. R.; Ng Tang, D. S.; Fermin, C. D.; Lewis, D. E.; Martin, D. S.; Garry, R. F.

    1995-01-01

    Apoptosis is a process of cell death characterized by distinctive morphological changes and fragmentation of cellular DNA. Using video imaging and color thresholding techniques, we objectively quantitated the number of cultured CD4+ T-lymphoblastoid cells (HUT78 cells, RH9 subclone) displaying morphological signs of apoptosis before and after exposure to gamma-irradiation. The numbers of apoptotic cells measured by objective video imaging techniques were compared to numbers of apoptotic cells measured in the same samples by sensitive apoptotic assays that quantitate DNA fragmentation. DNA fragmentation assays gave consistently higher values compared with the video imaging assays that measured morphological changes associated with apoptosis. These results suggest that substantial DNA fragmentation can precede or occur in the absence of the morphological changes which are associated with apoptosis in gamma-irradiated RH9 cells.

  15. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  16. Creating Objects and Object Categories for Studying Perception and Perceptual Learning

    PubMed Central

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-01-01

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created

  17. Creating objects and object categories for studying perception and perceptual learning.

    PubMed

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-11-02

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties. Many innovative and useful methods currently exist for creating novel objects and object categories (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection. Objects and object categories created by these simulations can

  18. Novel method for quantitative ANA measurement using near-infrared imaging.

    PubMed

    Peterson, Lisa K; Wells, Daniel; Shaw, Laura; Velez, Maria-Gabriela; Harbeck, Ronald; Dragone, Leonard L

    2009-09-30

    Antinuclear antibodies (ANA) have been detected in patients with systemic rheumatic diseases and are used in the screening and/or diagnosis of autoimmunity in patients as well as mouse models of systemic autoimmunity. Indirect immunofluorescence (IIF) on HEp-2 cells is the gold standard for ANA screening. However, its usefulness is limited in diagnosis, prognosis and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. Various immunological techniques have been developed in an attempt to improve upon the method to quantify ANA, including enzyme-linked immunosorbent assays (ELISAs), line immunoassays (LIAs), multiplexed bead immunoassays and IIF on substrates other than HEp-2 cells. Yet IIF on HEp-2 cells remains the most common screening method for ANA. In this study, we describe a simple quantitative method to detect ANA which combines IIF on HEp-2 coated slides with analysis using a near-infrared imaging (NII) system. Using NII to determine ANA titer, 86.5% (32 of 37) of the titers for human patient samples were within 2 dilutions of those determined by IIF, which is the acceptable range for proficiency testing. Combining an initial screening for nuclear staining using microscopy with titration by NII resulted in 97.3% (36 of 37) of the titers detected to be within two dilutions of those determined by IIF. The NII method for quantitative ANA measurements using serum from both patients and mice with autoimmunity provides a fast, relatively simple, objective, sensitive and reproducible assay, which could easily be standardized for comparison between laboratories.

  19. Chemoenzymatic method for glycomics: isolation, identification, and quantitation

    PubMed Central

    Yang, Shuang; Rubin, Abigail; Eshghi, Shadi Toghi; Zhang, Hui

    2015-01-01

    Over the past decade, considerable progress has been made with respect to the analytical methods for analysis of glycans from biological sources. Regardless of the specific methods that are used, glycan analysis includes isolation, identification, and quantitation. Derivatization is indispensable to increase their identification. Derivatization of glycans can be performed by permethylation or carbodiimide coupling / esterification. By introducing a fluorophore or chromophore at their reducing end, glycans can be separated by electrophoresis or chromatography. The fluorogenically labeled glycans can be quantitated using fluorescent detection. The recently developed approaches using solid-phase such as glycoprotein immobilization for glycan extraction and on-tissue glycan mass spectrometry imaging demonstrate advantages over methods performed in solution. Derivatization of sialic acids is favorably implemented on the solid support using carbodiimide coupling, and the released glycans can be further modified at the reducing end or permethylated for quantitative analysis. In this review, methods for glycan isolation, identification, and quantitation are discussed. PMID:26390280

  20. A comparison of moving object detection methods for real-time moving object detection

    NASA Astrophysics Data System (ADS)

    Roshan, Aditya; Zhang, Yun

    2014-06-01

    Moving object detection has a wide variety of applications from traffic monitoring, site monitoring, automatic theft identification, face detection to military surveillance. Many methods have been developed across the globe for moving object detection, but it is very difficult to find one which can work globally in all situations and with different types of videos. The purpose of this paper is to evaluate existing moving object detection methods which can be implemented in software on a desktop or laptop, for real time object detection. There are several moving object detection methods noted in the literature, but few of them are suitable for real time moving object detection. Most of the methods which provide for real time movement are further limited by the number of objects and the scene complexity. This paper evaluates the four most commonly used moving object detection methods as background subtraction technique, Gaussian mixture model, wavelet based and optical flow based methods. The work is based on evaluation of these four moving object detection methods using two (2) different sets of cameras and two (2) different scenes. The moving object detection methods have been implemented using MatLab and results are compared based on completeness of detected objects, noise, light change sensitivity, processing time etc. After comparison, it is observed that optical flow based method took least processing time and successfully detected boundary of moving objects which also implies that it can be implemented for real-time moving object detection.

  1. A method for three-dimensional quantitative observation of the microstructure of biological samples

    NASA Astrophysics Data System (ADS)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  2. Engineering Ethics Education : Its Necessity, Objectives, Methods, Current State, and Challenges

    NASA Astrophysics Data System (ADS)

    Fudano, Jun

    The importance of engineering ethics education has become widely recognized in the industrialized countries including Japan. This paper examines the background against which engineering ethics education is required, and reviews its objectives, methods, and challenges, as well as its current state. In pointing out important issues associated with the apparent acceptance and quantitative development of ethics education, especially after the establishment of the Japan Accreditation Board for Engineering Education in 1999, the author stresses that the most serious problem is the lack of common understanding on the objectives of engineering ethics education. As a strategy to improve the situation, the so-called “Ethics-across-the-Curriculum” approach is introduced. The author also claims that business/organization ethics which is consistent with engineering ethics should be promoted in Japan.

  3. Mapcurves: a quantitative method for comparing categorical maps.

    Treesearch

    William W. Hargrove; M. Hoffman Forrest; Paul F. Hessburg

    2006-01-01

    We present Mapcurves, a quantitative goodness-of-fit (GOF) method that unambiguously shows the degree of spatial concordance between two or more categorical maps. Mapcurves graphically and quantitatively evaluate the degree of fit among any number of maps and quantify a GOF for each polygon, as well as the entire map. The Mapcurve method indicates a perfect fit even if...

  4. Methods and strategies of object localization

    NASA Technical Reports Server (NTRS)

    Shao, Lejun; Volz, Richard A.

    1989-01-01

    An important property of an intelligent robot is to be able to determine the location of an object in 3-D space. A general object localization system structure is proposed, some important issues on localization discussed, and an overview given for current available object localization algorithms and systems. The algorithms reviewed are characterized by their feature extracting and matching strategies; the range finding methods; the types of locatable objects; and the mathematical formulating methods.

  5. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Calibration methods influence quantitative material decomposition in photon-counting spectral CT

    NASA Astrophysics Data System (ADS)

    Curtis, Tyler E.; Roeder, Ryan K.

    2017-03-01

    Photon-counting detectors and nanoparticle contrast agents can potentially enable molecular imaging and material decomposition in computed tomography (CT). Material decomposition has been investigated using both simulated and acquired data sets. However, the effect of calibration methods on material decomposition has not been systematically investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on quantitative material decomposition. A commerciallyavailable photon-counting spectral micro-CT (MARS Bioimaging) was used to acquire images with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material basis matrix values were determined using multiple linear regression models and material decomposition was performed using a maximum a posteriori estimator. The accuracy of quantitative material decomposition was evaluated by the root mean squared error (RMSE), specificity, sensitivity, and area under the curve (AUC). An increased maximum concentration (range) in the calibration significantly improved RMSE, specificity and AUC. The effects of an increased number of concentrations in the calibration were not statistically significant for the conditions in this study. The overall results demonstrated that the accuracy of quantitative material decomposition in spectral CT is significantly influenced by calibration methods, which must therefore be carefully considered for the intended diagnostic imaging application.

  7. Object extraction method for image synthesis

    NASA Astrophysics Data System (ADS)

    Inoue, Seiki

    1991-11-01

    The extraction of component objects from images is fundamentally important for image synthesis. In TV program production, one useful method is the Video-Matte technique for specifying the necessary boundary of an object. This, however, involves some manually intricate and tedious processes. A new method proposed in this paper can reduce the needed level of operator skill and simplify object extraction. The object is automatically extracted by just a simple drawing of a thick boundary line. The basic principle involves a thinning of the thick boundary line binary image using the edge intensity of the original image. This method has many practical advantages, including the simplicity of specifying an object, the high accuracy of thinned-out boundary line, its ease of application to moving images, and the lack of any need for adjustment.

  8. An object tracking method based on guided filter for night fusion image

    NASA Astrophysics Data System (ADS)

    Qian, Xiaoyan; Wang, Yuedong; Han, Lei

    2016-01-01

    Online object tracking is a challenging problem as it entails learning an effective model to account for appearance change caused by intrinsic and extrinsic factors. In this paper, we propose a novel online object tracking with guided image filter for accurate and robust night fusion image tracking. Firstly, frame difference is applied to produce the coarse target, which helps to generate observation models. Under the restriction of these models and local source image, guided filter generates sufficient and accurate foreground target. Then accurate boundaries of the target can be extracted from detection results. Finally timely updating for observation models help to avoid tracking shift. Both qualitative and quantitative evaluations on challenging image sequences demonstrate that the proposed tracking algorithm performs favorably against several state-of-art methods.

  9. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  10. Quantitative diagnostic method for biceps long head tendinitis by using ultrasound.

    PubMed

    Huang, Shih-Wei; Wang, Wei-Te

    2013-01-01

    To investigate the feasibility of grayscale quantitative diagnostic method for biceps tendinitis and determine the cut-off points of a quantitative biceps ultrasound (US) method to diagnose biceps tendinitis. Design. Prospective cross-sectional case controlled study. Outpatient rehabilitation service. A total of 336 shoulder pain patients with suspected biceps tendinitis were recruited in this prospective observational study. The grayscale pixel data of the range of interest (ROI) were obtained for both the transverse and longitudinal views of the biceps US. A total of 136 patients were classified with biceps tendinitis, and 200 patients were classified as not having biceps tendinitis based on the diagnostic criteria. Based on the Youden index, the cut-off points were determined as 26.85 for the transverse view and 21.25 for the longitudinal view of the standard deviation (StdDev) of the ROI values, respectively. When the ROI evaluation of the US surpassed the cut-off point, the sensitivity was 68% and the specificity was 90% in the StdDev of the transverse view, and the sensitivity was 81% and the specificity was 73% in the StdDev of the longitudinal view to diagnose biceps tendinitis. For equivocal cases or inexperienced sonographers, our study provides a more objective method for diagnosing biceps tendinitis in shoulder pain patients.

  11. Biological characteristics of crucian by quantitative inspection method

    NASA Astrophysics Data System (ADS)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  12. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  13. Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research

    PubMed Central

    SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN

    2015-01-01

    Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073

  14. Mathematical modelling and quantitative methods.

    PubMed

    Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G

    2002-01-01

    The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.

  15. An Quantitative Analysis Method Of Trabecular Pattern In A Bone

    NASA Astrophysics Data System (ADS)

    Idesawa, Masanor; Yatagai, Toyohiko

    1982-11-01

    Orientation and density of trabecular pattern observed in a bone is closely related to its mechanical properties and deseases of a bone are appeared as changes of orientation and/or density distrbution of its trabecular patterns. They have been treated from a qualitative point of view so far because quantitative analysis method has not be established. In this paper, the authors proposed and investigated some quantitative analysis methods of density and orientation of trabecular patterns observed in a bone. These methods can give an index for evaluating orientation of trabecular pattern quantitatively and have been applied to analyze trabecular pattern observed in a head of femur and their availabilities are confirmed. Key Words: Index of pattern orientation, Trabecular pattern, Pattern density, Quantitative analysis

  16. [The development and validation of the methods for the quantitative determination of sibutramine derivatives in dietary supplements].

    PubMed

    Stern, K I; Malkova, T L

    The objective of the present study was the development and validation of sibutramine demethylated derivatives, desmethyl sibutramine and didesmethyl sibutramine. Gas-liquid chromatography with the flame ionization detector was used for the quantitative determination of the above substances in dietary supplements. The conditions for the chromatographic determination of the analytes in the presence of the reference standard, methyl stearate, were proposed allowing to achieve the efficient separation. The method has the necessary sensitivity, specificity, linearity, accuracy, and precision (on the intra-day and inter-day basis) which suggests its good validation characteristics. The proposed method can be employed in the analytical laboratories for the quantitative determination of sibutramine derivatives in biologically active dietary supplements.

  17. Quantitative mass spectrometry methods for pharmaceutical analysis

    PubMed Central

    Loos, Glenn; Van Schepdael, Ann

    2016-01-01

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982

  18. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    PubMed

    Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  19. Structured decision making as a method for linking quantitative decision support to community fundamental objectives

    EPA Science Inventory

    Decision support intended to improve ecosystem sustainability requires that we link stakeholder priorities directly to quantitative tools and measures of desired outcomes. Actions taken at the community level can have large impacts on production and delivery of ecosystem service...

  20. A Spectral Method for Color Quantitation of a Protein Drug Solution.

    PubMed

    Swartz, Trevor E; Yin, Jian; Patapoff, Thomas W; Horst, Travis; Skieresz, Susan M; Leggett, Gordon; Morgan, Charles J; Rahimi, Kimia; Marhoul, Joseph; Kabakoff, Bruce

    2016-01-01

    Color is an important quality attribute for biotherapeutics. In the biotechnology industry, a visual method is most commonly utilized for color characterization of liquid drug protein solutions. The color testing method is used for both batch release and on stability testing for quality control. Using that method, an analyst visually determines the color of the sample by choosing the closest matching European Pharmacopeia reference color solution. The requirement to judge the best match makes it a subjective method. Furthermore, the visual method does not capture data on hue or chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we describe a quantitative method for color determination that greatly reduces the variability in measuring color and allows for a more precise understanding of color differences. Following color industry standards established by International Commission on Illumination, this method converts a protein solution's visible absorption spectra to L*a*b* color space. Color matching is achieved within the L*a*b* color space, a practice that is already widely used in other industries. The work performed here is to facilitate the adoption and transition for the traditional visual assessment method to a quantitative spectral method. We describe here the algorithm used such that the quantitative spectral method correlates with the currently used visual method. In addition, we provide the L*a*b* values for the European Pharmacopeia reference color solutions required for the quantitative method. We have determined these L*a*b* values by gravimetrically preparing and measuring multiple lots of the reference color solutions. We demonstrate that the visual assessment and the quantitative spectral method are comparable using both low- and high-concentration antibody solutions and solutions with varying turbidity. In the biotechnology industry, a visual

  1. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    PubMed

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  2. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix.

    PubMed

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image

  3. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  4. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  5. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  6. A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.

    PubMed

    Rong, Xing; Frey, Eric C

    2013-08-01

    general, the authors simulated multiple tumors of various sizes in the liver. The authors realistically simulated human anatomy using a digital phantom and the image formation process using a previously validated and computationally efficient method for modeling the image-degrading effects including object scatter, attenuation, and the full collimator-detector response (CDR). The scatter kernels and CDR function tables used in the modeling method were generated using a previously validated Monte Carlo simulation code. The hole length, hole diameter, and septal thickness of the obtained optimal collimator were 84, 3.5, and 1.4 mm, respectively. Compared to a commercial high-energy general-purpose collimator, the optimal collimator improved the resolution and FOM by 27% and 18%, respectively. The proposed collimator optimization method may be useful for improving quantitative SPECT imaging for radionuclides with complex energy spectra. The obtained optimal collimator provided a substantial improvement in quantitative performance for the microsphere radioembolization task considered.

  7. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  8. Method and System for Object Recognition Search

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A. (Inventor); Duong, Vu A. (Inventor); Stubberud, Allen R. (Inventor)

    2012-01-01

    A method for object recognition using shape and color features of the object to be recognized. An adaptive architecture is used to recognize and adapt the shape and color features for moving objects to enable object recognition.

  9. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  10. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  12. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  13. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  14. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  15. Introduction of a method for quantitative evaluation of spontaneous motor activity development with age in infants.

    PubMed

    Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter

    2012-04-01

    Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.

  16. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  17. Quantitative endoscopy: initial accuracy measurements.

    PubMed

    Truitt, T O; Adelman, R A; Kelly, D H; Willging, J P

    2000-02-01

    The geometric optics of an endoscope can be used to determine the absolute size of an object in an endoscopic field without knowing the actual distance from the object. This study explores the accuracy of a technique that estimates absolute object size from endoscopic images. Quantitative endoscopy involves calibrating a rigid endoscope to produce size estimates from 2 images taken with a known traveled distance between the images. The heights of 12 samples, ranging in size from 0.78 to 11.80 mm, were estimated with this calibrated endoscope. Backup distances of 5 mm and 10 mm were used for comparison. The mean percent error for all estimated measurements when compared with the actual object sizes was 1.12%. The mean errors for 5-mm and 10-mm backup distances were 0.76% and 1.65%, respectively. The mean errors for objects <2 mm and > or =2 mm were 0.94% and 1.18%, respectively. Quantitative endoscopy estimates endoscopic image size to within 5% of the actual object size. This method remains promising for quantitatively evaluating object size from endoscopic images. It does not require knowledge of the absolute distance of the endoscope from the object, rather, only the distance traveled by the endoscope between images.

  18. Method for imaging a concealed object

    DOEpatents

    Davidson, James R [Idaho Falls, ID; Partin, Judy K [Idaho Falls, ID; Sawyers, Robert J [Idaho Falls, ID

    2007-07-03

    A method for imaging a concealed object is described and which includes a step of providing a heat radiating body, and wherein an object to be detected is concealed on the heat radiating body; imaging the heat radiating body to provide a visibly discernible infrared image of the heat radiating body; and determining if the visibly discernible infrared image of the heat radiating body is masked by the presence of the concealed object.

  19. Quantitative Assessment of Transportation Network Vulnerability with Dynamic Traffic Simulation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shekar, Venkateswaran; Fiondella, Lance; Chatterjee, Samrat

    Transportation networks are critical to the social and economic function of nations. Given the continuing increase in the populations of cities throughout the world, the criticality of transportation infrastructure is expected to increase. Thus, it is ever more important to mitigate congestion as well as to assess the impact disruptions would have on individuals who depend on transportation for their work and livelihood. Moreover, several government organizations are responsible for ensuring transportation networks are available despite the constant threat of natural disasters and terrorist activities. Most of the previous transportation network vulnerability research has been performed in the context ofmore » static traffic models, many of which are formulated as traditional optimization problems. However, transportation networks are dynamic because their usage varies over time. Thus, more appropriate methods to characterize the vulnerability of transportation networks should consider their dynamic properties. This paper presents a quantitative approach to assess the vulnerability of a transportation network to disruptions with methods from traffic simulation. Our approach can prioritize the critical links over time and is generalizable to the case where both link and node disruptions are of concern. We illustrate the approach through a series of examples. Our results demonstrate that the approach provides quantitative insight into the time varying criticality of links. Such an approach could be used as the objective function of less traditional optimization methods that use simulation and other techniques to evaluate the relative utility of a particular network defense to reduce vulnerability and increase resilience.« less

  20. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    PubMed

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  1. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  2. Quantitative 4D Transcatheter Intraarterial Perfusion MR Imaging as a Method to Standardize Angiographic Chemoembolization Endpoints

    PubMed Central

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2011-01-01

    PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P < 0.001) intraprocedural perfusion reduction. CONCLUSION The SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520

  3. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  4. Holographic quantitative imaging of sample hidden by turbid medium or occluding objects

    NASA Astrophysics Data System (ADS)

    Bianco, V.; Miccio, L.; Merola, F.; Memmolo, P.; Gennari, O.; Paturzo, Melania; Netti, P. A.; Ferraro, P.

    2015-03-01

    Digital Holography (DH) numerical procedures have been developed to allow imaging through turbid media. A fluid is considered turbid when dispersed particles provoke strong light scattering, thus destroying the image formation by any standard optical system. Here we show that sharp amplitude imaging and phase-contrast mapping of object hidden behind turbid medium and/or occluding objects are possible in harsh noise conditions and with a large field-of view by Multi-Look DH microscopy. In particular, it will be shown that both amplitude imaging and phase-contrast mapping of cells hidden behind a flow of Red Blood Cells can be obtained. This allows, in a noninvasive way, the quantitative evaluation of living processes in Lab on Chip platforms where conventional microscopy techniques fail. The combination of this technique with endoscopic imaging can pave the way for the holographic blood vessel inspection, e.g. to look for settled cholesterol plaques as well as blood clots for a rapid diagnostics of blood diseases.

  5. Objective and quantitative equilibriometric evaluation of individual locomotor behaviour in schizophrenia: Translational and clinical implications.

    PubMed

    Haralanov, Svetlozar; Haralanova, Evelina; Milushev, Emil; Shkodrova, Diana; Claussen, Claus-Frenz

    2018-04-17

    Psychiatry is the only medical specialty that lacks clinically applicable biomarkers for objective evaluation of the existing pathology at a single-patient level. On the basis of an original translational equilibriometric method for evaluation of movement patterns, we have introduced in the everyday clinical practice of psychiatry an easy-to-perform computerized objective quantification of the individual locomotor behaviour during execution of the Unterberger stepping test. For the last 20 years, we have gradually collected a large database of more than 1000 schizophrenic patients, their relatives, and matched psychiatric, neurological, and healthy controls via cross-sectional and longitudinal investigations. Comparative analyses revealed transdiagnostic locomotor similarities among schizophrenic patients, high-risk schizotaxic individuals, and neurological patients with multiple sclerosis and cerebellar ataxia, thus suggesting common underlying brain mechanisms. In parallel, intradiagnostic dissimilarities were revealed, which allow to separate out subclinical locomotor subgroups within the diagnostic categories. Prototypical qualitative (dysmetric and ataxic) locomotor abnormalities in schizophrenic patients were differentiated from 2 atypical quantitative ones, manifested as either hypolocomotion or hyperlocomotion. Theoretical analyses suggested that these 3 subtypes of locomotor abnormalities could be conceived as objectively measurable biomarkers of 3 schizophrenic subgroups with dissimilar brain mechanisms, which require different treatment strategies. Analogies with the prominent role of locomotor measures in some well-known animal models of mental disorders advocate for a promising objective translational research in the so far over-subjective field of psychiatry. Distinctions among prototypical, atypical, and diagnostic biomarkers, as well as between neuromotor and psychomotor locomotor abnormalities, are discussed. Conclusions are drawn about the

  6. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  7. A new, objective, quantitative scale for measuring local skin responses following topical actinic keratosis therapy with ingenol mebutate.

    PubMed

    Rosen, Robert; Marmur, Ellen; Anderson, Lawrence; Welburn, Peter; Katsamas, Janelle

    2014-12-01

    Local skin responses (LSRs) are the most common adverse effects of topical actinic keratosis (AK) therapy. There is currently no method available that allows objective characterization of LSRs. Here, the authors describe a new scale developed to quantitatively and objectively assess the six most common LSRs resulting from topical AK therapy with ingenol mebutate. The LSR grading scale was developed using a 0-4 numerical rating, with clinical descriptors and representative photographic images for each rating. Good inter-observer grading concordance was demonstrated in peer review during development of the tool. Data on the use of the scale are described from four phase III double-blind studies of ingenol mebutate (n = 1,005). LSRs peaked on days 4 (face/scalp) or 8 (trunk/extremities), with mean maximum composite LSR scores of 9.1 and 6.8, respectively, and a rapid return toward baseline by day 15 in most cases. Mean composite LSR score at day 57 was generally lower than at baseline. The LSR grading scale is an objective tool allowing practicing dermatologists to characterize and compare LSRs to existing and, potentially, future AK therapies.

  8. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  9. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  10. A Novel Feature-Tracking Echocardiographic Method for the Quantitation of Regional Myocardial Function

    PubMed Central

    Pirat, Bahar; Khoury, Dirar S.; Hartley, Craig J.; Tiller, Les; Rao, Liyun; Schulz, Daryl G.; Nagueh, Sherif F.; Zoghbi, William A.

    2012-01-01

    Objectives The aim of this study was to validate a novel, angle-independent, feature-tracking method for the echocardiographic quantitation of regional function. Background A new echocardiographic method, Velocity Vector Imaging (VVI) (syngo Velocity Vector Imaging technology, Siemens Medical Solutions, Ultrasound Division, Mountain View, California), has been introduced, based on feature tracking—incorporating speckle and endocardial border tracking, that allows the quantitation of endocardial strain, strain rate (SR), and velocity. Methods Seven dogs were studied during baseline, and various interventions causing alterations in regional function: dobutamine, 5-min coronary occlusion with reperfusion up to 1 h, followed by dobutamine and esmolol infusions. Echocardiographic images were acquired from short- and long-axis views of the left ventricle. Segment-length sonomicrometry crystals were used as the reference method. Results Changes in systolic strain in ischemic segments were tracked well with VVI during the different states of regional function. There was a good correlation between circumferential and longitudinal systolic strain by VVI and sonomicrometry (r = 0.88 and r = 0.83, respectively, p < 0.001). Strain measurements in the nonischemic basal segments also demonstrated a significant correlation between the 2 methods (r = 0.65, p < 0.001). Similarly, a significant relation was observed for circumferential and longitudinal SR between the 2 methods (r = 0.94, p < 0.001 and r = 0.90, p < 0.001, respectively). The endocardial velocity relation to changes in strain by sonomicrometry was weaker owing to significant cardiac translation. Conclusions Velocity Vector Imaging, a new feature-tracking method, can accurately assess regional myocardial function at the endocardial level and is a promising clinical tool for the simultaneous quantification of regional and global myocardial function. PMID:18261685

  11. Robust Dynamic Multi-objective Vehicle Routing Optimization Method.

    PubMed

    Guo, Yi-Nan; Cheng, Jian; Luo, Sha; Gong, Dun-Wei

    2017-03-21

    For dynamic multi-objective vehicle routing problems, the waiting time of vehicle, the number of serving vehicles, the total distance of routes were normally considered as the optimization objectives. Except for above objectives, fuel consumption that leads to the environmental pollution and energy consumption was focused on in this paper. Considering the vehicles' load and the driving distance, corresponding carbon emission model was built and set as an optimization objective. Dynamic multi-objective vehicle routing problems with hard time windows and randomly appeared dynamic customers, subsequently, were modeled. In existing planning methods, when the new service demand came up, global vehicle routing optimization method was triggered to find the optimal routes for non-served customers, which was time-consuming. Therefore, robust dynamic multi-objective vehicle routing method with two-phase is proposed. Three highlights of the novel method are: (i) After finding optimal robust virtual routes for all customers by adopting multi-objective particle swarm optimization in the first phase, static vehicle routes for static customers are formed by removing all dynamic customers from robust virtual routes in next phase. (ii)The dynamically appeared customers append to be served according to their service time and the vehicles' statues. Global vehicle routing optimization is triggered only when no suitable locations can be found for dynamic customers. (iii)A metric measuring the algorithms' robustness is given. The statistical results indicated that the routes obtained by the proposed method have better stability and robustness, but may be sub-optimum. Moreover, time-consuming global vehicle routing optimization is avoided as dynamic customers appear.

  12. Quantitative assessment in thermal image segmentation for artistic objects

    NASA Astrophysics Data System (ADS)

    Yousefi, Bardia; Sfarra, Stefano; Maldague, Xavier P. V.

    2017-07-01

    The application of the thermal and infrared technology in different areas of research is considerably increasing. These applications involve Non-destructive Testing (NDT), Medical analysis (Computer Aid Diagnosis/Detection- CAD), Arts and Archaeology among many others. In the arts and archaeology field, infrared technology provides significant contributions in term of finding defects of possible impaired regions. This has been done through a wide range of different thermographic experiments and infrared methods. The proposed approach here focuses on application of some known factor analysis methods such as standard Non-Negative Matrix Factorization (NMF) optimized by gradient-descent-based multiplicative rules (SNMF1) and standard NMF optimized by Non-negative least squares (NNLS) active-set algorithm (SNMF2) and eigen decomposition approaches such as Principal Component Thermography (PCT), Candid Covariance-Free Incremental Principal Component Thermography (CCIPCT) to obtain the thermal features. On one hand, these methods are usually applied as preprocessing before clustering for the purpose of segmentation of possible defects. On the other hand, a wavelet based data fusion combines the data of each method with PCT to increase the accuracy of the algorithm. The quantitative assessment of these approaches indicates considerable segmentation along with the reasonable computational complexity. It shows the promising performance and demonstrated a confirmation for the outlined properties. In particular, a polychromatic wooden statue and a fresco were analyzed using the above mentioned methods and interesting results were obtained.

  13. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    PubMed

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and

  14. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  15. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    PubMed

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Object-oriented Persistent Homology

    PubMed Central

    Wang, Bao; Wei, Guo-Wei

    2015-01-01

    Persistent homology provides a new approach for the topological simplification of big data via measuring the life time of intrinsic topological features in a filtration process and has found its success in scientific and engineering applications. However, such a success is essentially limited to qualitative data classification and analysis. Indeed, persistent homology has rarely been employed for quantitative modeling and prediction. Additionally, the present persistent homology is a passive tool, rather than a proactive technique, for classification and analysis. In this work, we outline a general protocol to construct object-oriented persistent homology methods. By means of differential geometry theory of surfaces, we construct an objective functional, namely, a surface free energy defined on the data of interest. The minimization of the objective functional leads to a Laplace-Beltrami operator which generates a multiscale representation of the initial data and offers an objective oriented filtration process. The resulting differential geometry based object-oriented persistent homology is able to preserve desirable geometric features in the evolutionary filtration and enhances the corresponding topological persistence. The cubical complex based homology algorithm is employed in the present work to be compatible with the Cartesian representation of the Laplace-Beltrami flow. The proposed Laplace-Beltrami flow based persistent homology method is extensively validated. The consistence between Laplace-Beltrami flow based filtration and Euclidean distance based filtration is confirmed on the Vietoris-Rips complex for a large amount of numerical tests. The convergence and reliability of the present Laplace-Beltrami flow based cubical complex filtration approach are analyzed over various spatial and temporal mesh sizes. The Laplace-Beltrami flow based persistent homology approach is utilized to study the intrinsic topology of proteins and fullerene molecules. Based on a

  17. Quantitative phase imaging method based on an analytical nonparaxial partially coherent phase optical transfer function.

    PubMed

    Bao, Yijun; Gaylord, Thomas K

    2016-11-01

    Multifilter phase imaging with partially coherent light (MFPI-PC) is a promising new quantitative phase imaging method. However, the existing MFPI-PC method is based on the paraxial approximation. In the present work, an analytical nonparaxial partially coherent phase optical transfer function is derived. This enables the MFPI-PC to be extended to the realistic nonparaxial case. Simulations over a wide range of test phase objects as well as experimental measurements on a microlens array verify higher levels of imaging accuracy compared to the paraxial method. Unlike the paraxial version, the nonparaxial MFPI-PC with obliquity factor correction exhibits no systematic error. In addition, due to its analytical expression, the increase in computation time compared to the paraxial version is negligible.

  18. Data-centric method for object observation through scattering media

    NASA Astrophysics Data System (ADS)

    Tanida, Jun; Horisaki, Ryoichi

    2018-03-01

    A data-centric method is introduced for object observation through scattering media. A large number of training pairs are used to characterize the relation between the object and the observation signals based on machine learning. Using the method object information can be retrieved even from strongly-disturbed signals. As potential applications, object recognition, imaging, and focusing through scattering media were demonstrated.

  19. Flexible Method for Inter-object Communication in C++

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Gould, Jack J.

    1994-01-01

    A method has been developed for organizing and sharing large amounts of information between objects in C++ code. This method uses a set of object classes to define variables and group them into tables. The variable tables presented here provide a convenient way of defining and cataloging data, as well as a user-friendly input/output system, a standardized set of access functions, mechanisms for ensuring data integrity, methods for interprocessor data transfer, and an interpretive language for programming relationships between parameters. The object-oriented nature of these variable tables enables the use of multiple data types, each with unique attributes and behavior. Because each variable provides its own access methods, redundant table lookup functions can be bypassed, thus decreasing access times while maintaining data integrity. In addition, a method for automatic reference counting was developed to manage memory safely.

  20. System and method for disrupting suspect objects

    DOEpatents

    Gladwell, T. Scott; Garretson, Justin R; Hobart, Clinton G; Monda, Mark J

    2013-07-09

    A system and method for disrupting at least one component of a suspect object is provided. The system includes a source for passing radiation through the suspect object, a screen for receiving the radiation passing through the suspect object and generating at least one image therefrom, a weapon having a discharge deployable therefrom, and a targeting unit. The targeting unit displays the image(s) of the suspect object and aims the weapon at a disruption point on the displayed image such that the weapon may be positioned to deploy the discharge at the disruption point whereby the suspect object is disabled.

  1. An Analysis on Usage Preferences of Learning Objects and Learning Object Repositories among Pre-Service Teachers

    ERIC Educational Resources Information Center

    Yeni, Sabiha; Ozdener, Nesrin

    2014-01-01

    The purpose of the study is to investigate how pre-service teachers benefit from learning objects repositories while preparing course content. Qualitative and quantitative data collection methods were used in a mixed methods approach. This study was carried out with 74 teachers from the Faculty of Education. In the first phase of the study,…

  2. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  3. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    PubMed

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. A general method for bead-enhanced quantitation by flow cytometry

    PubMed Central

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  5. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    PubMed Central

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  6. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  7. Blending quantitative and qualitative methods in language research and intervention.

    PubMed

    Brinton, Bonnie; Fujiki, Martin

    2003-05-01

    Best practice in speech-language pathology should be informed by current research findings. Traditional research methods are not always geared to address some of the complex, individual questions that arise in clinical intervention, however. Qualitative research methods may provide useful tools for bridging the gap from research to practice. Combinations of qualitative and quantitative procedures may be particularly helpful in sorting out some of the important issues surrounding language intervention in both clinical and research contexts. Examples of research blending qualitative and quantitative methods, as well as the case study of Sid, an 11-year-old boy with specific language impairment, are presented to illustrate how a combination of procedures can be used to enhance language research and intervention.

  8. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  9. Salient object detection method based on multiple semantic features

    NASA Astrophysics Data System (ADS)

    Wang, Chunyang; Yu, Chunyan; Song, Meiping; Wang, Yulei

    2018-04-01

    The existing salient object detection model can only detect the approximate location of salient object, or highlight the background, to resolve the above problem, a salient object detection method was proposed based on image semantic features. First of all, three novel salient features were presented in this paper, including object edge density feature (EF), object semantic feature based on the convex hull (CF) and object lightness contrast feature (LF). Secondly, the multiple salient features were trained with random detection windows. Thirdly, Naive Bayesian model was used for combine these features for salient detection. The results on public datasets showed that our method performed well, the location of salient object can be fixed and the salient object can be accurately detected and marked by the specific window.

  10. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  11. Object Recognition using Feature- and Color-Based Methods

    NASA Technical Reports Server (NTRS)

    Duong, Tuan; Duong, Vu; Stubberud, Allen

    2008-01-01

    An improved adaptive method of processing image data in an artificial neural network has been developed to enable automated, real-time recognition of possibly moving objects under changing (including suddenly changing) conditions of illumination and perspective. The method involves a combination of two prior object-recognition methods one based on adaptive detection of shape features and one based on adaptive color segmentation to enable recognition in situations in which either prior method by itself may be inadequate. The chosen prior feature-based method is known as adaptive principal-component analysis (APCA); the chosen prior color-based method is known as adaptive color segmentation (ACOSE). These methods are made to interact with each other in a closed-loop system to obtain an optimal solution of the object-recognition problem in a dynamic environment. One of the results of the interaction is to increase, beyond what would otherwise be possible, the accuracy of the determination of a region of interest (containing an object that one seeks to recognize) within an image. Another result is to provide a minimized adaptive step that can be used to update the results obtained by the two component methods when changes of color and apparent shape occur. The net effect is to enable the neural network to update its recognition output and improve its recognition capability via an adaptive learning sequence. In principle, the improved method could readily be implemented in integrated circuitry to make a compact, low-power, real-time object-recognition system. It has been proposed to demonstrate the feasibility of such a system by integrating a 256-by-256 active-pixel sensor with APCA, ACOSE, and neural processing circuitry on a single chip. It has been estimated that such a system on a chip would have a volume no larger than a few cubic centimeters, could operate at a rate as high as 1,000 frames per second, and would consume in the order of milliwatts of power.

  12. The Health-Related Quality of Life in Long-Term Colorectal Cancer Survivors Study: objectives, methods, and patient sample

    PubMed Central

    Mohler, M. Jane; Coons, Stephen Joel; Hornbrook, Mark C.; Herrinton, Lisa J.; Wendel, Christopher S.; Grant, Marcia; Krouse, Robert S.

    2008-01-01

    Objectives The objective of this paper is to describe the complex mixed-methods design of a study conducted to assess health-related quality of life (HRQOL) outcomes and ostomy-related obstacles and adjustments among long-term (>five years) colorectal cancer (CRC) survivors with ostomies (cases) and without ostomies (controls). In addition, details are provided regarding the study sample and the psychometric properties of the quantitative data collection measures used. Subsequent manuscripts will present the study findings. Research Design and Methods The study design involved a cross-sectional mail survey for collecting quantitative data and focus groups for collecting qualitative data. The study subjects were individuals identified as long-term CRC survivors within a community-based health maintenance organization's enrolled population. Focus groups comprised of cases and divided by gender and HRQOL high and low quartile contrasts (based on the mail survey data) were conducted. Main Outcome Measures The modified City of Hope Quality of Life (mCOH-QOL)-Ostomy and SF-36v2 questionnaires were used in the mail survey. An abridged version of the mCOH-QOL-Ostomy was used for the control subjects. Focus groups explored ostomy-related barriers to self-care, adaptation methods/skills, and advice for others with an ostomy. Results The survey response rate was 52% (679/1308) and 34 subjects participated in focus groups. The internal consistency reliability estimates for the mCOH-QOL-Ostomy and SF-36v2 questionnaires were very acceptable for group comparisons. In addition, evidence supports the construct validity of the abridged version of the mCOH-QOL-Ostomy. Study limitations include potential non-response bias and limited minority participation. Conclusions We were able to successfully recruit long-term CRC survivors into this study and the psychometric properties of the quantitative measures used were quite acceptable. Mixed-methods designs, such as the one used in this

  13. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  14. A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers

    PubMed Central

    Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-01-01

    Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715

  15. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  16. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  17. Object-based change detection method using refined Markov random field

    NASA Astrophysics Data System (ADS)

    Peng, Daifeng; Zhang, Yongjun

    2017-01-01

    In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.

  18. The detection methods of dynamic objects

    NASA Astrophysics Data System (ADS)

    Knyazev, N. L.; Denisova, L. A.

    2018-01-01

    The article deals with the application of cluster analysis methods for solving the task of aircraft detection on the basis of distribution of navigation parameters selection into groups (clusters). The modified method of cluster analysis for search and detection of objects and then iterative combining in clusters with the subsequent count of their quantity for increase in accuracy of the aircraft detection have been suggested. The course of the method operation and the features of implementation have been considered. In the conclusion the noted efficiency of the offered method for exact cluster analysis for finding targets has been shown.

  19. Iterative optimization method for design of quantitative magnetization transfer imaging experiments.

    PubMed

    Levesque, Ives R; Sled, John G; Pike, G Bruce

    2011-09-01

    Quantitative magnetization transfer imaging (QMTI) using spoiled gradient echo sequences with pulsed off-resonance saturation can be a time-consuming technique. A method is presented for selection of an optimum experimental design for quantitative magnetization transfer imaging based on the iterative reduction of a discrete sampling of the Z-spectrum. The applicability of the technique is demonstrated for human brain white matter imaging at 1.5 T and 3 T, and optimal designs are produced to target specific model parameters. The optimal number of measurements and the signal-to-noise ratio required for stable parameter estimation are also investigated. In vivo imaging results demonstrate that this optimal design approach substantially improves parameter map quality. The iterative method presented here provides an advantage over free form optimal design methods, in that pragmatic design constraints are readily incorporated. In particular, the presented method avoids clustering and repeated measures in the final experimental design, an attractive feature for the purpose of magnetization transfer model validation. The iterative optimal design technique is general and can be applied to any method of quantitative magnetization transfer imaging. Copyright © 2011 Wiley-Liss, Inc.

  20. Embedding Quantitative Methods by Stealth in Political Science: Developing a Pedagogy for Psephology

    ERIC Educational Resources Information Center

    Gunn, Andrew

    2017-01-01

    Student evaluations of quantitative methods courses in political science often reveal they are characterised by aversion, alienation and anxiety. As a solution to this problem, this paper describes a pedagogic research project with the aim of embedding quantitative methods by stealth into the first-year undergraduate curriculum. This paper…

  1. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE.

    PubMed

    Ojagbemi, A

    2017-06-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan Africa and Europe. The paper concludes that an integration of both the qualitative and quantitative research approaches may provide a better platform for unraveling the complex phenomenon of suicide in the elderly.

  2. An object-oriented description method of EPMM process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  3. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology

    PubMed Central

    Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.

    2016-01-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199

  4. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    PubMed

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  5. The importance of quantitative measurement methods for uveitis: laser flare photometry endorsed in Europe while neglected in Japan where the technology measuring quantitatively intraocular inflammation was developed.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur

    2017-06-01

    Laser flare photometry (LFP) is an objective and quantitative method to measure intraocular inflammation. The LFP technology was developed in Japan and has been commercially available since 1990. The aim of this work was to review the application of LFP in uveitis practice in Europe compared to Japan where the technology was born. We reviewed PubMed articles published on LFP and uveitis. Although LFP has been largely integrated in routine uveitis practice in Europe, it has been comparatively neglected in Japan and still has not received FDA approval in the USA. As LFP is the only method that provides a precise measure of intraocular inflammation, it should be used as a gold standard in uveitis centres worldwide.

  6. Method of synthesized phase objects for pattern recognition with rotation invariance

    NASA Astrophysics Data System (ADS)

    Ostroukh, Alexander P.; Butok, Alexander M.; Shvets, Rostislav A.; Yezhov, Pavel V.; Kim, Jin-Tae; Kuzmenko, Alexander V.

    2015-11-01

    We present a development of the method of synthesized phase objects (SPO-method) [1] for the rotation-invariant pattern recognition. For the standard method of recognition and the SPO-method, the comparison of the parameters of correlation signals for a number of amplitude objects is executed at the realization of a rotation in an optical-digital correlator with the joint Fourier transformation. It is shown that not only the invariance relative to a rotation at a realization of the joint correlation for synthesized phase objects (SP-objects) but also the main advantage of the method of SP-objects over the reference one such as the unified δ-like recognition signal with the largest possible signal-to-noise ratio independent of the type of an object are attained.

  7. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  8. Quantitative Evaluation Method of Each Generation Margin for Power System Planning

    NASA Astrophysics Data System (ADS)

    Su, Su; Tanaka, Kazuyuki

    As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.

  9. Schlieren System and method for moving objects

    NASA Technical Reports Server (NTRS)

    Weinstein, Leonard M. (Inventor)

    1995-01-01

    A system and method are provided for recording density changes in a flow field surrounding a moving object. A mask having an aperture for regulating the passage of images is placed in front of an image recording medium. An optical system is placed in front of the mask. A transition having a light field-of-view and a dark field-of-view is located beyond the test object. The optical system focuses an image of the transition at the mask such that the aperture causes a band of light to be defined on the image recording medium. The optical system further focuses an image of the object through the aperture of the mask so that the image of the object appears on the image recording medium. Relative motion is minimized between the mask and the transition. Relative motion is also minimized between the image recording medium and the image of the object. In this way, the image of the object and density changes in a flow field surrounding the object are recorded on the image recording medium when the object crosses the transition in front of the optical system.

  10. Methods in hair research: how to objectively distinguish between anagen and catagen in human hair follicle organ culture.

    PubMed

    Kloepper, Jennifer Elisabeth; Sugawara, Koji; Al-Nuaimi, Yusur; Gáspár, Erzsébet; van Beek, Nina; Paus, Ralf

    2010-03-01

    The organ culture of human scalp hair follicles (HFs) is the best currently available assay for hair research in the human system. In order to determine the hair growth-modulatory effects of agents in this assay, one critical read-out parameter is the assessment of whether the test agent has prolonged anagen duration or induced catagen in vitro. However, objective criteria to distinguish between anagen VI HFs and early catagen in human HF organ culture, two hair cycle stages with a deceptively similar morphology, remain to be established. Here, we develop, document and test an objective classification system that allows to distinguish between anagen VI and early catagen in organ-cultured human HFs, using both qualitative and quantitative parameters that can be generated by light microscopy or immunofluorescence. Seven qualitative classification criteria are defined that are based on assessing the morphology of the hair matrix, the dermal papilla and the distribution of pigmentary markers (melanin, gp100). These are complemented by ten quantitative parameters. We have tested this classification system by employing the clinically used topical hair growth inhibitor, eflornithine, and show that eflornithine indeed produces the expected premature catagen induction, as identified by the novel classification criteria reported here. Therefore, this classification system offers a standardized, objective and reproducible new experimental method to reliably distinguish between human anagen VI and early catagen HFs in organ culture.

  11. An automated and objective method for age partitioning of reference intervals based on continuous centile curves.

    PubMed

    Yang, Qian; Lew, Hwee Yeong; Peh, Raymond Hock Huat; Metz, Michael Patrick; Loh, Tze Ping

    2016-10-01

    Reference intervals are the most commonly used decision support tool when interpreting quantitative laboratory results. They may require partitioning to better describe subpopulations that display significantly different reference values. Partitioning by age is particularly important for the paediatric population since there are marked physiological changes associated with growth and maturation. However, most partitioning methods are either technically complex or require prior knowledge of the underlying physiology/biological variation of the population. There is growing interest in the use of continuous centile curves, which provides seamless laboratory reference values as a child grows, as an alternative to rigidly described fixed reference intervals. However, the mathematical functions that describe these curves can be complex and may not be easily implemented in laboratory information systems. Hence, the use of fixed reference intervals is expected to continue for a foreseeable time. We developed a method that objectively proposes optimised age partitions and reference intervals for quantitative laboratory data (http://research.sph.nus.edu.sg/pp/ppResult.aspx), based on the sum of gradient that best describes the underlying distribution of the continuous centile curves. It is hoped that this method may improve the selection of age intervals for partitioning, which is receiving increasing attention in paediatric laboratory medicine. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  12. Comparison of Quantitative Antifungal Testing Methods for Textile Fabrics.

    PubMed

    Imoto, Yasuo; Seino, Satoshi; Nakagawa, Takashi; Yamamoto, Takao A

    2017-01-01

     Quantitative antifungal testing methods for textile fabrics under growth-supportive conditions were studied. Fungal growth activities on unfinished textile fabrics and textile fabrics modified with Ag nanoparticles were investigated using the colony counting method and the luminescence method. Morphological changes of the fungi during incubation were investigated by microscopic observation. Comparison of the results indicated that the fungal growth activity values obtained with the colony counting method depended on the morphological state of the fungi on textile fabrics, whereas those obtained with the luminescence method did not. Our findings indicated that unique characteristics of each testing method must be taken into account for the proper evaluation of antifungal activity.

  13. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    PubMed

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  14. Development of a quantitative assessment method of pigmentary skin disease using ultraviolet optical imaging.

    PubMed

    Lee, Onseok; Park, Sunup; Kim, Jaeyoung; Oh, Chilhwan

    2017-11-01

    The visual scoring method has been used as a subjective evaluation of pigmentary skin disorders. Severity of pigmentary skin disease, especially melasma, is evaluated using a visual scoring method, the MASI (melasma area severity index). This study differentiates between epidermal and dermal pigmented disease. The study was undertaken to determine methods to quantitatively measure the severity of pigmentary skin disorders under ultraviolet illumination. The optical imaging system consists of illumination (white LED, UV-A lamp) and image acquisition (DSLR camera, air cooling CMOS CCD camera). Each camera is equipped with a polarizing filter to remove glare. To analyze images of visible and UV light, images are divided into frontal, cheek, and chin regions of melasma patients. Each image must undergo image processing. To reduce the curvature error in facial contours, a gradient mask is used. The new method of segmentation of front and lateral facial images is more objective for face-area-measurement than the MASI score. Image analysis of darkness and homogeneity is adequate to quantify the conventional MASI score. Under visible light, active lesion margins appear in both epidermal and dermal melanin, whereas melanin is found in the epidermis under UV light. This study objectively analyzes severity of melasma and attempts to develop new methods of image analysis with ultraviolet optical imaging equipment. Based on the results of this study, our optical imaging system could be used as a valuable tool to assess the severity of pigmentary skin disease. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    PubMed

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (P< .001). Repeated measures pairwise correlation between any of the methods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Autoradiographic method for quantitation of deposition and distribution of radiocalcium in bone

    PubMed Central

    Lawrence Riggs, B; Bassingthwaighte, James B.; Jowsey, Jenifer; Peter Pequegnat, E

    2010-01-01

    A method is described for quantitating autoradiographs of bone-seeking isotopes in microscopic sections of bone. Autoradiographs of bone sections containing 45Ca and internal calibration standards are automatically scanned with a microdensitometer. The digitized optical density output is stored on magnetic tape and is converted by computer to equivalent activity of 45Ca per gram of bone. The computer determines the total 45Ca uptake in the bone section and, on the basis of optical density and anatomic position, quantitatively divides the uptake into 4 components, each representing a separate physiologic process (bone formation, secondary mineralization, diffuse long-term exchange, and surface short-term exchange). The method is also applicable for quantitative analysis of microradiographs of bone sections for mineral content and density. PMID:5416906

  17. [Research on rapid and quantitative detection method for organophosphorus pesticide residue].

    PubMed

    Sun, Yuan-Xin; Chen, Bing-Tai; Yi, Sen; Sun, Ming

    2014-05-01

    The methods of physical-chemical inspection is adopted in the traditional pesticide residue detection, which require a lot of pretreatment processes, are time-consuming and complicated. In the present study, the authors take chlorpyrifos applied widely in the present agricultural field as the research object and propose a rapid and quantitative detection method for organophosphorus pesticide residues. At first, according to the chemical characteristics of chlorpyrifos and comprehensive chromogenic effect of several colorimetric reagents and secondary pollution, the pretreatment of the scheme of chromogenic reaction of chlorpyrifos with resorcin in a weak alkaline environment was determined. Secondly, by analyzing Uv-Vis spectrum data of chlorpyrifos samples whose content were between 0. 5 and 400 mg kg-1, it was confirmed that the characteristic information after the color reaction mainly was concentrated among 360 approximately 400 nm. Thirdly, the full spectrum forecasting model was established based on the partial least squares, whose correlation coefficient of calibration was 0. 999 6, correlation coefficient of prediction reached 0. 995 6, standard deviation of calibration (RMSEC) was 2. 814 7 mg kg-1, and standard deviation of verification (RMSEP) was 8. 012 4 mg kg-1. Fourthly, the wavelengths whose center wavelength is 400 nm was extracted as characteristic region to build a forecasting model, whose correlation coefficient of calibration was 0. 999 6, correlation coefficient of prediction reached 0. 999 3, standard deviation of calibration (RMSEC) was 2. 566 7 mg kg-1 , standard deviation of verification (RMSEP) was 4. 886 6 mg kg-1, respectively. At last, by analyzing the near infrared spectrum data of chlorpyrifos samples with contents between 0. 5 and 16 mg kg-1, the authors found that although the characteristics of the chromogenic functional group are not obvious, the change of absorption peaks of resorcin itself in the neighborhood of 5 200 cm

  18. Effects of calibration methods on quantitative material decomposition in photon-counting spectral computed tomography using a maximum a posteriori estimator.

    PubMed

    Curtis, Tyler E; Roeder, Ryan K

    2017-10-01

    Advances in photon-counting detectors have enabled quantitative material decomposition using multi-energy or spectral computed tomography (CT). Supervised methods for material decomposition utilize an estimated attenuation for each material of interest at each photon energy level, which must be calibrated based upon calculated or measured values for known compositions. Measurements using a calibration phantom can advantageously account for system-specific noise, but the effect of calibration methods on the material basis matrix and subsequent quantitative material decomposition has not been experimentally investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on the accuracy of quantitative material decomposition in the image domain. Gadolinium was chosen as a model contrast agent in imaging phantoms, which also contained bone tissue and water as negative controls. The maximum gadolinium concentration (30, 60, and 90 mM) and total number of concentrations (2, 4, and 7) were independently varied to systematically investigate effects of the material basis matrix and scaling factor calibration on the quantitative (root mean squared error, RMSE) and spatial (sensitivity and specificity) accuracy of material decomposition. Images of calibration and sample phantoms were acquired using a commercially available photon-counting spectral micro-CT system with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material decomposition of gadolinium, calcium, and water was performed for each calibration method using a maximum a posteriori estimator. Both the quantitative and spatial accuracy of material decomposition were most improved by using an increased maximum gadolinium concentration (range) in the basis matrix calibration; the effects of using a greater number of concentrations were relatively small in

  19. Large project experiences with object-oriented methods and reuse

    NASA Technical Reports Server (NTRS)

    Wessale, William; Reifer, Donald J.; Weller, David

    1992-01-01

    The SSVTF (Space Station Verification and Training Facility) project is completing the Preliminary Design Review of a large software development using object-oriented methods and systematic reuse. An incremental developmental lifecycle was tailored to provide early feedback and guidance on methods and products, with repeated attention to reuse. Object oriented methods were formally taught and supported by realistic examples. Reuse was readily accepted and planned by the developers. Schedule and budget issues were handled by agreements and work sharing arranged by the developers.

  20. [Quantitative and qualitative research methods, can they coexist yet?].

    PubMed

    Hunt, Elena; Lavoie, Anne-Marise

    2011-06-01

    Qualitative design is gaining ground in Nursing research. In spite of a relative progress however, the evidence based practice movement continues to dominate and to underline the exclusive value of quantitative design (particularly that of randomized clinical trials) for clinical decision making. In the actual context convenient to those in power making utilitarian decisions on one hand, and facing nursing criticism of the establishment in favor of qualitative research on the other hand, it is difficult to chose a practical and ethical path that values the nursing role within the health care system, keeping us committed to quality care and maintaining researcher's integrity. Both qualitative and quantitative methods have advantages and disadvantages, and clearly, none of them can, by itself, capture, describe and explain reality adequately. Therefore, a balance between the two methods is needed. Researchers bare responsibility to society and science, and they should opt for the appropriate design susceptible to answering the research question, not promote the design favored by the research funding distributors.

  1. Quantitation of valve regurgitation severity by three-dimensional vena contracta area is superior to flow convergence method of quantitation on transesophageal echocardiography.

    PubMed

    Abudiab, Muaz M; Chao, Chieh-Ju; Liu, Shuang; Naqvi, Tasneem Z

    2017-07-01

    Quantitation of regurgitation severity using the proximal isovelocity acceleration (PISA) method to calculate effective regurgitant orifice (ERO) area has limitations. Measurement of three-dimensional (3D) vena contracta area (VCA) accurately grades mitral regurgitation (MR) severity on transthoracic echocardiography (TTE). We evaluated 3D VCA quantitation of regurgitant jet severity using 3D transesophageal echocardiography (TEE) in 110 native mitral, aortic, and tricuspid valves and six prosthetic valves in patients with at least mild valvular regurgitation. The ASE-recommended integrative method comprising semiquantitative and quantitative assessment of valvular regurgitation was used as a reference method, including ERO area by 2D PISA for assigning severity of regurgitation grade. Mean age was 62.2±14.4 years; 3D VCA quantitation was feasible in 91% regurgitant valves compared to 78% by the PISA method. When both methods were feasible and in the presence of a single regurgitant jet, 3D VCA and 2D PISA were similar in differentiating assigned severity (ANOVAP<.001). In valves with multiple jets, however, 3D VCA had a better correlation to assigned severity (ANOVAP<.0001). The agreement of 2D PISA and 3D VCA with the integrative method was 47% and 58% for moderate and 65% and 88% for severe regurgitation, respectively. Measurement of 3D VCA by TEE is superior to the 2D PISA method in determination of regurgitation severity in multiple native and prosthetic valves. © 2017, Wiley Periodicals, Inc.

  2. Objective and quantitative definitions of modified food textures based on sensory and rheological methodology

    PubMed Central

    Wendin, Karin; Ekman, Susanne; Bülow, Margareta; Ekberg, Olle; Johansson, Daniel; Rothenberg, Elisabet; Stading, Mats

    2010-01-01

    Introduction Patients who suffer from chewing and swallowing disorders, i.e. dysphagia, may have difficulties ingesting normal food and liquids. In these patients a texture modified diet may enable that the patient maintain adequate nutrition. However, there is no generally accepted definition of ‘texture’ that includes measurements describing different food textures. Objective Objectively define and quantify categories of texture-modified food by conducting rheological measurements and sensory analyses. A further objective was to facilitate the communication and recommendations of appropriate food textures for patients with dysphagia. Design About 15 food samples varying in texture qualities were characterized by descriptive sensory and rheological measurements. Results Soups were perceived as homogenous; thickened soups were perceived as being easier to swallow, more melting and creamy compared with soups without thickener. Viscosity differed between the two types of soups. Texture descriptors for pâtés were characterized by high chewing resistance, firmness, and having larger particles compared with timbales and jellied products. Jellied products were perceived as wobbly, creamy, and easier to swallow. Concerning the rheological measurements, all solid products were more elastic than viscous (G′>G″), belonging to different G′ intervals: jellied products (low G′) and timbales together with pâtés (higher G′). Conclusion By combining sensory and rheological measurements, a system of objective, quantitative, and well-defined food textures was developed that characterizes the different texture categories. PMID:20592965

  3. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  4. Effect of ethnicity on performance in a final objective structured clinical examination: qualitative and quantitative study

    PubMed Central

    Wass, Val; Roberts, Celia; Hoogenboom, Ron; Jones, Roger; Van der Vleuten, Cees

    2003-01-01

    Objective To assess the effect of ethnicity on student performance in stations assessing communication skills within an objective structured clinical examination. Design Quantitative and qualitative study. Setting A final UK clinical examination consisting of a two day objective structured clinical examination with 22 stations. Participants 82 students from ethnic minorities and 97 white students. Main outcome measures Mean scores for stations (quantitative) and observations made using discourse analysis on selected communication stations (qualitative). Results Mean performance of students from ethnic minorities was significantly lower than that of white students for stations assessing communication skills on days 1 (67.0% (SD 6.8%) and 72.3% (7.6%); P=0.001) and 2 (65.2% (6.6%) and 69.5% (6.3%); P=0.003). No examples of overt discrimination were found in 309 video recordings. Transcriptions showed subtle differences in communication styles in some students from ethnic minorities who performed poorly. Examiners' assumptions about what is good communication may have contributed to differences in grading. Conclusions There was no evidence of explicit discrimination between students from ethnic minorities and white students in the objective structured clinical examination. A small group of male students from ethnic minorities used particularly poorly rated communicative styles, and some subtle problems in assessing communication skills may have introduced bias. Tests need to reflect issues of diversity to ensure that students from ethnic minorities are not disadvantaged. What is already known on this topicUK medical schools are concerned that students from ethnic minorities may perform less well than white students in examinationsIt is important to understand whether our examination system disadvantages themWhat this study addsMean performance of students from ethnic minorities was significantly lower than that of white students in a final year objective structured

  5. A Real-Time Method to Estimate Speed of Object Based on Object Detection and Optical Flow Calculation

    NASA Astrophysics Data System (ADS)

    Liu, Kaizhan; Ye, Yunming; Li, Xutao; Li, Yan

    2018-04-01

    In recent years Convolutional Neural Network (CNN) has been widely used in computer vision field and makes great progress in lots of contents like object detection and classification. Even so, combining Convolutional Neural Network, which means making multiple CNN frameworks working synchronously and sharing their output information, could figure out useful message that each of them cannot provide singly. Here we introduce a method to real-time estimate speed of object by combining two CNN: YOLOv2 and FlowNet. In every frame, YOLOv2 provides object size; object location and object type while FlowNet providing the optical flow of whole image. On one hand, object size and object location help to select out the object part of optical flow image thus calculating out the average optical flow of every object. On the other hand, object type and object size help to figure out the relationship between optical flow and true speed by means of optics theory and priori knowledge. Therefore, with these two key information, speed of object can be estimated. This method manages to estimate multiple objects at real-time speed by only using a normal camera even in moving status, whose error is acceptable in most application fields like manless driving or robot vision.

  6. Method and apparatus for determining the coordinates of an object

    DOEpatents

    Pedersen, Paul S.

    2002-01-01

    A simplified method and related apparatus are described for determining the location of points on the surface of an object by varying, in accordance with a unique sequence, the intensity of each illuminated pixel directed to the object surface, and detecting at known detector pixel locations the intensity sequence of reflected illumination from the surface of the object whereby the identity and location of the originating illuminated pixel can be determined. The coordinates of points on the surface of the object are then determined by conventional triangulation methods.

  7. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  8. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    PubMed

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  9. System and method for removal of buried objects

    DOEpatents

    Alexander, Robert G [Richland, WA; Crass, Dennis [Kennewick, WA; Grams, William [Kennewick, WA; Phillips, Steven J [Sunnyside, WA; Riess, Mark [Kennewick, WA

    2008-06-03

    The present invention is a system and method for removal of buried objects. According to one embodiment of the invention, a crane with a vibrator casing driver is used to lift and suspend a large diameter steel casing over the buried object. Then the casing is driven into the ground by the vibratory driver until the casing surrounds the buried object. Then the open bottom of the casing is sealed shut by injecting grout into the ground within the casing near its bottom. When the seal has cured and hardened, the top of the casing is lifted to retrieve the casing, with the buried object inside, from the ground.

  10. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  11. Variety of geologic silhouette shapes distinguishable by multiple rotations method of quantitative shape analysis text

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, D.G.; Parks, J.M.

    1984-04-01

    Silhouette shapes are two-dimensional projections of three-dimensional objects such as sand grains, gravel, and fossils. Within-the-margin markings such as chamber boundaries, sutures, or ribs are ignored. Comparisons between populations of objects from similar and differential origins (i.e., environments, species or genera, growth series, etc) is aided by quantifying the shapes. The Multiple Rotations Method (MRM) uses a variation of ''eigenshapes'', which is capable of distinguishing most of the subtle variations that the ''trained eye'' can detect. With a video-digitizer and microcomputer, MRM is fast, more accurate, and more objective than the human eye. The resulting shape descriptors comprise 5 ormore » 6 numbers per object that can be stored and retrieved to compare with similar descriptions of other objects. The original-shape outlines can be reconstituted sufficiently for gross recognition from these few numerical descriptors. Thus, a semi-automated data-retrieval system becomes feasible, with silhouette-shape descriptions as one of several recognition criteria. MRM consists of four ''rotations'': rotation about a center to a comparable orientation; a principal-components rotation to reduce the many original shape descriptors to a few; a VARIMAX orthogonal-factor rotation to achieve simple structure; and a rotation to achieve factor scores on individual objects. A variety of subtly different shapes includes sand grains from several locations, ages, and environments, and fossils of several types. This variety illustrates the feasibility of quantitative comparisons by MRM.« less

  12. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops ( Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach.

  13. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less

  14. Joint Geophysical Inversion With Multi-Objective Global Optimization Methods

    NASA Astrophysics Data System (ADS)

    Lelievre, P. G.; Bijani, R.; Farquharson, C. G.

    2015-12-01

    Pareto multi-objective global optimization (PMOGO) methods generate a suite of solutions that minimize multiple objectives (e.g. data misfits and regularization terms) in a Pareto-optimal sense. Providing a suite of models, as opposed to a single model that minimizes a weighted sum of objectives, allows a more complete assessment of the possibilities and avoids the often difficult choice of how to weight each objective. We are applying PMOGO methods to three classes of inverse problems. The first class are standard mesh-based problems where the physical property values in each cell are treated as continuous variables. The second class of problems are also mesh-based but cells can only take discrete physical property values corresponding to known or assumed rock units. In the third class we consider a fundamentally different type of inversion in which a model comprises wireframe surfaces representing contacts between rock units; the physical properties of each rock unit remain fixed while the inversion controls the position of the contact surfaces via control nodes. This third class of problem is essentially a geometry inversion, which can be used to recover the unknown geometry of a target body or to investigate the viability of a proposed Earth model. Joint inversion is greatly simplified for the latter two problem classes because no additional mathematical coupling measure is required in the objective function. PMOGO methods can solve numerically complicated problems that could not be solved with standard descent-based local minimization methods. This includes the latter two classes of problems mentioned above. There are significant increases in the computational requirements when PMOGO methods are used but these can be ameliorated using parallelization and problem dimension reduction strategies.

  15. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  16. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  17. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  18. Fuzzy object models for newborn brain MR image segmentation

    NASA Astrophysics Data System (ADS)

    Kobashi, Syoji; Udupa, Jayaram K.

    2013-03-01

    Newborn brain MR image segmentation is a challenging problem because of variety of size, shape and MR signal although it is the fundamental study for quantitative radiology in brain MR images. Because of the large difference between the adult brain and the newborn brain, it is difficult to directly apply the conventional methods for the newborn brain. Inspired by the original fuzzy object model introduced by Udupa et al. at SPIE Medical Imaging 2011, called fuzzy shape object model (FSOM) here, this paper introduces fuzzy intensity object model (FIOM), and proposes a new image segmentation method which combines the FSOM and FIOM into fuzzy connected (FC) image segmentation. The fuzzy object models are built from training datasets in which the cerebral parenchyma is delineated by experts. After registering FSOM with the evaluating image, the proposed method roughly recognizes the cerebral parenchyma region based on a prior knowledge of location, shape, and the MR signal given by the registered FSOM and FIOM. Then, FC image segmentation delineates the cerebral parenchyma using the fuzzy object models. The proposed method has been evaluated using 9 newborn brain MR images using the leave-one-out strategy. The revised age was between -1 and 2 months. Quantitative evaluation using false positive volume fraction (FPVF) and false negative volume fraction (FNVF) has been conducted. Using the evaluation data, a FPVF of 0.75% and FNVF of 3.75% were achieved. More data collection and testing are underway.

  19. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    PubMed Central

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  20. Objective and quantitative definitions of modified food textures based on sensory and rheological methodology.

    PubMed

    Wendin, Karin; Ekman, Susanne; Bülow, Margareta; Ekberg, Olle; Johansson, Daniel; Rothenberg, Elisabet; Stading, Mats

    2010-06-28

    Patients who suffer from chewing and swallowing disorders, i.e. dysphagia, may have difficulties ingesting normal food and liquids. In these patients a texture modified diet may enable that the patient maintain adequate nutrition. However, there is no generally accepted definition of 'texture' that includes measurements describing different food textures. Objectively define and quantify categories of texture-modified food by conducting rheological measurements and sensory analyses. A further objective was to facilitate the communication and recommendations of appropriate food textures for patients with dysphagia. About 15 food samples varying in texture qualities were characterized by descriptive sensory and rheological measurements. Soups were perceived as homogenous; thickened soups were perceived as being easier to swallow, more melting and creamy compared with soups without thickener. Viscosity differed between the two types of soups. Texture descriptors for pâtés were characterized by high chewing resistance, firmness, and having larger particles compared with timbales and jellied products. Jellied products were perceived as wobbly, creamy, and easier to swallow. Concerning the rheological measurements, all solid products were more elastic than viscous (G'>G''), belonging to different G' intervals: jellied products (low G') and timbales together with pâtés (higher G'). By combining sensory and rheological measurements, a system of objective, quantitative, and well-defined food textures was developed that characterizes the different texture categories.

  1. Quantitative analysis of a reconstruction method for fully three-dimensional PET.

    PubMed

    Suckling, J; Ott, R J; Deehan, B J

    1992-03-01

    The major advantage of positron emission tomography (PET) using large area planar detectors over scintillator-based commercial ring systems is the potentially larger (by a factor of two or three) axial field-of-view (FOV). However, to achieve the space invariance of the point spread function necessary for Fourier filtering a polar angle rejection criterion is applied to the data during backprojection resulting in a trade-off between FOV size and sensitivity. A new algorithm due to Defrise and co-workers developed for list-mode data overcomes this problem with a solution involving the division of the image into several subregions. A comparison between the existing backprojection-then-filter algorithm and the new method (with three subregions) has been made using both simulated and real data collected from the MUP-PET positron camera. Signal-to-noise analysis reveals that improvements of up to a factor of 1.4 are possible resulting from an increased data usage of up to a factor of 2.5 depending on the axial extent of the imaged object. Quantitation is also improved.

  2. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    PubMed

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for

  3. Quantitative methods used in Australian health promotion research: a review of publications from 1992-2002.

    PubMed

    Smith, Ben J; Zehle, Katharina; Bauman, Adrian E; Chau, Josephine; Hawkshaw, Barbara; Frost, Steven; Thomas, Margaret

    2006-04-01

    This study examined the use of quantitative methods in Australian health promotion research in order to identify methodological trends and priorities for strengthening the evidence base for health promotion. Australian health promotion articles were identified by hand searching publications from 1992-2002 in six journals: Health Promotion Journal of Australia, Australian and New Zealand journal of Public Health, Health Promotion International, Health Education Research, Health Education and Behavior and the American Journal of Health Promotion. The study designs and statistical methods used in articles presenting quantitative research were recorded. 591 (57.7%) of the 1,025 articles used quantitative methods. Cross-sectional designs were used in the majority (54.3%) of studies with pre- and post-test (14.6%) and post-test only (9.5%) the next most common designs. Bivariate statistical methods were used in 45.9% of papers, multivariate methods in 27.1% and simple numbers and proportions in 25.4%. Few studies used higher-level statistical techniques. While most studies used quantitative methods, the majority were descriptive in nature. The study designs and statistical methods used provided limited scope for demonstrating intervention effects or understanding the determinants of change.

  4. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    PubMed

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  5. Object-Based Dense Matching Method for Maintaining Structure Characteristics of Linear Buildings

    PubMed Central

    Yan, Yiming; Qiu, Mingjie; Zhao, Chunhui; Wang, Liguo

    2018-01-01

    In this paper, we proposed a novel object-based dense matching method specially for the high-precision disparity map of building objects in urban areas, which can maintain accurate object structure characteristics. The proposed framework mainly includes three stages. Firstly, an improved edge line extraction method is proposed for the edge segments to fit closely to building outlines. Secondly, a fusion method is proposed for the outlines under the constraint of straight lines, which can maintain the building structural attribute with parallel or vertical edges, which is very useful for the dense matching method. Finally, we proposed an edge constraint and outline compensation (ECAOC) dense matching method to maintain building object structural characteristics in the disparity map. In the proposed method, the improved edge lines are used to optimize matching search scope and matching template window, and the high-precision building outlines are used to compensate the shape feature of building objects. Our method can greatly increase the matching accuracy of building objects in urban areas, especially at building edges. For the outline extraction experiments, our fusion method verifies the superiority and robustness on panchromatic images of different satellites and different resolutions. For the dense matching experiments, our ECOAC method shows great advantages for matching accuracy of building objects in urban areas compared with three other methods. PMID:29596393

  6. An efficient direct method for image registration of flat objects

    NASA Astrophysics Data System (ADS)

    Nikolaev, Dmitry; Tihonkih, Dmitrii; Makovetskii, Artyom; Voronin, Sergei

    2017-09-01

    Image alignment of rigid surfaces is a rapidly developing area of research and has many practical applications. Alignment methods can be roughly divided into two types: feature-based methods and direct methods. Known SURF and SIFT algorithms are examples of the feature-based methods. Direct methods refer to those that exploit the pixel intensities without resorting to image features and image-based deformations are general direct method to align images of deformable objects in 3D space. Nevertheless, it is not good for the registration of images of 3D rigid objects since the underlying structure cannot be directly evaluated. In the article, we propose a model that is suitable for image alignment of rigid flat objects under various illumination models. The brightness consistency assumptions used for reconstruction of optimal geometrical transformation. Computer simulation results are provided to illustrate the performance of the proposed algorithm for computing of an accordance between pixels of two images.

  7. Comparison Of Methods Used In Cartography For The Skeletonisation Of Areal Objects

    NASA Astrophysics Data System (ADS)

    Szombara, Stanisław

    2015-12-01

    The article presents a method that would compare skeletonisation methods for areal objects. The skeleton of an areal object, being its linear representation, is used, among others, in cartographic visualisation. The method allows us to compare between any skeletonisation methods in terms of the deviations of distance differences between the skeleton of the object and its border from one side and the distortions of skeletonisation from another. In the article, 5 methods were compared: Voronoi diagrams, densified Voronoi diagrams, constrained Delaunay triangulation, Straight Skeleton and Medial Axis (Transform). The results of comparison were presented on the example of several areal objects. The comparison of the methods showed that in all the analysed objects the Medial Axis (Transform) gives the smallest distortion and deviation values, which allows us to recommend it.

  8. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  9. Quantitative methods in electroencephalography to access therapeutic response.

    PubMed

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  10. The Objective Borderline Method: A Probabilistic Method for Standard Setting

    ERIC Educational Resources Information Center

    Shulruf, Boaz; Poole, Phillippa; Jones, Philip; Wilkinson, Tim

    2015-01-01

    A new probability-based standard setting technique, the Objective Borderline Method (OBM), was introduced recently. This was based on a mathematical model of how test scores relate to student ability. The present study refined the model and tested it using 2500 simulated data-sets. The OBM was feasible to use. On average, the OBM performed well…

  11. Bayesian data augmentation methods for the synthesis of qualitative and quantitative research findings

    PubMed Central

    Crandell, Jamie L.; Voils, Corrine I.; Chang, YunKyung; Sandelowski, Margarete

    2010-01-01

    The possible utility of Bayesian methods for the synthesis of qualitative and quantitative research has been repeatedly suggested but insufficiently investigated. In this project, we developed and used a Bayesian method for synthesis, with the goal of identifying factors that influence adherence to HIV medication regimens. We investigated the effect of 10 factors on adherence. Recognizing that not all factors were examined in all studies, we considered standard methods for dealing with missing data and chose a Bayesian data augmentation method. We were able to summarize, rank, and compare the effects of each of the 10 factors on medication adherence. This is a promising methodological development in the synthesis of qualitative and quantitative research. PMID:21572970

  12. Quantitative Urine Culture Method Using a Plastic „Paddle” Containing Dual Media

    PubMed Central

    Craig, William A.; Kunin, Calvin M.

    1972-01-01

    A new dip-inoculum method for quantitative urine culture is described which utilizes a dual-chambered plastic „paddle” housing both a general purpose and differential medium. Comparative bacterial counts of 1,000 clinical specimens using the pour plate and this device were identical in 82.9% and within a factor of five in 95.6%. The „paddle” detected all but 19 of 258 specimens (92.6%) with 100,000 or greater colonies per ml. This simple, convenient method should allow more extensive use of quantitative urine culture in the diagnosis and follow-up of patients with urinary tract infections in office practice. It should not be considered as a substitute for the more definitive pour plate method or for standard methods for characterization of bacteriological species when more exact information is required. PMID:4555636

  13. A singular-value method for reconstruction of nonradial and lossy objects.

    PubMed

    Jiang, Wei; Astheimer, Jeffrey; Waag, Robert

    2012-03-01

    Efficient inverse scattering algorithms for nonradial lossy objects are presented using singular-value decomposition to form reduced-rank representations of the scattering operator. These algorithms extend eigenfunction methods that are not applicable to nonradial lossy scattering objects because the scattering operators for these objects do not have orthonormal eigenfunction decompositions. A method of local reconstruction by segregation of scattering contributions from different local regions is also presented. Scattering from each region is isolated by forming a reduced-rank representation of the scattering operator that has domain and range spaces comprised of far-field patterns with retransmitted fields that focus on the local region. Methods for the estimation of the boundary, average sound speed, and average attenuation slope of the scattering object are also given. These methods yielded approximations of scattering objects that were sufficiently accurate to allow residual variations to be reconstructed in a single iteration. Calculated scattering from a lossy elliptical object with a random background, internal features, and white noise is used to evaluate the proposed methods. Local reconstruction yielded images with spatial resolution that is finer than a half wavelength of the center frequency and reproduces sound speed and attenuation slope with relative root-mean-square errors of 1.09% and 11.45%, respectively.

  14. Comparison of Objective and Subjective Methods on Determination of Differential Item Functioning

    ERIC Educational Resources Information Center

    Sahin, Melek Gülsah

    2017-01-01

    Research objective is comparing the objective methods often used in literature for determination of differential item functioning (DIF) and the subjective method based on the opinions of the experts which are not used so often in literature. Mantel-Haenszel (MH), Logistic Regression (LR) and SIBTEST are chosen as objective methods. While the data…

  15. Simple and rapid method for isolation and quantitation of polyhydroxyalkanoate by SDS-sonication treatment.

    PubMed

    Arikawa, Hisashi; Sato, Shunsuke; Fujiki, Tetsuya; Matsumoto, Keiji

    2017-08-01

    We developed a new method for isolation and quantitation of polyhydroxyalkanoate (PHA) from culture broth. In this method, the cells were sonicated in sodium dodecyl sulfate (SDS) solution and centrifuged to recover PHA. The recovered PHA was rinsed with deionized water and ethanol, and then weighed after drying. Hazardous chemicals such as chloroform, methanol, and sulfuric acid were not used, and no expensive analytical instruments were needed. We applied this method to Cupriavidus necator culture broths that included various amounts of poly(3-hydroxybutyrate) (PHB) or poly(3-hydroxybutyrate-co-3-hydroxyhexanoate) (PHBHHx) from flasks and jar fermentors. The quantitation by this method was practical for use with a wide range of production amounts and PHA monomer compositions compared to the conventional whole-cell methanolysis method with gas chromatographic analysis, and besides, the recovered PHAs were adequately pure (≥96% purity). Therefore, this new method would be valuable not only for quantitation of PHA but also for preparation of samples to characterize their mechanical properties. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  16. MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis

    PubMed Central

    JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali

    2016-01-01

    Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537

  17. A Study of Dim Object Detection for the Space Surveillance Telescope

    DTIC Science & Technology

    2013-03-21

    ENG-13-M-32 Abstract Current methods of dim object detection for space surveillance make use of a Gaussian log-likelihood-ratio-test-based...quantitatively comparing the efficacy of two methods for dim object detection , termed in this paper the point detector and the correlator, both of which rely... applications . It is used in national defense for detecting satellites. It is used to detecting space debris, which threatens both civilian and

  18. A New Moving Object Detection Method Based on Frame-difference and Background Subtraction

    NASA Astrophysics Data System (ADS)

    Guo, Jiajia; Wang, Junping; Bai, Ruixue; Zhang, Yao; Li, Yong

    2017-09-01

    Although many methods of moving object detection have been proposed, moving object extraction is still the core in video surveillance. However, with the complex scene in real world, false detection, missed detection and deficiencies resulting from cavities inside the body still exist. In order to solve the problem of incomplete detection for moving objects, a new moving object detection method combined an improved frame-difference and Gaussian mixture background subtraction is proposed in this paper. To make the moving object detection more complete and accurate, the image repair and morphological processing techniques which are spatial compensations are applied in the proposed method. Experimental results show that our method can effectively eliminate ghosts and noise and fill the cavities of the moving object. Compared to other four moving object detection methods which are GMM, VIBE, frame-difference and a literature's method, the proposed method improve the efficiency and accuracy of the detection.

  19. Error analysis of motion correction method for laser scanning of moving objects

    NASA Astrophysics Data System (ADS)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  20. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  1. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  2. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Technical Performance Assessment

    PubMed Central

    2017-01-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831

  3. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews.

    PubMed

    Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique

    2009-04-01

    A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This

  4. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    PubMed

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  6. Method for the Simultaneous Quantitation of Apolipoprotein E Isoforms using Tandem Mass Spectrometry

    PubMed Central

    Wildsmith, Kristin R.; Han, Bomie; Bateman, Randall J.

    2009-01-01

    Using Apolipoprotein E (ApoE) as a model protein, we developed a protein isoform analysis method utilizing Stable Isotope Labeling Tandem Mass Spectrometry (SILT MS). ApoE isoforms are quantitated using the intensities of the b and y ions of the 13C-labeled tryptic isoform-specific peptides versus unlabeled tryptic isoform-specific peptides. The ApoE protein isoform analysis using SILT allows for the simultaneous detection and relative quantitation of different ApoE isoforms from the same sample. This method provides a less biased assessment of ApoE isoforms compared to antibody-dependent methods, and may lead to a better understanding of the biological differences between isoforms. PMID:19653990

  7. A method for the quantitative determination of crystalline phases by X-ray

    NASA Technical Reports Server (NTRS)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  8. Object-Oriented Image Clustering Method Using UAS Photogrammetric Imagery

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Larson, A.; Schultz-Fellenz, E. S.; Sussman, A. J.; Swanson, E.; Coppersmith, R.

    2016-12-01

    Unmanned Aerial Systems (UAS) have been used widely as an imaging modality to obtain remotely sensed multi-band surface imagery, and are growing in popularity due to their efficiency, ease of use, and affordability. Los Alamos National Laboratory (LANL) has employed the use of UAS for geologic site characterization and change detection studies at a variety of field sites. The deployed UAS equipped with a standard visible band camera to collect imagery datasets. Based on the imagery collected, we use deep sparse algorithmic processing to detect and discriminate subtle topographic features created or impacted by subsurface activities. In this work, we develop an object-oriented remote sensing imagery clustering method for land cover classification. To improve the clustering and segmentation accuracy, instead of using conventional pixel-based clustering methods, we integrate the spatial information from neighboring regions to create super-pixels to avoid salt-and-pepper noise and subsequent over-segmentation. To further improve robustness of our clustering method, we also incorporate a custom digital elevation model (DEM) dataset generated using a structure-from-motion (SfM) algorithm together with the red, green, and blue (RGB) band data for clustering. In particular, we first employ an agglomerative clustering to create an initial segmentation map, from where every object is treated as a single (new) pixel. Based on the new pixels obtained, we generate new features to implement another level of clustering. We employ our clustering method to the RGB+DEM datasets collected at the field site. Through binary clustering and multi-object clustering tests, we verify that our method can accurately separate vegetation from non-vegetation regions, and are also able to differentiate object features on the surface.

  9. Combinational pixel-by-pixel and object-level classifying, segmenting, and agglomerating in performing quantitative image analysis that distinguishes between healthy non-cancerous and cancerous cell nuclei and delineates nuclear, cytoplasm, and stromal material objects from stained biological tissue materials

    DOEpatents

    Boucheron, Laura E

    2013-07-16

    Quantitative object and spatial arrangement-level analysis of tissue are detailed using expert (pathologist) input to guide the classification process. A two-step method is disclosed for imaging tissue, by classifying one or more biological materials, e.g. nuclei, cytoplasm, and stroma, in the tissue into one or more identified classes on a pixel-by-pixel basis, and segmenting the identified classes to agglomerate one or more sets of identified pixels into segmented regions. Typically, the one or more biological materials comprises nuclear material, cytoplasm material, and stromal material. The method further allows a user to markup the image subsequent to the classification to re-classify said materials. The markup is performed via a graphic user interface to edit designated regions in the image.

  10. Validation of a Spectral Method for Quantitative Measurement of Color in Protein Drug Solutions.

    PubMed

    Yin, Jian; Swartz, Trevor E; Zhang, Jian; Patapoff, Thomas W; Chen, Bartolo; Marhoul, Joseph; Shih, Norman; Kabakoff, Bruce; Rahimi, Kimia

    2016-01-01

    A quantitative spectral method has been developed to precisely measure the color of protein solutions. In this method, a spectrophotometer is utilized for capturing the visible absorption spectrum of a protein solution, which can then be converted to color values (L*a*b*) that represent human perception of color in a quantitative three-dimensional space. These quantitative values (L*a*b*) allow for calculating the best match of a sample's color to a European Pharmacopoeia reference color solution. In order to qualify this instrument and assay for use in clinical quality control, a technical assessment was conducted to evaluate the assay suitability and precision. Setting acceptance criteria for this study required development and implementation of a unique statistical method for assessing precision in 3-dimensional space. Different instruments, cuvettes, protein solutions, and analysts were compared in this study. The instrument accuracy, repeatability, and assay precision were determined. The instrument and assay are found suitable for use in assessing color of drug substances and drug products and is comparable to the current European Pharmacopoeia visual assessment method. In the biotechnology industry, a visual assessment is the most commonly used method for color characterization, batch release, and stability testing of liquid protein drug solutions. Using this method, an analyst visually determines the color of the sample by choosing the closest match to a standard color series. This visual method can be subjective because it requires an analyst to make a judgment of the best match of color of the sample to the standard color series, and it does not capture data on hue and chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we developed a quantitative spectral method for color determination that greatly reduces the variability in measuring color and allows

  11. A Method for a Retrospective Analysis of Course Objectives: Have Pursued Objectives in Fact Been Attained? Twente Educational Report Number 7.

    ERIC Educational Resources Information Center

    Plomp, Tjeerd; van der Meer, Adri

    A method pertaining to the identification and analysis of course objectives is discussed. A framework is developed by which post facto objectives can be determined and students' attainment of the objectives can be assessed. The method can also be used for examining the quality of instruction. Using this method, it is possible to determine…

  12. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  13. Electrochemical method for synthesizing metal-containing particles and other objects

    DOEpatents

    Rondinone, Adam Justin; Ivanov, Ilia N.; Smith, Sean Campbell; Liang, Chengdu; Hensley, Dale K.; Moon, Ji-Won; Phelps, Tommy Joe

    2017-05-02

    The invention is directed to a method for producing metal-containing (e.g., non-oxide, oxide, or elemental) nano-objects, which may be nanoparticles or nanowires, the method comprising contacting an aqueous solution comprising a metal salt and water with an electrically powered electrode to form said metal-containing nano-objects dislodged from the electrode, wherein said electrode possesses a nanotextured surface that functions to confine the particle growth process to form said metal-containing nano-objects. The invention is also directed to the resulting metal-containing compositions as well as devices in which they are incorporated.

  14. An improved transmutation method for quantitative determination of the components in multicomponent overlapping chromatograms.

    PubMed

    Shao, Xueguang; Yu, Zhengliang; Ma, Chaoxiong

    2004-06-01

    An improved method is proposed for the quantitative determination of multicomponent overlapping chromatograms based on a known transmutation method. To overcome the main limitation of the transmutation method caused by the oscillation generated in the transmutation process, two techniques--wavelet transform smoothing and the cubic spline interpolation for reducing data points--were adopted, and a new criterion was also developed. By using the proposed algorithm, the oscillation can be suppressed effectively, and quantitative determination of the components in both the simulated and experimental overlapping chromatograms is successfully obtained.

  15. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699

  16. Coherent diffraction imaging of non-isolated object with apodized illumination.

    PubMed

    Khakurel, Krishna P; Kimura, Takashi; Joti, Yasumasa; Matsuyama, Satoshi; Yamauchi, Kazuto; Nishino, Yoshinori

    2015-11-02

    Coherent diffraction imaging (CDI) is an established lensless imaging method widely used at the x-ray regime applicable to the imaging of non-periodic materials. Conventional CDI can practically image isolated objects only, which hinders the broader application of the method. We present the imaging of non-isolated objects by employing recently proposed "non-scanning" apodized-illumination CDI at an optical wavelength. We realized isolated apodized illumination with a specially designed optical configuration and succeeded in imaging phase objects as well as amplitude objects. The non-scanning nature of the method is important particularly in imaging live cells and tissues, where fast imaging is required for non-isolated objects, and is an advantage over ptychography. We believe that our result of phase contrast imaging at an optical wavelength can be extended to the quantitative phase imaging of cells and tissues. The method also provides the feasibility of the lensless single-shot imaging of extended objects with x-ray free-electron lasers.

  17. Application of Multi-Objective Human Learning Optimization Method to Solve AC/DC Multi-Objective Optimal Power Flow Problem

    NASA Astrophysics Data System (ADS)

    Cao, Jia; Yan, Zheng; He, Guangyu

    2016-06-01

    This paper introduces an efficient algorithm, multi-objective human learning optimization method (MOHLO), to solve AC/DC multi-objective optimal power flow problem (MOPF). Firstly, the model of AC/DC MOPF including wind farms is constructed, where includes three objective functions, operating cost, power loss, and pollutant emission. Combining the non-dominated sorting technique and the crowding distance index, the MOHLO method can be derived, which involves individual learning operator, social learning operator, random exploration learning operator and adaptive strategies. Both the proposed MOHLO method and non-dominated sorting genetic algorithm II (NSGAII) are tested on an improved IEEE 30-bus AC/DC hybrid system. Simulation results show that MOHLO method has excellent search efficiency and the powerful ability of searching optimal. Above all, MOHLO method can obtain more complete pareto front than that by NSGAII method. However, how to choose the optimal solution from pareto front depends mainly on the decision makers who stand from the economic point of view or from the energy saving and emission reduction point of view.

  18. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  19. Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review

    ERIC Educational Resources Information Center

    Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael

    2014-01-01

    Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…

  20. A novel, objective, quantitative method of evaluation of the back pain component using comparative computerized multi-parametric tactile mapping before/after spinal cord stimulation and database analysis: the "Neuro-Pain't" software.

    PubMed

    Rigoard, P; Nivole, K; Blouin, P; Monlezun, O; Roulaud, M; Lorgeoux, B; Bataille, B; Guetarni, F

    2015-03-01

    One of the major challenges of neurostimulation is actually to address the back pain component in patients suffering from refractory chronic back and leg pain. Facing a tremendous expansion of neurostimulation techniques and available devices, implanters and patients can still remain confused as they need to select the right tool for the right indication. To be able to evaluate and compare objectively patient outcomes, depending on therapeutical strategies, it appears essential to develop a rational and quantitative approach to pain assessment for those who undergo neurostimulation implantation. We developed a touch screen interface, in Poitiers University Hospital and N(3)Lab, called the "Neuro-Pain'T", to detect, record and quantify the painful area surface and intensity changes in an implanted patient within time. The second aim of this software is to analyse the link between a paraesthesia coverage generated by a type of neurostimulation and a potential analgesic effect, measured by pain surface reduction, pain intensity reduction within the painful surface and local change in pain characteristics distribution. The third aim of Neuro-Pain'T is to correlate these clinical parameters to global patient data and functional outcome analysis, via a network database (Neuro-Database), to be able to provide a concise but objective approach of the neurostimulation efficacy, summarized by an index called "RFG Index". This software has been used in more than 190 patients since 2012, leading us to define three clinical parameters grouped as a clinical component of the RFG Index, which might be helpful to assess neurostimulation efficacy and compare implanted devices. The Neuro-Pain'T is an original software designed to objectively and quantitatively characterize reduction of a painful area in a given individual, in terms of intensity, surface and pain typology, in response to a treatment strategy or implantation of an analgesic device. Because pain is a physical sensation

  1. Object Segmentation Methods for Online Model Acquisition to Guide Robotic Grasping

    NASA Astrophysics Data System (ADS)

    Ignakov, Dmitri

    A vision system is an integral component of many autonomous robots. It enables the robot to perform essential tasks such as mapping, localization, or path planning. A vision system also assists with guiding the robot's grasping and manipulation tasks. As an increased demand is placed on service robots to operate in uncontrolled environments, advanced vision systems must be created that can function effectively in visually complex and cluttered settings. This thesis presents the development of segmentation algorithms to assist in online model acquisition for guiding robotic manipulation tasks. Specifically, the focus is placed on localizing door handles to assist in robotic door opening, and on acquiring partial object models to guide robotic grasping. First, a method for localizing a door handle of unknown geometry based on a proposed 3D segmentation method is presented. Following segmentation, localization is performed by fitting a simple box model to the segmented handle. The proposed method functions without requiring assumptions about the appearance of the handle or the door, and without a geometric model of the handle. Next, an object segmentation algorithm is developed, which combines multiple appearance (intensity and texture) and geometric (depth and curvature) cues. The algorithm is able to segment objects without utilizing any a priori appearance or geometric information in visually complex and cluttered environments. The segmentation method is based on the Conditional Random Fields (CRF) framework, and the graph cuts energy minimization technique. A simple and efficient method for initializing the proposed algorithm which overcomes graph cuts' reliance on user interaction is also developed. Finally, an improved segmentation algorithm is developed which incorporates a distance metric learning (DML) step as a means of weighing various appearance and geometric segmentation cues, allowing the method to better adapt to the available data. The improved method

  2. A New Kinetic Spectrophotometric Method for the Quantitation of Amorolfine.

    PubMed

    Soto, César; Poza, Cristian; Contreras, David; Yáñez, Jorge; Nacaratte, Fallon; Toral, M Inés

    2017-01-01

    Amorolfine (AOF) is a compound with fungicide activity based on the dual inhibition of growth of the fungal cell membrane, the biosynthesis and accumulation of sterols, and the reduction of ergosterol. In this work a sensitive kinetic and spectrophotometric method for the AOF quantitation based on the AOF oxidation by means of KMnO 4 at 30 min (fixed time), pH alkaline, and ionic strength controlled was developed. Measurements of changes in absorbance at 610 nm were used as criterion of the oxidation progress. In order to maximize the sensitivity, different experimental reaction parameters were carefully studied via factorial screening and optimized by multivariate method. The linearity, intraday, and interday assay precision and accuracy were determined. The absorbance-concentration plot corresponding to tap water spiked samples was rectilinear, over the range of 7.56 × 10 -6 -3.22 × 10 -5  mol L -1 , with detection and quantitation limits of 2.49 × 10 -6  mol L -1 and 7.56 × 10 -6  mol L -1 , respectively. The proposed method was successfully validated for the application of the determination of the drug in the spiked tap water samples and the percentage recoveries were 94.0-105.0%. The method is simple and does not require expensive instruments or complicated extraction steps of the reaction product.

  3. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    PubMed

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  5. Method of center localization for objects containing concentric arcs

    NASA Astrophysics Data System (ADS)

    Kuznetsova, Elena G.; Shvets, Evgeny A.; Nikolaev, Dmitry P.

    2015-02-01

    This paper proposes a method for automatic center location of objects containing concentric arcs. The method utilizes structure tensor analysis and voting scheme optimized with Fast Hough Transform. Two applications of the proposed method are considered: (i) wheel tracking in video-based system for automatic vehicle classification and (ii) tree growth rings analysis on a tree cross cut image.

  6. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. A quantitative method for optimized placement of continuous air monitors.

    PubMed

    Whicker, Jeffrey J; Rodgers, John C; Moxley, John S

    2003-11-01

    Alarming continuous air monitors (CAMs) are a critical component for worker protection in facilities that handle large amounts of hazardous materials. In nuclear facilities, continuous air monitors alarm when levels of airborne radioactive materials exceed alarm thresholds, thus prompting workers to exit the room to reduce inhalation exposures. To maintain a high level of worker protection, continuous air monitors are required to detect radioactive aerosol clouds quickly and with good sensitivity. This requires that there are sufficient numbers of continuous air monitors in a room and that they are well positioned. Yet there are no published methodologies to quantitatively determine the optimal number and placement of continuous air monitors in a room. The goal of this study was to develop and test an approach to quantitatively determine optimal number and placement of continuous air monitors in a room. The method we have developed uses tracer aerosol releases (to simulate accidental releases) and the measurement of the temporal and spatial aspects of the dispersion of the tracer aerosol through the room. The aerosol dispersion data is then analyzed to optimize continuous air monitor utilization based on simulated worker exposure. This method was tested in a room within a Department of Energy operated plutonium facility at the Savannah River Site in South Carolina, U.S. Results from this study show that the value of quantitative airflow and aerosol dispersion studies is significant and that worker protection can be significantly improved while balancing the costs associated with CAM programs.

  8. A software package to improve image quality and isolation of objects of interest for quantitative stereology studies of rat hepatocarcinogenesis.

    PubMed

    Xu, Yihua; Pitot, Henry C

    2006-03-01

    In the studies of quantitative stereology of rat hepatocarcinogenesis, we have used image analysis technology (automatic particle analysis) to obtain data such as liver tissue area, size and location of altered hepatic focal lesions (AHF), and nuclei counts. These data are then used for three-dimensional estimation of AHF occurrence and nuclear labeling index analysis. These are important parameters for quantitative studies of carcinogenesis, for screening and classifying carcinogens, and for risk estimation. To take such measurements, structures or cells of interest should be separated from the other components based on the difference of color and density. Common background problems seen on the captured sample image such as uneven light illumination or color shading can cause severe problems in the measurement. Two application programs (BK_Correction and Pixel_Separator) have been developed to solve these problems. With BK_Correction, common background problems such as incorrect color temperature setting, color shading, and uneven light illumination background, can be corrected. With Pixel_Separator different types of objects can be separated from each other in relation to their color, such as seen with different colors in immunohistochemically stained slides. The resultant images of such objects separated from other components are then ready for particle analysis. Objects that have the same darkness but different colors can be accurately differentiated in a grayscale image analysis system after application of these programs.

  9. Fast Quantitative Analysis Of Museum Objects Using Laser-Induced Breakdown Spectroscopy And Multiple Regression Algorithms

    NASA Astrophysics Data System (ADS)

    Lorenzetti, G.; Foresta, A.; Palleschi, V.; Legnaioli, S.

    2009-09-01

    The recent development of mobile instrumentation, specifically devoted to in situ analysis and study of museum objects, allows the acquisition of many LIBS spectra in very short time. However, such large amount of data calls for new analytical approaches which would guarantee a prompt analysis of the results obtained. In this communication, we will present and discuss the advantages of statistical analytical methods, such as Partial Least Squares Multiple Regression algorithms vs. the classical calibration curve approach. PLS algorithms allows to obtain in real time the information on the composition of the objects under study; this feature of the method, compared to the traditional off-line analysis of the data, is extremely useful for the optimization of the measurement times and number of points associated with the analysis. In fact, the real time availability of the compositional information gives the possibility of concentrating the attention on the most `interesting' parts of the object, without over-sampling the zones which would not provide useful information for the scholars or the conservators. Some example on the applications of this method will be presented, including the studies recently performed by the researcher of the Applied Laser Spectroscopy Laboratory on museum bronze objects.

  10. Quantitative fluorescence microscopy and image deconvolution.

    PubMed

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used

  11. High-fidelity, low-cost, automated method to assess laparoscopic skills objectively.

    PubMed

    Gray, Richard J; Kahol, Kanav; Islam, Gazi; Smith, Marshall; Chapital, Alyssa; Ferrara, John

    2012-01-01

    We sought to define the extent to which a motion analysis-based assessment system constructed with simple equipment could measure technical skill objectively and quantitatively. An "off-the-shelf" digital video system was used to capture the hand and instrument movement of surgical trainees (beginner level = PGY-1, intermediate level = PGY-3, and advanced level = PGY-5/fellows) while they performed a peg transfer exercise. The video data were passed through a custom computer vision algorithm that analyzed incoming pixels to measure movement smoothness objectively. The beginner-level group had the poorest performance, whereas those in the advanced group generated the highest scores. Intermediate-level trainees scored significantly (p < 0.04) better than beginner trainees. Advanced-level trainees scored significantly better than intermediate-level trainees and beginner-level trainees (p < 0.04 and p < 0.03, respectively). A computer vision-based analysis of surgical movements provides an objective basis for technical expertise-level analysis with construct validity. The technology to capture the data is simple, low cost, and readily available, and it obviates the need for expert human assessment in this setting. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  12. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and

  13. An eigenfunction method for reconstruction of large-scale and high-contrast objects.

    PubMed

    Waag, Robert C; Lin, Feng; Varslot, Trond K; Astheimer, Jeffrey P

    2007-07-01

    A multiple-frequency inverse scattering method that uses eigenfunctions of a scattering operator is extended to image large-scale and high-contrast objects. The extension uses an estimate of the scattering object to form the difference between the scattering by the object and the scattering by the estimate of the object. The scattering potential defined by this difference is expanded in a basis of products of acoustic fields. These fields are defined by eigenfunctions of the scattering operator associated with the estimate. In the case of scattering objects for which the estimate is radial, symmetries in the expressions used to reconstruct the scattering potential greatly reduce the amount of computation. The range of parameters over which the reconstruction method works well is illustrated using calculated scattering by different objects. The method is applied to experimental data from a 48-mm diameter scattering object with tissue-like properties. The image reconstructed from measurements has, relative to a conventional B-scan formed using a low f-number at the same center frequency, significantly higher resolution and less speckle, implying that small, high-contrast structures can be demonstrated clearly using the extended method.

  14. Solving multi-objective optimization problems in conservation with the reference point method

    PubMed Central

    Dujardin, Yann; Chadès, Iadine

    2018-01-01

    Managing the biodiversity extinction crisis requires wise decision-making processes able to account for the limited resources available. In most decision problems in conservation biology, several conflicting objectives have to be taken into account. Most methods used in conservation either provide suboptimal solutions or use strong assumptions about the decision-maker’s preferences. Our paper reviews some of the existing approaches to solve multi-objective decision problems and presents new multi-objective linear programming formulations of two multi-objective optimization problems in conservation, allowing the use of a reference point approach. Reference point approaches solve multi-objective optimization problems by interactively representing the preferences of the decision-maker with a point in the criteria (objectives) space, called the reference point. We modelled and solved the following two problems in conservation: a dynamic multi-species management problem under uncertainty and a spatial allocation resource management problem. Results show that the reference point method outperforms classic methods while illustrating the use of an interactive methodology for solving combinatorial problems with multiple objectives. The method is general and can be adapted to a wide range of ecological combinatorial problems. PMID:29293650

  15. [Quantitative method for simultaneous assay of four coumarins with one marker in Fraxini Cortex].

    PubMed

    Feng, Weihong; Wang, Zhimin; Zhang, Qiwei; Liu, Limei; Wang, Jinyu; Yang, Fei

    2011-07-01

    To establish a new quantitative method for simultaneous determination of multi-coumarins in Fraxini Cortex by using one chemical reference substance, and validate its feasibilities. The new quality evaluation method, quantitative analysis of multi-components by singer-marker (QAMS), was established and validated with Fraxini Cortex. Four main coumarins were selected as analytes to evaluate the quality and their relative correlation factors (RCF) were determined by HPLC-DAD. Within the linear range, the values of RCF at 340 nm of aesculin to asculetin, fraxin and fraxetin were 1.771, 0.799, 1.409, respectively. And the contents of aesculin in samples of Fraxini Cortex were authentically determined by the external standard method, and the contents of the three other coumarins were calculated by their RCF. The contents of these four coumarins in all samples were also determined by the external standard method. Within a certain range, the RCF had a good reproducibility (RSD 2.5%-3.9%). Significant differences were not observed between the quantitative results of two methods. It is feasible and suitable to evaluate the quality of Fraxini Cortex and its Yinpian by QAMS.

  16. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  17. Method for Statically Checking an Object-oriented Computer Program Module

    NASA Technical Reports Server (NTRS)

    Bierhoff, Kevin M. (Inventor); Aldrich, Jonathan (Inventor)

    2012-01-01

    A method for statically checking an object-oriented computer program module includes the step of identifying objects within a computer program module, at least one of the objects having a plurality of references thereto, possibly from multiple clients. A discipline of permissions is imposed on the objects identified within the computer program module. The permissions enable tracking, from among a discrete set of changeable states, a subset of states each object might be in. A determination is made regarding whether the imposed permissions are violated by a potential reference to any of the identified objects. The results of the determination are output to a user.

  18. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  19. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  20. Method for determining the weight of functional objectives on manufacturing system.

    PubMed

    Zhang, Qingshan; Xu, Wei; Zhang, Jiekun

    2014-01-01

    We propose a three-dimensional integrated weight determination to solve manufacturing system functional objectives, where consumers are weighted by triangular fuzzy numbers to determine the enterprises. The weights, subjective parts are determined by the expert scoring method, the objective parts are determined by the entropy method with the competitive advantage of determining. Based on the integration of three methods and comprehensive weight, we provide some suggestions for the manufacturing system. This paper provides the numerical example analysis to illustrate the feasibility of this method.

  1. Method for Determining the Weight of Functional Objectives on Manufacturing System

    PubMed Central

    Zhang, Qingshan; Xu, Wei; Zhang, Jiekun

    2014-01-01

    We propose a three-dimensional integrated weight determination to solve manufacturing system functional objectives, where consumers are weighted by triangular fuzzy numbers to determine the enterprises. The weights, subjective parts are determined by the expert scoring method, the objective parts are determined by the entropy method with the competitive advantage of determining. Based on the integration of three methods and comprehensive weight, we provide some suggestions for the manufacturing system. This paper provides the numerical example analysis to illustrate the feasibility of this method. PMID:25243203

  2. Quantitative nondestructive evaluation of ceramic matrix composite by the resonance method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watanabe, T.; Aizawa, T.; Kihara, J.

    The resonance method was developed to make quantitative nondestructive evaluation on the mechanical properties without any troublesome procedure. Since the present method is indifferent to the geometry of specimen, both monolithic and ceramic matrix composite materials in process can be evaluated in the nondestructive manner. Al{sub 2}O{sub 3}, Si{sub 3}N{sub 4}, SiC/Si{sub 3}N{sub 4}, and various C/C composite materials are employed to demonstrate the validity and effectiveness of the present method.

  3. Spectro-refractometry of individual microscopic objects using swept-source quantitative phase imaging.

    PubMed

    Jung, Jae-Hwang; Jang, Jaeduck; Park, Yongkeun

    2013-11-05

    We present a novel spectroscopic quantitative phase imaging technique with a wavelength swept-source, referred to as swept-source diffraction phase microscopy (ssDPM), for quantifying the optical dispersion of microscopic individual samples. Employing the swept-source and the principle of common-path interferometry, ssDPM measures the multispectral full-field quantitative phase imaging and spectroscopic microrefractometry of transparent microscopic samples in the visible spectrum with a wavelength range of 450-750 nm and a spectral resolution of less than 8 nm. With unprecedented precision and sensitivity, we demonstrate the quantitative spectroscopic microrefractometry of individual polystyrene beads, 30% bovine serum albumin solution, and healthy human red blood cells.

  4. Provisional-Ideal-Point-Based Multi-objective Optimization Method for Drone Delivery Problem

    NASA Astrophysics Data System (ADS)

    Omagari, Hiroki; Higashino, Shin-Ichiro

    2018-04-01

    In this paper, we proposed a new evolutionary multi-objective optimization method for solving drone delivery problems (DDP). It can be formulated as a constrained multi-objective optimization problem. In our previous research, we proposed the "aspiration-point-based method" to solve multi-objective optimization problems. However, this method needs to calculate the optimal values of each objective function value in advance. Moreover, it does not consider the constraint conditions except for the objective functions. Therefore, it cannot apply to DDP which has many constraint conditions. To solve these issues, we proposed "provisional-ideal-point-based method." The proposed method defines a "penalty value" to search for feasible solutions. It also defines a new reference solution named "provisional-ideal point" to search for the preferred solution for a decision maker. In this way, we can eliminate the preliminary calculations and its limited application scope. The results of the benchmark test problems show that the proposed method can generate the preferred solution efficiently. The usefulness of the proposed method is also demonstrated by applying it to DDP. As a result, the delivery path when combining one drone and one truck drastically reduces the traveling distance and the delivery time compared with the case of using only one truck.

  5. System and method for detecting a faulty object in a system

    DOEpatents

    Gunnels, John A.; Gustavson, Fred Gehrung; Engle, Robert Daniel

    2010-12-14

    A method (and system) for detecting at least one faulty object in a system including a plurality of objects in communication with each other in an n-dimensional architecture, includes probing a first plane of objects in the n-dimensional architecture and probing at least one other plane of objects in the n-dimensional architecture which would result in identifying a faulty object in the system.

  6. System and method for detecting a faulty object in a system

    DOEpatents

    Gunnels, John A [Brewster, NY; Gustavson, Fred Gehrung [Briarcliff Manor, NY; Engle, Robert Daniel [St. Louis, MO

    2009-03-17

    A method (and system) for detecting at least one faulty object in a system including a plurality of objects in communication with each other in an n-dimensional architecture, includes probing a first plane of objects in the n-dimensional architecture and probing at least one other plane of objects in the n-dimensional architecture which would result in identifying a faulty object in the system.

  7. Quantitative phase retrieval with arbitrary pupil and illumination

    DOE PAGES

    Claus, Rene A.; Naulleau, Patrick P.; Neureuther, Andrew R.; ...

    2015-10-02

    We present a general algorithm for combining measurements taken under various illumination and imaging conditions to quantitatively extract the amplitude and phase of an object wave. The algorithm uses the weak object transfer function, which incorporates arbitrary pupil functions and partially coherent illumination. The approach is extended beyond the weak object regime using an iterative algorithm. Finally, we demonstrate the method on measurements of Extreme Ultraviolet Lithography (EUV) multilayer mask defects taken in an EUV zone plate microscope with both a standard zone plate lens and a zone plate implementing Zernike phase contrast.

  8. [Bio-objects and biological methods of space radiation effects evaluation].

    PubMed

    Kaminskaia, E V; Nevzgodina, L V; Platova, N G

    2009-01-01

    The unique conditions of space experiments place austere requirements to bio-objects and biological methods of radiation effects evaluation. The paper discusses suitability of a number of bio-objects varying in stage of evolution and metabolism for space researches aimed to state common patterns of the radiation damage caused by heavy ions (HI), and character of HI-cell interaction. Physical detectors in space experiments of the BIOBLOCK series make it possible to identify bio-objects hit by space HI and to set correlation between HI track topography and biological effect. The paper provides an all-round description of the bio-objects chosen for two BIOBLOCK experiments (population of hydrophyte Wolffia arrhiza (fam. duckweed) and Lactuca sativa seeds) and the method of evaluating effects from single space radiation HI. Direct effects of heavy ions on cells can be determined by the criteria of chromosomal aberrations and delayed morphologic abnormalities. The evaluation results are compared with the data about human blood lymphocytes. Consideration is being given to the procedures of test-objects' treatment and investigation.

  9. Geophysics-based method of locating a stationary earth object

    DOEpatents

    Daily, Michael R [Albuquerque, NM; Rohde, Steven B [Corrales, NM; Novak, James L [Albuquerque, NM

    2008-05-20

    A geophysics-based method for determining the position of a stationary earth object uses the periodic changes in the gravity vector of the earth caused by the sun- and moon-orbits. Because the local gravity field is highly irregular over a global scale, a model of local tidal accelerations can be compared to actual accelerometer measurements to determine the latitude and longitude of the stationary object.

  10. Simple and cost-effective liquid chromatography-mass spectrometry method to measure dabrafenib quantitatively and six metabolites semi-quantitatively in human plasma.

    PubMed

    Vikingsson, Svante; Dahlberg, Jan-Olof; Hansson, Johan; Höiom, Veronica; Gréen, Henrik

    2017-06-01

    Dabrafenib is an inhibitor of BRAF V600E used for treating metastatic melanoma but a majority of patients experience adverse effects. Methods to measure the levels of dabrafenib and major metabolites during treatment are needed to allow development of individualized dosing strategies to reduce the burden of such adverse events. In this study, an LC-MS/MS method capable of measuring dabrafenib quantitatively and six metabolites semi-quantitatively is presented. The method is fully validated with regard to dabrafenib in human plasma in the range 5-5000 ng/mL. The analytes were separated on a C18 column after protein precipitation and detected in positive electrospray ionization mode using a Xevo TQ triple quadrupole mass spectrometer. As no commercial reference standards are available, the calibration curve of dabrafenib was used for semi-quantification of dabrafenib metabolites. Compared to earlier methods the presented method represents a simpler and more cost-effective approach suitable for clinical studies. Graphical abstract Combined multi reaction monitoring transitions of dabrafenib and metabolites in a typical case sample.

  11. IT: An Effective Pedagogic Tool in the Teaching of Quantitative Methods in Management.

    ERIC Educational Resources Information Center

    Nadkami, Sanjay M.

    1998-01-01

    Examines the possibility of supplementing conventional pedagogic methods with information technology-based teaching aids in the instruction of quantitative methods to undergraduate students. Considers the case for a problem-based learning approach, and discusses the role of information technology. (Author/LRW)

  12. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    PubMed

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.

  13. FE-ANN based modeling of 3D Simple Reinforced Concrete Girders for Objective Structural Health Evaluation : Tech Transfer Summary

    DOT National Transportation Integrated Search

    2017-06-01

    The objective of this study was to develop an objective, quantitative method for evaluating damage to bridge girders by using artificial neural networks (ANNs). This evaluation method, which is a supplement to visual inspection, requires only the res...

  14. A Novel Method for Relative Quantitation of N-Glycans by Isotopic Labeling Using 18O-Water

    PubMed Central

    Tao, Shujuan; Orlando, Ron

    2014-01-01

    Quantitation is an essential aspect of comprehensive glycomics study. Here, a novel isotopic-labeling method is described for N-glycan quantitation using 18O-water. The incorporation of the 18O-labeling into the reducing end of N-glycans is simply and efficiently achieved during peptide-N4-(N-acetyl-β-glucosaminyl) asparagine amidase F release. This process provides a 2-Da mass difference compared with the N-glycans released in 16O-water. A mathematical calculation method was also developed to determine the 18O/16O ratios from isotopic peaks. Application of this method to several standard glycoprotein mixtures and human serum demonstrated that this method can facilitate the relative quantitation of N-glycans over a linear dynamic range of two orders, with high accuracy and reproducibility. PMID:25365792

  15. A method of object recognition for single pixel imaging

    NASA Astrophysics Data System (ADS)

    Li, Boxuan; Zhang, Wenwen

    2018-01-01

    Computational ghost imaging(CGI), utilizing a single-pixel detector, has been extensively used in many fields. However, in order to achieve a high-quality reconstructed image, a large number of iterations are needed, which limits the flexibility of using CGI in practical situations, especially in the field of object recognition. In this paper, we purpose a method utilizing the feature matching to identify the number objects. In the given system, approximately 90% of accuracy of recognition rates can be achieved, which provides a new idea for the application of single pixel imaging in the field of object recognition

  16. Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text].

    PubMed

    2004-09-01

    The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.

  17. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    evolutionary method (DE) is first used to solve a relatively difficult problem in extended surface heat transfer wherein optimal fin geometries are obtained for different safe operating base temperatures. The objective of maximizing the safe operating base temperature range is in direct conflict with the objective of maximizing fin heat transfer. This problem is a good example of achieving robustness in the context of changing operating conditions. The evolutionary method is then used to design a turbine airfoil; the two objectives being reduced sensitivity of the pressure distribution to small changes in the airfoil shape and the maximization of the trailing edge wedge angle with the consequent increase in airfoil thickness and strength. This is a relevant example of achieving robustness to manufacturing tolerances and wear and tear in the presence of other objectives.

  18. The beam stop array method to measure object scatter in digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Lee, Haeng-hwa; Kim, Ye-seul; Park, Hye-Suk; Kim, Hee-Joung; Choi, Jae-Gu; Choi, Young-Wook

    2014-03-01

    Scattered radiation is inevitably generated in the object. The distribution of the scattered radiation is influenced by object thickness, filed size, object-to-detector distance, and primary energy. One of the investigations to measure scatter intensities involves measuring the signal detected under the shadow of the lead discs of a beam-stop array (BSA). The measured scatter by BSA includes not only the scattered radiation within the object (object scatter), but also the external scatter source. The components of external scatter source include the X-ray tube, detector, collimator, x-ray filter, and BSA. Excluding background scattered radiation can be applied to different scanner geometry by simple parameter adjustments without prior knowledge of the scanned object. In this study, a method using BSA to differentiate scatter in phantom (object scatter) from external background was used. Furthermore, this method was applied to BSA algorithm to correct the object scatter. In order to confirm background scattered radiation, we obtained the scatter profiles and scatter fraction (SF) profiles in the directions perpendicular to the chest wall edge (CWE) with and without scattering material. The scatter profiles with and without the scattering material were similar in the region between 127 mm and 228 mm from chest wall. This result indicated that the measured scatter by BSA included background scatter. Moreover, the BSA algorithm with the proposed method could correct the object scatter because the total radiation profiles of object scatter correction corresponded to original image in the region between 127 mm and 228 mm from chest wall. As a result, the BSA method to measure object scatter could be used to remove background scatter. This method could apply for different scanner geometry after background scatter correction. In conclusion, the BSA algorithm with the proposed method is effective to correct object scatter.

  19. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    PubMed

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p < 0.0001), 0.062 for SD v (AUC: 0.847, p < 0.0001), 0.117 for A 1 (AUC: 0.876, p < 0.0001), and 0.349 for MUD-MDD (AUC: 0.948, p < 0.0001). This is the first study to analyze multiple aspects of respiration using various mathematical constructs and provides quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  20. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  1. Quantitative data standardization of X-ray based densitometry methods

    NASA Astrophysics Data System (ADS)

    Sergunova, K. A.; Petraikin, A. V.; Petrjajkin, F. A.; Akhmad, K. S.; Semenov, D. S.; Potrakhov, N. N.

    2018-02-01

    In the present work is proposed the design of special liquid phantom for assessing the accuracy of quantitative densitometric data. Also are represented the dependencies between the measured bone mineral density values and the given values for different X-ray based densitometry techniques. Shown linear graphs make it possible to introduce correction factors to increase the accuracy of BMD measurement by QCT, DXA and DECT methods, and to use them for standardization and comparison of measurements.

  2. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  3. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    ERIC Educational Resources Information Center

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  4. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    ERIC Educational Resources Information Center

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  5. Speckle correlation method used to measure object's in-plane velocity.

    PubMed

    Smíd, Petr; Horváth, Pavel; Hrabovský, Miroslav

    2007-06-20

    We present a measurement of an object's in-plane velocity in one direction by the use of the speckle correlation method. Numerical correlations of speckle patterns recorded periodically during motion of the object under investigation give information used to evaluate the object's in-plane velocity. The proposed optical setup uses a detection plane in the image field and enables one to detect the object's velocity within the interval (10-150) microm x s(-1). Simulation analysis shows a way of controlling the measuring range. The presented theory, simulation analysis, and setup are verified through an experiment of measurement of the velocity profile of an object.

  6. Quantitating Human Optic Disc Topography

    NASA Astrophysics Data System (ADS)

    Graebel, William P.; Cohan, Bruce E.; Pearch, Andrew C.

    1980-07-01

    A method is presented for quantitatively expressing the topography of the human optic disc, applicable in a clinical setting to the diagnosis and management of glaucoma. Pho-tographs of the disc illuminated by a pattern of fine, high contrast parallel lines are digitized. From the measured deviation of the lines as they traverse the disc surface, disc topography is calculated, using the principles of optical sectioning. The quantitators applied to express this topography have the the following advantages : sensitivity to disc shape; objectivity; going beyond the limits of cup-disc ratio estimates and volume calculations; perfect generality in a mathematical sense; an inherent scheme for determining a non-subjective reference frame to compare different discs or the same disc over time.

  7. Algorithms for Learning Preferences for Sets of Objects

    NASA Technical Reports Server (NTRS)

    Wagstaff, Kiri L.; desJardins, Marie; Eaton, Eric

    2010-01-01

    A method is being developed that provides for an artificial-intelligence system to learn a user's preferences for sets of objects and to thereafter automatically select subsets of objects according to those preferences. The method was originally intended to enable automated selection, from among large sets of images acquired by instruments aboard spacecraft, of image subsets considered to be scientifically valuable enough to justify use of limited communication resources for transmission to Earth. The method is also applicable to other sets of objects: examples of sets of objects considered in the development of the method include food menus, radio-station music playlists, and assortments of colored blocks for creating mosaics. The method does not require the user to perform the often-difficult task of quantitatively specifying preferences; instead, the user provides examples of preferred sets of objects. This method goes beyond related prior artificial-intelligence methods for learning which individual items are preferred by the user: this method supports a concept of setbased preferences, which include not only preferences for individual items but also preferences regarding types and degrees of diversity of items in a set. Consideration of diversity in this method involves recognition that members of a set may interact with each other in the sense that when considered together, they may be regarded as being complementary, redundant, or incompatible to various degrees. The effects of such interactions are loosely summarized in the term portfolio effect. The learning method relies on a preference representation language, denoted DD-PREF, to express set-based preferences. In DD-PREF, a preference is represented by a tuple that includes quality (depth) functions to estimate how desired a specific value is, weights for each feature preference, the desired diversity of feature values, and the relative importance of diversity versus depth. The system applies statistical

  8. Quantitative imaging of aggregated emulsions.

    PubMed

    Penfold, Robert; Watson, Andrew D; Mackie, Alan R; Hibberd, David J

    2006-02-28

    Noise reduction, restoration, and segmentation methods are developed for the quantitative structural analysis in three dimensions of aggregated oil-in-water emulsion systems imaged by fluorescence confocal laser scanning microscopy. Mindful of typical industrial formulations, the methods are demonstrated for concentrated (30% volume fraction) and polydisperse emulsions. Following a regularized deconvolution step using an analytic optical transfer function and appropriate binary thresholding, novel application of the Euclidean distance map provides effective discrimination of closely clustered emulsion droplets with size variation over at least 1 order of magnitude. The a priori assumption of spherical nonintersecting objects provides crucial information to combat the ill-posed inverse problem presented by locating individual particles. Position coordinates and size estimates are recovered with sufficient precision to permit quantitative study of static geometrical features. In particular, aggregate morphology is characterized by a novel void distribution measure based on the generalized Apollonius problem. This is also compared with conventional Voronoi/Delauney analysis.

  9. Method for observing phase objects without halos and directional shadows

    NASA Astrophysics Data System (ADS)

    Suzuki, Yoshimasa; Kajitani, Kazuo; Ohde, Hisashi

    2015-03-01

    A new microscopy method for observing phase objects without halos and directional shadows is proposed. The key optical element is an annular aperture at the front focal plane of a condenser with a larger diameter than those used in standard phase contrast microscopy. The light flux passing through the annular aperture is changed by the specimen's surface profile and then passes through an objective and contributes to image formation. This paper presents essential conditions for realizing the method. In this paper, images of colonies formed by induced pluripotent stem (iPS) cells using this method are compared with the conventional phase contrast method and the bright-field method when the NA of the illumination is small to identify differences among these techniques. The outlines of the iPS cells are clearly visible with this method, whereas they are not clearly visible due to halos when using the phase contrast method or due to weak contrast when using the bright-field method. Other images using this method are also presented to demonstrate a capacity of this method: a mouse ovum and superimposition of several different images of mouse iPS cells.

  10. Clusters of Insomnia Disorder: An Exploratory Cluster Analysis of Objective Sleep Parameters Reveals Differences in Neurocognitive Functioning, Quantitative EEG, and Heart Rate Variability

    PubMed Central

    Miller, Christopher B.; Bartlett, Delwyn J.; Mullins, Anna E.; Dodds, Kirsty L.; Gordon, Christopher J.; Kyle, Simon D.; Kim, Jong Won; D'Rozario, Angela L.; Lee, Rico S.C.; Comas, Maria; Marshall, Nathaniel S.; Yee, Brendon J.; Espie, Colin A.; Grunstein, Ronald R.

    2016-01-01

    Study Objectives: To empirically derive and evaluate potential clusters of Insomnia Disorder through cluster analysis from polysomnography (PSG). We hypothesized that clusters would differ on neurocognitive performance, sleep-onset measures of quantitative (q)-EEG and heart rate variability (HRV). Methods: Research volunteers with Insomnia Disorder (DSM-5) completed a neurocognitive assessment and overnight PSG measures of total sleep time (TST), wake time after sleep onset (WASO), and sleep onset latency (SOL) were used to determine clusters. Results: From 96 volunteers with Insomnia Disorder, cluster analysis derived at least two clusters from objective sleep parameters: Insomnia with normal objective sleep duration (I-NSD: n = 53) and Insomnia with short sleep duration (I-SSD: n = 43). At sleep onset, differences in HRV between I-NSD and I-SSD clusters suggest attenuated parasympathetic activity in I-SSD (P < 0.05). Preliminary work suggested three clusters by retaining the I-NSD and splitting the I-SSD cluster into two: I-SSD A (n = 29): defined by high WASO and I-SSD B (n = 14): a second I-SSD cluster with high SOL and medium WASO. The I-SSD B cluster performed worse than I-SSD A and I-NSD for sustained attention (P ≤ 0.05). In an exploratory analysis, q-EEG revealed reduced spectral power also in I-SSD B before (Delta, Alpha, Beta-1) and after sleep-onset (Beta-2) compared to I-SSD A and I-NSD (P ≤ 0.05). Conclusions: Two insomnia clusters derived from cluster analysis differ in sleep onset HRV. Preliminary data suggest evidence for three clusters in insomnia with differences for sustained attention and sleep-onset q-EEG. Clinical Trial Registration: Insomnia 100 sleep study: Australia New Zealand Clinical Trials Registry (ANZCTR) identification number 12612000049875. URL: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=347742. Citation: Miller CB, Bartlett DJ, Mullins AE, Dodds KL, Gordon CJ, Kyle SD, Kim JW, D'Rozario AL, Lee RS, Comas

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    ERIC Educational Resources Information Center

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  13. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  14. A review of virtual cutting methods and technology in deformable objects.

    PubMed

    Wang, Monan; Ma, Yuzheng

    2018-06-05

    Virtual cutting of deformable objects has been a research topic for more than a decade and has been used in many areas, especially in surgery simulation. We refer to the relevant literature and briefly describe the related research. The virtual cutting method is introduced, and we discuss the benefits and limitations of these methods and explore possible research directions. Virtual cutting is a category of object deformation. It needs to represent the deformation of models in real time as accurately, robustly and efficiently as possible. To accurately represent models, the method must be able to: (1) model objects with different material properties; (2) handle collision detection and collision response; and (3) update the geometry and topology of the deformable model that is caused by cutting. Virtual cutting is widely used in surgery simulation, and research of the cutting method is important to the development of surgery simulation. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Beyond CCT: The spectral index system as a tool for the objective, quantitative characterization of lamps

    NASA Astrophysics Data System (ADS)

    Galadí-Enríquez, D.

    2018-02-01

    Correlated color temperature (CCT) is a semi-quantitative system that roughly describes the spectra of lamps. This parameter gives the temperature (measured in kelvins) of the black body that would show the hue more similar to that of the light emitted by the lamp. Modern lamps for indoor and outdoor lighting display many spectral energy distributions, most of them extremely different to those of black bodies, what makes CCT to be far from a perfect descriptor from the physical point of view. The spectral index system presented in this work provides an accurate, objective, quantitative procedure to characterize the spectral properties of lamps, with just a few numbers. The system is an adaptation to lighting technology of the classical procedures of multi-band astronomical photometry with wide and intermediate-band filters. We describe the basic concepts and we apply the system to a representative set of lamps of many kinds. The results lead to interesting, sometimes surprising conclusions. The spectral index system is extremely easy to implement from the spectral data that are routinely measured at laboratories. Thus, including this kind of computations in the standard protocols for the certification of lamps will be really straightforward, and will enrich the technical description of lighting devices.

  16. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  17. Compatibility of Qualitative and Quantitative Methods: Studying Child Sexual Abuse in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    1987-01-01

    Illustrates how the combined use of qualitative and quantitative methods were necessary in obtaining a clearer understanding of the process of incest in American society. Argues that the exclusive use of one methodology would have obscured important information. (FMW)

  18. The health-related quality of life in long-term colorectal cancer survivors study: objectives, methods and patient sample.

    PubMed

    Mohler, M Jane; Coons, Stephen Joel; Hornbrook, Mark C; Herrinton, Lisa J; Wendel, Christopher S; Grant, Marcia; Krouse, Robert S

    2008-07-01

    The objective of this paper is to describe the complex mixed-methods design of a study conducted to assess health-related quality of life (HRQOL) outcomes and ostomy-related obstacles and adjustments among long-term (>5 years) colorectal cancer (CRC) survivors with ostomies (cases) and without ostomies (controls). In addition, details are provided regarding the study sample and the psychometric properties of the quantitative data collection measures used. Subsequent manuscripts will present the study findings. The study design involved a cross-sectional mail survey for collecting quantitative data and focus groups for collecting qualitative data. The study subjects were individuals identified as long-term CRC survivors within a community-based health maintenance organization's enrolled population. Focus groups comprised of cases were conducted. The groups were divided by gender and HRQOL high and low quartile contrasts (based on the mail survey data). The modified City of Hope Quality of Life (mCOH-QOL)-Ostomy and SF-36v2 questionnaires were used in the mail survey. An abridged version of the mCOH-QOL-Ostomy was used for the control subjects. Focus groups explored ostomy-related barriers to self-care, adaptation methods/skills, and advice for others with an ostomy. The survey response rate was 52% (679/1308) and 34 subjects participated in focus groups. The internal consistency reliability estimates for the mCOH-QOL-Ostomy and SF-36v2 questionnaires were very acceptable for group comparisons. In addition, evidence supports the construct validity of the abridged version of the mCOH-QOL-Ostomy. Study limitations include potential non-response bias and limited minority participation. We were able to successfully recruit long-term CRC survivors into this study and the psychometric properties of the quantitative measures used were quite acceptable. Mixed-methods designs, such as the one used in this study, may be useful in identification and further elucidation of

  19. Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yubin; Yuan, Zhen, E-mail: zhenyuan@umac.mo

    Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects withmore » different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular

  20. Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method

    NASA Astrophysics Data System (ADS)

    Yu, Wei; Tian, Xiaolin; He, Xiaoliang; Song, Xiaojun; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-08-01

    Microscopy based on transport of intensity equation provides quantitative phase distributions which opens another perspective for cellular observations. However, it requires multi-focal image capturing while mechanical and electrical scanning limits its real time capacity in sample detections. Here, in order to break through this restriction, real time quantitative phase microscopy based on single-shot transport of the intensity equation method is proposed. A programmed phase mask is designed to realize simultaneous multi-focal image recording without any scanning; thus, phase distributions can be quantitatively retrieved in real time. It is believed the proposed method can be potentially applied in various biological and medical applications, especially for live cell imaging.

  1. Composition and Quantitation of Microalgal Lipids by ERETIC 1H NMR Method

    PubMed Central

    Nuzzo, Genoveffa; Gallo, Carmela; d’Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo

    2013-01-01

    Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (1H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790

  2. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    PubMed

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P < 0.05) lower than vital stain and PMA-qPCR methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  3. Quantitative method of measuring cancer cell urokinase and metastatic potential

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  4. Quantitative Methods in the Study of Local History

    ERIC Educational Resources Information Center

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  5. Method and Application for Dynamic Comprehensive Evaluation with Subjective and Objective Information

    PubMed Central

    Liu, Dinglin; Zhao, Xianglian

    2013-01-01

    In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176

  6. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  7. Technical note: development of a quantitative PCR method for monitoring strain dynamics during yogurt manufacture.

    PubMed

    Miller, D M; Dudley, E G; Roberts, R F

    2012-09-01

    Yogurt starter cultures may consist of multiple strains of Lactobacillus delbrueckii ssp. bulgaricus (LB) and Streptococcus thermophilus (ST). Conventional plating methods for monitoring LB and ST levels during yogurt manufacture do not allow for quantification of individual strains. The objective of the present work was to develop a quantitative PCR method for quantification of individual strains in a commercial yogurt starter culture. Strain-specific primers were designed for 2 ST strains (ST DGCC7796 and ST DGCC7710), 1 LB strain (DGCC4078), and 1 Lactobacillus delbrueckii ssp. lactis strain (LL; DGCC4550). Primers for the individual ST and LB strains were designed to target unique DNA sequences in clustered regularly interspersed short palindromic repeats. Primers for LL were designed to target a putative mannitol-specific IIbC component of the phosphotransferase system. Following evaluation of primer specificity, standard curves relating cell number to cycle threshold were prepared for each strain individually and in combination in yogurt mix, and no significant differences in the slopes were observed. Strain balance data was collected for yogurt prepared at 41 and 43°C to demonstrate the potential application of this method. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. Grid-based precision aim system and method for disrupting suspect objects

    DOEpatents

    Gladwell, Thomas Scott; Garretson, Justin; Hobart, Clinton G.; Monda, Mark J.

    2014-06-10

    A system and method for disrupting at least one component of a suspect object is provided. The system has a source for passing radiation through the suspect object, a grid board positionable adjacent the suspect object (the grid board having a plurality of grid areas, the radiation from the source passing through the grid board), a screen for receiving the radiation passing through the suspect object and generating at least one image, a weapon for deploying a discharge, and a targeting unit for displaying the image of the suspect object and aiming the weapon according to a disruption point on the displayed image and deploying the discharge into the suspect object to disable the suspect object.

  9. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    PubMed

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification

  10. Semi-quantitative methods yield greater inter- and intraobserver agreement than subjective methods for interpreting 99m technetium-hydroxymethylene-diphosphonate uptake in equine thoracic processi spinosi.

    PubMed

    van Zadelhoff, Claudia; Ehrle, Anna; Merle, Roswitha; Jahn, Werner; Lischer, Christoph

    2018-05-09

    Scintigraphy is a standard diagnostic method for evaluating horses with back pain due to suspected thoracic processus spinosus pathology. Lesion detection is based on subjective or semi-quantitative assessments of increased uptake. This retrospective, analytical study is aimed to compare semi-quantitative and subjective methods in the evaluation of scintigraphic images of the processi spinosi in the equine thoracic spine. Scintigraphic images of 20 Warmblood horses, presented for assessment of orthopedic conditions between 2014 and 2016, were included in the study. Randomized, blinded image evaluation was performed by 11 veterinarians using subjective and semi-quantitative methods. Subjective grading was performed for the analysis of red-green-blue and grayscale scintigraphic images, which were presented in full-size or as masked images. For the semi-quantitative assessment, observers placed regions of interest over each processus spinosus. The uptake ratio of each processus spinosus in comparison to a reference region of interest was determined. Subsequently, a modified semi-quantitative calculation was developed whereby only the highest counts-per-pixel for a specified number of pixels was processed. Inter- and intraobserver agreement was calculated using intraclass correlation coefficients. Inter- and intraobserver intraclass correlation coefficients were 41.65% and 71.39%, respectively, for the subjective image assessment. Additionally, a correlation between intraobserver agreement, experience, and grayscale images was identified. The inter- and intraobserver agreement was significantly increased when using semi-quantitative analysis (97.35% and 98.36%, respectively) or the modified semi-quantitative calculation (98.61% and 98.82%, respectively). The proposed modified semi-quantitative technique showed a higher inter- and intraobserver agreement when compared to other methods, which makes it a useful tool for the analysis of scintigraphic images. The

  11. Method and apparatus for acoustic imaging of objects in water

    DOEpatents

    Deason, Vance A.; Telschow, Kenneth L.

    2005-01-25

    A method, system and underwater camera for acoustic imaging of objects in water or other liquids includes an acoustic source for generating an acoustic wavefront for reflecting from a target object as a reflected wavefront. The reflected acoustic wavefront deforms a screen on an acoustic side and correspondingly deforms the opposing optical side of the screen. An optical processing system is optically coupled to the optical side of the screen and converts the deformations on the optical side of the screen into an optical intensity image of the target object.

  12. Image Retrieval Method for Multiscale Objects from Optical Colonoscopy Images

    PubMed Central

    Sakanashi, Hidenori; Takahashi, Eiichi; Murakawa, Masahiro; Aoki, Hiroshi; Takeuchi, Ken; Suzuki, Yasuo

    2017-01-01

    Optical colonoscopy is the most common approach to diagnosing bowel diseases through direct colon and rectum inspections. Periodic optical colonoscopy examinations are particularly important for detecting cancers at early stages while still treatable. However, diagnostic accuracy is highly dependent on both the experience and knowledge of the medical doctor. Moreover, it is extremely difficult, even for specialist doctors, to detect the early stages of cancer when obscured by inflammations of the colonic mucosa due to intractable inflammatory bowel diseases, such as ulcerative colitis. Thus, to assist the UC diagnosis, it is necessary to develop a new technology that can retrieve similar cases of diagnostic target image from cases in the past that stored the diagnosed images with various symptoms of colonic mucosa. In order to assist diagnoses with optical colonoscopy, this paper proposes a retrieval method for colonoscopy images that can cope with multiscale objects. The proposed method can retrieve similar colonoscopy images despite varying visible sizes of the target objects. Through three experiments conducted with real clinical colonoscopy images, we demonstrate that the method is able to retrieve objects of any visible size and any location at a high level of accuracy. PMID:28255295

  13. Image Retrieval Method for Multiscale Objects from Optical Colonoscopy Images.

    PubMed

    Nosato, Hirokazu; Sakanashi, Hidenori; Takahashi, Eiichi; Murakawa, Masahiro; Aoki, Hiroshi; Takeuchi, Ken; Suzuki, Yasuo

    2017-01-01

    Optical colonoscopy is the most common approach to diagnosing bowel diseases through direct colon and rectum inspections. Periodic optical colonoscopy examinations are particularly important for detecting cancers at early stages while still treatable. However, diagnostic accuracy is highly dependent on both the experience and knowledge of the medical doctor. Moreover, it is extremely difficult, even for specialist doctors, to detect the early stages of cancer when obscured by inflammations of the colonic mucosa due to intractable inflammatory bowel diseases, such as ulcerative colitis. Thus, to assist the UC diagnosis, it is necessary to develop a new technology that can retrieve similar cases of diagnostic target image from cases in the past that stored the diagnosed images with various symptoms of colonic mucosa. In order to assist diagnoses with optical colonoscopy, this paper proposes a retrieval method for colonoscopy images that can cope with multiscale objects. The proposed method can retrieve similar colonoscopy images despite varying visible sizes of the target objects. Through three experiments conducted with real clinical colonoscopy images, we demonstrate that the method is able to retrieve objects of any visible size and any location at a high level of accuracy.

  14. Method for detecting a mass density image of an object

    DOEpatents

    Wernick, Miles N [Chicago, IL; Yang, Yongyi [Westmont, IL

    2008-12-23

    A method for detecting a mass density image of an object. An x-ray beam is transmitted through the object and a transmitted beam is emitted from the object. The transmitted beam is directed at an angle of incidence upon a crystal analyzer. A diffracted beam is emitted from the crystal analyzer onto a detector and digitized. A first image of the object is detected from the diffracted beam emitted from the crystal analyzer when positioned at a first angular position. A second image of the object is detected from the diffracted beam emitted from the crystal analyzer when positioned at a second angular position. A refraction image is obtained and a regularized mathematical inversion algorithm is applied to the refraction image to obtain a mass density image.

  15. Surrogate Based Uni/Multi-Objective Optimization and Distribution Estimation Methods

    NASA Astrophysics Data System (ADS)

    Gong, W.; Duan, Q.; Huo, X.

    2017-12-01

    Parameter calibration has been demonstrated as an effective way to improve the performance of dynamic models, such as hydrological models, land surface models, weather and climate models etc. Traditional optimization algorithms usually cost a huge number of model evaluations, making dynamic model calibration very difficult, or even computationally prohibitive. With the help of a serious of recently developed adaptive surrogate-modelling based optimization methods: uni-objective optimization method ASMO, multi-objective optimization method MO-ASMO, and probability distribution estimation method ASMO-PODE, the number of model evaluations can be significantly reduced to several hundreds, making it possible to calibrate very expensive dynamic models, such as regional high resolution land surface models, weather forecast models such as WRF, and intermediate complexity earth system models such as LOVECLIM. This presentation provides a brief introduction to the common framework of adaptive surrogate-based optimization algorithms of ASMO, MO-ASMO and ASMO-PODE, a case study of Common Land Model (CoLM) calibration in Heihe river basin in Northwest China, and an outlook of the potential applications of the surrogate-based optimization methods.

  16. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  17. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  18. A Quantitative Comparison of Calibration Methods for RGB-D Sensors Using Different Technologies.

    PubMed

    Villena-Martínez, Víctor; Fuster-Guilló, Andrés; Azorín-López, Jorge; Saval-Calvo, Marcelo; Mora-Pascual, Jeronimo; Garcia-Rodriguez, Jose; Garcia-Garcia, Alberto

    2017-01-27

    RGB-D (Red Green Blue and Depth) sensors are devices that can provide color and depth information from a scene at the same time. Recently, they have been widely used in many solutions due to their commercial growth from the entertainment market to many diverse areas (e.g., robotics, CAD, etc.). In the research community, these devices have had good uptake due to their acceptable levelofaccuracyformanyapplicationsandtheirlowcost,butinsomecases,theyworkatthelimitof their sensitivity, near to the minimum feature size that can be perceived. For this reason, calibration processes are critical in order to increase their accuracy and enable them to meet the requirements of such kinds of applications. To the best of our knowledge, there is not a comparative study of calibration algorithms evaluating its results in multiple RGB-D sensors. Specifically, in this paper, a comparison of the three most used calibration methods have been applied to three different RGB-D sensors based on structured light and time-of-flight. The comparison of methods has been carried out by a set of experiments to evaluate the accuracy of depth measurements. Additionally, an object reconstruction application has been used as example of an application for which the sensor works at the limit of its sensitivity. The obtained results of reconstruction have been evaluated through visual inspection and quantitative measurements.

  19. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    NASA Astrophysics Data System (ADS)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  20. Pareto Tracer: a predictor-corrector method for multi-objective optimization problems

    NASA Astrophysics Data System (ADS)

    Martín, Adanay; Schütze, Oliver

    2018-03-01

    This article proposes a novel predictor-corrector (PC) method for the numerical treatment of multi-objective optimization problems (MOPs). The algorithm, Pareto Tracer (PT), is capable of performing a continuation along the set of (local) solutions of a given MOP with k objectives, and can cope with equality and box constraints. Additionally, the first steps towards a method that manages general inequality constraints are also introduced. The properties of PT are first discussed theoretically and later numerically on several examples.

  1. A method for operative quantitative interpretation of multispectral images of biological tissues

    NASA Astrophysics Data System (ADS)

    Lisenko, S. A.; Kugeiko, M. M.

    2013-10-01

    A method for operative retrieval of spatial distributions of biophysical parameters of a biological tissue by using a multispectral image of it has been developed. The method is based on multiple regressions between linearly independent components of the diffuse reflection spectrum of the tissue and unknown parameters. Possibilities of the method are illustrated by an example of determining biophysical parameters of the skin (concentrations of melanin, hemoglobin and bilirubin, blood oxygenation, and scattering coefficient of the tissue). Examples of quantitative interpretation of the experimental data are presented.

  2. A multiple-point spatially weighted k-NN method for object-based classification

    NASA Astrophysics Data System (ADS)

    Tang, Yunwei; Jing, Linhai; Li, Hui; Atkinson, Peter M.

    2016-10-01

    Object-based classification, commonly referred to as object-based image analysis (OBIA), is now commonly regarded as able to produce more appealing classification maps, often of greater accuracy, than pixel-based classification and its application is now widespread. Therefore, improvement of OBIA using spatial techniques is of great interest. In this paper, multiple-point statistics (MPS) is proposed for object-based classification enhancement in the form of a new multiple-point k-nearest neighbour (k-NN) classification method (MPk-NN). The proposed method first utilises a training image derived from a pre-classified map to characterise the spatial correlation between multiple points of land cover classes. The MPS borrows spatial structures from other parts of the training image, and then incorporates this spatial information, in the form of multiple-point probabilities, into the k-NN classifier. Two satellite sensor images with a fine spatial resolution were selected to evaluate the new method. One is an IKONOS image of the Beijing urban area and the other is a WorldView-2 image of the Wolong mountainous area, in China. The images were object-based classified using the MPk-NN method and several alternatives, including the k-NN, the geostatistically weighted k-NN, the Bayesian method, the decision tree classifier (DTC), and the support vector machine classifier (SVM). It was demonstrated that the new spatial weighting based on MPS can achieve greater classification accuracy relative to the alternatives and it is, thus, recommended as appropriate for object-based classification.

  3. Towards a non-invasive quantitative analysis of the organic components in museum objects varnishes by vibrational spectroscopies: methodological approach.

    PubMed

    Daher, Céline; Pimenta, Vanessa; Bellot-Gurlet, Ludovic

    2014-11-01

    The compositions of ancient varnishes are mainly determined destructively by separation methods coupled to mass spectrometry. In this study, a methodology for non-invasive quantitative analyses of varnishes by vibrational spectroscopies is proposed. For that, experimental simplified varnishes of colophony and linseed oil were prepared according to 18th century traditional recipes with an increasing mass concentration ratio of colophony/linseed oil. FT-Raman and IR analyses using ATR and non-invasive reflectance modes were done on the "pure" materials and on the different mixtures. Then, a new approach involving spectral decomposition calculation was developed considering the mixture spectra as a linear combination of the pure materials ones, and giving a relative amount of each component. Specific spectral regions were treated and the obtained results show a good accuracy between the prepared and calculated amounts of the two compounds. We were thus able to detect and quantify from 10% to 50% of colophony in linseed oil using non-invasive techniques that can also be conducted in situ with portable instruments when it comes to museum varnished objects and artifacts. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Development and validation of stability indicating method for the quantitative determination of venlafaxine hydrochloride in extended release formulation using high performance liquid chromatography

    PubMed Central

    Kaur, Jaspreet; Srinivasan, K. K.; Joseph, Alex; Gupta, Abhishek; Singh, Yogendra; Srinivas, Kona S.; Jain, Garima

    2010-01-01

    Objective: Venlafaxine,hydrochloride is a structurally novel phenethyl bicyclic antidepressant, and is usually categorized as a serotonin–norepinephrine reuptake inhibitor (SNRI) but it has been referred to as a serotonin–norepinephrine–dopamine reuptake inhibitor. It inhibits the reuptake of dopamine. Venlafaxine HCL is widely prescribed in the form of sustained release formulations. In the current article we are reporting the development and validation of a fast and simple stability indicating, isocratic high performance liquid chromatographic (HPLC) method for the determination of venlafaxine hydrochloride in sustained release formulations. Materials and Methods: The quantitative determination of venlafaxine hydrochloride was performed on a Kromasil C18 analytical column (250 × 4.6 mm i.d., 5 μm particle size) with 0.01 M phosphate buffer (pH 4.5): methanol (40: 60) as a mobile phase, at a flow rate of 1.0 ml/min. For HPLC methods, UV detection was made at 225 nm. Results: During method validation, parameters such as precision, linearity, accuracy, stability, limit of quantification and detection and specificity were evaluated, which remained within acceptable limits. Conclusions: The method has been successfully applied for the quantification and dissolution profiling of Venlafaxine HCL in sustained release formulation. The method presents a simple and reliable solution for the routine quantitative analysis of Venlafaxine HCL. PMID:21814426

  5. An Alu-based, MGB Eclipse real-time PCR method for quantitation of human DNA in forensic samples.

    PubMed

    Nicklas, Janice A; Buel, Eric

    2005-09-01

    The forensic community needs quick, reliable methods to quantitate human DNA in crime scene samples to replace the laborious and imprecise slot blot method. A real-time PCR based method has the possibility of allowing development of a faster and more quantitative assay. Alu sequences are primate-specific and are found in many copies in the human genome, making these sequences an excellent target or marker for human DNA. This paper describes the development of a real-time Alu sequence-based assay using MGB Eclipse primers and probes. The advantages of this assay are simplicity, speed, less hands-on-time and automated quantitation, as well as a large dynamic range (128 ng/microL to 0.5 pg/microL).

  6. Comparison of transect sampling and object-oriented image classification methods of urbanizing catchments

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Tenenbaum, D. E.

    2009-12-01

    The process of urbanization has major effects on both human and natural systems. In order to monitor these changes and better understand how urban ecological systems work, urban spatial structure and the variation needs to be first quantified at a fine scale. Because the land-use and land-cover (LULC) in urbanizing areas is highly heterogeneous, the classification of urbanizing environments is the most challenging field in remote sensing. Although a pixel-based method is a common way to do classification, the results are not good enough for many research objectives which require more accurate classification data in fine scales. Transect sampling and object-oriented classification methods are more appropriate for urbanizing areas. Tenenbaum used a transect sampling method using a computer-based facility within a widely available commercial GIS in the Glyndon Catchment and the Upper Baismans Run Catchment, Baltimore, Maryland. It was a two-tiered classification system, including a primary level (which includes 7 classes) and a secondary level (which includes 37 categories). The statistical information of LULC was collected. W. Zhou applied an object-oriented method at the parcel level in Gwynn’s Falls Watershed which includes the two previously mentioned catchments and six classes were extracted. The two urbanizing catchments are located in greater Baltimore, Maryland and drain into Chesapeake Bay. In this research, the two different methods are compared for 6 classes (woody, herbaceous, water, ground, pavement and structure). The comparison method uses the segments in the transect method to extract LULC information from the results of the object-oriented method. Classification results were compared in order to evaluate the difference between the two methods. The overall proportions of LULC classes from the two studies show that there is overestimation of structures in the object-oriented method. For the other five classes, the results from the two methods are

  7. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.

    PubMed

    Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-04-01

    To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.

  8. "Inject-mix-react-separate-and-quantitate" (IMReSQ) method for screening enzyme inhibitors.

    PubMed

    Wong, Edmund; Okhonin, Victor; Berezovski, Maxim V; Nozaki, Tomoyoshi; Waldmann, Herbert; Alexandrov, Kirill; Krylov, Sergey N

    2008-09-10

    Many regulatory enzymes are considered attractive therapeutic targets, and their inhibitors are potential drug candidates. Screening combinatorial libraries for enzyme inhibitors is pivotal to identifying hit compounds for the development of drugs targeting regulatory enzymes. Here, we introduce the first inhibitor screening method that consumes only nanoliters of the reactant solutions and is applicable to regulatory enzymes. The method is termed inject-mix-react-separate-and-quantitate (IMReSQ) and includes five steps. First, nanoliter volumes of substrate, candidate inhibitor, and enzyme solutions are injected by pressure into a capillary as separate plugs. Second, the plugs are mixed inside this capillary microreactor by transverse diffusion of laminar flow profiles. Third, the reaction mixture is incubated to form the enzymatic product. Fourth, the product is separated from the substrate inside the capillary by electrophoresis. Fifth, the amounts of the product and substrate are quantitated. In this proof-of-principle work, we applied IMReSQ to study inhibition of recently cloned protein farnesyltransferase from parasite Entamoeba histolytica. This enzyme is a potential therapeutic target for antiparasitic drugs. We identified three previously unknown inhibitors of this enzyme and proved that IMReSQ could be used for quantitatively ranking the potencies of inhibitors.

  9. Method and apparatus for releasably connecting first and second objects

    NASA Technical Reports Server (NTRS)

    Monford, Leo G., Jr. (Inventor)

    1991-01-01

    The apparatus and method are disclosed for releasably connecting first and second objects, where a magnetic end effector may include at least one elongated pin number, a proximal end of which is connected to the first object and the distal end of which may be inserted into a receiving portion in the second object. Latch members are carried by the pin member for radial movement between retracted and expanded positions for releasing and locking, respectively, first and second objects. A plunger member carried by the pin member is axially moveable between first and second positions. In the first plunger position, the latch members are located in the expanded (locked) position and in the second plunger position the latch members are released for movement to retracted or unlocked position. The magnetic end effector is provided for releasable attachment to the first object and for moving the plunger member to the second position, releasing the first object.

  10. A quantitative study of gully erosion based on object-oriented analysis techniques: a case study in Beiyanzikou catchment of Qixia, Shandong, China.

    PubMed

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m(2), 5074.1790 m(3) and 1316.1250 m(2), 1591.5784 m(3), respectively. The results of the study provide a new method for the quantitative study of small gully erosion.

  11. A Quantitative Study of Gully Erosion Based on Object-Oriented Analysis Techniques: A Case Study in Beiyanzikou Catchment of Qixia, Shandong, China

    PubMed Central

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626

  12. ADvanced IMage Algebra (ADIMA): a novel method for depicting multiple sclerosis lesion heterogeneity, as demonstrated by quantitative MRI

    PubMed Central

    Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia AM

    2013-01-01

    Background: There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. Objective: To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. Methods: We obtained conventional PDw and T2w images from 10 patients with relapsing–remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Results: Our study’s ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. Conclusion: ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished. PMID:23037551

  13. System and method for inventorying multiple remote objects

    DOEpatents

    Carrender, Curtis L.; Gilbert, Ronald W.

    2007-10-23

    A system and method of inventorying multiple objects utilizing a multi-level or a chained radio frequency identification system. The system includes a master tag and a plurality of upper level tags and lower level tags associated with respective objects. The upper and lower level tags communicate with each other and the master tag so that reading of the master tag reveals the presence and absence of upper and lower level tags. In the chained RF system, the upper and lower level tags communicate locally with each other in a manner so that more remote tags that are out of range of some of the upper and lower level tags have their information relayed through adjacent tags to the master tag and thence to a controller.

  14. System and method for inventorying multiple remote objects

    DOEpatents

    Carrender, Curtis L [Morgan Hill, CA; Gilbert, Ronald W [Morgan Hill, CA

    2009-12-29

    A system and method of inventorying multiple objects utilizing a multi-level or a chained radio frequency identification system. The system includes a master tag and a plurality of upper level tags and lower level tags associated with respective objects. The upper and lower level tags communicate with each other and the master tag so that reading of the master tag reveals the presence and absence of upper and lower level tags. In the chained RF system, the upper and lower level tags communicate locally with each other in a manner so that more remote tags that are out of range of some of the upper and lower level tags have their information relayed through adjacent tags to the master tag and thence to a controller.

  15. Comparative assessment of fluorescent transgene methods for quantitative imaging in human cells.

    PubMed

    Mahen, Robert; Koch, Birgit; Wachsmuth, Malte; Politi, Antonio Z; Perez-Gonzalez, Alexis; Mergenthaler, Julia; Cai, Yin; Ellenberg, Jan

    2014-11-05

    Fluorescence tagging of proteins is a widely used tool to study protein function and dynamics in live cells. However, the extent to which different mammalian transgene methods faithfully report on the properties of endogenous proteins has not been studied comparatively. Here we use quantitative live-cell imaging and single-molecule spectroscopy to analyze how different transgene systems affect imaging of the functional properties of the mitotic kinase Aurora B. We show that the transgene method fundamentally influences level and variability of expression and can severely compromise the ability to report on endogenous binding and localization parameters, providing a guide for quantitative imaging studies in mammalian cells. © 2014 Mahen et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  16. Quantitative methods for evaluating the efficacy of thalamic deep brain stimulation in patients with essential tremor.

    PubMed

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life.

  17. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  18. Nondestructive analysis of three-dimensional objects using a fluid displacement method

    USDA-ARS?s Scientific Manuscript database

    Quantification of three-dimensional (3-D) objects has been a real challenge in agricultural, hydrological and environmental studies. We designed and tested a method that is capable of quantifying 3-D objects using measurements of fluid displacement. The device consists of a stand that supports a mov...

  19. Real-time optical multiple object recognition and tracking system and method

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin (Inventor); Liu, Hua Kuang (Inventor)

    1987-01-01

    The invention relates to an apparatus and associated methods for the optical recognition and tracking of multiple objects in real time. Multiple point spatial filters are employed that pre-define the objects to be recognized at run-time. The system takes the basic technology of a Vander Lugt filter and adds a hololens. The technique replaces time, space and cost-intensive digital techniques. In place of multiple objects, the system can also recognize multiple orientations of a single object. This later capability has potential for space applications where space and weight are at a premium.

  20. Methods and Apparatus for Detecting Defects in an Object of Interest

    NASA Technical Reports Server (NTRS)

    Hartman, John K. (Inventor); Pearson, Lee H (Inventor)

    2017-01-01

    A method for detecting defects in an object of interest comprises applying an ultrasonic signal including a tone burst having a predetermined frequency and number of cycles into an object of interest, receiving a return signal reflected from the object of interest, and processing the return signal to detect defects in at least one inner material. The object may have an outer material and the at least one inner material that have different acoustic impedances. An ultrasonic sensor system includes an ultrasonic sensor configured to generate an ultrasonic signal having a tone burst at a predetermined frequency corresponding to a resonant frequency of an outer material of an object of interest.

  1. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2017-06-01

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors

  2. Qualities of a Psychiatric Mentor: A Quantitative Singaporean Survey

    ERIC Educational Resources Information Center

    Tor, Phern-Chern; Goh, Lee-Gan; Ang, Yong-Guan; Lim, Leslie; Winslow, Rasaiah-Munidasa; Ng, Beng-Yeong; Wong, Sze-Tai; Ng, Tse-Pin; Kia, Ee-Heok

    2011-01-01

    Objective: Psychiatric mentors are an important part of the new, seamless training program in Singapore. There is a need to assess the qualities of a good psychiatric mentor vis-a-vis those of a good psychiatrist. Method: An anonymous survey was sent out to all psychiatry trainees and psychiatrists in Singapore to assess quantitatively the…

  3. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  4. Revisiting Individual Creativity Assessment: Triangulation in Subjective and Objective Assessment Methods

    ERIC Educational Resources Information Center

    Park, Namgyoo K.; Chun, Monica Youngshin; Lee, Jinju

    2016-01-01

    Compared to the significant development of creativity studies, individual creativity research has not reached a meaningful consensus regarding the most valid and reliable method for assessing individual creativity. This study revisited 2 of the most popular methods for assessing individual creativity: subjective and objective methods. This study…

  5. The analysis of selected orientation methods of architectural objects' scans

    NASA Astrophysics Data System (ADS)

    Markiewicz, Jakub S.; Kajdewicz, Irmina; Zawieska, Dorota

    2015-05-01

    The terrestrial laser scanning is commonly used in different areas, inter alia in modelling architectural objects. One of the most important part of TLS data processing is scans registration. It significantly affects the accuracy of generation of high resolution photogrammetric documentation. This process is time consuming, especially in case of a large number of scans. It is mostly based on an automatic detection and a semi-automatic measurement of control points placed on the object. In case of the complicated historical buildings, sometimes it is forbidden to place survey targets on an object or it may be difficult to distribute survey targets in the optimal way. Such problems encourage the search for the new methods of scan registration which enable to eliminate the step of placing survey targets on the object. In this paper the results of target-based registration method are presented The survey targets placed on the walls of historical chambers of the Museum of King Jan III's Palace at Wilanów and on the walls of ruins of the Bishops Castle in Iłża were used for scan orientation. Several variants of orientation were performed, taking into account different placement and different number of survey marks. Afterwards, during next research works, raster images were generated from scans and the SIFT and SURF algorithms for image processing were used to automatically search for corresponding natural points. The case of utilisation of automatically identified points for TLS data orientation was analysed. The results of both methods for TLS data registration were summarized and presented in numerical and graphical forms.

  6. A novel method for overlapping community detection using Multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Ebrahimi, Morteza; Shahmoradi, Mohammad Reza; Heshmati, Zainabolhoda; Salehi, Mostafa

    2018-09-01

    The problem of community detection as one of the most important applications of network science can be addressed effectively by multi-objective optimization. In this paper, we aim to present a novel efficient method based on this approach. Also, in this study the idea of using all Pareto fronts to detect overlapping communities is introduced. The proposed method has two main advantages compared to other multi-objective optimization based approaches. The first advantage is scalability, and the second is the ability to find overlapping communities. Despite most of the works, the proposed method is able to find overlapping communities effectively. The new algorithm works by extracting appropriate communities from all the Pareto optimal solutions, instead of choosing the one optimal solution. Empirical experiments on different features of separated and overlapping communities, on both synthetic and real networks show that the proposed method performs better in comparison with other methods.

  7. High-quality slab-based intermixing method for fusion rendering of multiple medical objects.

    PubMed

    Kim, Dong-Joon; Kim, Bohyoung; Lee, Jeongjin; Shin, Juneseuk; Kim, Kyoung Won; Shin, Yeong-Gil

    2016-01-01

    The visualization of multiple 3D objects has been increasingly required for recent applications in medical fields. Due to the heterogeneity in data representation or data configuration, it is difficult to efficiently render multiple medical objects in high quality. In this paper, we present a novel intermixing scheme for fusion rendering of multiple medical objects while preserving the real-time performance. First, we present an in-slab visibility interpolation method for the representation of subdivided slabs. Second, we introduce virtual zSlab, which extends an infinitely thin boundary (such as polygonal objects) into a slab with a finite thickness. Finally, based on virtual zSlab and in-slab visibility interpolation, we propose a slab-based visibility intermixing method with the newly proposed rendering pipeline. Experimental results demonstrate that the proposed method delivers more effective multiple-object renderings in terms of rendering quality, compared to conventional approaches. And proposed intermixing scheme provides high-quality intermixing results for the visualization of intersecting and overlapping surfaces by resolving aliasing and z-fighting problems. Moreover, two case studies are presented that apply the proposed method to the real clinical applications. These case studies manifest that the proposed method has the outstanding advantages of the rendering independency and reusability. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Job Search as Goal-Directed Behavior: Objectives and Methods

    ERIC Educational Resources Information Center

    Van Hoye, Greet; Saks, Alan M.

    2008-01-01

    This study investigated the relationship between job search objectives (finding a new job/turnover, staying aware of job alternatives, developing a professional network, and obtaining leverage against an employer) and job search methods (looking at job ads, visiting job sites, networking, contacting employment agencies, contacting employers, and…

  9. A method for the evaluation of image quality according to the recognition effectiveness of objects in the optical remote sensing image using machine learning algorithm.

    PubMed

    Yuan, Tao; Zheng, Xinqi; Hu, Xuan; Zhou, Wei; Wang, Wei

    2014-01-01

    Objective and effective image quality assessment (IQA) is directly related to the application of optical remote sensing images (ORSI). In this study, a new IQA method of standardizing the target object recognition rate (ORR) is presented to reflect quality. First, several quality degradation treatments with high-resolution ORSIs are implemented to model the ORSIs obtained in different imaging conditions; then, a machine learning algorithm is adopted for recognition experiments on a chosen target object to obtain ORRs; finally, a comparison with commonly used IQA indicators was performed to reveal their applicability and limitations. The results showed that the ORR of the original ORSI was calculated to be up to 81.95%, whereas the ORR ratios of the quality-degraded images to the original images were 65.52%, 64.58%, 71.21%, and 73.11%. The results show that these data can more accurately reflect the advantages and disadvantages of different images in object identification and information extraction when compared with conventional digital image assessment indexes. By recognizing the difference in image quality from the application effect perspective, using a machine learning algorithm to extract regional gray scale features of typical objects in the image for analysis, and quantitatively assessing quality of ORSI according to the difference, this method provides a new approach for objective ORSI assessment.

  10. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  11. Methods, systems and devices for detecting threatening objects and for classifying magnetic data

    DOEpatents

    Kotter, Dale K [Shelley, ID; Roybal, Lyle G [Idaho Falls, ID; Rohrbaugh, David T [Idaho Falls, ID; Spencer, David F [Idaho Falls, ID

    2012-01-24

    A method for detecting threatening objects in a security screening system. The method includes a step of classifying unique features of magnetic data as representing a threatening object. Another step includes acquiring magnetic data. Another step includes determining if the acquired magnetic data comprises a unique feature.

  12. Dual respiratory and cardiac motion estimation in PET imaging: Methods design and quantitative evaluation.

    PubMed

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-04-01

    The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be

  13. Influence of echo time in quantitative proton MR spectroscopy using LCModel.

    PubMed

    Yamamoto, Tetsuya; Isobe, Tomonori; Akutsu, Hiroyoshi; Masumoto, Tomohiko; Ando, Hiroki; Sato, Eisuke; Takada, Kenta; Anno, Izumi; Matsumura, Akira

    2015-06-01

    The objective of this study was to elucidate the influence on quantitative analysis using LCModel with the condition of echo time (TE) longer than the recommended values in the spectrum acquisition specifications. A 3T magnetic resonance system was used to perform proton magnetic resonance spectroscopy. The participants were 5 healthy volunteers and 11 patients with glioma. Data were collected at TE of 72, 144 and 288ms. LCModel was used to quantify several metabolites (N-acetylaspartate, creatine and phosphocreatine, and choline-containing compounds). The results were compared with quantitative values obtained by using the T2-corrected internal reference method. In healthy volunteers, when TE was long, the quantitative values obtained using LCModel were up to 6.8-fold larger (p<0.05) than those obtained using the T2-corrected internal reference method. The ratios of the quantitative values obtained by the two methods differed between metabolites (p<0.05). In patients with glioma, the ratios of quantitative values obtained by the two methods tended to be larger at longer TE, similarly to the case of healthy volunteers, and large between-individual variation in the ratios was observed. In clinical practice, TE is sometimes set longer than the value recommended for LCModel. If TE is long, LCModel overestimates the quantitative value since it cannot compensate for signal attenuation, and this effect is different for each metabolite and condition. Therefore, if TE is longer than recommended, it is necessary to account for the possibly reduced reliability of quantitative values calculated using LCModel. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Quantitative photogrammetric analysis of the Klapp method for treating idiopathic scoliosis.

    PubMed

    Iunes, Denise H; Cecílio, Maria B B; Dozza, Marina A; Almeida, Polyanna R

    2010-01-01

    Few studies have proved that physical therapy techniques are efficient in the treatment of scoliosis. To analyze the efficiency of the Klapp method for the treatment of scoliosis, through a quantitative analysis using computerized biophotogrammetry. Sixteen participants of a mean age of 15+/-2.61 yrs. with idiopathic scoliosis were treated using the Klapp method. To analyze the results from the treatment, they were all of photographed before and after the treatments, following a standardized photographic method. All of the photographs were analyzed quantitatively by the same examiner using the ALCimagem 2000 software. The statistical analyses were performed using the paired t-test with a significance level of 5%. The treatments showed improvements in the angles which evaluated the symmetry of the shoulders, i.e. the acromioclavicular joint angle (AJ; p=0.00) and sternoclavicular joint angle (SJ; p=0.01). There were also improvements in the angle that evaluated the left Thales triangle (DeltaT; p=0.02). Regarding flexibility, there were improvements in the tibiotarsal angle (TTA; p=0.01) and in the hip joint angles (HJA; p=0.00). There were no changes in the vertebral curvatures and nor improvements in head positioning. Only the lumbar curvature, evaluated by the lumbar lordosis angle (LL; p=0.00), changed after the treatments. The Klapp method was an efficient therapeutic technique for treating asymmetries of the trunk and improving its flexibility. However, it was not efficient for pelvic asymmetry modifications in head positioning, cervical lordosis or thoracic kyphosis.

  15. Taxonomy based analysis of force exchanges during object grasping and manipulation

    PubMed Central

    Martin-Brevet, Sandra; Jarrassé, Nathanaël; Burdet, Etienne

    2017-01-01

    The flexibility of the human hand in object manipulation is essential for daily life activities, but remains relatively little explored with quantitative methods. On the one hand, recent taxonomies describe qualitatively the classes of hand postures for object grasping and manipulation. On the other hand, the quantitative analysis of hand function has been generally restricted to precision grip (with thumb and index opposition) during lifting tasks. The aim of the present study is to fill the gap between these two kinds of descriptions, by investigating quantitatively the forces exerted by the hand on an instrumented object in a set of representative manipulation tasks. The object was a parallelepiped object able to measure the force exerted on the six faces and its acceleration. The grasping force was estimated from the lateral force and the unloading force from the bottom force. The protocol included eleven tasks with complementary constraints inspired by recent taxonomies: four tasks corresponding to lifting and holding the object with different grasp configurations, and seven to manipulating the object (rotation around each of its axis and translation). The grasping and unloading forces and object rotations were measured during the five phases of the actions: unloading, lifting, holding or manipulation, preparation to deposit, and deposit. The results confirm the tight regulation between grasping and unloading forces during lifting, and extend this to the deposit phase. In addition, they provide a precise description of the regulation of force exchanges during various manipulation tasks spanning representative actions of daily life. The timing of manipulation showed both sequential and overlapping organization of the different sub-actions, and micro-errors could be detected. This phenomenological study confirms the feasibility of using an instrumented object to investigate complex manipulative behavior in humans. This protocol will be used in the future to

  16. Are three generations of quantitative molecular methods sufficient in medical virology? Brief review.

    PubMed

    Clementi, Massimo; Bagnarelli, Patrizia

    2015-10-01

    In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.

  17. Developing Investigative Entry Points: Exploring the Use of Quantitative Methods in English Education Research

    ERIC Educational Resources Information Center

    McGraner, Kristin L.; Robbins, Daniel

    2010-01-01

    Although many research questions in English education demand the use of qualitative methods, this paper will briefly explore how English education researchers and doctoral students may use statistics and quantitative methods to inform, complement, and/or deepen their inquiries. First, the authors will provide a general overview of the survey areas…

  18. Tactile objects based on an amplitude disturbed diffraction pattern method

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Nikolovski, Jean-Pierre; Mechbal, Nazih; Hafez, Moustapha; Vergé, Michel

    2009-12-01

    Tactile sensing is becoming widely used in human-computer interfaces. Recent advances in acoustic approaches demonstrated the possibilities to transform ordinary solid objects into interactive interfaces. This letter proposes a static finger contact localization process using an amplitude disturbed diffraction pattern method. The localization method is based on the following physical phenomenon: a finger contact modifies the energy distribution of acoustic wave in a solid; these variations depend on the wave frequency and the contact position. The presented method first consists of exciting the object with an acoustic signal with plural frequency components. In a second step, a measured acoustic signal is compared with prerecorded values to deduce the contact position. This position is then used for human-machine interaction (e.g., finger tracking on computer screen). The selection of excitation signals is discussed and a frequency choice criterion based on contrast value is proposed. Tests on a sandwich plate (liquid crystal display screen) prove the simplicity and easiness to apply the process in various solids.

  19. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  20. An experimental comparison of online object-tracking algorithms

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Chen, Feng; Xu, Wenli; Yang, Ming-Hsuan

    2011-09-01

    This paper reviews and evaluates several state-of-the-art online object tracking algorithms. Notwithstanding decades of efforts, object tracking remains a challenging problem due to factors such as illumination, pose, scale, deformation, motion blur, noise, and occlusion. To account for appearance change, most recent tracking algorithms focus on robust object representations and effective state prediction. In this paper, we analyze the components of each tracking method and identify their key roles in dealing with specific challenges, thereby shedding light on how to choose and design algorithms for different situations. We compare state-of-the-art online tracking methods including the IVT,1 VRT,2 FragT,3 BoostT,4 SemiT,5 BeSemiT,6 L1T,7 MILT,8 VTD9 and TLD10 algorithms on numerous challenging sequences, and evaluate them with different performance metrics. The qualitative and quantitative comparative results demonstrate the strength and weakness of these algorithms.

  1. A novel method for quantitative geosteering using azimuthal gamma-ray logging.

    PubMed

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-02-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Quantitative Methods Intervention: What Do the Students Want?

    ERIC Educational Resources Information Center

    Frankland, Lianne; Harrison, Jacqui

    2016-01-01

    The shortage of social science graduates with competent quantitative skills jeopardises the competitive UK economy, public policy making effectiveness and the status the UK has as a world leader in higher education and research (British Academy for Humanities and Social Sciences, 2012). There is a growing demand for quantitative skills across all…

  3. A non-radioisotopic quantitative competitive polymerase chain reaction method: application in measurement of human herpesvirus 7 load.

    PubMed

    Kidd, I M; Clark, D A; Emery, V C

    2000-06-01

    Quantitative-competitive polymerase chain reaction (QCPCR) is a well-optimised and objective methodology for the determination of viral load in clinical specimens. A major advantage of QCPCR is the ability to control for the differential modulation of the PCR process in the presence of potentially inhibitory material. QCPCR protocols were developed previously for CMV, HHV-6, HHV-7 and HHV-8 and relied upon radioactively labelled primers, followed by autoradiography of the separated and digested PCR products to quantify viral load. Whilst this approach offers high accuracy and dynamic range, non-radioactive approaches would be attractive. Here, an alternative detection system is reported, based on simple ethidium bromide staining and computer analysis of the separated reaction products, which enables its adoption in the analysis of a large number of samples. In calibration experiments using cloned HHV-7 DNA, the ethidium bromide detection method showed an improved correlation with known copy number over that obtained with the isotopic method. In addition, 67 HHV-7 PCR positive blood samples, derived from immunocompromised patients, were quantified using both detection techniques. The results showed a highly significant correlation with no significant difference between the two methods. The applicability of the computerised densitometry method in the routine laboratory is discussed.

  4. Comparison of salivary collection and processing methods for quantitative HHV-8 detection.

    PubMed

    Speicher, D J; Johnson, N W

    2014-10-01

    Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Assessment of Intrathecal Free Light Chain Synthesis: Comparison of Different Quantitative Methods with the Detection of Oligoclonal Free Light Chains by Isoelectric Focusing and Affinity-Mediated Immunoblotting

    PubMed Central

    Kušnierová, Pavlína; Švagera, Zdeněk; Všianský, František; Byrtusová, Monika; Hradílek, Pavel; Kurková, Barbora; Zapletalová, Olga; Bartoš, Vladimír

    2016-01-01

    Objectives We aimed to compare various methods for free light chain (fLC) quantitation in cerebrospinal fluid (CSF) and serum and to determine whether quantitative CSF measurements could reliably predict intrathecal fLC synthesis. In addition, we wished to determine the relationship between free kappa and free lambda light chain concentrations in CSF and serum in various disease groups. Methods We analysed 166 paired CSF and serum samples by at least one of the following methods: turbidimetry (Freelite™, SPAPLUS), nephelometry (N Latex FLC™, BN ProSpec), and two different (commercially available and in-house developed) sandwich ELISAs. The results were compared with oligoclonal fLC detected by affinity-mediated immunoblotting after isoelectric focusing. Results Although the correlations between quantitative methods were good, both proportional and systematic differences were discerned. However, no major differences were observed in the prediction of positive oligoclonal fLC test. Surprisingly, CSF free kappa/free lambda light chain ratios were lower than those in serum in about 75% of samples with negative oligoclonal fLC test. In about a half of patients with multiple sclerosis and clinically isolated syndrome, profoundly increased free kappa/free lambda light chain ratios were found in the CSF. Conclusions Our results show that using appropriate method-specific cut-offs, different methods of CSF fLC quantitation can be used for the prediction of intrathecal fLC synthesis. The reason for unexpectedly low free kappa/free lambda light chain ratios in normal CSFs remains to be elucidated. Whereas CSF free kappa light chain concentration is increased in most patients with multiple sclerosis and clinically isolated syndrome, CSF free lambda light chain values show large interindividual variability in these patients and should be investigated further for possible immunopathological and prognostic significance. PMID:27846293

  6. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  7. Method for preventing micromechanical structures from adhering to another object

    DOEpatents

    Smith, James H.; Ricco, Antonio J.

    1998-01-01

    A method for preventing micromechanical structures from adhering to another object includes the step of immersing a micromechanical structure and its associated substrate in a chemical species that does not stick to itself. The method can be employed during the manufacture of micromechanical structures to prevent micromechanical parts from sticking or adhering to one another and their associated substrate surface.

  8. Method for preventing micromechanical structures from adhering to another object

    DOEpatents

    Smith, J.H.; Ricco, A.J.

    1998-06-16

    A method for preventing micromechanical structures from adhering to another object includes the step of immersing a micromechanical structure and its associated substrate in a chemical species that does not stick to itself. The method can be employed during the manufacture of micromechanical structures to prevent micromechanical parts from sticking or adhering to one another and their associated substrate surface. 3 figs.

  9. A new method of edge detection for object recognition

    USGS Publications Warehouse

    Maddox, Brian G.; Rhew, Benjamin

    2004-01-01

    Traditional edge detection systems function by returning every edge in an input image. This can result in a large amount of clutter and make certain vectorization algorithms less accurate. Accuracy problems can then have a large impact on automated object recognition systems that depend on edge information. A new method of directed edge detection can be used to limit the number of edges returned based on a particular feature. This results in a cleaner image that is easier for vectorization. Vectorized edges from this process could then feed an object recognition system where the edge data would also contain information as to what type of feature it bordered.

  10. Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits

    ERIC Educational Resources Information Center

    Almalki, Sami

    2016-01-01

    This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…

  11. [Comparison of two quantitative methods of endobronchial ultrasound real-time elastography for evaluating intrathoracic lymph nodes].

    PubMed

    Mao, X W; Yang, J Y; Zheng, X X; Wang, L; Zhu, L; Li, Y; Xiong, H K; Sun, J Y

    2017-06-12

    Objective: To compare the clinical value of two quantitative methods in analyzing endobronchial ultrasound real-time elastography (EBUS-RTE) images for evaluating intrathoracic lymph nodes. Methods: From January 2014 to April 2014, EBUS-RTE examination was performed in patients who received EBUS-TBNA examination in Shanghai Chest Hospital. Each intrathoracic lymph node had a selected EBUS-RTE image. Stiff area ratio and mean hue value of region of interest (ROI) in each image were calculated respectively. The final diagnosis of lymph node was based on the pathologic/microbiologic results of EBUS-TBNA, pathologic/microbiologic results of other examinations and clinical following-up. The sensitivity, specificity, positive predictive value, negative predictive value and accuracy were evaluated for distinguishing malignant and benign lesions. Results: Fifty-six patients and 68 lymph nodes were enrolled in this study, of which 35 lymph nodes were malignant and 33 lymph nodes were benign. The stiff area ratio and mean hue value of benign and malignant lesions were 0.32±0.29, 0.62±0.20 and 109.99±28.13, 141.62±17.52, respectively, and statistical differences were found in both of those two methods ( t =-5.14, P <0.01; t =-5.53, P <0.01). The area under curves was 0.813, 0.814 in stiff area ratio and mean hue value, respectively. The optimal diagnostic cut-off value of stiff area ratio was 0.48, and the sensitivity, specificity, positive predictive value, negative predictive value and accuracy were 82.86%, 81.82%, 82.86%, 81.82% and 82.35%, respectively. The optimal diagnostic cut-off value of mean hue value was 126.28, and the sensitivity, specificity, positive predictive value, negative predictive value and accuracy were 85.71%, 75.76%, 78.95%, 83.33% and 80.88%, respectively. Conclusion: Both the stiff area ratio and mean hue value methods can be used for analyzing EBUS-RTE images quantitatively, having the value of differentiating benign and malignant intrathoracic

  12. Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1989-09-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less

  13. The Objective Borderline method (OBM): a probability-based model for setting up an objective pass/fail cut-off score in medical programme assessments.

    PubMed

    Shulruf, Boaz; Turner, Rolf; Poole, Phillippa; Wilkinson, Tim

    2013-05-01

    The decision to pass or fail a medical student is a 'high stakes' one. The aim of this study is to introduce and demonstrate the feasibility and practicality of a new objective standard-setting method for determining the pass/fail cut-off score from borderline grades. Three methods for setting up pass/fail cut-off scores were compared: the Regression Method, the Borderline Group Method, and the new Objective Borderline Method (OBM). Using Year 5 students' OSCE results from one medical school we established the pass/fail cut-off scores by the abovementioned three methods. The comparison indicated that the pass/fail cut-off scores generated by the OBM were similar to those generated by the more established methods (0.840 ≤ r ≤ 0.998; p < .0001). Based on theoretical and empirical analysis, we suggest that the OBM has advantages over existing methods in that it combines objectivity, realism, robust empirical basis and, no less importantly, is simple to use.

  14. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  15. Advanced display object selection methods for enhancing user-computer productivity

    NASA Technical Reports Server (NTRS)

    Osga, Glenn A.

    1993-01-01

    The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.

  16. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  17. Identification, quantitation, and method validation for flavan-3-ols in fermented ready-to-drink teas from the Italian market using HPLC-UV/DAD and LC-MS/MS.

    PubMed

    Cordero, Chiara; Canale, Francesca; Del Rio, Daniele; Bicchi, Carlo

    2009-11-01

    The present study is focused on flavan-3-ols characterizing the antioxidant properties of fermented tea (Camellia sinensis). These bioactive compounds, object of nutritional claims in commercial products, should be quantified with rigorous analytical procedures whose accuracy and precision have been stated with a certain level of confidence. An HPLC-UV/DAD method, able to detect and quantify flavan-3-ols in infusions and ready-to-drink teas, has been developed for routine analysis and validated by characterizing several performance parameters. The accuracy assessment has been run through a series of LC-MS/MS analyses. Epigallocatechin, (+)-catechin, (-)-epigallocatechingallate, (-)-epicatechin, (-)-gallocatechingallate, (-)-epicatechingallate, and (-)-catechingallate were chosen as markers of the polyphenolic fraction. Quantitative results showed that samples obtained from tea leaves infusion were richer in polyphenolic antioxidants than those obtained through other industrial processes. The influence of shelf-life and packaging material on the flavan-3-ols content was also considered; markers decreased, with an exponential trend, as a function of time within the shelf life while packaging materials demonstrated to influence differently the flavan-3-ol fraction composition over time. The method presented here provides quantitative results with a certain level of confidence and is suitable for a routine quality control of iced teas whose antioxidant properties are object of nutritional claim.

  18. A fast and objective multidimensional kernel density estimation method: fastKDE

    DOE PAGES

    O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.; ...

    2016-03-07

    Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less

  19. Dual-mode nested search method for categorical uncertain multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Tang, Long; Wang, Hu

    2016-10-01

    Categorical multi-objective optimization is an important issue involved in many matching design problems. Non-numerical variables and their uncertainty are the major challenges of such optimizations. Therefore, this article proposes a dual-mode nested search (DMNS) method. In the outer layer, kriging metamodels are established using standard regular simplex mapping (SRSM) from categorical candidates to numerical values. Assisted by the metamodels, a k-cluster-based intelligent sampling strategy is developed to search Pareto frontier points. The inner layer uses an interval number method to model the uncertainty of categorical candidates. To improve the efficiency, a multi-feature convergent optimization via most-promising-area stochastic search (MFCOMPASS) is proposed to determine the bounds of objectives. Finally, typical numerical examples are employed to demonstrate the effectiveness of the proposed DMNS method.

  20. Overcoming Methods Anxiety: Qualitative First, Quantitative Next, Frequent Feedback along the Way

    ERIC Educational Resources Information Center

    Bernstein, Jeffrey L.; Allen, Brooke Thomas

    2013-01-01

    Political Science research methods courses face two problems. First is what to cover, as there are too many techniques to explore in any one course. Second is dealing with student anxiety around quantitative material. We explore a novel way to approach these issues. Our students began by writing a qualitative paper. They followed with a term…

  1. Saving Educational Dollars through Quality Objectives.

    ERIC Educational Resources Information Center

    Alvir, Howard P.

    This document is a collection of working papers written to meet the specific needs of teachers who are starting to think about and write performance objectives. It emphasizes qualitative objectives as opposed to quantitative classroom goals. The author describes quality objectives as marked by their clarity, accessibility, accountability, and…

  2. Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.

    PubMed

    Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko

    2013-06-19

    A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.

  3. Quantitative analysis of tympanic membrane perforation: a simple and reliable method.

    PubMed

    Ibekwe, T S; Adeosun, A A; Nwaorgu, O G

    2009-01-01

    Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.

  4. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a…

  5. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    ERIC Educational Resources Information Center

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  6. Quantitative rotating frame relaxometry methods in MRI.

    PubMed

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Multiscale moment-based technique for object matching and recognition

    NASA Astrophysics Data System (ADS)

    Thio, HweeLi; Chen, Liya; Teoh, Eam-Khwang

    2000-03-01

    A new method is proposed to extract features from an object for matching and recognition. The features proposed are a combination of local and global characteristics -- local characteristics from the 1-D signature function that is defined to each pixel on the object boundary, global characteristics from the moments that are generated from the signature function. The boundary of the object is first extracted, then the signature function is generated by computing the angle between two lines from every point on the boundary as a function of position along the boundary. This signature function is position, scale and rotation invariant (PSRI). The shape of the signature function is then described quantitatively by using moments. The moments of the signature function are the global characters of a local feature set. Using moments as the eventual features instead of the signature function reduces the time and complexity of an object matching application. Multiscale moments are implemented to produce several sets of moments that will generate more accurate matching. Basically multiscale technique is a coarse to fine procedure and makes the proposed method more robust to noise. This method is proposed to match and recognize objects under simple transformation, such as translation, scale changes, rotation and skewing. A simple logo indexing system is implemented to illustrate the performance of the proposed method.

  8. An object recognition method based on fuzzy theory and BP networks

    NASA Astrophysics Data System (ADS)

    Wu, Chuan; Zhu, Ming; Yang, Dong

    2006-01-01

    It is difficult to choose eigenvectors when neural network recognizes object. It is possible that the different object eigenvectors is similar or the same object eigenvectors is different under scaling, shifting, rotation if eigenvectors can not be chosen appropriately. In order to solve this problem, the image is edged, the membership function is reconstructed and a new threshold segmentation method based on fuzzy theory is proposed to get the binary image. Moment invariant of binary image is extracted and normalized. Some time moment invariant is too small to calculate effectively so logarithm of moment invariant is taken as input eigenvectors of BP network. The experimental results demonstrate that the proposed approach could recognize the object effectively, correctly and quickly.

  9. [CHROMATOSPECTROPHOTOMETRIC METHOD OF QUANTITATIVE ANALYSIS OF LAPPACONITINE IN THE UNDERGROUND PARTS OF ACONITUM ORIENTALE MILL, GROWING IN GEORGIA].

    PubMed

    Kintsurashvili, L

    2016-05-01

    Aconitum orientale Mill (family Helleboraceae) is a perennial herb. It is spread in forests of the west and the east Georgia and in the subalpine zone. The research objects were underground parts of Aconitum orientale Mill, which were picked in the phase of fruiting in Borjomi in 2014. We had received alkaloids sum from the air-dry underground parts (1.5 kg) with chloroform extract which was alkalined by 5% sodium carbonate. We received the alkaloids sum of 16.5 g and determined that predominant is pharmacologically active diterpenic alkaloid - Lappaconitine, which is an acting initial part of the antiarrhythmic drug "Allapinin". The chromatospectrophotometrical method of quantitative analysis of Lappaconitine is elaborated for the detection of productivity of the underground parts of Aconitum orientale Mill. It was determined that maximal absorption wave length in ultra-violet spectrum (λmax) is 308 nm; It is established that relative error is norm (4%) from statical processing of quantitative analysis results. We determined that the content of Lappaconitine in the underground parts of Aconitum orientale Mill is 0.11-0.13% in the phase of fruiting. In consequence of experimental data Aconitum orientale Mill is approved as the raw material to receive pharmacologically active Lappaconitine.

  10. Qualitative to quantitative: linked trajectory of method triangulation in a study on HIV/AIDS in Goa, India.

    PubMed

    Bailey, Ajay; Hutter, Inge

    2008-10-01

    With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is frequently used to identify risk behaviours and adherence behaviour in the field of HIV/AIDS. Risk behaviour studies that apply HBM have been largely quantitative and use of qualitative methodology is rare. The marriage of qualitative and quantitative methods has never been easy. The challenge is in triangulating the methods. Method triangulation has been largely used to combine insights from the qualitative and quantitative methods but not to link both the methods. In this paper we suggest a linked trajectory of method triangulation (LTMT). The linked trajectory aims to first gather individual level information through in-depth interviews and then to present the information as vignettes in focus group discussions. We thus validate information obtained from in-depth interviews and gather emic concepts that arise from the interaction. We thus capture both the interpretation and the interaction angles of the qualitative method. Further, using the qualitative information gained, a survey is designed. In doing so, the survey questions are grounded and contextualized. We employed this linked trajectory of method triangulation in a study on the risk assessment of HIV/AIDS among migrant and mobile men. Fieldwork was carried out in Goa, India. Data come from two waves of studies, first an explorative qualitative study (2003), second a larger study (2004-2005), including in-depth interviews (25), focus group discussions (21) and a survey (n=1259). By employing the qualitative to quantitative LTMT we can not only contextualize the existing concepts of the HBM, but also validate new concepts and identify new risk groups.

  11. Employing Machine-Learning Methods to Study Young Stellar Objects

    NASA Astrophysics Data System (ADS)

    Moore, Nicholas

    2018-01-01

    Vast amounts of data exist in the astronomical data archives, and yet a large number of sources remain unclassified. We developed a multi-wavelength pipeline to classify infrared sources. The pipeline uses supervised machine learning methods to classify objects into the appropriate categories. The program is fed data that is already classified to train it, and is then applied to unknown catalogues. The primary use for such a pipeline is the rapid classification and cataloging of data that would take a much longer time to classify otherwise. While our primary goal is to study young stellar objects (YSOs), the applications extend beyond the scope of this project. We present preliminary results from our analysis and discuss future applications.

  12. Frame sequences analysis technique of linear objects movement

    NASA Astrophysics Data System (ADS)

    Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.

    2017-12-01

    Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.

  13. Finding the rarest objects in the universe: A new, efficient method for discovering BL Lacertae objects

    NASA Technical Reports Server (NTRS)

    Stocke, John; Perlman, Eric; Granados, Arno; Schachter, Jonathan; Elvis, Martin; Urry, Meg; Impey, Chris; Smith, Paul

    1993-01-01

    We present a new, efficient method for discovering new BL Lac Objects based upon the results of the Einstein Extended Medium Sensitivity Survey (EMSS). We have found that all x-ray selected BL Lacs are radio emitters, and further, that in a 'color-color' diagram (radio/optical and optical/x-ray) the BL Lac Objects occupy an area distinct from both radio loud quasars and the radio quiet QSOs and Seyferts which dominate x-ray selected samples. After obtaining radio counterparts via VLA 'snapshot' observations of a large sample of unidentified x-ray sources, the list of candidates is reduced. These candidates then can be confirmed with optical spectroscopy and/or polarimetry. Since greater than 70 percent of these sources are expected to be BL Lacs, the optical observations are very efficient. We have tested this method using unidentified sources found in the Einstein Slew Survey. The 162 Slew Survey x-ray source positions were observed with the VLA in a mixed B/C configuration at 6 cm resulting in 60 detections within 1.5 position error circle radii. These x-ray/optical/radio sources were then plotted, and 40 BL Lac candidates were identified. To date, 10 candidates have been spectroscopically observed resulting in 10 new BL Lac objects! Radio flux, optical magnitude, and polarization statistics (obtained in white light with the Steward Observatory 2.3 m CCD polarimeter) for each are given.

  14. Method for detecting an image of an object

    DOEpatents

    Chapman, Leroy Dean; Thomlinson, William C.; Zhong, Zhong

    1999-11-16

    A method for detecting an absorption, refraction and scatter image of an object by independently analyzing, detecting, digitizing, and combining images acquired on a high and a low angle side of a rocking curve of a crystal analyzer. An x-ray beam which is generated by any suitable conventional apparatus can be irradiated upon either a Bragg type crystal analyzer or a Laue type crystal analyzer. Images of the absorption, refraction and scattering effects are detected, such as on an image plate, and then digitized. The digitized images are simultaneously solved, preferably on a pixel-by-pixel basis, to derive a combined visual image which has dramatically improved contrast and spatial resolution over an image acquired through conventional radiology methods.

  15. Objective Versus Subjective Military Pilot Selection Methods in the United States of America

    DTIC Science & Technology

    2015-12-14

    a computerized test designed to assess pilot skills by measuring spatial orientation and psychomotor skills and multitasking . The second is the...AFRL-SA-WP-SR-2015-0028 Objective Versus Subjective Military Pilot Selection Methods in the United States of America Joe...September 2014 4. TITLE AND SUBTITLE Objective Versus Subjective Military Pilot Selection Methods in the United States of America 5a. CONTRACT

  16. Are quantitative sensitivity analysis methods always reliable?

    NASA Astrophysics Data System (ADS)

    Huang, X.

    2016-12-01

    Physical parameterizations developed to represent subgrid-scale physical processes include various uncertain parameters, leading to large uncertainties in today's Earth System Models (ESMs). Sensitivity Analysis (SA) is an efficient approach to quantitatively determine how the uncertainty of the evaluation metric can be apportioned to each parameter. Also, SA can identify the most influential parameters, as a result to reduce the high dimensional parametric space. In previous studies, some SA-based approaches, such as Sobol' and Fourier amplitude sensitivity testing (FAST), divide the parameters into sensitive and insensitive groups respectively. The first one is reserved but the other is eliminated for certain scientific study. However, these approaches ignore the disappearance of the interactive effects between the reserved parameters and the eliminated ones, which are also part of the total sensitive indices. Therefore, the wrong sensitive parameters might be identified by these traditional SA approaches and tools. In this study, we propose a dynamic global sensitivity analysis method (DGSAM), which iteratively removes the least important parameter until there are only two parameters left. We use the CLM-CASA, a global terrestrial model, as an example to verify our findings with different sample sizes ranging from 7000 to 280000. The result shows DGSAM has abilities to identify more influential parameters, which is confirmed by parameter calibration experiments using four popular optimization methods. For example, optimization using Top3 parameters filtered by DGSAM could achieve substantial improvement against Sobol' by 10%. Furthermore, the current computational cost for calibration has been reduced to 1/6 of the original one. In future, it is necessary to explore alternative SA methods emphasizing parameter interactions.

  17. Quantitative vibro-acoustography of tissue-like objects by measurement of resonant modes

    NASA Astrophysics Data System (ADS)

    Mazumder, Dibbyan; Umesh, Sharath; Mohan Vasu, Ram; Roy, Debasish; Kanhirodan, Rajan; Asokan, Sundarrajan

    2017-01-01

    We demonstrate a simple and computationally efficient method to recover the shear modulus pertaining to the focal volume of an ultrasound transducer from the measured vibro-acoustic spectral peaks. A model that explains the transport of local deformation information with the acoustic wave acting as a carrier is put forth. It is also shown that the peaks correspond to the natural frequencies of vibration of the focal volume, which may be readily computed by solving an eigenvalue problem associated with the vibrating region. Having measured the first natural frequency with a fibre Bragg grating sensor, and armed with an expedient means of computing the same, we demonstrate a simple procedure, based on the method of bisection, to recover the average shear modulus of the object in the ultrasound focal volume. We demonstrate this recovery for four homogeneous agarose slabs of different stiffness and verify the accuracy of the recovery using independent rheometer-based measurements. Extension of the method to anisotropic samples through the measurement of a more complete set of resonant modes and the recovery of an elasticity tensor distribution, as is done in resonant ultrasound spectroscopy, is suggested.

  18. Matrix evaluation of science objectives

    NASA Technical Reports Server (NTRS)

    Wessen, Randii R.

    1994-01-01

    The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.

  19. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    ERIC Educational Resources Information Center

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  20. Comparative analysis of quantitative efficiency evaluation methods for transportation networks.

    PubMed

    He, Yuxin; Qin, Jin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess's Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified.

  1. A simple objective method for determining a dynamic journal collection.

    PubMed

    Bastille, J D; Mankin, C J

    1980-10-01

    In order to determine the content of a journal collection responsive to both user needs and space and dollar constraints, quantitative measures of the use of a 647-title collection have been related to space and cost requirements to develop objective criteria for a dynamic collection for the Treadwell Library at the Massachusetts General Hospital, a large medical research center. Data were collected for one calendar year (1977) and stored with the elements for each title's profile in a computerized file. To account for the effect of the bulk of the journal runs on the number of uses, raw use data have been adjusted using linear shelf space required for each title to produce a factor called density of use. Titles have been ranked by raw use and by density of use with space and cost requirements for each. Data have also been analyzed for five special categories of use. Given automated means of collecting and storing data, use measures should be collected continuously. Using raw use frequency ranking to relate use to space and costs seems sensible since a decision point cutoff can be chosen in terms of the potential interlibrary loans generated. But it places new titles at risk while protecting titles with long, little used runs. Basing decisions on density of use frequency ranking seems to produce a larger yield of titles with fewer potential interlibrary loans and to identify titles with overlong runs which may be pruned or converted to microform. The method developed is simple and practical. Its design will be improved to apply to data collected in 1980 for a continuous study of journal use. The problem addressed is essentially one of inventory control. Viewed as such it makes good financial sense to measure use as part of the routine operation of the library to provide information for effective management decisions.

  2. Exploring the Objective and Perceived Environmental Attributes of Older Adults’ Neighborhood Walking Routes: A Mixed Methods Analysis

    PubMed Central

    Moran, Mika R.; Werner, Perla; Doron, Israel; HaGani, Neta; Benvenisti, Yael; King, Abby C.; Winter, Sandra J.; Sheats, Jylana L.; Garber, Randi; Motro, Hadas; Ergon, Shlomit

    2018-01-01

    Walking is a central form of physical activity among older adults that is associated with the physical environment at various scales. This mixed-methods study employs a concurrent nested design to explore objective and perceived environmental characteristics of older adults’ local walking routes. This was achieved by integrating quantitative Geographic Information System (GIS) data with qualitative data obtained using the Stanford Discovery Tool (DT). Fifty-nine community-dwelling middle-aged and older adults (14 men and 45 women aged 50+) were recruited in a snowball approach through community centers in the city of Haifa (Israel). Four neighborhood environment themes were identified: pedestrian infrastructure, access to destinations, aesthetics, and environmental quality. Both geometrical traits (i.e., distance, slope) and urban features (i.e., land-uses, greenery) of the route may impact the experience of walking. The findings thus highlight the importance of micro-scale environmental elements in shaping environmental perceptions, which may consequently influence the choice of being active. PMID:27992252

  3. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  4. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  5. A method to validate quantitative high-frequency power doppler ultrasound with fluorescence in vivo video microscopy.

    PubMed

    Pinter, Stephen Z; Kim, Dae-Ro; Hague, M Nicole; Chambers, Ann F; MacDonald, Ian C; Lacefield, James C

    2014-08-01

    Flow quantification with high-frequency (>20 MHz) power Doppler ultrasound can be performed objectively using the wall-filter selection curve (WFSC) method to select the cutoff velocity that yields a best-estimate color pixel density (CPD). An in vivo video microscopy system (IVVM) is combined with high-frequency power Doppler ultrasound to provide a method for validation of CPD measurements based on WFSCs in mouse testicular vessels. The ultrasound and IVVM systems are instrumented so that the mouse remains on the same imaging platform when switching between the two modalities. In vivo video microscopy provides gold-standard measurements of vascular diameter to validate power Doppler CPD estimates. Measurements in four image planes from three mice exhibit wide variation in the optimal cutoff velocity and indicate that a predetermined cutoff velocity setting can introduce significant errors in studies intended to quantify vascularity. Consistent with previously published flow-phantom data, in vivo WFSCs exhibited three characteristic regions and detectable plateaus. Selection of a cutoff velocity at the right end of the plateau yielded a CPD close to the gold-standard vascular volume fraction estimated using IVVM. An investigator can implement the WFSC method to help adapt cutoff velocity to current blood flow conditions and thereby improve the accuracy of power Doppler for quantitative microvascular imaging. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  6. Quantitative Determination of Caffeine in Beverages Using a Combined SPME-GC/MS Method

    NASA Astrophysics Data System (ADS)

    Pawliszyn, Janusz; Yang, Min J.; Orton, Maureen L.

    1997-09-01

    Solid-phase microextraction (SPME) combined with gas chromatography/mass spectrometry (GC/MS) has been applied to the analysis of various caffeinated beverages. Unlike the current methods, this technique is solvent free and requires no pH adjustments. The simplicity of the SPME-GC/MS method lends itself to a good undergraduate laboratory practice. This publication describes the analytical conditions and presents the data for determination of caffeine in coffee, tea, and coke. Quantitation by isotopic dilution is also illustrated.

  7. Optical micromanipulation methods for controlled rotation, transportation, and microinjection of biological objects.

    PubMed

    Mohanty, S K; Gupta, P K

    2007-01-01

    The use of laser microtools for rotation and controlled transport of microscopic biological objects and for microinjection of exogenous material in cells is discussed. We first provide a brief overview of the laser tweezers-based methods for rotation or orientation of microscopic objects. Particular emphasis is placed on the methods that are more suitable for the manipulation of biological objects, and the use of these for two-dimensional (2D) and 3D rotations/orientations of intracellular objects is discussed. We also discuss how a change in the shape of a red blood cell (RBC) suspended in hypertonic buffer leads to its rotation when it is optically tweezed. The potential use of this approach for the diagnosis of malaria is also illustrated. The use of a line tweezers having an asymmetric intensity distribution about the center of its major axis for simultaneous transport of microscopic objects, and the successful use of this approach for induction, enhancement, and guidance of neuronal growth cones is presented next. Finally, we describe laser microbeam-assisted microinjection of impermeable drugs into cells and also briefly discuss possible adverse effects of the laser trap or microbeams on cells.

  8. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    PubMed

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. mHealth Series: mHealth project in Zhao County, rural China – Description of objectives, field site and methods

    PubMed Central

    van Velthoven, Michelle Helena; Li, Ye; Wang, Wei; Du, Xiaozhen; Wu, Qiong; Chen, Li; Majeed, Azeem; Rudan, Igor; Zhang, Yanfeng; Car, Josip

    2013-01-01

    Background We set up a collaboration between researchers in China and the UK that aimed to explore the use of mHealth in China. This is the first paper in a series of papers on a large mHealth project part of this collaboration. This paper included the aims and objectives of the mHealth project, our field site, and the detailed methods of two studies. Field site The field site for this mHealth project was Zhao County, which lies 280 km south of Beijing in Hebei Province, China. Methods We described the methodology of two studies: (i) a mixed methods study exploring factors influencing sample size calculations for mHealth–based health surveys and (ii) a cross–over study determining validity of an mHealth text messaging data collection tool. The first study used mixed methods, both quantitative and qualitative, including: (i) two surveys with caregivers of young children, (ii) interviews with caregivers, village doctors and participants of the cross–over study, and (iii) researchers’ views. We combined data from caregivers, village doctors and researchers to provide an in–depth understanding of factors influencing sample size calculations for mHealth–based health surveys. The second study, a cross–over study, used a randomised cross–over study design to compare the traditional face–to–face survey method to the new text messaging survey method. We assessed data equivalence (intrarater agreement), the amount of information in responses, reasons for giving different responses, the response rate, characteristics of non–responders, and the error rate. Conclusions This paper described the objectives, field site and methods of a large mHealth project part of a collaboration between researchers in China and the UK. The mixed methods study evaluating factors that influence sample size calculations could help future studies with estimating reliable sample sizes. The cross–over study comparing face–to–face and text message survey data collection

  10. Determination of feature generation methods for PTZ camera object tracking

    NASA Astrophysics Data System (ADS)

    Doyle, Daniel D.; Black, Jonathan T.

    2012-06-01

    Object detection and tracking using computer vision (CV) techniques have been widely applied to sensor fusion applications. Many papers continue to be written that speed up performance and increase learning of artificially intelligent systems through improved algorithms, workload distribution, and information fusion. Military application of real-time tracking systems is becoming more and more complex with an ever increasing need of fusion and CV techniques to actively track and control dynamic systems. Examples include the use of metrology systems for tracking and measuring micro air vehicles (MAVs) and autonomous navigation systems for controlling MAVs. This paper seeks to contribute to the determination of select tracking algorithms that best track a moving object using a pan/tilt/zoom (PTZ) camera applicable to both of the examples presented. The select feature generation algorithms compared in this paper are the trained Scale-Invariant Feature Transform (SIFT) and Speeded Up Robust Features (SURF), the Mixture of Gaussians (MoG) background subtraction method, the Lucas- Kanade optical flow method (2000) and the Farneback optical flow method (2003). The matching algorithm used in this paper for the trained feature generation algorithms is the Fast Library for Approximate Nearest Neighbors (FLANN). The BSD licensed OpenCV library is used extensively to demonstrate the viability of each algorithm and its performance. Initial testing is performed on a sequence of images using a stationary camera. Further testing is performed on a sequence of images such that the PTZ camera is moving in order to capture the moving object. Comparisons are made based upon accuracy, speed and memory.

  11. An augmented classical least squares method for quantitative Raman spectral analysis against component information loss.

    PubMed

    Zhou, Yan; Cao, Hui

    2013-01-01

    We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.

  12. Atmospheric Blocking and Intercomparison of Objective Detection Methods: Flow Field Characteristics

    NASA Astrophysics Data System (ADS)

    Pinheiro, M. C.; Ullrich, P. A.; Grotjahn, R.

    2017-12-01

    A number of objective methods for identifying and quantifying atmospheric blocking have been developed over the last couple of decades, but there is variable consensus on the resultant blocking climatology. This project examines blocking climatologies as produced by three different methods: two anomaly-based methods, and the geopotential height gradient method of Tibaldi and Molteni (1990). The results highlight the differences in blocking that arise from the choice of detection method, with emphasis on the physical characteristics of the flow field and the subsequent effects on the blocking patterns that emerge.

  13. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  14. Ionizing radiation post-curing of objects produced by stereolithography and other methods

    DOEpatents

    Howell, David H.; Eberle, Claude C.; Janke, Christopher J.

    2000-01-01

    An object comprised of a curable material and formed by stereolithography or another three-dimensional prototyping method, in which the object has undergone initial curing, is subjected to post-curing by ionizing radiation, such as an electron beam having a predetermined beam output energy, which is applied in a predetermined dosage and at a predetermined dose rate. The post-cured object exhibits a property profile which is superior to that which existed prior to the ionizing radiation post-curing.

  15. Method for a quantitative investigation of the frozen flow hypothesis

    PubMed

    Schock; Spillar

    2000-09-01

    We present a technique to test the frozen flow hypothesis quantitatively, using data from wave-front sensors such as those found in adaptive optics systems. Detailed treatments of the theoretical background of the method and of the error analysis are presented. Analyzing data from the 1.5-m and 3.5-m telescopes at the Starfire Optical Range, we find that the frozen flow hypothesis is an accurate description of the temporal development of atmospheric turbulence on time scales of the order of 1-10 ms but that significant deviations from the frozen flow behavior are present for longer time scales.

  16. A method for normalizing pathology images to improve feature extraction for quantitative pathology.

    PubMed

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  17. Applying the Multiple Signal Classification Method to Silent Object Detection Using Ambient Noise

    NASA Astrophysics Data System (ADS)

    Mori, Kazuyoshi; Yokoyama, Tomoki; Hasegawa, Akio; Matsuda, Minoru

    2004-05-01

    The revolutionary concept of using ocean ambient noise positively to detect objects, called acoustic daylight imaging, has attracted much attention. The authors attempted the detection of a silent target object using ambient noise and a wide-band beam former consisting of an array of receivers. In experimental results obtained in air, using the wide-band beam former, we successfully applied the delay-sum array (DSA) method to detect a silent target object in an acoustic noise field generated by a large number of transducers. This paper reports some experimental results obtained by applying the multiple signal classification (MUSIC) method to a wide-band beam former to detect silent targets. The ocean ambient noise was simulated by transducers decentralized to many points in air. Both MUSIC and DSA detected a spherical target object in the noise field. The relative power levels near the target obtained with MUSIC were compared with those obtained by DSA. Then the effectiveness of the MUSIC method was evaluated according to the rate of increase in the maximum and minimum relative power levels.

  18. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology imagesmore » by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.« less

  19. Paradigms Lost and Pragmatism Regained: Methodological Implications of Combining Qualitative and Quantitative Methods

    ERIC Educational Resources Information Center

    Morgan, David L.

    2007-01-01

    This article examines several methodological issues associated with combining qualitative and quantitative methods by comparing the increasing interest in this topic with the earlier renewal of interest in qualitative research during the 1980s. The first section argues for the value of Kuhn's concept of paradigm shifts as a tool for examining…

  20. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    PubMed

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Haptic exploratory behavior during object discrimination: a novel automatic annotation method.

    PubMed

    Jansen, Sander E M; Bergmann Tiest, Wouter M; Kappers, Astrid M L

    2015-01-01

    In order to acquire information concerning the geometry and material of handheld objects, people tend to execute stereotypical hand movement patterns called haptic Exploratory Procedures (EPs). Manual annotation of haptic exploration trials with these EPs is a laborious task that is affected by subjectivity, attentional lapses, and viewing angle limitations. In this paper we propose an automatic EP annotation method based on position and orientation data from motion tracking sensors placed on both hands and inside a stimulus. A set of kinematic variables is computed from these data and compared to sets of predefined criteria for each of four EPs. Whenever all criteria for a specific EP are met, it is assumed that that particular hand movement pattern was performed. This method is applied to data from an experiment where blindfolded participants haptically discriminated between objects differing in hardness, roughness, volume, and weight. In order to validate the method, its output is compared to manual annotation based on video recordings of the same trials. Although mean pairwise agreement is less between human-automatic pairs than between human-human pairs (55.7% vs 74.5%), the proposed method performs much better than random annotation (2.4%). Furthermore, each EP is linked to a specific object property for which it is optimal (e.g., Lateral Motion for roughness). We found that the percentage of trials where the expected EP was found does not differ between manual and automatic annotation. For now, this method cannot yet completely replace a manual annotation procedure. However, it could be used as a starting point that can be supplemented by manual annotation.

  2. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, Leon J.; Cremers, David A.

    1985-01-01

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is demonstrated. Significant shortening of analysis time is achieved from those of the usual chemical techniques of analysis.

  3. Apparatus and method for quantitative determination of materials contained in fluids

    DOEpatents

    Radziemski, L.J.; Cremers, D.A.

    1982-09-07

    Apparatus and method for near real-time in-situ monitoring of particulates and vapors contained in fluids are described. Initial filtration of a known volume of the fluid sample is combined with laser-induced dielectric breakdown spectroscopy of the filter employed to obtain qualitative and quantitative information with high sensitivity. Application of the invention to monitoring of beryllium, beryllium oxide, or other beryllium-alloy dusts is shown. Significant shortening of analysis time is achieved from the usual chemical techniques of analysis.

  4. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  5. Detection of blob objects in microscopic zebrafish images based on gradient vector diffusion.

    PubMed

    Li, Gang; Liu, Tianming; Nie, Jingxin; Guo, Lei; Malicki, Jarema; Mara, Andrew; Holley, Scott A; Xia, Weiming; Wong, Stephen T C

    2007-10-01

    The zebrafish has become an important vertebrate animal model for the study of developmental biology, functional genomics, and disease mechanisms. It is also being used for drug discovery. Computerized detection of blob objects has been one of the important tasks in quantitative phenotyping of zebrafish. We present a new automated method that is able to detect blob objects, such as nuclei or cells in microscopic zebrafish images. This method is composed of three key steps. The first step is to produce a diffused gradient vector field by a physical elastic deformable model. In the second step, the flux image is computed on the diffused gradient vector field. The third step performs thresholding and nonmaximum suppression based on the flux image. We report the validation and experimental results of this method using zebrafish image datasets from three independent research labs. Both sensitivity and specificity of this method are over 90%. This method is able to differentiate closely juxtaposed or connected blob objects, with high sensitivity and specificity in different situations. It is characterized by a good, consistent performance in blob object detection.

  6. Effects of DNA extraction and purification methods on real-time quantitative PCR analysis of Roundup Ready soybean.

    PubMed

    Demeke, Tigst; Ratnayaka, Indira; Phan, Anh

    2009-01-01

    The quality of DNA affects the accuracy and repeatability of quantitative PCR results. Different DNA extraction and purification methods were compared for quantification of Roundup Ready (RR) soybean (event 40-3-2) by real-time PCR. DNA was extracted using cetylmethylammonium bromide (CTAB), DNeasy Plant Mini Kit, and Wizard Magnetic DNA purification system for food. CTAB-extracted DNA was also purified using the Zymo (DNA Clean & Concentrator 25 kit), Qtip 100 (Qiagen Genomic-Tip 100/G), and QIAEX II Gel Extraction Kit. The CTAB extraction method provided the largest amount of DNA, and the Zymo purification kit resulted in the highest percentage of DNA recovery. The Abs260/280 and Abs260/230 ratios were less than the expected values for some of the DNA extraction and purification methods used, indicating the presence of substances that could inhibit PCR reactions. Real-time quantitative PCR results were affected by the DNA extraction and purification methods used. Further purification or dilution of the CTAB DNA was required for successful quantification of RR soybean. Less variability of quantitative PCR results was observed among experiments and replications for DNA extracted and/or purified by CTAB, CTAB+Zymo, CTAB+Qtip 100, and DNeasy methods. Correct and repeatable results for real-time PCR quantification of RR soybean were achieved using CTAB DNA purified with Zymo and Qtip 100 methods.

  7. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true

  8. Association between quantitative measures obtained using fluorescence-based methods and activity status of occlusal caries lesions in primary molars.

    PubMed

    Novaes, Tatiane Fernandes; Reyes, Alessandra; Matos, Ronilza; Antunes-Pontes, Laura Regina; Marques, Renata Pereira de Samuel; Braga, Mariana Minatel; Diniz, Michele Baffi; Mendes, Fausto Medeiros

    2017-05-01

    Fluorescence-based methods (FBM) can add objectiveness to diagnosis strategy for caries. Few studies, however, have focused on the evaluation of caries activity. To evaluate the association between quantitative measures obtained with FBM, clinical parameters acquired from the patients, caries detection, and assessment of activity status in occlusal surfaces of primary molars. Six hundred and six teeth from 113 children (4-14 years) were evaluated. The presence of a biofilm, caries experience, and the number of active lesions were recorded. The teeth were assessed using FBM: DIAGNOdent pen (Lfpen) and Quantitative light-induced fluorescence (QLF). As reference standard, all teeth were evaluated using the ICDAS (International Caries Detection and Assessment System) associated with clinical activity assessments. Multilevel regressions compared the FBM values and evaluated the association between the FBM measures and clinical variables related to the caries activity. The measures from the FBM were higher in cavitated lesions. Only, ∆F values distinguished active and inactive lesions. The LFpen measures were higher in active lesions, at the cavitated threshold (56.95 ± 29.60). Following regression analyses, only the presence of visible biofilm on occlusal surfaces (adjusted prevalence ratio = 1.43) and ∆R values of the teeth (adjusted prevalence ratio = 1.02) were associated with caries activity. Some quantitative measures from FBM parameters are associated with caries activity evaluation, which is similar to the clinical evaluation of the presence of visible biofilm. © 2016 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Methods for Quantitative Interpretation of Retarding Field Analyzer Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.

    2011-03-28

    Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and onemore » can obtain best fit values for important simulation parameters with a chi-square minimization method.« less

  10. Overview of Student Affairs Research Methods: Qualitative and Quantitative.

    ERIC Educational Resources Information Center

    Perl, Emily J.; Noldon, Denise F.

    2000-01-01

    Reviews the strengths and weaknesses of quantitative and qualitative research in student affairs research, noting that many student affairs professionals question the value of more traditional quantitative approaches to research, though they typically have very good people skills that they have applied to being good qualitative researchers.…

  11. A quantitative method for determining spatial discriminative capacity.

    PubMed

    Zhang, Zheng; Tannan, Vinay; Holden, Jameson K; Dennis, Robert G; Tommerdahl, Mark

    2008-03-10

    The traditional two-point discrimination (TPD) test, a widely used tactile spatial acuity measure, has been criticized as being imprecise because it is based on subjective criteria and involves a number of non-spatial cues. The results of a recent study showed that as two stimuli were delivered simultaneously, vibrotactile amplitude discrimination became worse when the two stimuli were positioned relatively close together and was significantly degraded when the probes were within a subject's two-point limen. The impairment of amplitude discrimination with decreasing inter-probe distance suggested that the metric of amplitude discrimination could possibly provide a means of objective and quantitative measurement of spatial discrimination capacity. A two alternative forced-choice (2AFC) tracking procedure was used to assess a subject's ability to discriminate the amplitude difference between two stimuli positioned at near-adjacent skin sites. Two 25 Hz flutter stimuli, identical except for a constant difference in amplitude, were delivered simultaneously to the hand dorsum. The stimuli were initially spaced 30 mm apart, and the inter-stimulus distance was modified on a trial-by-trial basis based on the subject's performance of discriminating the stimulus with higher intensity. The experiment was repeated via sequential, rather than simultaneous, delivery of the same vibrotactile stimuli. Results obtained from this study showed that the performance of the amplitude discrimination task was significantly degraded when the stimuli were delivered simultaneously and were near a subject's two-point limen. In contrast, subjects were able to correctly discriminate between the amplitudes of the two stimuli when they were sequentially delivered at all inter-probe distances (including those within the two-point limen), and improved when an adapting stimulus was delivered prior to simultaneously delivered stimuli. Subjects' capacity to discriminate the amplitude difference

  12. Hepatitis C Virus RNA Real-Time Quantitative RT-PCR Method Based on a New Primer Design Strategy.

    PubMed

    Chen, Lida; Li, Wenli; Zhang, Kuo; Zhang, Rui; Lu, Tian; Hao, Mingju; Jia, Tingting; Sun, Yu; Lin, Guigao; Wang, Lunan; Li, Jinming

    2016-01-01

    Viral nucleic acids are unstable when improperly collected, handled, and stored, resulting in decreased sensitivity of currently available commercial quantitative nucleic acid testing kits. Using known unstable hepatitis C virus RNA, we developed a quantitative RT-PCR method based on a new primer design strategy to reduce the impact of nucleic acid instability on nucleic acid testing. The performance of the method was evaluated for linearity, limit of detection, precision, specificity, and agreement with commercial hepatitis C virus assays. Its clinical application was compared to that of two commercial kits--Cobas AmpliPrep/Cobas TaqMan (CAP/CTM) and Kehua. The quantitative RT-PCR method delivered a good performance, with a linearity of R(2) = 0.99, a total limit of detection (genotypes 1 to 6) of 42.6 IU/mL (95% CI, 32.84 to 67.76 IU/mL), a CV of 1.06% to 3.34%, a specificity of 100%, and a high concordance with the CAP/CTM assay (R(2) = 0.97), with a means ± SD value of -0.06 ± 1.96 log IU/mL (range, -0.38 to 0.25 log IU/mL). The method was superior to commercial assays in detecting unstable hepatitis C virus RNA (P < 0.05). This quantitative RT-PCR method can effectively eliminate the influence of RNA instability on nucleic acid testing. The principle of primer design strategy may be applied to the detection of other RNA or DNA viruses. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  13. Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.

    PubMed

    Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin

    2018-01-08

    We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.

  14. Determining characteristics of artificial near-Earth objects using observability analysis

    NASA Astrophysics Data System (ADS)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  15. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    PubMed

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  16. A quantitative approach to evolution of music and philosophy

    NASA Astrophysics Data System (ADS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano

    2012-08-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.

  17. Comparative analysis of quantitative efficiency evaluation methods for transportation networks

    PubMed Central

    He, Yuxin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165

  18. QDMR: a quantitative method for identification of differentially methylated regions by entropy

    PubMed Central

    Zhang, Yan; Liu, Hongbo; Lv, Jie; Xiao, Xue; Zhu, Jiang; Liu, Xiaojuan; Su, Jianzhong; Li, Xia; Wu, Qiong; Wang, Fang; Cui, Ying

    2011-01-01

    DNA methylation plays critical roles in transcriptional regulation and chromatin remodeling. Differentially methylated regions (DMRs) have important implications for development, aging and diseases. Therefore, genome-wide mapping of DMRs across various temporal and spatial methylomes is important in revealing the impact of epigenetic modifications on heritable phenotypic variation. We present a quantitative approach, quantitative differentially methylated regions (QDMRs), to quantify methylation difference and identify DMRs from genome-wide methylation profiles by adapting Shannon entropy. QDMR was applied to synthetic methylation patterns and methylation profiles detected by methylated DNA immunoprecipitation microarray (MeDIP-chip) in human tissues/cells. This approach can give a reasonable quantitative measure of methylation difference across multiple samples. Then DMR threshold was determined from methylation probability model. Using this threshold, QDMR identified 10 651 tissue DMRs which are related to the genes enriched for cell differentiation, including 4740 DMRs not identified by the method developed by Rakyan et al. QDMR can also measure the sample specificity of each DMR. Finally, the application to methylation profiles detected by reduced representation bisulphite sequencing (RRBS) in mouse showed the platform-free and species-free nature of QDMR. This approach provides an effective tool for the high-throughput identification of potential functional regions involved in epigenetic regulation. PMID:21306990

  19. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  20. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    PubMed

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens.

  1. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  2. Objective, Way and Method of Faculty Management Based on Ergonomics

    ERIC Educational Resources Information Center

    WANG, Hong-bin; Liu, Yu-hua

    2008-01-01

    The core problem that influences educational quality of talents in colleges and universities is the faculty management. Without advanced faculty, it is difficult to cultivate excellent talents. With regard to some problems in present faculty construction of colleges and universities, this paper puts forward the new objectives, ways and methods of…

  3. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  4. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  5. A Verification System for Distributed Objects with Asynchronous Method Calls

    NASA Astrophysics Data System (ADS)

    Ahrendt, Wolfgang; Dylla, Maximilian

    We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.

  6. Buried Object Detection Method Using Optimum Frequency Range in Extremely Shallow Underground

    NASA Astrophysics Data System (ADS)

    Sugimoto, Tsuneyoshi; Abe, Touma

    2011-07-01

    We propose a new detection method for buried objects using the optimum frequency response range of the corresponding vibration velocity. Flat speakers and a scanning laser Doppler vibrometer (SLDV) are used for noncontact acoustic imaging in the extremely shallow underground. The exploration depth depends on the sound pressure, but it is usually less than 10 cm. Styrofoam, wood (silver fir), and acrylic boards of the same size, different size styrofoam boards, a hollow toy duck, a hollow plastic container, a plastic container filled with sand, a hollow steel can and an unglazed pot are used as buried objects which are buried in sand to about 2 cm depth. The imaging procedure of buried objects using the optimum frequency range is given below. First, the standardized difference from the average vibration velocity is calculated for all scan points. Next, using this result, underground images are made using a constant frequency width to search for the frequency response range of the buried object. After choosing an approximate frequency response range, the difference between the average vibration velocity for all points and that for several points that showed a clear response is calculated for the final confirmation of the optimum frequency range. Using this optimum frequency range, we can obtain the clearest image of the buried object. From the experimental results, we confirmed the effectiveness of our proposed method. In particular, a clear image of the buried object was obtained when the SLDV image was unclear.

  7. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE PAGES

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; ...

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  8. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  9. Integrating planning perception and action for informed object search.

    PubMed

    Manso, Luis J; Gutierrez, Marco A; Bustos, Pablo; Bachiller, Pilar

    2018-05-01

    This paper presents a method to reduce the time spent by a robot with cognitive abilities when looking for objects in unknown locations. It describes how machine learning techniques can be used to decide which places should be inspected first, based on images that the robot acquires passively. The proposal is composed of two concurrent processes. The first one uses the aforementioned images to generate a description of the types of objects found in each object container seen by the robot. This is done passively, regardless of the task being performed. The containers can be tables, boxes, shelves or any other kind of container of known shape whose contents can be seen from a distance. The second process uses the previously computed estimation of the contents of the containers to decide which is the most likely container having the object to be found. This second process is deliberative and takes place only when the robot needs to find an object, whether because it is explicitly asked to locate one or because it is needed as a step to fulfil the mission of the robot. Upon failure to guess the right container, the robot can continue making guesses until the object is found. Guesses are made based on the semantic distance between the object to find and the description of the types of the objects found in each object container. The paper provides quantitative results comparing the efficiency of the proposed method and two base approaches.

  10. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  11. A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.

    PubMed

    Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao

    2015-06-15

    ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Assessing deep and shallow learning methods for quantitative prediction of acute chemical toxicity.

    PubMed

    Liu, Ruifeng; Madore, Michael; Glover, Kyle P; Feasel, Michael G; Wallqvist, Anders

    2018-05-02

    Animal-based methods for assessing chemical toxicity are struggling to meet testing demands. In silico approaches, including machine-learning methods, are promising alternatives. Recently, deep neural networks (DNNs) were evaluated and reported to outperform other machine-learning methods for quantitative structure-activity relationship modeling of molecular properties. However, most of the reported performance evaluations relied on global performance metrics, such as the root mean squared error (RMSE) between the predicted and experimental values of all samples, without considering the impact of sample distribution across the activity spectrum. Here, we carried out an in-depth analysis of DNN performance for quantitative prediction of acute chemical toxicity using several datasets. We found that the overall performance of DNN models on datasets of up to 30,000 compounds was similar to that of random forest (RF) models, as measured by the RMSE and correlation coefficients between the predicted and experimental results. However, our detailed analyses demonstrated that global performance metrics are inappropriate for datasets with a highly uneven sample distribution, because they show a strong bias for the most populous compounds along the toxicity spectrum. For highly toxic compounds, DNN and RF models trained on all samples performed much worse than the global performance metrics indicated. Surprisingly, our variable nearest neighbor method, which utilizes only structurally similar compounds to make predictions, performed reasonably well, suggesting that information of close near neighbors in the training sets is a key determinant of acute toxicity predictions.

  13. Object permanence and method of disappearance: looking measures further contradict reaching measures.

    PubMed

    Charles, Eric P; Rivera, Susan M

    2009-11-01

    Piaget proposed that understanding permanency, understanding occlusion events, and forming mental representations were synonymous; however, accumulating evidence indicates that those concepts are not unified in development. Infants reach for endarkened objects at younger ages than for occluded objects, and infants' looking patterns suggest that they expect occluded objects to reappear at younger ages than they reach for them. We reaffirm the latter finding in 5- to 6-month-olds and find similar responses to faded objects, but we fail to find that pattern in response to endarkened objects. This suggests that looking behavior and reaching behavior are both sensitive to method of disappearance, but with opposite effects. Current cognition-oriented (i.e. representation-oriented) explanations of looking behavior cannot easily accommodate these results; neither can perceptual-preference explanations, nor the traditional ecological reinterpretations of object permanence. A revised ecological hypothesis, invoking affordance learning, suggests how these differences could arise developmentally.

  14. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  15. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  16. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  17. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  18. Space Archaeology: Attribute, Object, Task and Method

    NASA Astrophysics Data System (ADS)

    Wang, Xinyuan; Guo, Huadong; Luo, Lei; Liu, Chuansheng

    2017-04-01

    Archaeology takes the material remains of human activity as the research object, and uses those fragmentary remains to reconstruct the humanistic and natural environment in different historical periods. Space Archaeology is a new branch of the Archaeology. Its study object is the humanistic-natural complex including the remains of human activities and living environments on the earth surface. The research method, space information technologies applied to this complex, is an innovative process concerning archaeological information acquisition, interpretation and reconstruction, and to achieve the 3-D dynamic reconstruction of cultural heritages by constructing the digital cultural-heritage sphere. Space archaeology's attribute is highly interdisciplinary linking several areas of natural and social and humanities. Its task is to reveal the history, characteristics, and patterns of human activities in the past, as well as to understand the evolutionary processes guiding the relationship between human and their environment. This paper summarizes six important aspects of space archaeology and five crucial recommendations for the establishment and development of this new discipline. The six important aspects are: (1) technologies and methods for non-destructive detection of archaeological sites; (2) space technologies for the protection and monitoring of cultural heritages; (3) digital environmental reconstruction of archaeological sites; (4) spatial data storage and data mining of cultural heritages; (5) virtual archaeology, digital reproduction and public information and presentation system; and (6) the construction of scientific platform of digital cultural-heritage sphere. The five key recommendations for establishing the discipline of Space Archaeology are: (1) encouraging the full integration of the strengths of both archaeology and museology with space technology to promote the development of space technologies' application for cultural heritages; (2) a new

  19. An Object-Based Requirements Modeling Method.

    ERIC Educational Resources Information Center

    Cordes, David W.; Carver, Doris L.

    1992-01-01

    Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…

  20. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  1. Correlation between average tissue depth data and quantitative accuracy of forensic craniofacial reconstructions measured by geometric surface comparison method.

    PubMed

    Lee, Won-Joon; Wilkinson, Caroline M; Hwang, Hyeon-Shik; Lee, Sang-Mi

    2015-05-01

    Accuracy is the most important factor supporting the reliability of forensic facial reconstruction (FFR) comparing to the corresponding actual face. A number of methods have been employed to evaluate objective accuracy of FFR. Recently, it has been attempted that the degree of resemblance between computer-generated FFR and actual face is measured by geometric surface comparison method. In this study, three FFRs were produced employing live adult Korean subjects and three-dimensional computerized modeling software. The deviations of the facial surfaces between the FFR and the head scan CT of the corresponding subject were analyzed in reverse modeling software. The results were compared with those from a previous study which applied the same methodology as this study except average facial soft tissue depth dataset. Three FFRs of this study that applied updated dataset demonstrated lesser deviation errors between the facial surfaces of the FFR and corresponding subject than those from the previous study. The results proposed that appropriate average tissue depth data are important to increase quantitative accuracy of FFR. © 2015 American Academy of Forensic Sciences.

  2. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    PubMed

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.

  3. Effectiveness evaluation of objective and subjective weighting methods for aquifer vulnerability assessment in urban context

    NASA Astrophysics Data System (ADS)

    Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet

    2016-10-01

    Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.

  4. The analysis of quantitative methods for renewable fuel processes and lubricant of materials derived from plastic waste

    NASA Astrophysics Data System (ADS)

    Rajagukguk, J. R.

    2018-01-01

    Plastic has become an important component in modern life today. Its role has replaced wood and metal, given its advantages such as light and strong, corrosion resistant, transparent and easy to color and good insulation properties. The research method is used with quantitative and engineering research methods. Research objective is to convert plastic waste into something more economical and to preserve the environment surrounding. Renewable fuel and lubricant variables are simultaneously influenced significantly to the sustainable environment. This is based on Fh> Ft of 62.101> 4.737) and its significance is 0.000 < 0.05. Then Ho concluded rejected Ha accepted which means that the variable of renewable fuels and lubricants or very large effect on the environment sustainable, the value of correlation coefficient 0.941 or 94.1% which means there is a very strong relationship between renewable fuel variables and lubricants to the sustainable environment. And utilizing plastic waste after being processed by pyrolysis method produces liquid hydrocarbons having elements of compounds such as crude oil and renewable fuels obtained from calculations are CO2 + H2O + C1-C4 + Residual substances. Then the plastic waste can be processed by isomerization process + catalyst to lubricating oil and the result of chemical calculation obtained is CO2, H2O, C18H21 and the rest.

  5. Introducing the Benson Prize for Discovery Methods of Near Earth Objects by Amateurs

    NASA Astrophysics Data System (ADS)

    Benson, J. W.

    1997-05-01

    The Benson Prize Sponsored by Space Development Corporation The Benson Prize for Discovery Methods of Near Earth Objects by Amateurs is an annual competition which awards prizes to the best proposed methods by which amateur astronomers may discover such near earth objects as asteroids and comet cores. The purpose of the Benson Prize is to encourage the discovery of near earth objects by amateur astronomers. The utilization of valuable near earth resources can provide many new jobs and economic activities on earth, while also creating many new opportunities for opening up the space frontier. The utilization of near earth resources will significantly contribute to the lessening of environmental degradation on the Earth caused by mining and chemical leaching operations required to exploit the low grade ores now remaining on Earth. In addition, near earth objects pose grave dangers for life on earth. Discovering and plotting the orbits of all potentially dangerous near earth objects is the first and necessary step in protecting ourselves against the enormous potential damage possible from near earth objects. With the high quality, large size and low cost of todays consumer telescopes, the rapid development of powerful, high resolution and inexpensive CCD cameras, and the proliferation of inexpensive software for todays powerful home computers, the discovery of near earth objects by amateur astronomers is more attainable than ever. The Benson Prize is sponsored by the Space Development Corporation, a space resource exploration and utilization company. In 1997 one prize of \\500 will be awarded to the best proposed method for the amateur discovery of NEOs, and in each of the four following years, Prizes of \\500, \\250 and \\100 will be awarded. Prizes for the actual discovery of Near Earth Asteroids will be added in later years.

  6. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    PubMed

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  7. Preliminary research on quantitative methods of water resources carrying capacity based on water resources balance sheet

    NASA Astrophysics Data System (ADS)

    Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai

    2018-06-01

    Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.

  8. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    NASA Astrophysics Data System (ADS)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  9. Monochloramine disinfection kinetics of Nitrosomonas europaea by propidium monoazide quantitative PCR and Live/Dead BacLight Methods

    EPA Science Inventory

    Monochloramine disinfection kinetics were determined for the pure culture ammonia-oxidizing bacterium Nitrosomonas europaea (ATCC 19718) by two culture independent methods: (1) LIVE/DEAD® BacLight™ (LD) and (2) propidium monoazide quantitative PCR (PMA-qPCR). Both methods were f...

  10. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is

  11. Multi-object segmentation framework using deformable models for medical imaging analysis.

    PubMed

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  12. Method and apparatus for determining the coordinates of an object

    DOEpatents

    Pedersen, Paul S; Sebring, Robert

    2003-01-01

    A method and apparatus is described for determining the coordinates on the surface of an object which is illuminated by a beam having pixels which have been modulated according to predetermined mathematical relationships with pixel position within the modulator. The reflected illumination is registered by an image sensor at a known location which registers the intensity of the pixels as received. Computations on the intensity, which relate the pixel intensities received to the pixel intensities transmitted at the modulator, yield the proportional loss of intensity and planar position of the originating pixels. The proportional loss and position information can then be utilized within triangulation equations to resolve the coordinates of associated surface locations on the object.

  13. Portable smartphone based quantitative phase microscope

    NASA Astrophysics Data System (ADS)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  14. Spiraling between qualitative and quantitative data on women's health behaviors: a double helix model for mixed methods.

    PubMed

    Mendlinger, Sheryl; Cwikel, Julie

    2008-02-01

    A double helix spiral model is presented which demonstrates how to combine qualitative and quantitative methods of inquiry in an interactive fashion over time. Using findings on women's health behaviors (e.g., menstruation, breast-feeding, coping strategies), we show how qualitative and quantitative methods highlight the theory of knowledge acquisition in women's health decisions. A rich data set of 48 semistructured, in-depth ethnographic interviews with mother-daughter dyads from six ethnic groups (Israeli, European, North African, Former Soviet Union [FSU], American/Canadian, and Ethiopian), plus seven focus groups, provided the qualitative sources for analysis. This data set formed the basis of research questions used in a quantitative telephone survey of 302 Israeli women from the ages of 25 to 42 from four ethnic groups. We employed multiple cycles of data analysis from both data sets to produce a more detailed and multidimensional picture of women's health behavior decisions through a spiraling process.

  15. Examining the Use of Web-Based Reusable Learning Objects by Animal and Veterinary Nursing Students

    ERIC Educational Resources Information Center

    Chapman-Waterhouse, Emily; Silva-Fletcher, Ayona; Whittlestone, Kim David

    2016-01-01

    This intervention study examined the interaction of animal and veterinary nursing students with reusable learning objects (RLO) in the context of preparing for summative assessment. Data was collected from 199 undergraduates using quantitative and qualitative methods. Students accessed RLO via personal devices in order to reinforce taught…

  16. Objective Data Assessment (ODA) Methods as Nutritional Assessment Tools.

    PubMed

    Hamada, Yasuhiro

    2015-01-01

    Nutritional screening and assessment should be a standard of care for all patients because nutritional management plays an important role in clinical practice. However, there is no gold standard for the diagnosis of malnutrition or undernutrition, although a large number of nutritional screening and assessment tools have been developed. Nutritional screening and assessment tools are classified into two categories, namely, subjective global assessment (SGA) and objective data assessment (ODA). SGA assesses nutritional status based on the features of medical history and physical examination. On the other hand, ODA consists of objective data provided from various analyses, such as anthropometry, bioimpedance analysis (BIA), dual-energy X-ray absorptiometry (DEXA), computed tomography (CT), magnetic resonance imaging (MRI), laboratory tests, and functional tests. This review highlights knowledge on the performance of ODA methods for the assessment of nutritional status in clinical practice. J. Med. Invest. 62: 119-122, August, 2015.

  17. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  18. A new quantitative evaluation method for age-related changes of individual pigmented spots in facial skin.

    PubMed

    Kikuchi, K; Masuda, Y; Yamashita, T; Sato, K; Katagiri, C; Hirao, T; Mizokami, Y; Yaguchi, H

    2016-08-01

    Facial skin pigmentation is one of the most prominent visible features of skin aging and often affects perception of health and beauty. To date, facial pigmentation has been evaluated using various image analysis methods developed for the cosmetic and esthetic fields. However, existing methods cannot provide precise information on pigmented spots, such as variations in size, color shade, and distribution pattern. The purpose of this study is the development of image evaluation methods to analyze individual pigmented spots and acquire detailed information on their age-related changes. To characterize the individual pigmented spots within a cheek image, we established a simple object-counting algorithm. First, we captured cheek images using an original imaging system equipped with an illumination unit and a high-resolution digital camera. The acquired images were converted into melanin concentration images using compensation formulae. Next, the melanin images were converted into binary images. The binary images were then subjected to noise reduction. Finally, we calculated parameters such as the melanin concentration, quantity, and size of individual pigmented spots using a connected-components labeling algorithm, which assigns a unique label to each separate group of connected pixels. The cheek image analysis was evaluated on 643 female Japanese subjects. We confirmed that the proposed method was sufficiently sensitive to measure the melanin concentration, and the numbers and sizes of individual pigmented spots through manual evaluation of the cheek images. The image analysis results for the 643 Japanese women indicated clear relationships between age and the changes in the pigmented spots. We developed a new quantitative evaluation method for individual pigmented spots in facial skin. This method facilitates the analysis of the characteristics of various pigmented facial spots and is directly applicable to the fields of dermatology, pharmacology, and esthetic

  19. A Simple Objective Method for Determining a Dynamic Journal Collection *†

    PubMed Central

    Bastille, Jacqueline D.; Mankin, Carole J.

    1980-01-01

    In order to determine the content of a journal collection responsive to both user needs and space and dollar constraints, quantitative measures of the use of a 647-title collection have been related to space and cost requirements to develop objective criteria for a dynamic collection for the Treadwell Library at the Massachusetts General Hospital, a large medical research center. Data were collected for one calendar year (1977) and stored with the elements for each title's profile in a computerized file. To account for the effect of the bulk of the journal runs on the number of uses, raw use data have been adjusted using linear shelf space required for each title to produce a factor called density of use. Titles have been ranked by raw use and by density of use with space and cost requirements for each. Data have also been analyzed for five special categories of use. Given automated means of collecting and storing data, use measures should be collected continuously. Using raw use frequency ranking to relate use to space and costs seems sensible since a decision point cutoff can be chosen in terms of the potential interlibrary loans generated. But it places new titles at risk while protecting titles with long, little used runs. Basing decisions on density of use frequency ranking seems to produce a larger yield of titles with fewer potential interlibrary loans and to identify titles with overlong runs which may be pruned or converted to microform. The method developed is simple and practical. Its design will be improved to apply to data collected in 1980 for a continuous study of journal use. The problem addressed is essentially one of inventory control. Viewed as such it makes good financial sense to measure use as part of the routine operation of the library to provide information for effective management decisions. PMID:7437589

  20. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    PubMed Central

    Zhu, Hao; Sun, Yan; Rajagopal, Gunaretnam; Mondry, Adrian; Dhar, Pawan

    2004-01-01

    Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described. PMID:15339335

  1. PCR-free quantitative detection of genetically modified organism from raw materials – A novel electrochemiluminescence-based bio-barcode method

    PubMed Central

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R.

    2018-01-01

    Bio-barcode assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio-barcode assay requires lengthy experimental procedures including the preparation and release of barcode DNA probes from the target-nanoparticle complex, and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio-barcode assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2’2’-bipyridyl) ruthenium (TBR)-labele barcode DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products. PMID:18386909

  2. An iterative method for near-field Fresnel region polychromatic phase contrast imaging

    NASA Astrophysics Data System (ADS)

    Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.

    2017-07-01

    We present an iterative method for polychromatic phase contrast imaging that is suitable for broadband illumination and which allows for the quantitative determination of the thickness of an object given the refractive index of the sample material. Experimental and simulation results suggest the iterative method provides comparable image quality and quantitative object thickness determination when compared to the analytical polychromatic transport of intensity and contrast transfer function methods. The ability of the iterative method to work over a wider range of experimental conditions means the iterative method is a suitable candidate for use with polychromatic illumination and may deliver more utility for laboratory-based x-ray sources, which typically have a broad spectrum.

  3. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    PubMed

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  4. Toward an Objective Enhanced-V Detection Algorithm

    NASA Technical Reports Server (NTRS)

    Brunner, Jason; Feltz, Wayne; Moses, John; Rabin, Robert; Ackerman, Steven

    2007-01-01

    The area of coldest cloud tops above thunderstorms sometimes has a distinct V or U shape. This pattern, often referred to as an "enhanced-V' signature, has been observed to occur during and preceding severe weather in previous studies. This study describes an algorithmic approach to objectively detect enhanced-V features with observations from the Geostationary Operational Environmental Satellite and Low Earth Orbit data. The methodology consists of cross correlation statistics of pixels and thresholds of enhanced-V quantitative parameters. The effectiveness of the enhanced-V detection method will be examined using Geostationary Operational Environmental Satellite, MODerate-resolution Imaging Spectroradiometer, and Advanced Very High Resolution Radiometer image data from case studies in the 2003-2006 seasons. The main goal of this study is to develop an objective enhanced-V detection algorithm for future implementation into operations with future sensors, such as GOES-R.

  5. Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.

    PubMed

    Chahrour, Osama; Cobice, Diego; Malone, John

    2015-09-10

    Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  7. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  8. Method of forming cavitated objects of controlled dimension

    DOEpatents

    Anderson, Paul R.; Miller, Wayne J.

    1982-01-01

    A method of controllably varying the dimensions of cavitated objects such as hollow spherical shells wherein a precursor shell is heated to a temperature above the shell softening temperature in an ambient atmosphere wherein the ratio of gases which are permeable through the shell wall at that temperature to gases which are impermeable through the shell wall is substantially greater than the corresponding ratio for gases contained within the precursor shell. As the shell expands, the partial pressures of permeable gases internally and externally of the shell approach and achieve equilibrium, so that the final shell size depends solely upon the difference in impermeable gas partial pressures and shell surface tension.

  9. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations.

    PubMed

    Wagner, Karla D; Davidson, Peter J; Pollini, Robin A; Strathdee, Steffanie A; Washburn, Rachel; Palinkas, Lawrence A

    2012-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, whilst conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors' research on HIV risk amongst injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a Needle/Syringe Exchange Program in Los Angeles, CA, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative parts

  10. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations

    PubMed Central

    Wagner, Karla D.; Davidson, Peter J.; Pollini, Robin A.; Strathdee, Steffanie A.; Washburn, Rachel; Palinkas, Lawrence A.

    2011-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, while conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors’ research on HIV risk among injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a needle/syringe exchange program in Los Angeles, California, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative

  11. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    NASA Astrophysics Data System (ADS)

    Barat, Christian; Phlypo, Ronald

    2010-12-01

    We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  12. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  13. Improved methods for capture, extraction, and quantitative assay of environmental DNA from Asian bigheaded carp (Hypophthalmichthys spp.).

    PubMed

    Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.

  14. A method to explore the quantitative interactions between metal and ceria for M/CeO2 catalysts

    NASA Astrophysics Data System (ADS)

    Zhu, Kong-Jie; Liu, Jie; Yang, Yan-Ju; Xu, Yu-Xing; Teng, Bo-Tao; Wen, Xiao-Dong; Fan, Maohong

    2018-03-01

    To explore the quantitative relationship of metal interaction with ceria plays a key role in the theoretical design of M/CeO2 catalysts, especially for the new hot topic of atomically dispersed catalysts. A method to quantitatively explore the interactions between metal and ceria is proposed in the present work on the basis of the qualitative analysis of the effects of different factors on metal adsorption at different ceria surfaces by using Ag/CeO2 as a case. Two parameters are firstly presented, Ep which converts the total adsorption energy into the interaction energy per Agsbnd O bond, and θdiff which measures the deviation of Agsbnd Osbnd Ce bond angle from the angle of the sp3 orbital hybridization of O atom. Using the two parameters, the quantitative relationship of the interaction energy between Ag and ceria is established. There is a linear correlation between Ep and dAgsbndO with θdiff. The higher θdiff, the weaker Ep, and the longer Agsbnd O bond. This method is also suitable for other metals (Cu, Ni, Pd, and Rh, etc.) on ceria. It is the first time to establish the quantitative relationship for the interaction between metal and ceria, and sheds light into the theoretical design of M/CeO2 catalysts.

  15. Dual-Hierarchy Graph Method for Object Indexing and Recognition

    DTIC Science & Technology

    2014-07-01

    from examples would be too late for the prey. Mythical monsters in movies or cartoons can look quite scary even though we have never seen their...uniform, at 25 blocks per parent, but depends on the number of SIFT features in the parent blocks. If we have more features we create more children for...method mentioned above to these descriptors to derive the 3D structure and pose of the object. In effect , we replace the previous “spatial verification

  16. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Evaluation of methods for detection of fluorescence labeled subcellular objects in microscope images.

    PubMed

    Ruusuvuori, Pekka; Aijö, Tarmo; Chowdhury, Sharif; Garmendia-Torres, Cecilia; Selinummi, Jyrki; Birbaumer, Mirko; Dudley, Aimée M; Pelkmans, Lucas; Yli-Harja, Olli

    2010-05-13

    Several algorithms have been proposed for detecting fluorescently labeled subcellular objects in microscope images. Many of these algorithms have been designed for specific tasks and validated with limited image data. But despite the potential of using extensive comparisons between algorithms to provide useful information to guide method selection and thus more accurate results, relatively few studies have been performed. To better understand algorithm performance under different conditions, we have carried out a comparative study including eleven spot detection or segmentation algorithms from various application fields. We used microscope images from well plate experiments with a human osteosarcoma cell line and frames from image stacks of yeast cells in different focal planes. These experimentally derived images permit a comparison of method performance in realistic situations where the number of objects varies within image set. We also used simulated microscope images in order to compare the methods and validate them against a ground truth reference result. Our study finds major differences in the performance of different algorithms, in terms of both object counts and segmentation accuracies. These results suggest that the selection of detection algorithms for image based screens should be done carefully and take into account different conditions, such as the possibility of acquiring empty images or images with very few spots. Our inclusion of methods that have not been used before in this context broadens the set of available detection methods and compares them against the current state-of-the-art methods for subcellular particle detection.

  18. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  19. Comparative evaluation of two quantitative test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface: a precollaborative study.

    PubMed

    Tomasino, Stephen F; Hamilton, Martin A

    2007-01-01

    Two quantitative carrier-based test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface, the Standard Quantitative Carrier Test Method-ASTM E 2111-00 and an adaptation of a quantitative micro-method as reported by Sagripanti and Bonifacino, were compared in this study. The methods were selected based on their desirable characteristics (e.g., well-developed protocol, previous use with spores, fully quantitative, and use of readily available equipment) for testing liquid sporicides and sterilants on a hard surface. In this paper, the Sagripanti-Bonifacino procedure is referred to as the Three Step Method (TSM). AOAC Official Method 966.04 was included in this study as a reference method. Three laboratories participated in the evaluation. Three chemical treatments were tested: (1) 3000 ppm sodium hypochlorite with pH adjusted to 7.0, (2) a hydrogen peroxide/peroxyacetic acid product, and (3) 3000 ppm sodium hypochlorite with pH unadjusted (pH of approximately 10.0). A fourth treatment, 6000 ppm sodium hypochlorite solution with pH adjusted to 7.0, was included only for Method 966.04 as a positive control (high level of efficacy). The contact time was 10 min for all chemical treatments except the 6000 ppm sodium hypochlorite treatment which was tested at 30 min. Each chemical treatment was tested 3 times using each of the methods. Only 2 of the laboratories performed the AOAC method. Method performance was assessed by the within-laboratory variance, between-laboratory variance, and total variance associated with the log reduction (LR) estimates generated by each quantitative method. The quantitative methods performed similarly, and the LR values generated by each method were not statistically different for the 3 treatments evaluated. Based on feedback from the participating laboratories, compared to the TSM, ASTM E 2111-00 was more resource demanding and required more set-up time. The logistical and resource concerns identified

  20. Archimedes Revisited: A Faster, Better, Cheaper Method of Accurately Measuring the Volume of Small Objects

    ERIC Educational Resources Information Center

    Hughes, Stephen W.

    2005-01-01

    A little-known method of measuring the volume of small objects based on Archimedes' principle is described, which involves suspending an object in a water-filled container placed on electronic scales. The suspension technique is a variation on the hydrostatic weighing technique used for measuring volume. The suspension method was compared with two…

  1. Breast tumour visualization using 3D quantitative ultrasound methods

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Raheem, Abdul; Tadayyon, Hadi; Liu, Simon; Hadizad, Farnoosh; Czarnota, Gregory J.

    2016-04-01

    Breast cancer is one of the most common cancer types accounting for 29% of all cancer cases. Early detection and treatment has a crucial impact on improving the survival of affected patients. Ultrasound (US) is non-ionizing, portable, inexpensive, and real-time imaging modality for screening and quantifying breast cancer. Due to these attractive attributes, the last decade has witnessed many studies on using quantitative ultrasound (QUS) methods in tissue characterization. However, these studies have mainly been limited to 2-D QUS methods using hand-held US (HHUS) scanners. With the availability of automated breast ultrasound (ABUS) technology, this study is the first to develop 3-D QUS methods for the ABUS visualization of breast tumours. Using an ABUS system, unlike the manual 2-D HHUS device, the whole patient's breast was scanned in an automated manner. The acquired frames were subsequently examined and a region of interest (ROI) was selected in each frame where tumour was identified. Standard 2-D QUS methods were used to compute spectral and backscatter coefficient (BSC) parametric maps on the selected ROIs. Next, the computed 2-D parameters were mapped to a Cartesian 3-D space, interpolated, and rendered to provide a transparent color-coded visualization of the entire breast tumour. Such 3-D visualization can potentially be used for further analysis of the breast tumours in terms of their size and extension. Moreover, the 3-D volumetric scans can be used for tissue characterization and the categorization of breast tumours as benign or malignant by quantifying the computed parametric maps over the whole tumour volume.

  2. Moving object detection via low-rank total variation regularization

    NASA Astrophysics Data System (ADS)

    Wang, Pengcheng; Chen, Qian; Shao, Na

    2016-09-01

    Moving object detection is a challenging task in video surveillance. Recently proposed Robust Principal Component Analysis (RPCA) can recover the outlier patterns from the low-rank data under some mild conditions. However, the l-penalty in RPCA doesn't work well in moving object detection because the irrepresentable condition is often not satisfied. In this paper, a method based on total variation (TV) regularization scheme is proposed. In our model, image sequences captured with a static camera are highly related, which can be described using a low-rank matrix. Meanwhile, the low-rank matrix can absorb background motion, e.g. periodic and random perturbation. The foreground objects in the sequence are usually sparsely distributed and drifting continuously, and can be treated as group outliers from the highly-related background scenes. Instead of l-penalty, we exploit the total variation of the foreground. By minimizing the total variation energy, the outliers tend to collapse and finally converge to be the exact moving objects. The TV-penalty is superior to the l-penalty especially when the outlier is in the majority for some pixels, and our method can estimate the outlier explicitly with less bias but higher variance. To solve the problem, a joint optimization function is formulated and can be effectively solved through the inexact Augmented Lagrange Multiplier (ALM) method. We evaluate our method along with several state-of-the-art approaches in MATLAB. Both qualitative and quantitative results demonstrate that our proposed method works effectively on a large range of complex scenarios.

  3. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  4. Method for contour extraction for object representation

    DOEpatents

    Skourikhine, Alexei N.; Prasad, Lakshman

    2005-08-30

    Contours are extracted for representing a pixelated object in a background pixel field. An object pixel is located that is the start of a new contour for the object and identifying that pixel as the first pixel of the new contour. A first contour point is then located on the mid-point of a transition edge of the first pixel. A tracing direction from the first contour point is determined for tracing the new contour. Contour points on mid-points of pixel transition edges are sequentially located along the tracing direction until the first contour point is again encountered to complete tracing the new contour. The new contour is then added to a list of extracted contours that represent the object. The contour extraction process associates regions and contours by labeling all the contours belonging to the same object with the same label.

  5. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    PubMed

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P < .001). There were no significant differences among LLC, T2-weighted short inversion time inversion recovery (STIR) sequences, early (EGE), and late (LGE) gadolinium-enhancement sequences for diagnosis of AM. The AUC for qualitative (T2-weighted STIR 0.92, EGE 0.87 and LGE 0.88) and quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  6. Understanding Variation in Treatment Effects in Education Impact Evaluations: An Overview of Quantitative Methods. NCEE 2014-4017

    ERIC Educational Resources Information Center

    Schochet, Peter Z.; Puma, Mike; Deke, John

    2014-01-01

    This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…

  7. Validation of a method for quantitation of the clopidogrel active metabolite, clopidogrel, clopidogrel carboxylic acid, and 2-oxo-clopidogrel in feline plasma.

    PubMed

    Lyngby, Janne G; Court, Michael H; Lee, Pamela M

    2017-08-01

    The clopidogrel active metabolite (CAM) is unstable and challenging to quantitate. The objective was to validate a new method for stabilization and quantitation of CAM, clopidogrel, and the inactive metabolites clopidogrel carboxylic acid and 2-oxo-clopiodgrel in feline plasma. Two healthy cats administered clopidogrel to demonstrate assay in vivo utility. Stabilization of CAM was achieved by adding 2-bromo-3'methoxyacetophenone to blood tubes to form a derivatized CAM (CAM-D). Method validation included evaluation of calibration curve linearity, accuracy, and precision; within and between assay precision and accuracy; and compound stability using spiked blank feline plasma. Analytes were measured by high performance liquid chromatography with tandem mass spectrometry. In vivo utility was demonstrated by a pharmacokinetic study of cats given a single oral dose of 18.75mg clopidogrel. The 2-oxo-clopidogrel metabolite was unstable. Clopidogrel, CAM-D, and clopidogrel carboxylic acid appear stable for 1 week at room temperature and 9 months at -80°C. Standard curves showed linearity for CAM-D, clopidogrel, and clopidogrel carboxylic acid (r > 0.99). Between assay accuracy and precision was ≤2.6% and ≤7.1% for CAM-D and ≤17.9% and ≤11.3% for clopidogrel and clopidogrel carboxylic acid. Within assay precision for all three compounds was ≤7%. All three compounds were detected in plasma from healthy cats receiving clopidogrel. This methodology is accurate and precise for simultaneous quantitation of CAM-D, clopidogrel, and clopidogrel carboxylic acid in feline plasma but not 2-oxo-clopidogrel. Validation of this assay is the first step to more fully understanding the use of clopidogrel in cats. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Multiple-3D-object secure information system based on phase shifting method and single interference.

    PubMed

    Li, Wei-Na; Shi, Chen-Xiao; Piao, Mei-Lan; Kim, Nam

    2016-05-20

    We propose a multiple-3D-object secure information system for encrypting multiple three-dimensional (3D) objects based on the three-step phase shifting method. During the decryption procedure, five phase functions (PFs) are decreased to three PFs, in comparison with our previous method, which implies that one cross beam splitter is utilized to implement the single decryption interference. Moreover, the advantages of the proposed scheme also include: each 3D object can be decrypted discretionarily without decrypting a series of other objects earlier; the quality of the decrypted slice image of each object is high according to the correlation coefficient values, none of which is lower than 0.95; no iterative algorithm is involved. The feasibility of the proposed scheme is demonstrated by computer simulation results.

  9. An objective method for a video quality evaluation in a 3DTV service

    NASA Astrophysics Data System (ADS)

    Wilczewski, Grzegorz

    2015-09-01

    The following article describes proposed objective method for a 3DTV video quality evaluation, a Compressed Average Image Intensity (CAII) method. Identification of the 3DTV service's content chain nodes enables to design a versatile, objective video quality metric. It is based on an advanced approach to the stereoscopic videostream analysis. Insights towards designed metric mechanisms, as well as the evaluation of performance of the designed video quality metric, in the face of the simulated environmental conditions are herein discussed. As a result, created CAII metric might be effectively used in a variety of service quality assessment applications.

  10. Metrology Standards for Quantitative Imaging Biomarkers

    PubMed Central

    Obuchowski, Nancy A.; Kessler, Larry G.; Raunig, David L.; Gatsonis, Constantine; Huang, Erich P.; Kondratovich, Marina; McShane, Lisa M.; Reeves, Anthony P.; Barboriak, Daniel P.; Guimaraes, Alexander R.; Wahl, Richard L.

    2015-01-01

    Although investigators in the imaging community have been active in developing and evaluating quantitative imaging biomarkers (QIBs), the development and implementation of QIBs have been hampered by the inconsistent or incorrect use of terminology or methods for technical performance and statistical concepts. Technical performance is an assessment of how a test performs in reference objects or subjects under controlled conditions. In this article, some of the relevant statistical concepts are reviewed, methods that can be used for evaluating and comparing QIBs are described, and some of the technical performance issues related to imaging biomarkers are discussed. More consistent and correct use of terminology and study design principles will improve clinical research, advance regulatory science, and foster better care for patients who undergo imaging studies. © RSNA, 2015 PMID:26267831

  11. Development of a quantitative diagnostic method of estrogen receptor expression levels by immunohistochemistry using organic fluorescent material-assembled nanoparticles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonda, Kohsuke, E-mail: gonda@med.tohoku.ac.jp; Miyashita, Minoru; Watanabe, Mika

    2012-09-28

    Highlights: Black-Right-Pointing-Pointer Organic fluorescent material-assembled nanoparticles for IHC were prepared. Black-Right-Pointing-Pointer New nanoparticle fluorescent intensity was 10.2-fold greater than Qdot655. Black-Right-Pointing-Pointer Nanoparticle staining analyzed a wide range of ER expression levels in tissue. Black-Right-Pointing-Pointer Nanoparticle staining enhanced the quantitative sensitivity for ER diagnosis. -- Abstract: The detection of estrogen receptors (ERs) by immunohistochemistry (IHC) using 3,3 Prime -diaminobenzidine (DAB) is slightly weak as a prognostic marker, but it is essential to the application of endocrine therapy, such as antiestrogen tamoxifen-based therapy. IHC using DAB is a poor quantitative method because horseradish peroxidase (HRP) activity depends on reaction time, temperature andmore » substrate concentration. However, IHC using fluorescent material provides an effective method to quantitatively use IHC because the signal intensity is proportional to the intensity of the photon excitation energy. However, the high level of autofluorescence has impeded the development of quantitative IHC using fluorescence. We developed organic fluorescent material (tetramethylrhodamine)-assembled nanoparticles for IHC. Tissue autofluorescence is comparable to the fluorescence intensity of quantum dots, which are the most representative fluorescent nanoparticles. The fluorescent intensity of our novel nanoparticles was 10.2-fold greater than quantum dots, and they did not bind non-specifically to breast cancer tissues due to the polyethylene glycol chain that coated their surfaces. Therefore, the fluorescent intensity of our nanoparticles significantly exceeded autofluorescence, which produced a significantly higher signal-to-noise ratio on IHC-imaged cancer tissues than previous methods. Moreover, immunostaining data from our nanoparticle fluorescent IHC and IHC with DAB were compared in the same region of adjacent tissues

  12. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes.

    PubMed

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-08-27

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications.

  13. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes

    PubMed Central

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-01-01

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications. PMID:26343656

  14. [Comparative analysis of real-time quantitative PCR-Sanger sequencing method and TaqMan probe method for detection of KRAS/BRAF mutation in colorectal carcinomas].

    PubMed

    Zhang, Xun; Wang, Yuehua; Gao, Ning; Wang, Jinfen

    2014-02-01

    To compare the application values of real-time quantitative PCR-Sanger sequencing and TaqMan probe method in the detection of KRAS and BRAF mutations, and to correlate KRAS/BRAF mutations with the clinicopathological characteristics in colorectal carcinomas. Genomic DNA of the tumor cells was extracted from formalin fixed paraffin embedded (FFPE) tissue samples of 344 colorectal carcinomas by microdissection. Real-time quantitative PCR-Sanger sequencing and TaqMan probe method were performed to detect the KRAS/BRAF mutations. The frequency and types of KRAS/BRAF mutations, clinicopathological characteristics and survival time were analyzed. KRAS mutations were detected in 39.8% (137/344) and 38.7% (133/344) of 344 colorectal carcinomas by using real-time quantitative PCR-Sanger sequencing and TaqMan probe method, respectively. BRAF mutation was detected in 4.7% (16/344) and 4.1% (14/344), respectively. There was no significant correlation between the two methods. The frequency of the KRAS mutation in female was higher than that in male (P < 0.05). The frequency of the BRAF mutation in colon was higher than that in rectum. The frequency of the BRAF mutation in stage III-IV cases was higher than that in stageI-II cases. The frequency of the BRAF mutation in signet ring cell carcinoma was higher than that in mucinous carcinoma and nonspecific adenocarcinoma had the lowest mutation rate. The frequency of the BRAF mutation in grade III cases was higher than that in grade II cases (P < 0.05). The overall concordance for the two methods of KRAS/BRAF mutation detection was 98.8% (kappa = 0.976). There was statistic significance between BRAF and KRAS mutations for the survival time of colorectal carcinomas (P = 0.039). There were no statistic significance between BRAF mutation type and BRAF/KRAS wild type (P = 0.058). (1) Compared with real-time quantitative PCR-Sanger sequencing, TaqMan probe method is better with regard to handling time, efficiency, repeatability, cost

  15. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.

    1982-03-31

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.

  16. Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Atencio, James D.

    1984-01-01

    A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.

  17. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    PubMed

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities.

  18. An Object-Oriented Classification Method on High Resolution Satellite Data

    DTIC Science & Technology

    2004-11-01

    25th ACRS 2004 Chiang Mai , Thailand 347 Data Processing B-4.6 AN OBJECT-ORIENTED CLASSIFICATION METHOD ON...unlimited 13. SUPPLEMENTARY NOTES Proceedings of the 25th Asian Conference on Remote Sensing, Held in Chiang Mai , Thailand on 22-26 November 2004...panchromatic (left) and multispectral (right) 25th ACRS 2004 Chiang Mai , Thailand 349 Data Processing B-4.6 First of all, the

  19. Real-time quantitative polymerase chain reaction methods for four genetically modified maize varieties and maize DNA content in food.

    PubMed

    Brodmann, Peter D; Ilg, Evelyn C; Berthoud, Hélène; Herrmann, Andre

    2002-01-01

    Quantitative detection methods are needed for enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients. This labeling threshold, which is set to 1% in the European Union and Switzerland, must be applied to all approved GMOs. Four different varieties of maize are approved in the European Union: the insect-resistant Bt176 maize (Maximizer), Btl 1 maize, Mon810 (YieldGard) maize, and the herbicide-tolerant T25 (Liberty Link) maize. Because the labeling must be considered individually for each ingredient, a quantitation system for the endogenous maize content is needed in addition to the GMO-specific detection systems. Quantitative real-time polymerase chain reaction detection methods were developed for the 4 approved genetically modified maize varieties and for an endogenous maize (invertase) gene system.

  20. Quantitative and qualitative approaches in the study of poverty and adolescent development: separation or integration?

    PubMed

    Leung, Janet T Y; Shek, Daniel T L

    2011-01-01

    This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.

  1. An Improved Flow Cytometry Method For Precise Quantitation Of Natural-Killer Cell Activity

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Nehlsen-Cannarella, Sandra; Sams, Clarence

    2006-01-01

    The ability to assess NK cell cytotoxicity using flow cytometry has been previously described and can serve as a powerful tool to evaluate effector immune function in the clinical setting. Previous methods used membrane permeable dyes to identify target cells. The use of these dyes requires great care to achieve optimal staining and results in a broad spectral emission that can make multicolor cytometry difficult. Previous methods have also used negative staining (the elimination of target cells) to identify effector cells. This makes a precise quantitation of effector NK cells impossible due to the interfering presence of T and B lymphocytes, and the data highly subjective to the variable levels of NK cells normally found in human peripheral blood. In this study an improved version of the standard flow cytometry assay for NK activity is described that has several advantages of previous methods. Fluorescent antibody staining (CD45FITC) is used to positively identify target cells in place of membranepermeable dyes. Fluorescent antibody staining of target cells is less labor intensive and more easily reproducible than membrane dyes. NK cells (true effector lymphocytes) are also positively identified by fluorescent antibody staining (CD56PE) allowing a simultaneous absolute count assessment of both NK cells and target cells. Dead cells are identified by membrane disruption using the DNA intercalating dye PI. Using this method, an exact NK:target ratio may be determined for each assessment, including quantitation of NK target complexes. Backimmunoscatter gating may be used to track live vs. dead Target cells via scatter properties. If desired, NK activity may then be normalized to standardized ratios for clinical comparisons between patients, making the determination of PBMC counts or NK cell percentages prior to testing unnecessary. This method provides an exact cytometric determination of NK activity that highly reproducible and may be suitable for routine use in the

  2. Communication Profiles of Psychiatric Residents and Attending Physicians in Medication-Management Appointments: A Quantitative Pilot Study

    ERIC Educational Resources Information Center

    Castillo, Enrico G.; Pincus, Harold A.; Wieland, Melissa; Roter, Debra; Larson, Susan; Houck, Patricia; Reynolds, Charles F.; Cruz, Mario

    2012-01-01

    Objective: The authors quantitatively examined differences in psychiatric residents' and attending physicians' communication profiles and voice tones. Methods: Audiotaped recordings of 49 resident-patient and 35 attending-patient medication-management appointments at four ambulatory sites were analyzed with the Roter Interaction Analysis System…

  3. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  4. Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method

    PubMed Central

    Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng

    2016-01-01

    Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society. PMID:28036009

  5. Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders

    PubMed Central

    Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu

    2014-01-01

    Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709

  6. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  7. Hydro-environmental management of groundwater resources: A fuzzy-based multi-objective compromise approach

    NASA Astrophysics Data System (ADS)

    Alizadeh, Mohammad Reza; Nikoo, Mohammad Reza; Rakhshandehroo, Gholam Reza

    2017-08-01

    Sustainable management of water resources necessitates close attention to social, economic and environmental aspects such as water quality and quantity concerns and potential conflicts. This study presents a new fuzzy-based multi-objective compromise methodology to determine the socio-optimal and sustainable policies for hydro-environmental management of groundwater resources, which simultaneously considers the conflicts and negotiation of involved stakeholders, uncertainties in decision makers' preferences, existing uncertainties in the groundwater parameters and groundwater quality and quantity issues. The fuzzy multi-objective simulation-optimization model is developed based on qualitative and quantitative groundwater simulation model (MODFLOW and MT3D), multi-objective optimization model (NSGA-II), Monte Carlo analysis and Fuzzy Transformation Method (FTM). Best compromise solutions (best management policies) on trade-off curves are determined using four different Fuzzy Social Choice (FSC) methods. Finally, a unanimity fallback bargaining method is utilized to suggest the most preferred FSC method. Kavar-Maharloo aquifer system in Fars, Iran, as a typical multi-stakeholder multi-objective real-world problem is considered to verify the proposed methodology. Results showed an effective performance of the framework for determining the most sustainable allocation policy in groundwater resource management.

  8. A subagging regression method for estimating the qualitative and quantitative state of groundwater

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Park, E.; Choi, J.; Han, W. S.; Yun, S. T.

    2016-12-01

    A subagging regression (SBR) method for the analysis of groundwater data pertaining to the estimation of trend and the associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of the other methods and the uncertainties are reasonably estimated where the others have no uncertainty analysis option. To validate further, real quantitative and qualitative data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by SBR, whereas the GPR has limitations in representing the variability of non-Gaussian skewed data. From the implementations, it is determined that the SBR method has potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data.

  9. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    PubMed

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher

  10. Extracting 3D Parametric Curves from 2D Images of Helical Objects.

    PubMed

    Willcocks, Chris G; Jackson, Philip T G; Nelson, Carl J; Obara, Boguslaw

    2017-09-01

    Helical objects occur in medicine, biology, cosmetics, nanotechnology, and engineering. Extracting a 3D parametric curve from a 2D image of a helical object has many practical applications, in particular being able to extract metrics such as tortuosity, frequency, and pitch. We present a method that is able to straighten the image object and derive a robust 3D helical curve from peaks in the object boundary. The algorithm has a small number of stable parameters that require little tuning, and the curve is validated against both synthetic and real-world data. The results show that the extracted 3D curve comes within close Hausdorff distance to the ground truth, and has near identical tortuosity for helical objects with a circular profile. Parameter insensitivity and robustness against high levels of image noise are demonstrated thoroughly and quantitatively.

  11. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  12. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  13. Multiagent scheduling method with earliness and tardiness objectives in flexible job shops.

    PubMed

    Wu, Zuobao; Weng, Michael X

    2005-04-01

    Flexible job-shop scheduling problems are an important extension of the classical job-shop scheduling problems and present additional complexity. Such problems are mainly due to the existence of a considerable amount of overlapping capacities with modern machines. Classical scheduling methods are generally incapable of addressing such capacity overlapping. We propose a multiagent scheduling method with job earliness and tardiness objectives in a flexible job-shop environment. The earliness and tardiness objectives are consistent with the just-in-time production philosophy which has attracted significant attention in both industry and academic community. A new job-routing and sequencing mechanism is proposed. In this mechanism, two kinds of jobs are defined to distinguish jobs with one operation left from jobs with more than one operation left. Different criteria are proposed to route these two kinds of jobs. Job sequencing enables to hold a job that may be completed too early. Two heuristic algorithms for job sequencing are developed to deal with these two kinds of jobs. The computational experiments show that the proposed multiagent scheduling method significantly outperforms the existing scheduling methods in the literature. In addition, the proposed method is quite fast. In fact, the simulation time to find a complete schedule with over 2000 jobs on ten machines is less than 1.5 min.

  14. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard.

  15. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  16. Using quantitative and qualitative data in health services research - what happens when mixed method findings conflict? [ISRCTN61522618].

    PubMed

    Moffatt, Suzanne; White, Martin; Mackintosh, Joan; Howel, Denise

    2006-03-08

    In this methodological paper we document the interpretation of a mixed methods study and outline an approach to dealing with apparent discrepancies between qualitative and quantitative research data in a pilot study evaluating whether welfare rights advice has an impact on health and social outcomes among a population aged 60 and over. Quantitative and qualitative data were collected contemporaneously. Quantitative data were collected from 126 men and women aged over 60 within a randomised controlled trial. Participants received a full welfare benefits assessment which successfully identified additional financial and non-financial resources for 60% of them. A range of demographic, health and social outcome measures were assessed at baseline, 6, 12 and 24 month follow up. Qualitative data were collected from a sub-sample of 25 participants purposively selected to take part in individual interviews to examine the perceived impact of welfare rights advice. Separate analysis of the quantitative and qualitative data revealed discrepant findings. The quantitative data showed little evidence of significant differences of a size that would be of practical or clinical interest, suggesting that the intervention had no impact on these outcome measures. The qualitative data suggested wide-ranging impacts, indicating that the intervention had a positive effect. Six ways of further exploring these data were considered: (i) treating the methods as fundamentally different; (ii) exploring the methodological rigour of each component; (iii) exploring dataset comparability; (iv) collecting further data and making further comparisons; (v) exploring the process of the intervention; and (vi) exploring whether the outcomes of the two components match. The study demonstrates how using mixed methods can lead to different and sometimes conflicting accounts and, using this six step approach, how such discrepancies can be harnessed to interrogate each dataset more fully. Not only does this

  17. Qualification of a Quantitative Method for Monitoring Aspartate Isomerization of a Monoclonal Antibody by Focused Peptide Mapping.

    PubMed

    Cao, Mingyan; Mo, Wenjun David; Shannon, Anthony; Wei, Ziping; Washabaugh, Michael; Cash, Patricia

    Aspartate (Asp) isomerization is a common post-translational modification of recombinant therapeutic proteins that can occur during manufacturing, storage, or administration. Asp isomerization in the complementarity-determining regions of a monoclonal antibody may affect the target binding and thus a sufficiently robust quality control method for routine monitoring is desirable. In this work, we utilized a liquid chromatography-mass spectrometry (LC/MS)-based approach to identify the Asp isomerization in the complementarity-determining regions of a therapeutic monoclonal antibody. To quantitate the site-specific Asp isomerization of the monoclonal antibody, a UV detection-based quantitation assay utilizing the same LC platform was developed. The assay was qualified and implemented for routine monitoring of this product-specific modification. Compared with existing methods, this analytical paradigm is applicable to identify Asp isomerization (or other modifications) and subsequently develop a rapid, sufficiently robust quality control method for routine site-specific monitoring and quantitation to ensure product quality. This approach first identifies and locates a product-related impurity (a critical quality attribute) caused by isomerization, deamidation, oxidation, or other post-translational modifications, and then utilizes synthetic peptides and MS to assist the development of a LC-UV-based chromatographic method that separates and quantifies the product-related impurities by UV peaks. The established LC-UV method has acceptable peak specificity, precision, linearity, and accuracy; it can be validated and used in a good manufacturing practice environment for lot release and stability testing. Aspartate isomerization is a common post-translational modification of recombinant proteins during manufacture process and storage. Isomerization in the complementarity-determining regions (CDRs) of a monoclonal antibody A (mAb-A) has been detected and has been shown to

  18. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  19. Practicable methods for histological section thickness measurement in quantitative stereological analyses.

    PubMed

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  20. Practicable methods for histological section thickness measurement in quantitative stereological analyses

    PubMed Central

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1–3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability