12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.
Code of Federal Regulations, 2012 CFR
2012-01-01
... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...
12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.
Code of Federal Regulations, 2014 CFR
2014-01-01
... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...
12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.
Code of Federal Regulations, 2013 CFR
2013-01-01
... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...
Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C
2015-04-13
Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.
A further component analysis for illicit drugs mixtures with THz-TDS
NASA Astrophysics Data System (ADS)
Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui
2009-07-01
A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.
NASA Astrophysics Data System (ADS)
shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu
2017-11-01
The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.
Assessment and monitoring of forest ecosystem structure
Oscar A. Aguirre Calderón; Javier Jiménez Pérez; Horst Kramer
2006-01-01
Characterization of forest ecosystems structure must be based on quantitative indices that allow objective analysis of human influences or natural succession processes. The objective of this paper is the compilation of diverse quantitative variables to describe structural attributes from the arboreal stratum of the ecosystem, as well as different methods of forest...
Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review
ERIC Educational Resources Information Center
Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael
2014-01-01
Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…
DOT National Transportation Integrated Search
2017-06-01
The objective of this study was to develop an objective, quantitative method for evaluating damage to bridge girders by using artificial neural networks (ANNs). This evaluation method, which is a supplement to visual inspection, requires only the res...
An iterative method for near-field Fresnel region polychromatic phase contrast imaging
NASA Astrophysics Data System (ADS)
Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.
2017-07-01
We present an iterative method for polychromatic phase contrast imaging that is suitable for broadband illumination and which allows for the quantitative determination of the thickness of an object given the refractive index of the sample material. Experimental and simulation results suggest the iterative method provides comparable image quality and quantitative object thickness determination when compared to the analytical polychromatic transport of intensity and contrast transfer function methods. The ability of the iterative method to work over a wider range of experimental conditions means the iterative method is a suitable candidate for use with polychromatic illumination and may deliver more utility for laboratory-based x-ray sources, which typically have a broad spectrum.
NASA Technical Reports Server (NTRS)
Pepper, Stephen V.
1995-01-01
A grazing angle objective on an infrared microspectrometer is studied for quantitative spectroscopy by considering the angular dependence of the incident intensity within the objective's angular aperture. The assumption that there is no angular dependence is tested by comparing the experimental reflectance of Si and KBr surfaces with the reflectance calculated by integrating the Fresnel reflection coefficient over the angular aperture under this assumption. Good agreement was found, indicating that the specular reflectance of surfaces can straight-forwardly be quantitatively integrated over the angular aperture without considering non-uniform incident intensity. This quantitative approach is applied to the thickness determination of dipcoated Krytox on gold. The infrared optical constants of both materials are known, allowing the integration to be carried out. The thickness obtained is in fair agreement with the value determined by ellipsometry in the visible. Therefore, this paper illustrates a method for more quantitative use of a grazing angle objective for infrared reflectance microspectroscopy.
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
Jha, Abhinav K; Caffo, Brian; Frey, Eric C
2016-01-01
The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626
Jha, Abhinav K; Caffo, Brian; Frey, Eric C
2016-04-07
The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.
Quantitative endoscopy: initial accuracy measurements.
Truitt, T O; Adelman, R A; Kelly, D H; Willging, J P
2000-02-01
The geometric optics of an endoscope can be used to determine the absolute size of an object in an endoscopic field without knowing the actual distance from the object. This study explores the accuracy of a technique that estimates absolute object size from endoscopic images. Quantitative endoscopy involves calibrating a rigid endoscope to produce size estimates from 2 images taken with a known traveled distance between the images. The heights of 12 samples, ranging in size from 0.78 to 11.80 mm, were estimated with this calibrated endoscope. Backup distances of 5 mm and 10 mm were used for comparison. The mean percent error for all estimated measurements when compared with the actual object sizes was 1.12%. The mean errors for 5-mm and 10-mm backup distances were 0.76% and 1.65%, respectively. The mean errors for objects <2 mm and > or =2 mm were 0.94% and 1.18%, respectively. Quantitative endoscopy estimates endoscopic image size to within 5% of the actual object size. This method remains promising for quantitatively evaluating object size from endoscopic images. It does not require knowledge of the absolute distance of the endoscope from the object, rather, only the distance traveled by the endoscope between images.
A GIS-BASED METHOD FOR MULTI-OBJECTIVE EVALUATION OF PARK VEGETATION. (R824766)
In this paper we describe a method for evaluating the concordance between a set of mapped landscape attributes and a set of quantitatively expressed management priorities. The method has proved to be useful in planning urban green areas, allowing objectively d...
Quantitative Information Differences Between Object-Person Presentation Methods
ERIC Educational Resources Information Center
Boyd, J. Edwin; Perry, Raymond P.
1972-01-01
Subjects used significantly more adjectives, on an adjective checklist (ACL), in giving their impressions of an object-person; based on written and audiovisual presentations, more than audio presentations. (SD)
Creating objects and object categories for studying perception and perceptual learning.
Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay
2012-11-02
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties. Many innovative and useful methods currently exist for creating novel objects and object categories (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper. We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have. Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
ERIC Educational Resources Information Center
Yeni, Sabiha; Ozdener, Nesrin
2014-01-01
The purpose of the study is to investigate how pre-service teachers benefit from learning objects repositories while preparing course content. Qualitative and quantitative data collection methods were used in a mixed methods approach. This study was carried out with 74 teachers from the Faculty of Education. In the first phase of the study,…
Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana
2014-01-01
Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282
Normalized Temperature Contrast Processing in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
Quantitative phase retrieval with arbitrary pupil and illumination
Claus, Rene A.; Naulleau, Patrick P.; Neureuther, Andrew R.; ...
2015-10-02
We present a general algorithm for combining measurements taken under various illumination and imaging conditions to quantitatively extract the amplitude and phase of an object wave. The algorithm uses the weak object transfer function, which incorporates arbitrary pupil functions and partially coherent illumination. The approach is extended beyond the weak object regime using an iterative algorithm. Finally, we demonstrate the method on measurements of Extreme Ultraviolet Lithography (EUV) multilayer mask defects taken in an EUV zone plate microscope with both a standard zone plate lens and a zone plate implementing Zernike phase contrast.
Evaluation of background parenchymal enhancement on breast MRI: a systematic review
Signori, Alessio; Valdora, Francesca; Rossi, Federica; Calabrese, Massimo; Durando, Manuela; Mariscotto, Giovanna; Tagliafico, Alberto
2017-01-01
Objective: To perform a systematic review of the methods used for background parenchymal enhancement (BPE) evaluation on breast MRI. Methods: Studies dealing with BPE assessment on breast MRI were retrieved from major medical libraries independently by four reviewers up to 6 October 2015. The keywords used for database searching are “background parenchymal enhancement”, “parenchymal enhancement”, “MRI” and “breast”. The studies were included if qualitative and/or quantitative methods for BPE assessment were described. Results: Of the 420 studies identified, a total of 52 articles were included in the systematic review. 28 studies performed only a qualitative assessment of BPE, 13 studies performed only a quantitative assessment and 11 studies performed both qualitative and quantitative assessments. A wide heterogeneity was found in the MRI sequences and in the quantitative methods used for BPE assessment. Conclusion: A wide variability exists in the quantitative evaluation of BPE on breast MRI. More studies focused on a reliable and comparable method for quantitative BPE assessment are needed. Advances in knowledge: More studies focused on a quantitative BPE assessment are needed. PMID:27925480
Rakesh Minocha; Stephanie Long
2004-01-01
The objective of the present study was to develop a rapid HPLC method for simultaneous separation and quantitation of dansylated amino acids and common polyamines in the same matrix for analyzing forest tree tissues and cell cultures. The major modifications incorporated into this method as compared to previously published HPLC methods for separation of only dansyl...
Creating Objects and Object Categories for Studying Perception and Perceptual Learning
Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay
2012-01-01
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper. We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have. Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis. PMID:23149420
A Method to Measure and Estimate Normalized Contrast in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
Phase retrieval with the reverse projection method in the presence of object's scattering
NASA Astrophysics Data System (ADS)
Wang, Zhili; Gao, Kun; Wang, Dajiang
2017-08-01
X-ray grating interferometry can provide substantially increased contrast over traditional attenuation-based techniques in biomedical applications, and therefore novel and complementary information. Recently, special attention has been paid to quantitative phase retrieval in X-ray grating interferometry, which is mandatory to perform phase tomography, to achieve material identification, etc. An innovative approach, dubbed ;Reverse Projection; (RP), has been developed for quantitative phase retrieval. The RP method abandons grating scanning completely, and is thus advantageous in terms of higher efficiency and reduced radiation damage. Therefore, it is expected that this novel method would find its potential in preclinical and clinical implementations. Strictly speaking, the reverse projection method is applicable for objects exhibiting only absorption and refraction. In this contribution, we discuss the phase retrieval with the reverse projection method for general objects with absorption, refraction and scattering simultaneously. Especially, we investigate the influence of the object's scattering on the retrieved refraction signal. Both theoretical analysis and numerical experiments are performed. The results show that the retrieved refraction signal is the product of object's refraction and scattering signals for small values. In the case of a strong scattering, the reverse projection method cannot provide reliable phase retrieval. Those presented results will guide the use of the reverse projection method for future practical applications, and help to explain some possible artifacts in the retrieved images and/or reconstructed slices.
Nonparticipatory Stiffness in the Male Perioral Complex
ERIC Educational Resources Information Center
Chu, Shin-Ying; Barlow, Steven M.; Lee, Jaehoon
2009-01-01
Purpose: The objective of this study was to extend previous published findings in the authors' laboratory using a new automated technology to quantitatively characterize nonparticipatory perioral stiffness in healthy male adults. Method: Quantitative measures of perioral stiffness were sampled during a nonparticipatory task using a…
ERIC Educational Resources Information Center
Metos, Julie; Gren, Lisa; Brusseau, Timothy; Moric, Endi; O'Toole, Karen; Mokhtari, Tahereh; Buys, Saundra; Frost, Caren
2018-01-01
Objective: The objective of this study was to understand adolescent girls' experiences using practical diet and physical activity measurement tools and to explore the food and physical activity settings that influence their lifestyle habits. Design: Mixed methods study using quantitative and qualitative methods. Setting: Large city in the western…
Decision support intended to improve ecosystem sustainability requires that we link stakeholder priorities directly to quantitative tools and measures of desired outcomes. Actions taken at the community level can have large impacts on production and delivery of ecosystem service...
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
A Study of Dim Object Detection for the Space Surveillance Telescope
2013-03-21
ENG-13-M-32 Abstract Current methods of dim object detection for space surveillance make use of a Gaussian log-likelihood-ratio-test-based...quantitatively comparing the efficacy of two methods for dim object detection , termed in this paper the point detector and the correlator, both of which rely... applications . It is used in national defense for detecting satellites. It is used to detecting space debris, which threatens both civilian and
Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim
2015-01-01
Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.
NASA Astrophysics Data System (ADS)
Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim
2015-11-01
Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.
Micro-vibration detection with heterodyne holography based on time-averaged method
NASA Astrophysics Data System (ADS)
Qin, XiaoDong; Pan, Feng; Chen, ZongHui; Hou, XueQin; Xiao, Wen
2017-02-01
We propose a micro-vibration detection method by introducing heterodyne interferometry to time-averaged holography. This method compensates for the deficiency of time-average holography in quantitative measurements and widens its range of application effectively. Acousto-optic modulators are used to modulate the frequencies of the reference beam and the object beam. Accurate detection of the maximum amplitude of each point in the vibration plane is performed by altering the frequency difference of both beams. The range of amplitude detection of plane vibration is extended. In the stable vibration mode, the distribution of the maximum amplitude of each point is measured and the fitted curves are plotted. Hence the plane vibration mode of the object is demonstrated intuitively and detected quantitatively. We analyzed the method in theory and built an experimental system with a sine signal as the excitation source and a typical piezoelectric ceramic plate as the target. The experimental results indicate that, within a certain error range, the detected vibration mode agrees with the intrinsic vibration characteristics of the object, thus proving the validity of this method.
ERIC Educational Resources Information Center
Castillo, Enrico G.; Pincus, Harold A.; Wieland, Melissa; Roter, Debra; Larson, Susan; Houck, Patricia; Reynolds, Charles F.; Cruz, Mario
2012-01-01
Objective: The authors quantitatively examined differences in psychiatric residents' and attending physicians' communication profiles and voice tones. Methods: Audiotaped recordings of 49 resident-patient and 35 attending-patient medication-management appointments at four ambulatory sites were analyzed with the Roter Interaction Analysis System…
Quantitative method for measuring heat flux emitted from a cryogenic object
Duncan, Robert V.
1993-01-01
The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.
Quantitative method for measuring heat flux emitted from a cryogenic object
Duncan, R.V.
1993-03-16
The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.
Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc
2017-05-01
Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.
Computer simulation of schlieren images of rotationally symmetric plasma systems: a simple method.
Noll, R; Haas, C R; Weikl, B; Herziger, G
1986-03-01
Schlieren techniques are commonly used methods for quantitative analysis of cylindrical or spherical index of refraction profiles. Many schlieren objects, however, are characterized by more complex geometries, so we have investigated the more general case of noncylindrical, rotationally symmetric distributions of index of refraction n(r,z). Assuming straight ray paths in the schlieren object we have calculated 2-D beam deviation profiles. It is shown that experimental schlieren images of the noncylindrical plasma generated by a plasma focus device can be simulated with these deviation profiles. The computer simulation allows a quantitative analysis of these schlieren images, which yields, for example, the plasma parameters, electron density, and electron density gradients.
Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yubin; Yuan, Zhen, E-mail: zhenyuan@umac.mo
Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects withmore » different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular, their methods are able to resolve the intrinsic difficulties that occur when quantitative PAT is conducted by combining conventional PAT with the diffusion approximation or with radiation transport modeling.« less
A quantitative method for evaluating alternatives. [aid to decision making
NASA Technical Reports Server (NTRS)
Forthofer, M. J.
1981-01-01
When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.
A new method to evaluate image quality of CBCT images quantitatively without observers
Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori
2017-01-01
Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343
Conflicts Management Model in School: A Mixed Design Study
ERIC Educational Resources Information Center
Dogan, Soner
2016-01-01
The object of this study is to evaluate the reasons for conflicts occurring in school according to perceptions and views of teachers and resolution strategies used for conflicts and to build a model based on the results obtained. In the research, explanatory design including quantitative and qualitative methods has been used. The quantitative part…
ERIC Educational Resources Information Center
Haworth, Claire M. A.; Plomin, Robert
2010-01-01
Objective: To consider recent findings from quantitative genetic research in the context of molecular genetic research, especially genome-wide association studies. We focus on findings that go beyond merely estimating heritability. We use learning abilities and disabilities as examples. Method: Recent twin research in the area of learning…
An improved level set method for brain MR images segmentation and bias correction.
Chen, Yunjie; Zhang, Jianwei; Macione, Jim
2009-10-01
Intensity inhomogeneities cause considerable difficulty in the quantitative analysis of magnetic resonance (MR) images. Thus, bias field estimation is a necessary step before quantitative analysis of MR data can be undertaken. This paper presents a variational level set approach to bias correction and segmentation for images with intensity inhomogeneities. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the overall intensity inhomogeneity. We first define a localized K-means-type clustering objective function for image intensities in a neighborhood around each point. The cluster centers in this objective function have a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain to define the data term into the level set framework. Our method is able to capture bias of quite general profiles. Moreover, it is robust to initialization, and thereby allows fully automated applications. The proposed method has been used for images of various modalities with promising results.
Proper survey methods for research of aquatic plant ecology and management
USDA-ARS?s Scientific Manuscript database
Proper survey methods are essential for objective, quantitative assessment of the distribution and abundance of aquatic plants as part of research and demonstration efforts. For research, the use of the appropriate method is an essential part of the scientific method, to ensure that the experimenta...
Quantitative phase microscopy via optimized inversion of the phase optical transfer function.
Jenkins, Micah H; Gaylord, Thomas K
2015-10-01
Although the field of quantitative phase imaging (QPI) has wide-ranging biomedical applicability, many QPI methods are not well-suited for such applications due to their reliance on coherent illumination and specialized hardware. By contrast, methods utilizing partially coherent illumination have the potential to promote the widespread adoption of QPI due to their compatibility with microscopy, which is ubiquitous in the biomedical community. Described herein is a new defocus-based reconstruction method that utilizes a small number of efficiently sampled micrographs to optimally invert the partially coherent phase optical transfer function under assumptions of weak absorption and slowly varying phase. Simulation results are provided that compare the performance of this method with similar algorithms and demonstrate compatibility with large phase objects. The accuracy of the method is validated experimentally using a microlens array as a test phase object. Lastly, time-lapse images of live adherent cells are obtained with an off-the-shelf microscope, thus demonstrating the new method's potential for extending QPI capability widely in the biomedical community.
Boucheron, Laura E
2013-07-16
Quantitative object and spatial arrangement-level analysis of tissue are detailed using expert (pathologist) input to guide the classification process. A two-step method is disclosed for imaging tissue, by classifying one or more biological materials, e.g. nuclei, cytoplasm, and stroma, in the tissue into one or more identified classes on a pixel-by-pixel basis, and segmenting the identified classes to agglomerate one or more sets of identified pixels into segmented regions. Typically, the one or more biological materials comprises nuclear material, cytoplasm material, and stromal material. The method further allows a user to markup the image subsequent to the classification to re-classify said materials. The markup is performed via a graphic user interface to edit designated regions in the image.
Qualities of a Psychiatric Mentor: A Quantitative Singaporean Survey
ERIC Educational Resources Information Center
Tor, Phern-Chern; Goh, Lee-Gan; Ang, Yong-Guan; Lim, Leslie; Winslow, Rasaiah-Munidasa; Ng, Beng-Yeong; Wong, Sze-Tai; Ng, Tse-Pin; Kia, Ee-Heok
2011-01-01
Objective: Psychiatric mentors are an important part of the new, seamless training program in Singapore. There is a need to assess the qualities of a good psychiatric mentor vis-a-vis those of a good psychiatrist. Method: An anonymous survey was sent out to all psychiatry trainees and psychiatrists in Singapore to assess quantitatively the…
ERIC Educational Resources Information Center
Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.
2012-01-01
Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…
Fuzzy object models for newborn brain MR image segmentation
NASA Astrophysics Data System (ADS)
Kobashi, Syoji; Udupa, Jayaram K.
2013-03-01
Newborn brain MR image segmentation is a challenging problem because of variety of size, shape and MR signal although it is the fundamental study for quantitative radiology in brain MR images. Because of the large difference between the adult brain and the newborn brain, it is difficult to directly apply the conventional methods for the newborn brain. Inspired by the original fuzzy object model introduced by Udupa et al. at SPIE Medical Imaging 2011, called fuzzy shape object model (FSOM) here, this paper introduces fuzzy intensity object model (FIOM), and proposes a new image segmentation method which combines the FSOM and FIOM into fuzzy connected (FC) image segmentation. The fuzzy object models are built from training datasets in which the cerebral parenchyma is delineated by experts. After registering FSOM with the evaluating image, the proposed method roughly recognizes the cerebral parenchyma region based on a prior knowledge of location, shape, and the MR signal given by the registered FSOM and FIOM. Then, FC image segmentation delineates the cerebral parenchyma using the fuzzy object models. The proposed method has been evaluated using 9 newborn brain MR images using the leave-one-out strategy. The revised age was between -1 and 2 months. Quantitative evaluation using false positive volume fraction (FPVF) and false negative volume fraction (FNVF) has been conducted. Using the evaluation data, a FPVF of 0.75% and FNVF of 3.75% were achieved. More data collection and testing are underway.
Stern, K I; Malkova, T L
The objective of the present study was the development and validation of sibutramine demethylated derivatives, desmethyl sibutramine and didesmethyl sibutramine. Gas-liquid chromatography with the flame ionization detector was used for the quantitative determination of the above substances in dietary supplements. The conditions for the chromatographic determination of the analytes in the presence of the reference standard, methyl stearate, were proposed allowing to achieve the efficient separation. The method has the necessary sensitivity, specificity, linearity, accuracy, and precision (on the intra-day and inter-day basis) which suggests its good validation characteristics. The proposed method can be employed in the analytical laboratories for the quantitative determination of sibutramine derivatives in biologically active dietary supplements.
Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter
2012-04-01
Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.
The SCHEIE Visual Field Grading System
Sankar, Prithvi S.; O’Keefe, Laura; Choi, Daniel; Salowe, Rebecca; Miller-Ellis, Eydie; Lehman, Amanda; Addis, Victoria; Ramakrishnan, Meera; Natesh, Vikas; Whitehead, Gideon; Khachatryan, Naira; O’Brien, Joan
2017-01-01
Objective No method of grading visual field (VF) defects has been widely accepted throughout the glaucoma community. The SCHEIE (Systematic Classification of Humphrey visual fields-Easy Interpretation and Evaluation) grading system for glaucomatous visual fields was created to convey qualitative and quantitative information regarding visual field defects in an objective, reproducible, and easily applicable manner for research purposes. Methods The SCHEIE grading system is composed of a qualitative and quantitative score. The qualitative score consists of designation in one or more of the following categories: normal, central scotoma, paracentral scotoma, paracentral crescent, temporal quadrant, nasal quadrant, peripheral arcuate defect, expansive arcuate, or altitudinal defect. The quantitative component incorporates the Humphrey visual field index (VFI), location of visual defects for superior and inferior hemifields, and blind spot involvement. Accuracy and speed at grading using the qualitative and quantitative components was calculated for non-physician graders. Results Graders had a median accuracy of 96.67% for their qualitative scores and a median accuracy of 98.75% for their quantitative scores. Graders took a mean of 56 seconds per visual field to assign a qualitative score and 20 seconds per visual field to assign a quantitative score. Conclusion The SCHEIE grading system is a reproducible tool that combines qualitative and quantitative measurements to grade glaucomatous visual field defects. The system aims to standardize clinical staging and to make specific visual field defects more easily identifiable. Specific patterns of visual field loss may also be associated with genetic variants in future genetic analysis. PMID:28932621
Meta-Analysis and Systematic Review Assessing the Efficacy of Dialectical Behavior Therapy (DBT)
ERIC Educational Resources Information Center
Panos, Patrick T.; Jackson, John W.; Hasan, Omar; Panos, Angelea
2014-01-01
Objective: The objective was to quantitatively and qualitatively examine the efficacy of DBT (e.g., decreasing life-threatening suicidal and parasuicidal acts, attrition, and depression) explicitly with borderline personality disorder (BPD) and using conservative assumptions and criteria, across treatment providers and settings. Method: Five…
Trade Space Analysis: Rotational Analyst Research Project
2015-09-01
POM Program Objective Memoranda PM Program Manager RFP Request for Proposal ROM Rough Order Magnitude RSM Response Surface Method RSE ...response surface method (RSM) / response surface equations ( RSEs ) as surrogate models. It uses the RSEs with Monte Carlo simulation to quantitatively
Qualitative Research in Educational Gerontology.
ERIC Educational Resources Information Center
Applewhite, Steven Lozano
1997-01-01
Quantitative methods such as logical positivism often view nondominant groups as deviant and purport to be objective. Qualitative methods such as ethnography help educational gerontologists understand diverse elderly populations and allow elders to participate in the process of defining reality and producing knowledge. (SK)
Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro
2016-01-01
Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image analysis.
Enumeration of viable and non-viable larvated Ascaris eggs with quantitative PCR
Aims: The goal of the study was to further develop an incubation-qPCR method for quantifying viable Ascaris eggs. The specific objectives were to characterize the detection limit and number of template copies per egg, determine the specificity of the method, and test the method w...
Measuring landscape esthetics: the scenic beauty estimation method
Terry C. Daniel; Ron S. Boster
1976-01-01
The Scenic Beauty Estimation Method (SBE) provides quantitative measures of esthetic preferences for alternative wildland management systems. Extensive experimentation and testing with user, interest, and professional groups validated the method. SBE shows promise as an efficient and objective means for assessing the scenic beauty of public forests and wildlands, and...
The Application of Montessori Method in Learning Mathematics: An Experimental Research
ERIC Educational Resources Information Center
Faryadi, Qais
2017-01-01
The prime objective of this research was to investigate whether the Montessori method of learning helped kindergarten pupils improve their mathematical proficiency, critical thinking and problem-solving skills, besides training them to be responsible learners. Quantitative, qualitative, and observational methods were employed in the investigation.…
Hagen, C K; Diemoz, P C; Endrizzi, M; Rigon, L; Dreossi, D; Arfelli, F; Lopez, F C M; Longo, R; Olivo, A
2014-04-07
X-ray phase contrast imaging (XPCi) methods are sensitive to phase in addition to attenuation effects and, therefore, can achieve improved image contrast for weakly attenuating materials, such as often encountered in biomedical applications. Several XPCi methods exist, most of which have already been implemented in computed tomographic (CT) modality, thus allowing volumetric imaging. The Edge Illumination (EI) XPCi method had, until now, not been implemented as a CT modality. This article provides indications that quantitative 3D maps of an object's phase and attenuation can be reconstructed from EI XPCi measurements. Moreover, a theory for the reconstruction of combined phase and attenuation maps is presented. Both reconstruction strategies find applications in tissue characterisation and the identification of faint, weakly attenuating details. Experimental results for wires of known materials and for a biological object validate the theory and confirm the superiority of the phase over conventional, attenuation-based image contrast.
ERIC Educational Resources Information Center
Storfer-Isser, Amy; Musher-Eizenman, Dara
2013-01-01
Objective: To examine the psychometric properties of 9 quantitative items that assess time scarcity and fatigue as parent barriers to planning and preparing meals for their children. Methods: A convenience sample of 342 parents of children aged 2-6 years completed a 20-minute online survey. Exploratory factor analysis was used to examine the…
NASA Astrophysics Data System (ADS)
Gibergans-Báguena, J.; Llasat, M. C.
2007-12-01
The objective of this paper is to present the improvement of quantitative forecasting of daily rainfall in Catalonia (NE Spain) from an analogues technique, taking into account synoptic and local data. This method is based on an analogues sorting technique: meteorological situations similar to the current one, in terms of 700 and 1000 hPa geopotential fields at 00 UTC, complemented with the inclusion of some thermodynamic parameters extracted from an historical data file. Thermodynamic analysis acts as a highly discriminating feature for situations in which the synoptic situation fails to explain either atmospheric phenomena or rainfall distribution. This is the case in heavy rainfall situations, where the existence of instability and high water vapor content is essential. With the objective of including these vertical thermodynamic features, information provided by the Palma de Mallorca radiosounding (Spain) has been used. Previously, a selection of the most discriminating thermodynamic parameters for the daily rainfall was made, and then the analogues technique applied to them. Finally, three analog forecasting methods were applied for the quantitative daily rainfall forecasting in Catalonia. The first one is based on analogies from geopotential fields to synoptic scale; the second one is exclusively based on the search of similarity from local thermodynamic information and the third method combines the other two methods. The results show that this last method provides a substantial improvement of quantitative rainfall estimation.
Mixed Methods in CAM Research: A Systematic Review of Studies Published in 2012
Bishop, Felicity L.; Holmes, Michelle M.
2013-01-01
Background. Mixed methods research uses qualitative and quantitative methods together in a single study or a series of related studies. Objectives. To review the prevalence and quality of mixed methods studies in complementary medicine. Methods. All studies published in the top 10 integrative and complementary medicine journals in 2012 were screened. The quality of mixed methods studies was appraised using a published tool designed for mixed methods studies. Results. 4% of papers (95 out of 2349) reported mixed methods studies, 80 of which met criteria for applying the quality appraisal tool. The most popular formal mixed methods design was triangulation (used by 74% of studies), followed by embedded (14%), sequential explanatory (8%), and finally sequential exploratory (5%). Quantitative components were generally of higher quality than qualitative components; when quantitative components involved RCTs they were of particularly high quality. Common methodological limitations were identified. Most strikingly, none of the 80 mixed methods studies addressed the philosophical tensions inherent in mixing qualitative and quantitative methods. Conclusions and Implications. The quality of mixed methods research in CAM can be enhanced by addressing philosophical tensions and improving reporting of (a) analytic methods and reflexivity (in qualitative components) and (b) sampling and recruitment-related procedures (in all components). PMID:24454489
Examining the Use of Web-Based Reusable Learning Objects by Animal and Veterinary Nursing Students
ERIC Educational Resources Information Center
Chapman-Waterhouse, Emily; Silva-Fletcher, Ayona; Whittlestone, Kim David
2016-01-01
This intervention study examined the interaction of animal and veterinary nursing students with reusable learning objects (RLO) in the context of preparing for summative assessment. Data was collected from 199 undergraduates using quantitative and qualitative methods. Students accessed RLO via personal devices in order to reinforce taught…
Drew, Peter; Tippett, Vivienne; Devenish, Scott
2018-05-01
The objective of this review is to develop an aggregated synthesis of qualitative and quantitative data on occupational violence (OV) mitigation interventions for Emergency Service Workers (ESW), to cultivate useful conclusions and recommendations for paramedic occupational safety and policy development. Emergency Service Worker is a broad term encompassing all elements of community-based emergency support and includes paramedics, firefighters, and police.The objective of the quantitative component of this review is to quantify the effectiveness of OV mitigation interventions for ESW.The objective of the qualitative component of this review is to explore the perceptions and experiences of ESW on the effectiveness of OV mitigation interventions.This review seeks to address the following questions.
The Evolution of 3D Microimaging Techniques in Geosciences
NASA Astrophysics Data System (ADS)
Sahagian, D.; Proussevitch, A.
2009-05-01
In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.
Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.
Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin
2018-01-08
We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.
Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.
NASA Astrophysics Data System (ADS)
Yuan, Zhen; Li, Xiaoqi; Xi, Lei
2014-06-01
Biomedical photoacoustic tomography (PAT), as a potential imaging modality, can visualize tissue structure and function with high spatial resolution and excellent optical contrast. It is widely recognized that the ability of quantitatively imaging optical absorption and scattering coefficients from photoacoustic measurements is essential before PAT can become a powerful imaging modality. Existing quantitative PAT (qPAT), while successful, has been focused on recovering absorption coefficient only by assuming scattering coefficient a constant. An effective method for photoacoustically recovering optical scattering coefficient is presently not available. Here we propose and experimentally validate such a method for quantitative scattering coefficient imaging using photoacoustic data from one-wavelength illumination. The reconstruction method developed combines conventional PAT with the photon diffusion equation in a novel way to realize the recovery of scattering coefficient. We demonstrate the method using various objects having scattering contrast only or both absorption and scattering contrasts embedded in turbid media. The listening-to-light-scattering method described will be able to provide high resolution scattering imaging for various biomedical applications ranging from breast to brain imaging.
Coherent diffraction imaging of non-isolated object with apodized illumination.
Khakurel, Krishna P; Kimura, Takashi; Joti, Yasumasa; Matsuyama, Satoshi; Yamauchi, Kazuto; Nishino, Yoshinori
2015-11-02
Coherent diffraction imaging (CDI) is an established lensless imaging method widely used at the x-ray regime applicable to the imaging of non-periodic materials. Conventional CDI can practically image isolated objects only, which hinders the broader application of the method. We present the imaging of non-isolated objects by employing recently proposed "non-scanning" apodized-illumination CDI at an optical wavelength. We realized isolated apodized illumination with a specially designed optical configuration and succeeded in imaging phase objects as well as amplitude objects. The non-scanning nature of the method is important particularly in imaging live cells and tissues, where fast imaging is required for non-isolated objects, and is an advantage over ptychography. We believe that our result of phase contrast imaging at an optical wavelength can be extended to the quantitative phase imaging of cells and tissues. The method also provides the feasibility of the lensless single-shot imaging of extended objects with x-ray free-electron lasers.
A New Approach for the Quantitative Evaluation of Drawings in Children with Learning Disabilities
ERIC Educational Resources Information Center
Galli, Manuela; Vimercati, Sara Laura; Stella, Giacomo; Caiazzo, Giorgia; Norveti, Federica; Onnis, Francesca; Rigoldi, Chiara; Albertini, Giorgio
2011-01-01
A new method for a quantitative and objective description of drawing and for the quantification of drawing ability in children with learning disabilities (LD) is hereby presented. Twenty-four normally developing children (N) (age 10.6 [plus or minus] 0.5) and 18 children with learning disabilities (LD) (age 10.3 [plus or minus] 2.4) took part to…
DOT National Transportation Integrated Search
1995-07-01
An objective and quantitative method has been developed for deriving models of complex and specialized spheres of activity (domains) from domain-generated verbal data. The method was developed for analysis of interview transcripts, incident reports, ...
Detection of blur artifacts in histopathological whole-slide images of endomyocardial biopsies.
Hang Wu; Phan, John H; Bhatia, Ajay K; Cundiff, Caitlin A; Shehata, Bahig M; Wang, May D
2015-01-01
Histopathological whole-slide images (WSIs) have emerged as an objective and quantitative means for image-based disease diagnosis. However, WSIs may contain acquisition artifacts that affect downstream image feature extraction and quantitative disease diagnosis. We develop a method for detecting blur artifacts in WSIs using distributions of local blur metrics. As features, these distributions enable accurate classification of WSI regions as sharp or blurry. We evaluate our method using over 1000 portions of an endomyocardial biopsy (EMB) WSI. Results indicate that local blur metrics accurately detect blurry image regions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lima, F.W.; Pagano, C.; Schneiderman, B.
1959-07-01
Boron can be determined quantitatively by absorption spectrophotometry of solutions of the red compound formed by the reaction of boric acid with curcumin. This reaction is affected by various factors, some of which can be detected easily in the data interpretation. Others, however, provide more difficulty. The application of modern statistical method to the study of the influence of these factors on the quantitative determination of boron is presented. These methods provide objective ways of establishing significant effects of the factors involved. (auth)
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko
2017-07-01
Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.
Parker, Christine H; Khuda, Sefat E; Pereira, Marion; Ross, Mark M; Fu, Tong-Jen; Fan, Xuebin; Wu, Yan; Williams, Kristina M; DeVries, Jonathan; Pulvermacher, Brian; Bedford, Binaifer; Zhang, Xi; Jackson, Lauren S
2015-12-16
Undeclared food allergens account for 30-40% of food recalls in the United States. Compliance with ingredient labeling regulations and the implementation of effective manufacturing allergen control plans require the use of reliable methods for allergen detection and quantitation in complex food products. The objectives of this work were to (1) produce industry-processed model foods incurred with egg, milk, and peanut allergens, (2) compare analytical method performance for allergen quantitation in thermally processed bakery products, and (3) determine the effects of thermal treatment on allergen detection. Control and allergen-incurred cereal bars and muffins were formulated in a pilot-scale industry processing facility. Quantitation of egg, milk, and peanut in incurred baked goods was compared at various processing stages using commercial enzyme-linked immunosorbent assay (ELISA) kits and a novel multi-allergen liquid chromatography (LC)-tandem mass spectrometry (MS/MS) multiple-reaction monitoring (MRM) method. Thermal processing was determined to negatively affect the recovery and quantitation of egg, milk, and peanut to different extents depending on the allergen, matrix, and analytical test method. The Morinaga ELISA and LC-MS/MS quantitative methods reported the highest recovery across all monitored allergens, whereas the ELISA Systems, Neogen BioKits, Neogen Veratox, and R-Biopharm ELISA Kits underperformed in the determination of allergen content of industry-processed bakery products.
ERIC Educational Resources Information Center
Rehberg, Robb S.; Gazzillo Diaz, Linda; Middlemas, David A.
2009-01-01
Objective: The objective of this study was to determine whether computer-based CPR training is comparable to traditional classroom training. Design and Setting: This study was quantitative in design. Data was gathered from a standardized examination and skill performance evaluation which yielded numerical scores. Subjects: The subjects were 64…
NASA Astrophysics Data System (ADS)
Chien, Kuang-Che Chang; Tu, Han-Yen; Hsieh, Ching-Huang; Cheng, Chau-Jern; Chang, Chun-Yen
2018-01-01
This study proposes a regional fringe analysis (RFA) method to detect the regions of a target object in captured shifted images to improve depth measurement in phase-shifting fringe projection profilometry (PS-FPP). In the RFA method, region-based segmentation is exploited to segment the de-fringed image of a target object, and a multi-level fuzzy-based classification with five presented features is used to analyze and discriminate the regions of an object from the segmented regions, which were associated with explicit fringe information. Then, in the experiment, the performance of the proposed method is tested and evaluated on 26 test cases made of five types of materials. The qualitative and quantitative results demonstrate that the proposed RFA method can effectively detect the desired regions of an object to improve depth measurement in the PS-FPP system.
Computer-aided analysis with Image J for quantitatively assessing psoriatic lesion area.
Sun, Z; Wang, Y; Ji, S; Wang, K; Zhao, Y
2015-11-01
Body surface area is important in determining the severity of psoriasis. However, objective, reliable, and practical method is still in need for this purpose. We performed a computer image analysis (CIA) of psoriatic area using the image J freeware to determine whether this method could be used for objective evaluation of psoriatic area. Fifteen psoriasis patients were randomized to be treated with adalimumab or placebo in a clinical trial. At each visit, the psoriasis area of each body site was estimated by two physicians (E-method), and standard photographs were taken. The psoriasis area in the pictures was assessed with CIA using semi-automatic threshold selection (T-method), or manual selection (M-method, gold standard). The results assessed by the three methods were analyzed with reliability and affecting factors evaluated. Both T- and E-method correlated strongly with M-method, and T-method had a slightly stronger correlation with M-method. Both T- and E-methods had a good consistency between the evaluators. All the three methods were able to detect the change in the psoriatic area after treatment, while the E-method tends to overestimate. The CIA with image J freeware is reliable and practicable in quantitatively assessing the lesional of psoriasis area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Bromage, Timothy G.; Doty, Stephen B.; Smolyar, Igor; Holton, Emily
1997-01-01
Our stated primary objective is to quantify the growth rate variability of rat lamellar bone exposed to micro- (near zero G: e.g., Cosmos 1887 & 2044; SLS-1 & SLS-2) and macrogravity (2G). The primary significance of the proposed work is that an elegant method will be established that unequivocally characterizes the morphological consequences of gravitational factors on developing bone. The integrity of this objective depends upon our successful preparation of thin sections suitable for imaging individual bone lamellae, and our imaging and quantitation of growth rate variability in populations of lamellae from individual bone samples.
Stephen R. Shifley; Hong S. He; Heike Lischke; Wen J. Wang; Wenchi Jin; Eric J. Gustafson; Jonathan R. Thompson; Frank R. Thompson; William D. Dijak; Jian Yang
2017-01-01
Context. Quantitative models of forest dynamics have followed a progression toward methods with increased detail, complexity, and spatial extent. Objectives. We highlight milestones in the development of forest dynamics models and identify future research and application opportunities. Methods. We reviewed...
[Information value of "additional tasks" method to evaluate pilot's work load].
Gorbunov, V V
2005-01-01
"Additional task" method was used to evaluate pilot's work load in prolonged flight. Calculated through durations of latent periods of motor responses, quantitative criterion of work load is more informative for objective evaluation of pilot's involvement in his piloting functions rather than of other registered parameters.
ERIC Educational Resources Information Center
Nielsen, Karina; Randall, Raymond; Christensen, Karl B.
2017-01-01
A mixed methods approach was applied to examine the effects of a naturally occurring teamwork intervention supported with training. The first objective was to integrate qualitative process evaluation and quantitative effect evaluation to examine "how" and "why" the training influence intervention outcomes. The intervention (N =…
Peeters, Michael J; Vaidya, Varun A
2016-06-25
Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.
Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J
2016-08-20
Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions and can comprehensively and accurately evaluate occupational health risk caused by DMF.
Herbort, Carl P; Tugal-Tutkun, Ilknur
2017-06-01
Laser flare photometry (LFP) is an objective and quantitative method to measure intraocular inflammation. The LFP technology was developed in Japan and has been commercially available since 1990. The aim of this work was to review the application of LFP in uveitis practice in Europe compared to Japan where the technology was born. We reviewed PubMed articles published on LFP and uveitis. Although LFP has been largely integrated in routine uveitis practice in Europe, it has been comparatively neglected in Japan and still has not received FDA approval in the USA. As LFP is the only method that provides a precise measure of intraocular inflammation, it should be used as a gold standard in uveitis centres worldwide.
Objective comparison of particle tracking methods.
Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R; Godinez, William J; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E G; Jaldén, Joakim; Blau, Helen M; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P; Dan, Han-Wei; Tsai, Yuh-Show; Ortiz de Solórzano, Carlos; Olivo-Marin, Jean-Christophe; Meijering, Erik
2014-03-01
Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Because manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized an open competition in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to notable practical conclusions for users and developers.
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-01-01
Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040
Towards standardized assessment of endoscope optical performance: geometric distortion
NASA Astrophysics Data System (ADS)
Wang, Quanzeng; Desai, Viraj N.; Ngo, Ying Z.; Cheng, Wei-Chung; Pfefer, Joshua
2013-12-01
Technological advances in endoscopes, such as capsule, ultrathin and disposable devices, promise significant improvements in safety, clinical effectiveness and patient acceptance. Unfortunately, the industry lacks test methods for preclinical evaluation of key optical performance characteristics (OPCs) of endoscopic devices that are quantitative, objective and well-validated. As a result, it is difficult for researchers and developers to compare image quality and evaluate equivalence to, or improvement upon, prior technologies. While endoscope OPCs include resolution, field of view, and depth of field, among others, our focus in this paper is geometric image distortion. We reviewed specific test methods for distortion and then developed an objective, quantitative test method based on well-defined experimental and data processing steps to evaluate radial distortion in the full field of view of an endoscopic imaging system. Our measurements and analyses showed that a second-degree polynomial equation could well describe the radial distortion curve of a traditional endoscope. The distortion evaluation method was effective for correcting the image and can be used to explain other widely accepted evaluation methods such as picture height distortion. Development of consensus standards based on promising test methods for image quality assessment, such as the method studied here, will facilitate clinical implementation of innovative endoscopic devices.
Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations
NASA Technical Reports Server (NTRS)
Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.
1993-01-01
We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.
2017-01-01
Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). Conclusions This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. PMID:28292738
NASA Astrophysics Data System (ADS)
Latief, F. D. E.; Mohammad, I. H.; Rarasati, A. D.
2017-11-01
Digital imaging of a concrete sample using high resolution tomographic imaging by means of X-Ray Micro Computed Tomography (μ-CT) has been conducted to assess the characteristic of the sample’s structure. A standard procedure of image acquisition, reconstruction, image processing of the method using a particular scanning device i.e., the Bruker SkyScan 1173 High Energy Micro-CT are elaborated. A qualitative and a quantitative analysis were briefly performed on the sample to deliver some basic ideas of the capability of the system and the bundled software package. Calculation of total VOI volume, object volume, percent of object volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity were conducted and analysed. This paper should serve as a brief description of how the device can produce the preferred image quality as well as the ability of the bundled software packages to help in performing qualitative and quantitative analysis.
Object-oriented Persistent Homology
Wang, Bao; Wei, Guo-Wei
2015-01-01
Persistent homology provides a new approach for the topological simplification of big data via measuring the life time of intrinsic topological features in a filtration process and has found its success in scientific and engineering applications. However, such a success is essentially limited to qualitative data classification and analysis. Indeed, persistent homology has rarely been employed for quantitative modeling and prediction. Additionally, the present persistent homology is a passive tool, rather than a proactive technique, for classification and analysis. In this work, we outline a general protocol to construct object-oriented persistent homology methods. By means of differential geometry theory of surfaces, we construct an objective functional, namely, a surface free energy defined on the data of interest. The minimization of the objective functional leads to a Laplace-Beltrami operator which generates a multiscale representation of the initial data and offers an objective oriented filtration process. The resulting differential geometry based object-oriented persistent homology is able to preserve desirable geometric features in the evolutionary filtration and enhances the corresponding topological persistence. The cubical complex based homology algorithm is employed in the present work to be compatible with the Cartesian representation of the Laplace-Beltrami flow. The proposed Laplace-Beltrami flow based persistent homology method is extensively validated. The consistence between Laplace-Beltrami flow based filtration and Euclidean distance based filtration is confirmed on the Vietoris-Rips complex for a large amount of numerical tests. The convergence and reliability of the present Laplace-Beltrami flow based cubical complex filtration approach are analyzed over various spatial and temporal mesh sizes. The Laplace-Beltrami flow based persistent homology approach is utilized to study the intrinsic topology of proteins and fullerene molecules. Based on a quantitative model which correlates the topological persistence of fullerene central cavity with the total curvature energy of the fullerene structure, the proposed method is used for the prediction of fullerene isomer stability. The efficiency and robustness of the present method are verified by more than 500 fullerene molecules. It is shown that the proposed persistent homology based quantitative model offers good predictions of total curvature energies for ten types of fullerene isomers. The present work offers the first example to design object-oriented persistent homology to enhance or preserve desirable features in the original data during the filtration process and then automatically detect or extract the corresponding topological traits from the data. PMID:26705370
ERIC Educational Resources Information Center
Bliss, Leonard B.; Tashakkori, Abbas
This paper discusses the objectives that would be appropriate for statistics classes for students who are not majoring in statistics, evaluation, or quantitative research design. These "non-majors" should be able to choose appropriate analytical methods for specific sets of data based on the research question and the nature of the data, and they…
Development of an Interactive Social Media Tool for Parents with Concerns about Vaccines
ERIC Educational Resources Information Center
Shoup, Jo Ann; Wagner, Nicole M.; Kraus, Courtney R.; Narwaney, Komal J.; Goddard, Kristin S.; Glanz, Jason M.
2015-01-01
Objective: Describe a process for designing, building, and evaluating a theory-driven social media intervention tool to help reduce parental concerns about vaccination. Method: We developed an interactive web-based tool using quantitative and qualitative methods (e.g., survey, focus groups, individual interviews, and usability testing). Results:…
The Path to Graduation: A Model Interactive Web Site Design Supporting Doctoral Students
ERIC Educational Resources Information Center
Simmons-Johnson, Nicole
2012-01-01
Objective. This 2-phase mixed method study assessed 2nd-year doctoral students' and dissertation students' perceptions of the current Graduate School of Education dissertation support Web site, with implications for designing a model dissertation support Web site. Methods. Phase 1 collected quantitative and qualitative data through an…
Sexual Health Promotion Programme: Participants' Perspectives on Capacity Building
ERIC Educational Resources Information Center
Keogh, Brian; Daly, Louise; Sharek, Danika; De Vries, Jan; McCann, Edward; Higgins, Agnes
2016-01-01
Objectives: The aim of this study was to evaluate a Health Service Executive (HSE) Foundation Programme in Sexual Health Promotion (FPSHP) with a specific emphasis on capacity building. Design: A mixed-method design using both quantitative and qualitative methods was used to collect the data. Setting: The FPSHP was delivered to staff working in…
DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.
ERIC Educational Resources Information Center
KUSEWITT, J.B.
THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…
Development of a Computer-Based Visualised Quantitative Learning System for Playing Violin Vibrato
ERIC Educational Resources Information Center
Ho, Tracy Kwei-Liang; Lin, Huann-shyang; Chen, Ching-Kong; Tsai, Jih-Long
2015-01-01
Traditional methods of teaching music are largely subjective, with the lack of objectivity being particularly challenging for violin students learning vibrato because of the existence of conflicting theories. By using a computer-based analysis method, this study found that maintaining temporal coincidence between the intensity peak and the target…
A Rationale for Mixed Methods (Integrative) Research Programmes in Education
ERIC Educational Resources Information Center
Niaz, Mansoor
2008-01-01
Recent research shows that research programmes (quantitative, qualitative and mixed) in education are not displaced (as suggested by Kuhn) but rather lead to integration. The objective of this study is to present a rationale for mixed methods (integrative) research programs based on contemporary philosophy of science (Lakatos, Giere, Cartwright,…
Process Evaluation of a Parenting Program for Low-Income Families in South Africa
ERIC Educational Resources Information Center
Lachman, Jamie M.; Kelly, Jane; Cluver, Lucie; Ward, Catherine L.; Hutchings, Judy; Gardner, Frances
2018-01-01
Objective: This mixed-methods process evaluation examined the feasibility of a parenting program delivered by community facilitators to reduce the risk of child maltreatment in low-income families with children aged 3-8 years in Cape Town, South Africa (N = 68). Method: Quantitative measures included attendance registers, fidelity checklists,…
Although two-dimensional electrophoresis (2D-GE) remains the basis for many ecotoxicoproteomic analyses, new, non gel-based methods are beginning to be applied to overcome throughput and coverage limitations of 2D-GE. The overall objective of our research was to apply a comprehe...
Model-assisted development of a laminography inspection system
NASA Astrophysics Data System (ADS)
Grandin, R.; Gray, J.
2012-05-01
Traditional computed tomography (CT) is an effective method of determining the internal structure of an object through non-destructive means; however, inspection of certain objects, such as those with planar geometrics or with limited access, requires an alternate approach. An alternative is laminography and has been the focus of a number of researchers in the past decade for both medical and industrial inspections. Many research efforts rely on geometrically-simple analytical models, such as the Shepp-Logan phantom, for the development of their algorithms. Recent work at the Center for Non-Destructive Evaluation makes extensive use of a forward model, XRSIM, to study artifacts arising from the reconstruction method, the effects of complex geometries and known issues such as high density features on the laminography reconstruction process. The use of a model provides full knowledge of all aspects of the geometry and provides a means to quantitatively evaluate the impact of methods designed to reduce artifacts generated by the reconstruction methods or that are result of the part geometry. We will illustrate the use of forward simulations to quantitatively assess reconstruction algorithm development and artifact reduction.
Near-infrared fluorescence image quality test methods for standardized performance evaluation
NASA Astrophysics Data System (ADS)
Kanniyappan, Udayakumar; Wang, Bohan; Yang, Charles; Ghassemi, Pejhman; Wang, Quanzeng; Chen, Yu; Pfefer, Joshua
2017-03-01
Near-infrared fluorescence (NIRF) imaging has gained much attention as a clinical method for enhancing visualization of cancers, perfusion and biological structures in surgical applications where a fluorescent dye is monitored by an imaging system. In order to address the emerging need for standardization of this innovative technology, it is necessary to develop and validate test methods suitable for objective, quantitative assessment of device performance. Towards this goal, we develop target-based test methods and investigate best practices for key NIRF imaging system performance characteristics including spatial resolution, depth of field and sensitivity. Characterization of fluorescence properties was performed by generating excitation-emission matrix properties of indocyanine green and quantum dots in biological solutions and matrix materials. A turbid, fluorophore-doped target was used, along with a resolution target for assessing image sharpness. Multi-well plates filled with either liquid or solid targets were generated to explore best practices for evaluating detection sensitivity. Overall, our results demonstrate the utility of objective, quantitative, target-based testing approaches as well as the need to consider a wide range of factors in establishing standardized approaches for NIRF imaging system performance.
3D methodology for evaluating rail crossing roughness.
DOT National Transportation Integrated Search
2015-03-02
Description of Research Project The overall objective of this project is to investigate develop a quantitative method or measure for determining the need to rehabilitate rail crossings. The scope of the project includes investigation of sensor capabi...
Influence of echo time in quantitative proton MR spectroscopy using LCModel.
Yamamoto, Tetsuya; Isobe, Tomonori; Akutsu, Hiroyoshi; Masumoto, Tomohiko; Ando, Hiroki; Sato, Eisuke; Takada, Kenta; Anno, Izumi; Matsumura, Akira
2015-06-01
The objective of this study was to elucidate the influence on quantitative analysis using LCModel with the condition of echo time (TE) longer than the recommended values in the spectrum acquisition specifications. A 3T magnetic resonance system was used to perform proton magnetic resonance spectroscopy. The participants were 5 healthy volunteers and 11 patients with glioma. Data were collected at TE of 72, 144 and 288ms. LCModel was used to quantify several metabolites (N-acetylaspartate, creatine and phosphocreatine, and choline-containing compounds). The results were compared with quantitative values obtained by using the T2-corrected internal reference method. In healthy volunteers, when TE was long, the quantitative values obtained using LCModel were up to 6.8-fold larger (p<0.05) than those obtained using the T2-corrected internal reference method. The ratios of the quantitative values obtained by the two methods differed between metabolites (p<0.05). In patients with glioma, the ratios of quantitative values obtained by the two methods tended to be larger at longer TE, similarly to the case of healthy volunteers, and large between-individual variation in the ratios was observed. In clinical practice, TE is sometimes set longer than the value recommended for LCModel. If TE is long, LCModel overestimates the quantitative value since it cannot compensate for signal attenuation, and this effect is different for each metabolite and condition. Therefore, if TE is longer than recommended, it is necessary to account for the possibly reduced reliability of quantitative values calculated using LCModel. Copyright © 2015 Elsevier Inc. All rights reserved.
Sornborger, Andrew; Broder, Josef; Majumder, Anirban; Srinivasamoorthy, Ganesh; Porter, Erika; Reagin, Sean S; Keith, Charles; Lauderdale, James D
2008-09-01
Ratiometric fluorescent indicators are used for making quantitative measurements of a variety of physiological variables. Their utility is often limited by noise. This is the second in a series of papers describing statistical methods for denoising ratiometric data with the aim of obtaining improved quantitative estimates of variables of interest. Here, we outline a statistical optimization method that is designed for the analysis of ratiometric imaging data in which multiple measurements have been taken of systems responding to the same stimulation protocol. This method takes advantage of correlated information across multiple datasets for objectively detecting and estimating ratiometric signals. We demonstrate our method by showing results of its application on multiple, ratiometric calcium imaging experiments.
Leung, Janet T Y; Shek, Daniel T L
2011-01-01
This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.
Objective comparison of particle tracking methods
Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F.; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R.; Godinez, William J.; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E. G.; Jaldén, Joakim; Blau, Helen M.; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L.; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P.; Dan, Han-Wei; Tsai, Yuh-Show; de Solórzano, Carlos Ortiz; Olivo-Marin, Jean-Christophe; Meijering, Erik
2014-01-01
Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Since manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized, for the first time, an open competition, in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to important practical conclusions for users and developers. PMID:24441936
NASA Astrophysics Data System (ADS)
Batyaev, V. F.; Belichenko, S. G.; Bestaev, R. R.
2016-04-01
The work is devoted to a quantitative comparison of different inorganic scintillators to be used in neutron-radiation inspection systems. Such systems can be based on the tagged neutron (TN) method and have a significant potential in different applications such as detection of explosives, drugs, mines, identification of chemical warfare agents, assay of nuclear materials and human body composition [1]-[3]. The elemental composition of an inspected object is determined via spectrometry of gammas from the object bombarded by neutrons which are tagged by an alpha-detector built inside a neutron generator. This creates a task to find a quantitative indicator of the object identification quality (via elemental composition) as a function of basic parameters of the γ-detectors, such as their efficiency, energy and time resolutions, which in turn are generally defined by a scintillator of the detector. We have tried to solve the task for a set of four scintillators which are often used in the study of TN method, namely BGO, LaBr3, LYSO, NaI(Tl), whose basic parameters are well known [4]-[7].
Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders
Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu
2014-01-01
Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709
Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E
2014-01-01
This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028
A method for three-dimensional quantitative observation of the microstructure of biological samples
NASA Astrophysics Data System (ADS)
Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying
2009-07-01
Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.
Detection of blob objects in microscopic zebrafish images based on gradient vector diffusion.
Li, Gang; Liu, Tianming; Nie, Jingxin; Guo, Lei; Malicki, Jarema; Mara, Andrew; Holley, Scott A; Xia, Weiming; Wong, Stephen T C
2007-10-01
The zebrafish has become an important vertebrate animal model for the study of developmental biology, functional genomics, and disease mechanisms. It is also being used for drug discovery. Computerized detection of blob objects has been one of the important tasks in quantitative phenotyping of zebrafish. We present a new automated method that is able to detect blob objects, such as nuclei or cells in microscopic zebrafish images. This method is composed of three key steps. The first step is to produce a diffused gradient vector field by a physical elastic deformable model. In the second step, the flux image is computed on the diffused gradient vector field. The third step performs thresholding and nonmaximum suppression based on the flux image. We report the validation and experimental results of this method using zebrafish image datasets from three independent research labs. Both sensitivity and specificity of this method are over 90%. This method is able to differentiate closely juxtaposed or connected blob objects, with high sensitivity and specificity in different situations. It is characterized by a good, consistent performance in blob object detection.
Designing a mixed methods study in primary care.
Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V
2004-01-01
Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.
Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes
2018-05-18
Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.
Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.
2011-01-01
PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P < 0.001) intraprocedural perfusion reduction. CONCLUSION The SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520
Bao, Yijun; Gaylord, Thomas K
2016-11-01
Multifilter phase imaging with partially coherent light (MFPI-PC) is a promising new quantitative phase imaging method. However, the existing MFPI-PC method is based on the paraxial approximation. In the present work, an analytical nonparaxial partially coherent phase optical transfer function is derived. This enables the MFPI-PC to be extended to the realistic nonparaxial case. Simulations over a wide range of test phase objects as well as experimental measurements on a microlens array verify higher levels of imaging accuracy compared to the paraxial method. Unlike the paraxial version, the nonparaxial MFPI-PC with obliquity factor correction exhibits no systematic error. In addition, due to its analytical expression, the increase in computation time compared to the paraxial version is negligible.
Content Validity of National Post Marriage Educational Program Using Mixed Methods
MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali
2015-01-01
Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672
Wirtz, M A; Strohmer, J
2016-06-01
In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, D.G.; Parks, J.M.
1984-04-01
Silhouette shapes are two-dimensional projections of three-dimensional objects such as sand grains, gravel, and fossils. Within-the-margin markings such as chamber boundaries, sutures, or ribs are ignored. Comparisons between populations of objects from similar and differential origins (i.e., environments, species or genera, growth series, etc) is aided by quantifying the shapes. The Multiple Rotations Method (MRM) uses a variation of ''eigenshapes'', which is capable of distinguishing most of the subtle variations that the ''trained eye'' can detect. With a video-digitizer and microcomputer, MRM is fast, more accurate, and more objective than the human eye. The resulting shape descriptors comprise 5 ormore » 6 numbers per object that can be stored and retrieved to compare with similar descriptions of other objects. The original-shape outlines can be reconstituted sufficiently for gross recognition from these few numerical descriptors. Thus, a semi-automated data-retrieval system becomes feasible, with silhouette-shape descriptions as one of several recognition criteria. MRM consists of four ''rotations'': rotation about a center to a comparable orientation; a principal-components rotation to reduce the many original shape descriptors to a few; a VARIMAX orthogonal-factor rotation to achieve simple structure; and a rotation to achieve factor scores on individual objects. A variety of subtly different shapes includes sand grains from several locations, ages, and environments, and fossils of several types. This variety illustrates the feasibility of quantitative comparisons by MRM.« less
Evaluation of macrozone dimensions by ultrasound and EBSD techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreau, Andre, E-mail: Andre.Moreau@cnrc-nrc.gc.ca; Toubal, Lotfi; Ecole de technologie superieure, 1100, rue Notre-Dame Ouest, Montreal, QC, Canada H3C 1K3
2013-01-15
Titanium alloys are known to have texture heterogeneities, i.e. regions much larger than the grain dimensions, where the local orientation distribution of the grains differs from one region to the next. The electron backscattering diffraction (EBSD) technique is the method of choice to characterize these macro regions, which are called macrozones. Qualitatively, the images obtained by EBSD show that these macrozones may be larger or smaller, elongated or equiaxed. However, often no well-defined boundaries are observed between the macrozones and it is very hard to obtain objective and quantitative estimates of the macrozone dimensions from these data. In the presentmore » work, we present a novel, non-destructive ultrasonic technique that provides objective and quantitative characteristic dimensions of the macrozones. The obtained dimensions are based on the spatial autocorrelation function of fluctuations in the sound velocity. Thus, a pragmatic definition of macrozone dimensions naturally arises from the ultrasonic measurement. This paper has three objectives: 1) to disclose the novel, non-destructive ultrasonic technique to measure macrozone dimensions, 2) to propose a quantitative and objective definition of macrozone dimensions adapted to and arising from the ultrasonic measurement, and which is also applicable to the orientation data obtained by EBSD, and 3) to compare the macrozone dimensions obtained using the two techniques on two samples of the near-alpha titanium alloy IMI834. In addition, it was observed that macrozones may present a semi-periodical arrangement. - Highlights: Black-Right-Pointing-Pointer Discloses a novel, ultrasonic NDT technique to measure macrozone dimensions Black-Right-Pointing-Pointer Proposes a quantitative and objective definition of macrozone dimensions Black-Right-Pointing-Pointer Compares macrozone dimensions obtained using EBSD and ultrasonics on 2 Ti samples Black-Right-Pointing-Pointer Observes that macrozones may have a semi-periodical arrangement.« less
Designing A Mixed Methods Study In Primary Care
Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.
2004-01-01
BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277
Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.
2012-01-01
The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600
Cordero, Chiara; Canale, Francesca; Del Rio, Daniele; Bicchi, Carlo
2009-11-01
The present study is focused on flavan-3-ols characterizing the antioxidant properties of fermented tea (Camellia sinensis). These bioactive compounds, object of nutritional claims in commercial products, should be quantified with rigorous analytical procedures whose accuracy and precision have been stated with a certain level of confidence. An HPLC-UV/DAD method, able to detect and quantify flavan-3-ols in infusions and ready-to-drink teas, has been developed for routine analysis and validated by characterizing several performance parameters. The accuracy assessment has been run through a series of LC-MS/MS analyses. Epigallocatechin, (+)-catechin, (-)-epigallocatechingallate, (-)-epicatechin, (-)-gallocatechingallate, (-)-epicatechingallate, and (-)-catechingallate were chosen as markers of the polyphenolic fraction. Quantitative results showed that samples obtained from tea leaves infusion were richer in polyphenolic antioxidants than those obtained through other industrial processes. The influence of shelf-life and packaging material on the flavan-3-ols content was also considered; markers decreased, with an exponential trend, as a function of time within the shelf life while packaging materials demonstrated to influence differently the flavan-3-ol fraction composition over time. The method presented here provides quantitative results with a certain level of confidence and is suitable for a routine quality control of iced teas whose antioxidant properties are object of nutritional claim.
NASA Astrophysics Data System (ADS)
Boehm, H. F.; Fink, C.; Becker, C.; Reiser, M.
2007-03-01
Reliable and accurate methods for objective quantitative assessment of parenchymal alterations in the lung are necessary for diagnosis, treatment and follow-up of pulmonary diseases. Two major types of alterations are pulmonary emphysema and fibrosis, emphysema being characterized by abnormal enlargement of the air spaces distal to the terminal, nonrespiratory bronchiole, accompanied by destructive changes of the alveolar walls. The main characteristic of fibrosis is coursening of the interstitial fibers and compaction of the pulmonary tissue. With the ability to display anatomy free from superimposing structures and greater visual clarity, Multi-Detector-CT has shown to be more sensitive than the chest radiograph in identifying alterations of lung parenchyma. In automated evaluation of pulmonary CT-scans, quantitative image processing techniques are applied for objective evaluation of the data. A number of methods have been proposed in the past, most of which utilize simple densitometric tissue features based on the mean X-ray attenuation coefficients expressed in terms of Hounsfield Units [HU]. Due to partial volume effects, most of the density-based methodologies tend to fail, namely in cases, where emphysema and fibrosis occur within narrow spatial limits. In this study, we propose a methodology based upon the topological assessment of graylevel distribution in the 3D image data of lung tissue which provides a way of improving quantitative CT evaluation. Results are compared to the more established density-based methods.
A benchmark for comparison of dental radiography analysis algorithms.
Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia
2016-07-01
Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Physical activity among South Asian women: a systematic, mixed-methods review
2012-01-01
Introduction The objective of this systematic mixed-methods review is to assess what is currently known about the levels of physical activity (PA) and sedentary time (ST) and to contextualize these behaviors among South Asian women with an immigrant background. Methods A systematic search of the literature was conducted using combinations of the key words PA, ST, South Asian, and immigrant. A mixed-methods approach was used to analyze and synthesize all evidence, both quantitative and qualitative. Twenty-six quantitative and twelve qualitative studies were identified as meeting the inclusion criteria. Results Studies quantifying PA and ST among South Asian women showed low levels of PA compared with South Asian men and with white European comparison populations. However making valid comparisons between studies was challenging due to a lack of standardized PA measurement. The majority of studies indicated that South Asian women did not meet recommended amounts of PA for health benefits. Few studies assessed ST. Themes emerging from qualitative studies included cultural and structural barriers to PA, faith and education as facilitators, and a lack of understanding of the recommended amounts of PA and its benefits among South Asian women. Conclusions Quantitative and qualitative evidence indicate that South Asian women do not perform the recommended level of PA for health benefits. Both types of studies suffer from limitations due to methods of data collection. More research should be dedicated to standardizing objective PA measurement and to understanding how to utilize the resources of the individuals and communities to increase PA levels and overall health of South Asian women. PMID:23256686
Tugal-Tutkun, Ilknur; Herbort, Carl P
2010-10-01
Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1990-09-01
This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less
Multi-objective decision-making under uncertainty: Fuzzy logic methods
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1995-01-01
Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
Interactions across Multiple Stimulus Dimensions in Primary Auditory Cortex.
Sloas, David C; Zhuo, Ran; Xue, Hongbo; Chambers, Anna R; Kolaczyk, Eric; Polley, Daniel B; Sen, Kamal
2016-01-01
Although sensory cortex is thought to be important for the perception of complex objects, its specific role in representing complex stimuli remains unknown. Complex objects are rich in information along multiple stimulus dimensions. The position of cortex in the sensory hierarchy suggests that cortical neurons may integrate across these dimensions to form a more gestalt representation of auditory objects. Yet, studies of cortical neurons typically explore single or few dimensions due to the difficulty of determining optimal stimuli in a high dimensional stimulus space. Evolutionary algorithms (EAs) provide a potentially powerful approach for exploring multidimensional stimulus spaces based on real-time spike feedback, but two important issues arise in their application. First, it is unclear whether it is necessary to characterize cortical responses to multidimensional stimuli or whether it suffices to characterize cortical responses to a single dimension at a time. Second, quantitative methods for analyzing complex multidimensional data from an EA are lacking. Here, we apply a statistical method for nonlinear regression, the generalized additive model (GAM), to address these issues. The GAM quantitatively describes the dependence between neural response and all stimulus dimensions. We find that auditory cortical neurons in mice are sensitive to interactions across dimensions. These interactions are diverse across the population, indicating significant integration across stimulus dimensions in auditory cortex. This result strongly motivates using multidimensional stimuli in auditory cortex. Together, the EA and the GAM provide a novel quantitative paradigm for investigating neural coding of complex multidimensional stimuli in auditory and other sensory cortices.
ERIC Educational Resources Information Center
Bogaert, Inge; De Martelaer, Kristine; Deforche, Benedicte; Clarys, Peter; Zinzen, Evert
2015-01-01
Objective: The primary aim of this study was to describe and analyse the physical activity and sedentary levels of secondary school teachers in Flanders. A secondary aim was to collect information regarding a possible worksite intervention of special relevance to secondary school teachers. Design: Mixed-methods quantitative and qualitative…
Bennett, Clare; Rebafka, Anne; Carrier, Judith; Edwards, Deborah; Jones, Jonathan
2018-05-01
The review questions are:The specific objectives are:This mixed methods review seeks to develop an aggregated synthesis of quantitative and qualitative data on the HRQOL implications of genital herpes for the individual in order to derive conclusions and recommendations for clinical practice and policy decision making.
Accuracy of a remote quantitative image analysis in the whole slide images.
Słodkowska, Janina; Markiewicz, Tomasz; Grala, Bartłomiej; Kozłowski, Wojciech; Papierz, Wielisław; Pleskacz, Katarzyna; Murawski, Piotr
2011-03-30
The rationale for choosing a remote quantitative method supporting a diagnostic decision requires some empirical studies and knowledge on scenarios including valid telepathology standards. The tumours of the central nervous system [CNS] are graded on the base of the morphological features and the Ki-67 labelling Index [Ki-67 LI]. Various methods have been applied for Ki-67 LI estimation. Recently we have introduced the Computerized Analysis of Medical Images [CAMI] software for an automated Ki-67 LI counting in the digital images. Aims of our study was to explore the accuracy and reliability of a remote assessment of Ki-67 LI with CAMI software applied to the whole slide images [WSI]. The WSI representing CNS tumours: 18 meningiomas and 10 oligodendrogliomas were stored on the server of the Warsaw University of Technology. The digital copies of entire glass slides were created automatically by the Aperio ScanScope CS with objective 20x or 40x. Aperio's Image Scope software provided functionality for a remote viewing of WSI. The Ki-67 LI assessment was carried on within 2 out of 20 selected fields of view (objective 40x) representing the highest labelling areas in each WSI. The Ki-67 LI counting was performed by 3 various methods: 1) the manual reading in the light microscope - LM, 2) the automated counting with CAMI software on the digital images - DI , and 3) the remote quantitation on the WSIs - as WSI method. The quality of WSIs and technical efficiency of the on-line system were analysed. The comparative statistical analysis was performed for the results obtained by 3 methods of Ki-67 LI counting. The preliminary analysis showed that in 18% of WSI the results of Ki-67 LI differed from those obtained in other 2 methods of counting when the quality of the glass slides was below the standard range. The results of our investigations indicate that the remote automated Ki-67 LI analysis performed with the CAMI algorithm on the whole slide images of meningiomas and oligodendrogliomas could be successfully used as an alternative method to the manual reading as well as to the digital images quantitation with CAMI software. According to our observation a need of a remote supervision/consultation and training for the effective use of remote quantitative analysis of WSI is necessary.
Using mixed methods in health research
Woodman, Jenny
2013-01-01
Summary Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research. However, there are few papers that summarize this methodological approach for health practitioners who wish to conduct or critically engage with mixed methods studies. The objective of this paper is to provide an accessible introduction to mixed methods for clinicians and researchers unfamiliar with this approach. We present a synthesis of key methodological literature on mixed methods research, with examples from our own work and that of others, to illustrate the practical applications of this approach within health research. We summarize definitions of mixed methods research, the value of this approach, key aspects of study design and analysis, and discuss the potential challenges of combining quantitative and qualitative methods and data. One of the key challenges within mixed methods research is the successful integration of quantitative and qualitative data during analysis and interpretation. However, the integration of different types of data can generate insights into a research question, resulting in enriched understanding of complex health research problems. PMID:23885291
Office-Based Elastographic Technique for Quantifying Mechanical Properties of Skeletal Muscle
Ballyns, Jeffrey J.; Turo, Diego; Otto, Paul; Shah, Jay P.; Hammond, Jennifer; Gebreab, Tadesse; Gerber, Lynn H.; Sikdar, Siddhartha
2012-01-01
Objectives Our objectives were to develop a new, efficient, and easy-to-administer approach to ultrasound elastography and assess its ability to provide quantitative characterization of viscoelastic properties of skeletal muscle in an outpatient clinical environment. We sought to show its validity and clinical utility in assessing myofascial trigger points, which are associated with myofascial pain syndrome. Methods Ultrasound imaging was performed while the muscle was externally vibrated at frequencies in the range of 60 to 200 Hz using a handheld vibrator. The spatial gradient of the vibration phase yielded the shear wave speed, which is related to the viscoelastic properties of tissue. The method was validated using a calibrated experimental phantom, the biceps brachii muscle in healthy volunteers (n = 6), and the upper trapezius muscle in symptomatic patients with axial neck pain (n = 13) and asymptomatic (pain-free) control participants (n = 9). Results Using the experimental phantom, our method was able to quantitatively measure the shear moduli with error rates of less than 20%. The mean shear modulus ± SD in the normal biceps brachii measured 12.5 ± 3.4 kPa, within the range of published values using more sophisticated methods. Shear wave speeds in active myofascial trigger points and the surrounding muscle tissue were significantly higher than those in normal tissue at high frequency excitations (>100 Hz; P < .05). Conclusions Off-the-shelf office-based equipment can be used to quantitatively characterize skeletal muscle viscoelastic properties with estimates comparable to those using more sophisticated methods. Our preliminary results using this method indicate that patients with spontaneous neck pain and symptomatic myofascial trigger points have increased tissue heterogeneity at the trigger point site and the surrounding muscle tissue. PMID:22837285
Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography
Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila
2016-01-01
Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251
Analyser-based phase contrast image reconstruction using geometrical optics.
Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A
2007-07-21
Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.
A new approach for the quantitative evaluation of drawings in children with learning disabilities.
Galli, Manuela; Vimercati, Sara Laura; Stella, Giacomo; Caiazzo, Giorgia; Norveti, Federica; Onnis, Francesca; Rigoldi, Chiara; Albertini, Giorgio
2011-01-01
A new method for a quantitative and objective description of drawing and for the quantification of drawing ability in children with learning disabilities (LD) is hereby presented. Twenty-four normally developing children (N) (age 10.6 ± 0.5) and 18 children with learning disabilities (LD) (age 10.3 ± 2.4) took part to the study. The drawing tasks were chosen among those already used in clinical daily experience (Denver Developmental Screening Test). Some parameters were defined in order to quantitatively describe the features of the children's drawings, introducing new objective measurements beside the subjective standard clinical evaluation. The experimental set-up revealed to be valid for clinical application with LD children. The parameters highlighted the presence of differences in the drawing features of N and LD children. This paper suggests the applicability of this protocol to other fields of motor and cognitive valuation, as well as the possibility to study the upper limbs position and muscle activation during drawing. Copyright © 2011 Elsevier Ltd. All rights reserved.
An Overview of data science uses in bioimage informatics.
Chessel, Anatole
2017-02-15
This review aims at providing a practical overview of the use of statistical features and associated data science methods in bioimage informatics. To achieve a quantitative link between images and biological concepts, one typically replaces an object coming from an image (a segmented cell or intracellular object, a pattern of expression or localisation, even a whole image) by a vector of numbers. They range from carefully crafted biologically relevant measurements to features learnt through deep neural networks. This replacement allows for the use of practical algorithms for visualisation, comparison and inference, such as the ones from machine learning or multivariate statistics. While originating mainly, for biology, in high content screening, those methods are integral to the use of data science for the quantitative analysis of microscopy images to gain biological insight, and they are sure to gather more interest as the need to make sense of the increasing amount of acquired imaging data grows more pressing. Copyright © 2017 Elsevier Inc. All rights reserved.
2012-05-18
by the AWAC. It is a surface- penetrating device that measures continuous changes in the water elevations over time at much higher sampling rates of...background subtraction, a technique based on detecting change from a background scene. Their study highlights the difficulty in object detection and tracking...movements (Zhang et al. 2009) Alternatively, another common object detection method , known as Optical Flow Analysis , may be utilized for vessel
Photogrammetry Applied to Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Liu, Tian-Shu; Cattafesta, L. N., III; Radeztsky, R. H.; Burner, A. W.
2000-01-01
In image-based measurements, quantitative image data must be mapped to three-dimensional object space. Analytical photogrammetric methods, which may be used to accomplish this task, are discussed from the viewpoint of experimental fluid dynamicists. The Direct Linear Transformation (DLT) for camera calibration, used in pressure sensitive paint, is summarized. An optimization method for camera calibration is developed that can be used to determine the camera calibration parameters, including those describing lens distortion, from a single image. Combined with the DLT method, this method allows a rapid and comprehensive in-situ camera calibration and therefore is particularly useful for quantitative flow visualization and other measurements such as model attitude and deformation in production wind tunnels. The paper also includes a brief description of typical photogrammetric applications to temperature- and pressure-sensitive paint measurements and model deformation measurements in wind tunnels.
Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C
2015-05-01
Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.
Dunbar, Richard L.; Goel, Harsh; Tuteja, Sony; Song, Wen-Liang; Nathanson, Grace; Babar, Zeeshan; Lalic, Dusanka; Gelfand, Joel M.; Rader, Daniel J.; Grove, Gary L.
2017-01-01
Though cardioprotective, niacin monotherapy is limited by unpleasant cutaneous symptoms mimicking dermatitis: niacin-associated skin toxicity (NASTy). Niacin is prototypical of several emerging drugs suffering off-target rubefacient properties whereby agonizing the GPR109A receptor on cutaneous immune cells provokes vasodilation, prompting skin plethora and rubor, as well as dolor, tumor, and calor, and systemically, heat loss, frigor, chills, and rigors. Typically, NASTy effects are described by subjective patient-reported perception, at best semi-quantitative and bias-prone. Conversely, objective, quantitative, and unbiased methods measuring NASTy stigmata would facilitate research to abolish them, motivating development of several objective methods. In early drug development, such methods might better predict clinical tolerability in larger clinical trials. Measuring cutaneous stigmata may also aid investigations of vasospastic, ischemic, and inflammatory skin conditions. We present methods to measure NASTy physical stigmata to facilitate research into novel niacin mimetics/analogs, detailing characteristics of each technique following niacin, and how NASTy stigmata relate to symptom perception. We gave niacin orally and measured rubor by colorimetry and white-light spectroscopy, plethora by laser Doppler flowmetry, and calor/frigor by thermometry. Surprisingly, each stigma’s abruptness predicted symptom perception, whereas peak intensity did not. These methods are adaptable to study other rubefacient drugs or dermatologic and vascular disorders. PMID:28119443
Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong
2015-01-01
Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364
Quantitative imaging of aggregated emulsions.
Penfold, Robert; Watson, Andrew D; Mackie, Alan R; Hibberd, David J
2006-02-28
Noise reduction, restoration, and segmentation methods are developed for the quantitative structural analysis in three dimensions of aggregated oil-in-water emulsion systems imaged by fluorescence confocal laser scanning microscopy. Mindful of typical industrial formulations, the methods are demonstrated for concentrated (30% volume fraction) and polydisperse emulsions. Following a regularized deconvolution step using an analytic optical transfer function and appropriate binary thresholding, novel application of the Euclidean distance map provides effective discrimination of closely clustered emulsion droplets with size variation over at least 1 order of magnitude. The a priori assumption of spherical nonintersecting objects provides crucial information to combat the ill-posed inverse problem presented by locating individual particles. Position coordinates and size estimates are recovered with sufficient precision to permit quantitative study of static geometrical features. In particular, aggregate morphology is characterized by a novel void distribution measure based on the generalized Apollonius problem. This is also compared with conventional Voronoi/Delauney analysis.
Patient-specific dosimetry based on quantitative SPECT imaging and 3D-DFT convolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akabani, G.; Hawkins, W.G.; Eckblade, M.B.
1999-01-01
The objective of this study was to validate the use of a 3-D discrete Fourier Transform (3D-DFT) convolution method to carry out the dosimetry for I-131 for soft tissues in radioimmunotherapy procedures. To validate this convolution method, mathematical and physical phantoms were used as a basis of comparison with Monte Carlo transport (MCT) calculations which were carried out using the EGS4 system code. The mathematical phantom consisted of a sphere containing uniform and nonuniform activity distributions. The physical phantom consisted of a cylinder containing uniform and nonuniform activity distributions. Quantitative SPECT reconstruction was carried out using the Circular Harmonic Transformmore » (CHT) algorithm.« less
A flexible skin piloerection monitoring sensor
NASA Astrophysics Data System (ADS)
Kim, Jaemin; Seo, Dae Geon; Cho, Young-Ho
2014-06-01
We have designed, fabricated, and tested a capacitive-type flexible micro sensor for measurement of the human skin piloerection arisen from sudden emotional and environmental change. The present skin piloerection monitoring methods are limited in objective and quantitative measurement by physical disturbance stimulation to the skin due to bulky size and heavy weight of measuring devices. The proposed flexible skin piloerection monitoring sensor is composed of 3 × 3 spiral coplanar capacitor array using conductive polymer for having high capacitive density and thin enough thickness to be attached to human skin. The performance of the skin piloerection monitoring sensor is characterized using the artificial bump, representing human skin goosebump; thus, resulting in the sensitivity of -0.00252%/μm and the nonlinearity of 25.9% for the artificial goosebump deformation in the range of 0-326 μm. We also verified successive human skin piloerection having 3.5 s duration on the subject's dorsal forearms, thus resulting in the capacitance change of -6.2 fF and -9.2 fF for the piloerection intensity of 145 μm and 194 μm, respectively. It is demonstrated experimentally that the proposed sensor is capable to measure the human skin piloerection objectively and quantitatively, thereby suggesting the quantitative evaluation method of the qualitative human emotional status for cognitive human-machine interfaces applications.
Effectiveness of Facebook in English Language Learning: A Case Study
ERIC Educational Resources Information Center
Faryadi, Qais
2017-01-01
The prime objective of this research was to investigate whether Facebook helped undergraduate students of Universiti Sains Islam Malaysia (USIM) improve their English language proficiency, critical thinking, comprehension skills, and motivation. A triangulation method (quantitative, qualitative, and descriptive) was employed in the investigation.…
Health Literacy Skills in Rural and Urban Populations
ERIC Educational Resources Information Center
Zahnd, Whitney E.; Scaife, Steven L.; Francis, Mark L.
2009-01-01
Objective: To determine whether health literacy is lower in rural populations. Method: We analyzed health, prose, document, and quantitative literacy from the National Assessment of Adult Literacy study. Metropolitan Statistical Area designated participants as rural or urban. Results: Rural populations had lower literacy levels for all literacy…
Development of a Patient-Centered Antipsychotic Medication Adherence Intervention
ERIC Educational Resources Information Center
Pyne, Jeffrey M.; Fischer, Ellen P.; Gilmore, LaNissa; McSweeney, Jean C.; Stewart, Katharine E.; Mittal, Dinesh; Bost, James E.; Valenstein, Marcia
2014-01-01
Objective: A substantial gap exists between patients and their mental health providers about patient's perceived barriers, facilitators, and motivators (BFMs) for taking antipsychotic medications. This article describes how we used an intervention mapping (IM) framework coupled with qualitative and quantitative item-selection methods to…
Inclusive Education at Primary Level: Reality or Phantasm
ERIC Educational Resources Information Center
Khan, Itfaq Khaliq; Behlol, Malik Ghulam
2014-01-01
The objectives of this study were to assess the impacts of Inclusive Education (IE) Project implemented in government schools of Islamabad and anticipate its practicability for public schools. Quantitative and qualitative methods were applied for data collection. Study instruments were structured interviews, unstructured focus group discussions,…
Women's Reported Health Behaviours before and during Pregnancy: A Retrospective Study
ERIC Educational Resources Information Center
Smedley, Jenna; Jancey, Jonine M.; Dhaliwal, Satvinder; Zhao, Yun; Monteiro, Sarojini M. D. R.; Howat, Peter
2014-01-01
Objective: This study aimed to determine women's reported health behaviours (physical activity, diet, weight management) before and during pregnancy, and to identify sources of health information. Design: Retrospective study incorporating quantitative (a self-completed survey) and qualitative (one-on-one interviews) methods. Methodology:…
da Vinci decoded: does da Vinci stereopsis rely on disparity?
Tsirlin, Inna; Wilcox, Laurie M; Allison, Robert S
2012-11-01
In conventional stereopsis, the depth between two objects is computed based on the retinal disparity in the position of matching points in the two eyes. When an object is occluded by another object in the scene, so that it is visible only in one eye, its retinal disparity cannot be computed. Nakayama and Shimojo (1990) found that a precept of quantitative depth between the two objects could still be established for such stimuli and proposed that this precept is based on the constraints imposed by occlusion geometry. They named this and other occlusion-based depth phenomena "da Vinci stereopsis." Subsequent research found quantitative depth based on occlusion geometry in several other classes of stimuli grouped under the term da Vinci stereopsis. However, Nakayama and Shimojo's findings were later brought into question by Gillam, Cook, and Blackburn (2003), who suggested that quantitative depth in their stimuli was perceived based on conventional disparity. In order to understand whether da Vinci stereopsis relies on one type of mechanism or whether its function is stimulus dependent we examine the nature and source of depth in the class of stimuli used by Nakayama and Shimojo (1990). We use three different psychophysical and computational methods to show that the most likely source for depth in these stimuli is occlusion geometry. Based on these experiments and previous data we discuss the potential mechanisms responsible for processing depth from monocular features in da Vinci stereopsis.
ERIC Educational Resources Information Center
Fisher, Aaron J.; Newman, Michelle G.; Molenaar, Peter C. M.
2011-01-01
Objective: The present article aimed to demonstrate that the establishment of dynamic patterns during the course of psychotherapy can create attractor states for continued adaptive change following the conclusion of treatment. Method: This study is a secondary analysis of T. D. Borkovec and E. Costello (1993). Of the 55 participants in the…
Quantification of EEG reactivity in comatose patients
Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas
2016-01-01
Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757
Taxonomy based analysis of force exchanges during object grasping and manipulation
Martin-Brevet, Sandra; Jarrassé, Nathanaël; Burdet, Etienne
2017-01-01
The flexibility of the human hand in object manipulation is essential for daily life activities, but remains relatively little explored with quantitative methods. On the one hand, recent taxonomies describe qualitatively the classes of hand postures for object grasping and manipulation. On the other hand, the quantitative analysis of hand function has been generally restricted to precision grip (with thumb and index opposition) during lifting tasks. The aim of the present study is to fill the gap between these two kinds of descriptions, by investigating quantitatively the forces exerted by the hand on an instrumented object in a set of representative manipulation tasks. The object was a parallelepiped object able to measure the force exerted on the six faces and its acceleration. The grasping force was estimated from the lateral force and the unloading force from the bottom force. The protocol included eleven tasks with complementary constraints inspired by recent taxonomies: four tasks corresponding to lifting and holding the object with different grasp configurations, and seven to manipulating the object (rotation around each of its axis and translation). The grasping and unloading forces and object rotations were measured during the five phases of the actions: unloading, lifting, holding or manipulation, preparation to deposit, and deposit. The results confirm the tight regulation between grasping and unloading forces during lifting, and extend this to the deposit phase. In addition, they provide a precise description of the regulation of force exchanges during various manipulation tasks spanning representative actions of daily life. The timing of manipulation showed both sequential and overlapping organization of the different sub-actions, and micro-errors could be detected. This phenomenological study confirms the feasibility of using an instrumented object to investigate complex manipulative behavior in humans. This protocol will be used in the future to investigate upper-limb dexterity in patients with sensory-motor impairments. PMID:28562617
Algorithms for Learning Preferences for Sets of Objects
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri L.; desJardins, Marie; Eaton, Eric
2010-01-01
A method is being developed that provides for an artificial-intelligence system to learn a user's preferences for sets of objects and to thereafter automatically select subsets of objects according to those preferences. The method was originally intended to enable automated selection, from among large sets of images acquired by instruments aboard spacecraft, of image subsets considered to be scientifically valuable enough to justify use of limited communication resources for transmission to Earth. The method is also applicable to other sets of objects: examples of sets of objects considered in the development of the method include food menus, radio-station music playlists, and assortments of colored blocks for creating mosaics. The method does not require the user to perform the often-difficult task of quantitatively specifying preferences; instead, the user provides examples of preferred sets of objects. This method goes beyond related prior artificial-intelligence methods for learning which individual items are preferred by the user: this method supports a concept of setbased preferences, which include not only preferences for individual items but also preferences regarding types and degrees of diversity of items in a set. Consideration of diversity in this method involves recognition that members of a set may interact with each other in the sense that when considered together, they may be regarded as being complementary, redundant, or incompatible to various degrees. The effects of such interactions are loosely summarized in the term portfolio effect. The learning method relies on a preference representation language, denoted DD-PREF, to express set-based preferences. In DD-PREF, a preference is represented by a tuple that includes quality (depth) functions to estimate how desired a specific value is, weights for each feature preference, the desired diversity of feature values, and the relative importance of diversity versus depth. The system applies statistical concepts to estimate quantitative measures of the user s preferences from training examples (preferred subsets) specified by the user. Once preferences have been learned, the system uses those preferences to select preferred subsets from new sets. The method was found to be viable when tested in computational experiments on menus, music playlists, and rover images. Contemplated future development efforts include further tests on more diverse sets and development of a sub-method for (a) estimating the parameter that represents the relative importance of diversity versus depth, and (b) incorporating background knowledge about the nature of quality functions, which are special functions that specify depth preferences for features.
Physical activity among South Asian women: a systematic, mixed-methods review.
Babakus, Whitney S; Thompson, Janice L
2012-12-20
The objective of this systematic mixed-methods review is to assess what is currently known about the levels of physical activity (PA) and sedentary time (ST) and to contextualize these behaviors among South Asian women with an immigrant background. A systematic search of the literature was conducted using combinations of the key words PA, ST, South Asian, and immigrant. A mixed-methods approach was used to analyze and synthesize all evidence, both quantitative and qualitative. Twenty-six quantitative and twelve qualitative studies were identified as meeting the inclusion criteria. Studies quantifying PA and ST among South Asian women showed low levels of PA compared with South Asian men and with white European comparison populations. However making valid comparisons between studies was challenging due to a lack of standardized PA measurement. The majority of studies indicated that South Asian women did not meet recommended amounts of PA for health benefits. Few studies assessed ST. Themes emerging from qualitative studies included cultural and structural barriers to PA, faith and education as facilitators, and a lack of understanding of the recommended amounts of PA and its benefits among South Asian women. Quantitative and qualitative evidence indicate that South Asian women do not perform the recommended level of PA for health benefits. Both types of studies suffer from limitations due to methods of data collection. More research should be dedicated to standardizing objective PA measurement and to understanding how to utilize the resources of the individuals and communities to increase PA levels and overall health of South Asian women.
Quantitating Human Optic Disc Topography
NASA Astrophysics Data System (ADS)
Graebel, William P.; Cohan, Bruce E.; Pearch, Andrew C.
1980-07-01
A method is presented for quantitatively expressing the topography of the human optic disc, applicable in a clinical setting to the diagnosis and management of glaucoma. Pho-tographs of the disc illuminated by a pattern of fine, high contrast parallel lines are digitized. From the measured deviation of the lines as they traverse the disc surface, disc topography is calculated, using the principles of optical sectioning. The quantitators applied to express this topography have the the following advantages : sensitivity to disc shape; objectivity; going beyond the limits of cup-disc ratio estimates and volume calculations; perfect generality in a mathematical sense; an inherent scheme for determining a non-subjective reference frame to compare different discs or the same disc over time.
CART V: recent advancements in computer-aided camouflage assessment
NASA Astrophysics Data System (ADS)
Müller, Thomas; Müller, Markus
2011-05-01
In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.
Kloepper, Jennifer Elisabeth; Sugawara, Koji; Al-Nuaimi, Yusur; Gáspár, Erzsébet; van Beek, Nina; Paus, Ralf
2010-03-01
The organ culture of human scalp hair follicles (HFs) is the best currently available assay for hair research in the human system. In order to determine the hair growth-modulatory effects of agents in this assay, one critical read-out parameter is the assessment of whether the test agent has prolonged anagen duration or induced catagen in vitro. However, objective criteria to distinguish between anagen VI HFs and early catagen in human HF organ culture, two hair cycle stages with a deceptively similar morphology, remain to be established. Here, we develop, document and test an objective classification system that allows to distinguish between anagen VI and early catagen in organ-cultured human HFs, using both qualitative and quantitative parameters that can be generated by light microscopy or immunofluorescence. Seven qualitative classification criteria are defined that are based on assessing the morphology of the hair matrix, the dermal papilla and the distribution of pigmentary markers (melanin, gp100). These are complemented by ten quantitative parameters. We have tested this classification system by employing the clinically used topical hair growth inhibitor, eflornithine, and show that eflornithine indeed produces the expected premature catagen induction, as identified by the novel classification criteria reported here. Therefore, this classification system offers a standardized, objective and reproducible new experimental method to reliably distinguish between human anagen VI and early catagen HFs in organ culture.
León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.
2013-01-01
The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921
Change Detection Algorithms for Surveillance in Visual IoT: A Comparative Study
NASA Astrophysics Data System (ADS)
Akram, Beenish Ayesha; Zafar, Amna; Akbar, Ali Hammad; Wajid, Bilal; Chaudhry, Shafique Ahmad
2018-01-01
The VIoT (Visual Internet of Things) connects virtual information world with real world objects using sensors and pervasive computing. For video surveillance in VIoT, ChD (Change Detection) is a critical component. ChD algorithms identify regions of change in multiple images of the same scene recorded at different time intervals for video surveillance. This paper presents performance comparison of histogram thresholding and classification ChD algorithms using quantitative measures for video surveillance in VIoT based on salient features of datasets. The thresholding algorithms Otsu, Kapur, Rosin and classification methods k-means, EM (Expectation Maximization) were simulated in MATLAB using diverse datasets. For performance evaluation, the quantitative measures used include OSR (Overall Success Rate), YC (Yule's Coefficient) and JC (Jaccard's Coefficient), execution time and memory consumption. Experimental results showed that Kapur's algorithm performed better for both indoor and outdoor environments with illumination changes, shadowing and medium to fast moving objects. However, it reflected degraded performance for small object size with minor changes. Otsu algorithm showed better results for indoor environments with slow to medium changes and nomadic object mobility. k-means showed good results in indoor environment with small object size producing slow change, no shadowing and scarce illumination changes.
Non-invasive, investigative methods in skin aging.
Longo, C; Ciardo, S; Pellacani, G
2015-12-01
A precise and noninvasive quantification of aging is of outmost importance for in vivo assessment of the skin aging "stage", and thus acts to minimize it. Several bioengineering methods have been proposed to objectively, precisely, and non-invasively measure skin aging, and to detect early skin damage, that is sub-clinically observable. In this review we have described the most relevant methods that have emerged from recently introduced technologies, aiming at quantitatively assessing the effects of aging on the skin.
Perceptions of Sexual Harassment in Athletic Training
ERIC Educational Resources Information Center
Shingles, René Revis; Smith, Yevonne
2008-01-01
Objective: To describe and analyze the experiences of ethnically diverse female certified athletic trainers (ATCs) in order to discern the perceived nature of sexual harassment in the athletic training profession. Design and Setting: Both quantitative and qualitative methods were used for a larger study; however, only the qualitative data are…
How College Students Conceptualize and Practice Responsible Drinking
ERIC Educational Resources Information Center
Barry, Adam E.; Goodson, Patricia
2011-01-01
Objective: This study sought to employ a mixed-methods approach to (a) qualitatively explore responsible drinking beliefs and behaviors among a sample of college students, and (b) quantitatively assess the prevalence of those behaviors. Participants: Convenience samples, drawn from currently enrolled students attending a large public university in…
High School Students' Concepts of Acids and Bases.
ERIC Educational Resources Information Center
Ross, Bertram H. B.
An investigation of Ontario high school students' understanding of acids and bases with quantitative and qualitative methods revealed misconceptions. A concept map, based on the objectives of the Chemistry Curriculum Guideline, generated multiple-choice items and interview questions. The multiple-choice test was administered to 34 grade 12…
Student Attrition in Mathematics E-Learning
ERIC Educational Resources Information Center
Smith, Glenn Gordon; Ferguson, David
2005-01-01
Qualitative studies indicate that mathematics does not work well in e-learning. The current study used quantitative methods to investigate more objectively the extent of problems with mathematics in e-learning. The authors used student attrition as a simple measure of student satisfaction and course viability in two studies, one investigating…
Diet and Colorectal Cancer Risk: Baseline Dietary Knowledge of Colorectal Patients
ERIC Educational Resources Information Center
Dyer, K. J.; Fearon, K. C. H.; Buckner, K.; Richardson, R. A.
2004-01-01
Objective: To establish the dietary knowledge, attitudes and potential barriers to change of patients attending a colorectal outpatient clinic. Design: Use of a semistructured interview to generate qualitative and quantitative data. Setting: A regional colorectal outpatient clinic within Edinburgh. Method: Patients attending clinic with colorectal…
ERIC Educational Resources Information Center
Simons, Jacob V., Jr.; Price, Barbara A.
2005-01-01
A recent classroom revelation caused us to reconsider the adequacy of the instructions offered in our textbooks for one of our most elementary quantitative methods. Specifically, we found that many students were mystified concerning how to pick an initial objective function value when plotting an isoprofit line in order to graphically solve a…
VISUALLY OBSERVED MOLD AND MOLDY ODOR VERSUS QUANTITATIVELY MEASURED MICROBIAL EXPOSURE IN HOMES
The main study objective was to compare different methods for assessing mold exposure in conjunction with an epidemiologic study on the development of children's asthma. Homes of 184 children were assessed for mold by visual observations and dust sampling at child's age 1 (Year ...
Media Literacy and Cigarette Smoking in Hungarian Adolescents
ERIC Educational Resources Information Center
Page, Randy M.; Piko, Bettina F.; Balazs, Mate A.; Struk, Tamara
2011-01-01
Objective: To assess smoking media literacy in a sample of Hungarian youth and to determine its association with current smoking and susceptibility to future smoking. Design: Quantitative cross-sectional survey. Setting: Four elementary and four high schools in Mako, Hungary. Method: A survey form was administered in regularly-scheduled classes to…
Main principles and technique of electronystagmography (a brief survey of the literature)
NASA Technical Reports Server (NTRS)
Tanchev, K. S.
1980-01-01
Electronystagmography (ENG) is one of the modern methods for objective recording of nystagmus, its quantitative and qualitative assessment. It is used more and more often in clinical practice. A brief review of the history of recording of nystagmus and a survey of the relevant literature is presented.
Perceptions of a Campus-Wide Condom Distribution Programme: An Exploratory Study
ERIC Educational Resources Information Center
Francis, Diane B.; Noar, Seth M.; Widman, Laura; Willoughby, Jessica Fitts; Sanchez, Diana M.; Garrett, Kyla P.
2016-01-01
Objective: Condom distribution programmes are an important means of preventing sexually transmitted infections (STIs); yet little research has examined their perceived and actual impact on college campuses. Design: Quantitative, cross-sectional study. Setting: Large public university in the Southeastern USA. Method: Approximately 2 months after a…
Gender Balance in Teaching Awards: Evidence from 18 Years of National Data
ERIC Educational Resources Information Center
Marchant, Teresa; Wallace, Michelle
2016-01-01
Gender implications of nationally competitive teaching awards were examined to determine whether women receive sufficient accolades, given their dominant position in university teaching. Quantitative methods and secondary data provided objective analysis of teaching awards for Australian universities, for an 18-year data set with 2046 units of…
ERIC Educational Resources Information Center
Jensen, Chad D.; Cushing, Christopher C.; Aylward, Brandon S.; Craig, James T.; Sorell, Danielle M.; Steele, Ric G.
2011-01-01
Objective: This study was designed to quantitatively evaluate the effectiveness of motivational interviewing (MI) interventions for adolescent substance use behavior change. Method: Literature searches of electronic databases were undertaken in addition to manual reference searches of identified review articles. Databases searched include…
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
NASA Astrophysics Data System (ADS)
Buendía, M.; Salvador, R.; Cibrián, R.; Laguia, M.; Sotoca, J. M.
1999-01-01
The projection of structured light is a technique frequently used to determine the surface shape of an object. In this paper, a new procedure is described that efficiently resolves the correspondence between the knots of the projected grid and those obtained on the object when the projection is made. The method is based on the use of three images of the projected grid. In two of them the grid is projected over a flat surface placed, respectively, before and behind the object; both images are used for calibration. In the third image the grid is projected over the object. It is not reliant on accurate determination of the camera and projector pair relative to the grid and object. Once the method is calibrated, we can obtain the surface function by just analysing the projected grid on the object. The procedure is especially suitable for the study of objects without discontinuities or large depth gradients. It can be employed for determining, in a non-invasive way, the patient's back surface function. Symmetry differences permit a quantitative diagnosis of spinal deformities such as scoliosis.
Tamburini, Elena; Tagliati, Chiara; Bonato, Tiziano; Costa, Stefania; Scapoli, Chiara; Pedrini, Paola
2016-01-01
Near-infrared spectroscopy (NIRS) has been widely used for quantitative and/or qualitative determination of a wide range of matrices. The objective of this study was to develop a NIRS method for the quantitative determination of fluorine content in polylactide (PLA)-talc blends. A blending profile was obtained by mixing different amounts of PLA granules and talc powder. The calibration model was built correlating wet chemical data (alkali digestion method) and NIR spectra. Using FT (Fourier Transform)-NIR technique, a Partial Least Squares (PLS) regression model was set-up, in a concentration interval of 0 ppm of pure PLA to 800 ppm of pure talc. Fluorine content prediction (R2cal = 0.9498; standard error of calibration, SEC = 34.77; standard error of cross-validation, SECV = 46.94) was then externally validated by means of a further 15 independent samples (R2EX.V = 0.8955; root mean standard error of prediction, RMSEP = 61.08). A positive relationship between an inorganic component as fluorine and NIR signal has been evidenced, and used to obtain quantitative analytical information from the spectra. PMID:27490548
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Lindgren, Eric A.
2018-04-01
This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.
Investigation on microfluidic particles manipulation by holographic 3D tracking strategies
NASA Astrophysics Data System (ADS)
Cacace, Teresa; Paturzo, Melania; Memmolo, Pasquale; Vassalli, Massimo; Fraldi, Massimiliano; Mensitieri, Giuseppe; Ferraro, Pietro
2017-06-01
We demonstrate a 3D holographic tracking method to investigate particles motion in a microfluidic channel while unperturbed while inducing their migration through microfluidic manipulation. Digital holography (DH) in microscopy is a full-field, label-free imaging technique able to provide quantitative phase-contrast. The employed 3D tracking method is articulated in steps. First, the displacements along the optical axis are assessed by numerical refocusing criteria. In particular, an automatic refocusing method to recover the particles axial position is implemented employing a contrast-based refocusing criterion. Then, the transverse position of the in-focus object is evaluated through quantitative phase map segmentation methods and centroid-based 2D tracking strategy. The introduction of DH is thus suggested as a powerful approach for control of particles and biological samples manipulation, as well as a possible aid to precise design and implementation of advanced lab-on-chip microfluidic devices.
Caries Detection Methods Based on Changes in Optical Properties between Healthy and Carious Tissue
Karlsson, Lena
2010-01-01
A conservative, noninvasive or minimally invasive approach to clinical management of dental caries requires diagnostic techniques capable of detecting and quantifying lesions at an early stage, when progression can be arrested or reversed. Objective evidence of initiation of the disease can be detected in the form of distinct changes in the optical properties of the affected tooth structure. Caries detection methods based on changes in a specific optical property are collectively referred to as optically based methods. This paper presents a simple overview of the feasibility of three such technologies for quantitative or semiquantitative assessment of caries lesions. Two of the techniques are well-established: quantitative light-induced fluorescence, which is used primarily in caries research, and laser-induced fluorescence, a commercially available method used in clinical dental practice. The third technique, based on near-infrared transillumination of dental enamel is in the developmental stages. PMID:20454579
Ranacher, Peter; Tzavella, Katerina
2014-05-27
In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods.
Ranacher, Peter; Tzavella, Katerina
2014-01-01
In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods. PMID:27019646
Determining characteristics of artificial near-Earth objects using observability analysis
NASA Astrophysics Data System (ADS)
Friedman, Alex M.; Frueh, Carolin
2018-03-01
Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.
ERIC Educational Resources Information Center
Benítez, Isabel; Padilla, José-Luis
2014-01-01
Differential item functioning (DIF) can undermine the validity of cross-lingual comparisons. While a lot of efficient statistics for detecting DIF are available, few general findings have been found to explain DIF results. The objective of the article was to study DIF sources by using a mixed method design. The design involves a quantitative phase…
Sun, Jared H; Twomey, Michele; Tran, Jeffrey; Wallis, Lee A
2012-11-01
Ninety percent of emergency incidents occur in developing countries, and this is only expected to get worse as these nations develop. As a result, governments in developing countries are establishing emergency care systems. However, there is currently no widely-usable, objective method to monitor or research the rapid growth of emergency care in the developing world. Analysis of current quantitative methods to assess emergency care in developing countries, and the proposal of a more appropriate method. Currently accepted methods to quantitatively assess the efficacy of emergency care systems cannot be performed in most developing countries due to weak record-keeping infrastructure and the inappropriateness of applying Western derived coefficients to developing country conditions. As a result, although emergency care in the developing world is rapidly growing, researchers and clinicians are unable to objectively measure its progress or determine which policies work best in their respective countries. We propose the TEWS methodology, a simple analytical tool that can be handled by low-resource, developing countries. By relying on the most basic universal parameters, simplest calculations and straightforward protocol, the TEWS methodology allows for widespread analysis of emergency care in the developing world. This could become essential in the establishment and growth of new emergency care systems worldwide.
NASA Astrophysics Data System (ADS)
Alizadeh, Mohammad Reza; Nikoo, Mohammad Reza; Rakhshandehroo, Gholam Reza
2017-08-01
Sustainable management of water resources necessitates close attention to social, economic and environmental aspects such as water quality and quantity concerns and potential conflicts. This study presents a new fuzzy-based multi-objective compromise methodology to determine the socio-optimal and sustainable policies for hydro-environmental management of groundwater resources, which simultaneously considers the conflicts and negotiation of involved stakeholders, uncertainties in decision makers' preferences, existing uncertainties in the groundwater parameters and groundwater quality and quantity issues. The fuzzy multi-objective simulation-optimization model is developed based on qualitative and quantitative groundwater simulation model (MODFLOW and MT3D), multi-objective optimization model (NSGA-II), Monte Carlo analysis and Fuzzy Transformation Method (FTM). Best compromise solutions (best management policies) on trade-off curves are determined using four different Fuzzy Social Choice (FSC) methods. Finally, a unanimity fallback bargaining method is utilized to suggest the most preferred FSC method. Kavar-Maharloo aquifer system in Fars, Iran, as a typical multi-stakeholder multi-objective real-world problem is considered to verify the proposed methodology. Results showed an effective performance of the framework for determining the most sustainable allocation policy in groundwater resource management.
Quantitative fluorescence microscopy and image deconvolution.
Swedlow, Jason R
2013-01-01
Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used to remove blurred signal from an image. There are two major types of deconvolution approaches, deblurring and restoration algorithms. Deblurring algorithms remove blur, but treat a series of optical sections as individual two-dimensional entities, and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed. Copyright © 1998 Elsevier Inc. All rights reserved.
Mohler, M. Jane; Coons, Stephen Joel; Hornbrook, Mark C.; Herrinton, Lisa J.; Wendel, Christopher S.; Grant, Marcia; Krouse, Robert S.
2008-01-01
Objectives The objective of this paper is to describe the complex mixed-methods design of a study conducted to assess health-related quality of life (HRQOL) outcomes and ostomy-related obstacles and adjustments among long-term (>five years) colorectal cancer (CRC) survivors with ostomies (cases) and without ostomies (controls). In addition, details are provided regarding the study sample and the psychometric properties of the quantitative data collection measures used. Subsequent manuscripts will present the study findings. Research Design and Methods The study design involved a cross-sectional mail survey for collecting quantitative data and focus groups for collecting qualitative data. The study subjects were individuals identified as long-term CRC survivors within a community-based health maintenance organization's enrolled population. Focus groups comprised of cases and divided by gender and HRQOL high and low quartile contrasts (based on the mail survey data) were conducted. Main Outcome Measures The modified City of Hope Quality of Life (mCOH-QOL)-Ostomy and SF-36v2 questionnaires were used in the mail survey. An abridged version of the mCOH-QOL-Ostomy was used for the control subjects. Focus groups explored ostomy-related barriers to self-care, adaptation methods/skills, and advice for others with an ostomy. Results The survey response rate was 52% (679/1308) and 34 subjects participated in focus groups. The internal consistency reliability estimates for the mCOH-QOL-Ostomy and SF-36v2 questionnaires were very acceptable for group comparisons. In addition, evidence supports the construct validity of the abridged version of the mCOH-QOL-Ostomy. Study limitations include potential non-response bias and limited minority participation. Conclusions We were able to successfully recruit long-term CRC survivors into this study and the psychometric properties of the quantitative measures used were quite acceptable. Mixed-methods designs, such as the one used in this study, may be useful in identification and further elucidation of common problems, coping strategies, and HRQOL outcomes among long-term cancer survivors. PMID:18544186
Engineering Ethics Education : Its Necessity, Objectives, Methods, Current State, and Challenges
NASA Astrophysics Data System (ADS)
Fudano, Jun
The importance of engineering ethics education has become widely recognized in the industrialized countries including Japan. This paper examines the background against which engineering ethics education is required, and reviews its objectives, methods, and challenges, as well as its current state. In pointing out important issues associated with the apparent acceptance and quantitative development of ethics education, especially after the establishment of the Japan Accreditation Board for Engineering Education in 1999, the author stresses that the most serious problem is the lack of common understanding on the objectives of engineering ethics education. As a strategy to improve the situation, the so-called “Ethics-across-the-Curriculum” approach is introduced. The author also claims that business/organization ethics which is consistent with engineering ethics should be promoted in Japan.
An object tracking method based on guided filter for night fusion image
NASA Astrophysics Data System (ADS)
Qian, Xiaoyan; Wang, Yuedong; Han, Lei
2016-01-01
Online object tracking is a challenging problem as it entails learning an effective model to account for appearance change caused by intrinsic and extrinsic factors. In this paper, we propose a novel online object tracking with guided image filter for accurate and robust night fusion image tracking. Firstly, frame difference is applied to produce the coarse target, which helps to generate observation models. Under the restriction of these models and local source image, guided filter generates sufficient and accurate foreground target. Then accurate boundaries of the target can be extracted from detection results. Finally timely updating for observation models help to avoid tracking shift. Both qualitative and quantitative evaluations on challenging image sequences demonstrate that the proposed tracking algorithm performs favorably against several state-of-art methods.
Optimization of Dual-Energy Xenon-CT for Quantitative Assessment of Regional Pulmonary Ventilation
Fuld, Matthew K.; Halaweish, Ahmed; Newell, John D.; Krauss, Bernhard; Hoffman, Eric A.
2013-01-01
Objective Dual-energy X-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study we seek to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. Materials and Methods The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon-oxygen gas mixtures (0, 20, 25, 33, 50, 66, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved three-material decomposition calibration parameters. Additionally, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine in order to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Results Attenuation curves for xenon were obtained from the syringe test objects and were used to develop improved three-material decomposition parameters (HU enhancement per percent xenon: Within the chest phantom: 2.25 at 80kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; In open air: 2.5 at 80kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally non-dependent portion of the airway tree test-object, while not affecting quantitation of xenon in the three-material decomposition DECT. 40%Xe/40%He/20%O2 provided good signal-to-noise, greater than the Rose Criterion (SNR > 5), while avoiding gravitational effects of similar concentrations of xenon in a 60%O2 mixture. 80/140-kVp (tin-filtered) provided improved SNR compared with 100/140-kVp in a swine with an equivalent thoracic transverse density to a human subject with body mass index of 33. Airways were brighter in the 80/140 kVp scan (80/140Sn, 31.6%; 100/140Sn, 25.1%) with considerably lower noise (80/140Sn, CV of 0.140; 100/140Sn, CV of 0.216). Conclusion In order to provide a truly quantitative measure of regional lung function with xenon-DECT, the basic protocols and parameter calibrations needed to be better understood and quantified. It is critically important to understand the fundamentals of new techniques in order to allow for proper implementation and interpretation of their results prior to wide spread usage. With the use of an in house derived xenon calibration curve for three-material decomposition rather than the scanner supplied calibration and a xenon/helium/oxygen mixture we demonstrate highly accurate quantitation of xenon gas volumes and avoid gravitational effects on gas distribution. This study provides a foundation for other researchers to use and test these methods with the goal of clinical translation. PMID:23571834
NASA Astrophysics Data System (ADS)
Mikheeva, A. I.; Tutubalina, O. V.; Zimin, M. V.; Golubeva, E. I.
2017-12-01
The tundra-taiga ecotone plays significant role in northern ecosystems. Due to global climatic changes, the vegetation of the ecotone is the key object of many remote-sensing studies. The interpretation of vegetation and nonvegetation objects of the tundra-taiga ecotone on satellite imageries of a moderate resolution is complicated by the difficulty of extracting these objects from the spectral and spatial mixtures within a pixel. This article describes a method for the subpixel classification of Terra ASTER satellite image for vegetation mapping of the tundra-taiga ecotone in the Tuliok River, Khibiny Mountains, Russia. It was demonstrated that this method allows to determine the position of the boundaries of ecotone objects and their abundance on the basis of quantitative criteria, which provides a more accurate characteristic of ecotone vegetation when compared to the per-pixel approach of automatic imagery interpretation.
High-resolution ab initio three-dimensional x-ray diffraction microscopy
Chapman, Henry N.; Barty, Anton; Marchesini, Stefano; ...
2006-01-01
Coherent x-ray diffraction microscopy is a method of imaging nonperiodic isolated objects at resolutions limited, in principle, by only the wavelength and largest scattering angles recorded. We demonstrate x-ray diffraction imaging with high resolution in all three dimensions, as determined by a quantitative analysis of the reconstructed volume images. These images are retrieved from the three-dimensional diffraction data using no a priori knowledge about the shape or composition of the object, which has never before been demonstrated on a nonperiodic object. We also construct two-dimensional images of thick objects with greatly increased depth of focus (without loss of transverse spatialmore » resolution). These methods can be used to image biological and materials science samples at high resolution with x-ray undulator radiation and establishes the techniques to be used in atomic-resolution ultrafast imaging at x-ray free-electron laser sources.« less
An experimental comparison of online object-tracking algorithms
NASA Astrophysics Data System (ADS)
Wang, Qing; Chen, Feng; Xu, Wenli; Yang, Ming-Hsuan
2011-09-01
This paper reviews and evaluates several state-of-the-art online object tracking algorithms. Notwithstanding decades of efforts, object tracking remains a challenging problem due to factors such as illumination, pose, scale, deformation, motion blur, noise, and occlusion. To account for appearance change, most recent tracking algorithms focus on robust object representations and effective state prediction. In this paper, we analyze the components of each tracking method and identify their key roles in dealing with specific challenges, thereby shedding light on how to choose and design algorithms for different situations. We compare state-of-the-art online tracking methods including the IVT,1 VRT,2 FragT,3 BoostT,4 SemiT,5 BeSemiT,6 L1T,7 MILT,8 VTD9 and TLD10 algorithms on numerous challenging sequences, and evaluate them with different performance metrics. The qualitative and quantitative comparative results demonstrate the strength and weakness of these algorithms.
Real-time quantitative Schlieren imaging by fast Fourier demodulation of a checkered backdrop
NASA Astrophysics Data System (ADS)
Wildeman, Sander
2018-06-01
A quantitative synthetic Schlieren imaging (SSI) method based on fast Fourier demodulation is presented. Instead of a random dot pattern (as usually employed in SSI), a 2D periodic pattern (such as a checkerboard) is used as a backdrop to the refractive object of interest. The range of validity and accuracy of this "Fast Checkerboard Demodulation" (FCD) method are assessed using both synthetic data and experimental recordings of patterns optically distorted by small waves on a water surface. It is found that the FCD method is at least as accurate as sophisticated, multi-stage, digital image correlation (DIC) or optical flow (OF) techniques used with random dot patterns, and it is significantly faster. Efficient, fully vectorized, implementations of both the FCD and DIC/OF schemes developed for this study are made available as open source Matlab scripts.
Metrology Standards for Quantitative Imaging Biomarkers
Obuchowski, Nancy A.; Kessler, Larry G.; Raunig, David L.; Gatsonis, Constantine; Huang, Erich P.; Kondratovich, Marina; McShane, Lisa M.; Reeves, Anthony P.; Barboriak, Daniel P.; Guimaraes, Alexander R.; Wahl, Richard L.
2015-01-01
Although investigators in the imaging community have been active in developing and evaluating quantitative imaging biomarkers (QIBs), the development and implementation of QIBs have been hampered by the inconsistent or incorrect use of terminology or methods for technical performance and statistical concepts. Technical performance is an assessment of how a test performs in reference objects or subjects under controlled conditions. In this article, some of the relevant statistical concepts are reviewed, methods that can be used for evaluating and comparing QIBs are described, and some of the technical performance issues related to imaging biomarkers are discussed. More consistent and correct use of terminology and study design principles will improve clinical research, advance regulatory science, and foster better care for patients who undergo imaging studies. © RSNA, 2015 PMID:26267831
Multiscale moment-based technique for object matching and recognition
NASA Astrophysics Data System (ADS)
Thio, HweeLi; Chen, Liya; Teoh, Eam-Khwang
2000-03-01
A new method is proposed to extract features from an object for matching and recognition. The features proposed are a combination of local and global characteristics -- local characteristics from the 1-D signature function that is defined to each pixel on the object boundary, global characteristics from the moments that are generated from the signature function. The boundary of the object is first extracted, then the signature function is generated by computing the angle between two lines from every point on the boundary as a function of position along the boundary. This signature function is position, scale and rotation invariant (PSRI). The shape of the signature function is then described quantitatively by using moments. The moments of the signature function are the global characters of a local feature set. Using moments as the eventual features instead of the signature function reduces the time and complexity of an object matching application. Multiscale moments are implemented to produce several sets of moments that will generate more accurate matching. Basically multiscale technique is a coarse to fine procedure and makes the proposed method more robust to noise. This method is proposed to match and recognize objects under simple transformation, such as translation, scale changes, rotation and skewing. A simple logo indexing system is implemented to illustrate the performance of the proposed method.
Benefit-risk analysis : a brief review and proposed quantitative approaches.
Holden, William L
2003-01-01
Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.
NASA Technical Reports Server (NTRS)
Kirby, Michelle R.
2002-01-01
The TIES method is a forecasting environment whereby the decision-maker has the ability to easily assess and trade-off the impact of various technologies without sophisticated and time-consuming mathematical formulations. TIES provides a methodical approach where technically feasible alternatives can be identified with accuracy and speed to reduce design cycle time, and subsequently, life cycle costs, and was achieved through the use of various probabilistic methods, such as Response Surface Methodology and Monte Carlo Simulations. Furthermore, structured and systematic techniques are utilized from other fields to identify possible concepts and evaluation criteria by which comparisons can be made. This objective is achieved by employing the use of Morphological Matrices and Multi-Attribute Decision Making techniques. Through the execution of each step, a family of design alternatives for a given set of customer requirements can be identified and assessed subjectively or objectively. This methodology allows for more information (knowledge) to be brought into the earlier phases of the design process and will have direct implications on the affordability of the system. The increased knowledge allows for optimum allocation of company resources and quantitative justification for program decisions. Finally, the TIES method provided novel results and quantitative justification to facilitate decision making in the early stages of design so as to produce affordable and quality products.
Savel'eva, N B; Bykovskaia, N Iu; Dikunets, M A; Bolotov, S L; Rodchenkov, G M
2010-01-01
The objective of this study was to demonstrate the possibility to use deuterated compounds as internal standards for the quantitative analysis of morphine by gas chromatography with mass-selective detection for the purpose of doping control. The paper is focused on the problems associated with the use of deuterated morphine-D3 as the internal standard. Quantitative characteristics of the calibration dependence thus documented are presented along with uncertainty values obtained in the measurements with the use of deuterated morphine-D6. An approach to the assessment of method bias associated with the application of morphine-D6 as the deuterated internal standard is described.
Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.
He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan
2009-07-01
Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.
Reuse Metrics for Object Oriented Software
NASA Technical Reports Server (NTRS)
Bieman, James M.
1998-01-01
One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.
A quantitative approach to evolution of music and philosophy
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano
2012-08-01
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Calibration methods influence quantitative material decomposition in photon-counting spectral CT
NASA Astrophysics Data System (ADS)
Curtis, Tyler E.; Roeder, Ryan K.
2017-03-01
Photon-counting detectors and nanoparticle contrast agents can potentially enable molecular imaging and material decomposition in computed tomography (CT). Material decomposition has been investigated using both simulated and acquired data sets. However, the effect of calibration methods on material decomposition has not been systematically investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on quantitative material decomposition. A commerciallyavailable photon-counting spectral micro-CT (MARS Bioimaging) was used to acquire images with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material basis matrix values were determined using multiple linear regression models and material decomposition was performed using a maximum a posteriori estimator. The accuracy of quantitative material decomposition was evaluated by the root mean squared error (RMSE), specificity, sensitivity, and area under the curve (AUC). An increased maximum concentration (range) in the calibration significantly improved RMSE, specificity and AUC. The effects of an increased number of concentrations in the calibration were not statistically significant for the conditions in this study. The overall results demonstrated that the accuracy of quantitative material decomposition in spectral CT is significantly influenced by calibration methods, which must therefore be carefully considered for the intended diagnostic imaging application.
An Inexpensive and Simple Method to Demonstrate Soil Water and Nutrient Flow
ERIC Educational Resources Information Center
Nichols, K. A.; Samson-Liebig, S.
2011-01-01
Soil quality, soil health, and soil sustainability are concepts that are being widely used but are difficult to define and illustrate, especially to a non-technical audience. The objectives of this manuscript were to develop simple and inexpensive methodologies to both qualitatively and quantitatively estimate water infiltration rates (IR),…
Evaluating the role of landscape in the spread of invasive species: the case of the biomass crop
USDA-ARS?s Scientific Manuscript database
As the development and cultivation of new bioeconomy crops and in particular biofuel feedstocks expands there is a pressing need for objective and quantitative methods to evaluate risks and benefits of their production. In particular, the traits being selected for in biofuel crops are highly aligned...
Postpartum Mental State of Mothers of Twins
ERIC Educational Resources Information Center
Brantmüller, Éva; Gyúró, Mónika; Galgán, Kitti; Pakai, Annamária
2016-01-01
Twin birth is a relevant risk factor for postnatal depression (PND). The primary objective of our study is to reveal the prevalence of suspected cases of depression and to identify some background factors among mothers of twins. We applied convenience sampling method within a retrospective, quantitative study among mothers given birth to twins for…
Fine phenotyping of pod and seed traits in Arachis germplasm accessions using digital image analysis
USDA-ARS?s Scientific Manuscript database
Reliable and objective phenotyping of peanut pod and seed traits is important for cultivar selection and genetic mapping of yield components. To develop useful and efficient methods to quantitatively define peanut pod and seed traits, a group of peanut germplasm with high levels of phenotypic varia...
Force Exertion Capacity Measurements in Haptic Virtual Environments
ERIC Educational Resources Information Center
Munih, Marko; Bardorfer, Ales; Ceru, Bojan; Bajd, Tadej; Zupan, Anton
2010-01-01
An objective test for evaluating functional status of the upper limbs (ULs) in patients with muscular distrophy (MD) is presented. The method allows for quantitative assessment of the UL functional state with an emphasis on force exertion capacity. The experimental measurement setup and the methodology for the assessment of maximal exertable force…
The environment of x ray selected BL Lacs: Host galaxies and galaxy clustering
NASA Technical Reports Server (NTRS)
Wurtz, Ron; Stocke, John T.; Ellingson, Erica; Yee, Howard K. C.
1993-01-01
Using the Canada-France-Hawaii Telescope, we have imaged a complete, flux-limited sample of Einstein Medium Sensitivity Survey BL Lacertae objects in order to study the properties of BL Lac host galaxies and to use quantitative methods to determine the richness of their galaxy cluster environments.
ERIC Educational Resources Information Center
Chen, Shih-Neng; Tseng, Jauling
2010-01-01
Objective: To assess various marginal effects of nutrient intakes, health behaviours and nutrition knowledge on the entire distribution of body mass index (BMI) across individuals. Design: Quantitative and distributional study. Setting: Taiwan. Methods: This study applies Becker's (1965) model of health production to construct an individual's BMI…
Reflective Teaching Practices in Turkish Primary School Teachers
ERIC Educational Resources Information Center
Tok, Sukran; Dolapcioglu, Sevda Dogan
2013-01-01
The objective of the study is to explore the prevalence of reflective teaching practices among Turkish primary school teachers. Qualitative and quantitative research methods were used together in the study. The sample was composed of 328 primary school teachers working in 30 primary education institutions in the town of Antakya in the province of…
ERIC Educational Resources Information Center
Heffernan, Bernadette M.
1998-01-01
Describes work done to provide staff of the Sandy Point Discovery Center with methods for evaluating exhibits and interpretive programming. Quantitative and qualitative evaluation measures were designed to assess the program's objective of estuary education. Pretest-posttest questionnaires and interviews are used to measure subjects' knowledge and…
ERIC Educational Resources Information Center
Khurana, Gauri; Henderson, Schuyler; Walter, Garry; Martin, Andres
2012-01-01
Objective: The authors reviewed and characterized conflict of interest (COI) and disclosure policies published in peer-reviewed psychiatric and nonpsychiatric journals. Methods: The authors examined peer-reviewed publications in the psychiatric (N=20) and nonpsychiatric (N=20) literature. Using qualitative and quantitative approaches, they…
ERIC Educational Resources Information Center
Wood, Alexis C.; Neale, Michael C.
2010-01-01
Objective: To describe the utility of twin studies for attention-deficit/hyperactivity disorder (ADHD) research and demonstrate their potential for the identification of alternative phenotypes suitable for genomewide association, developmental risk assessment, treatment response, and intervention targets. Method: Brief descriptions of the classic…
Study of Human Barriers upon Development of Virtual Disciplines at University of Isfahan
ERIC Educational Resources Information Center
Nikoonezhad, Sepideh; Nili, Mohammadreza; Esfahani, Ahmadreza Nasr
2015-01-01
The present study has been carried out to investigate the human barriers of developing virtual majors at Isfahan University; therefore, considering its objective, it is a functional research. It was conducted in combined (quantitative-qualitative) manner via descriptive survey method. In order to do the research, investigating the texts, interview…
School Administrators Skills in Organizing the Parent Participation Studies
ERIC Educational Resources Information Center
Albez, Canan; Ada, Sükrü
2017-01-01
The objective of this study is to ascertain administrator, teacher and parent opinions on the level of school administrators' skills of organising parent participation efforts. The study group of the study conducted according to the descriptive survey model using the quantitative method consists of 273 school administrators, 916 teachers and 395…
Teaching Research and Practice Evaluation Skills to Graduate Social Work Students
ERIC Educational Resources Information Center
Wong, Stephen E.; Vakharia, Sheila P.
2012-01-01
Objective: The authors examined outcomes of a graduate course on evaluating social work practice that required students to use published research, quantitative measures, and single-system designs in a simulated practice evaluation project. Method: Practice evaluation projects from a typical class were analyzed for the number of research references…
Genome-Wide Association Study of Intelligence: Additive Effects of Novel Brain Expressed Genes
ERIC Educational Resources Information Center
Loo, Sandra K.; Shtir, Corina; Doyle, Alysa E.; Mick, Eric; McGough, James J.; McCracken, James; Biederman, Joseph; Smalley, Susan L.; Cantor, Rita M.; Faraone, Stephen V.; Nelson, Stanley F.
2012-01-01
Objective: The purpose of the present study was to identify common genetic variants that are associated with human intelligence or general cognitive ability. Method: We performed a genome-wide association analysis with a dense set of 1 million single-nucleotide polymorphisms (SNPs) and quantitative intelligence scores within an ancestrally…
A Meta-Analysis of Predictors of Offender Treatment Attrition and Its Relationship to Recidivism
ERIC Educational Resources Information Center
Olver, Mark E.; Stockdale, Keira C.; Wormith, J. Stephen
2011-01-01
Objective: The failure of offenders to complete psychological treatment can pose significant concerns, including increased risk for recidivism. Although a large literature identifying predictors of offender treatment attrition has accumulated, there has yet to be a comprehensive quantitative review. Method: A meta-analysis of the offender…
Multi-objective decision-making under uncertainty: Fuzzy logic methods
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1994-01-01
Selecting the best option among alternatives is often a difficult process. This process becomes even more difficult when the evaluation criteria are vague or qualitative, and when the objectives vary in importance and scope. Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
Han, Xue; Jiang, Hong; Han, Li; Xiong, Xi; He, Yanan; Fu, Chaomei; Xu, Runchun; Zhang, Dingkun; Lin, Junzhi; Yang, Ming
2018-03-01
Traditional Chinese herbs (TCH) are currently gaining attention in disease prevention and health care plans. However, their general bitter taste hinders their use. Despite the development of a variety of taste evaluation methods, it is still a major challenge to establish a quantitative detection technique that is objective, authentic and sensitive. Based on the two-bottle preference test (TBP), we proposed a novel quantitative strategy using a standardized animal test and a unified quantitative benchmark. To reduce the difference of results, the methodology of TBP was optimized. The relationship between the concentration of quinine and animal preference index (PI) was obtained. Then the PI of TCH was measured through TBP, and bitterness results were converted into a unified numerical system using the relationship of concentration and PI. To verify the authenticity and sensitivity of quantified results, human sensory testing and electronic tongue testing were applied. The quantified results showed a good discrimination ability. For example, the bitterness of Coptidis Rhizoma was equal to 0.0579 mg/mL quinine, and Nelumbinis Folium was equal to 0.0001 mg/mL. The validation results proved that the new assessment method for TCH was objective and reliable. In conclusion, this study provides an option for the quantification of bitterness and the evaluation of taste masking effects.
Forest management and economics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buongiorno, J.; Gilless, J.K.
1987-01-01
This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.
Method for beam hardening correction in quantitative computed X-ray tomography
NASA Technical Reports Server (NTRS)
Yan, Chye Hwang (Inventor); Whalen, Robert T. (Inventor); Napel, Sandy (Inventor)
2001-01-01
Each voxel is assumed to contain exactly two distinct materials, with the volume fraction of each material being iteratively calculated. According to the method, the spectrum of the X-ray beam must be known, and the attenuation spectra of the materials in the object must be known, and be monotonically decreasing with increasing X-ray photon energy. Then, a volume fraction is estimated for the voxel, and the spectrum is iteratively calculated.
Network of TAMCNS: Identifying Influence Regions Within the GCSS-MC Database
2017-06-01
relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This... methodology identifies the most important TAMCN and provides a list of TAMCNs in order of importance. We also analyze the community and core structure of...relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This
Development of a Thiolysis HPLC Method for the Analysis of Procyanidins in Cranberry Products.
Gao, Chi; Cunningham, David G; Liu, Haiyan; Khoo, Christina; Gu, Liwei
2018-03-07
The objective of this study was to develop a thiolysis HPLC method to quantify total procyanidins, the ratio of A-type linkages, and A-type procyanidin equivalents in cranberry products. Cysteamine was utilized as a low-odor substitute of toluene-α-thiol for thiolysis depolymerization. A reaction temperature of 70 °C and reaction time of 20 min, in 0.3 M of HCl, were determined to be optimum depolymerization conditions. Thiolytic products of cranberry procyanidins were separated by RP-HPLC and identified using high-resolution mass spectrometry. Standards curves of good linearity were obtained on thiolyzed procyanidin dimer A2 and B2 external standards. The detection and quantification limits, recovery, and precision of this method were validated. The new method was applied to quantitate total procyanidins, average degree of polymerization, ratio of A-type linkages, and A-type procyanidin equivalents in cranberry products. Results showed that the method was suitable for quantitative and qualitative analysis of procyanidins in cranberry products.
Qualitative research and the profound grasp of the obvious.
Hurley, R E
1999-01-01
OBJECTIVE: To discuss the value of promoting coexistent and complementary relationships between qualitative and quantitative research methods as illustrated by presentations made by four respected health services researchers who described their experiences in multi-method projects. DATA SOURCES: Presentations and publications related to the four research projects, which described key substantive and methodological areas that had been addressed with qualitative techniques. PRINCIPAL FINDINGS: Sponsor interest in timely, insightful, and reality-anchored evidence has provided a strong base of support for the incorporation of qualitative methods into major contemporary policy research studies. In addition, many issues may be suitable for study only with qualitative methods because of their complexity, their emergent nature, or because of the need to revisit and reexamine previously untested assumptions. CONCLUSION: Experiences from the four projects, as well as from other recent health services studies with major qualitative components, support the assertion that the interests of sponsors in the policy realm and pressure from them suppress some of the traditional tensions and antagonisms between qualitative and quantitative methods. PMID:10591276
NASA Astrophysics Data System (ADS)
Apperl, B.; Pulido-Velazquez, M.; Andreu, J.; Llopis-Albert, C.
2012-04-01
The implementation of the EU Water Framework Directive, with consideration of environmental, economic and social objectives, claims for participatory water resource management methods. To deal with different conflicting objectives it is necessary to apply a method for clarifying stakeholders' positions (identifying values and opinions of stakeholders, and quantifying their valuations), improving transparency with respect to outcomes of alternatives, and moving the discussion from alternatives towards fundamental objectives (value-thinking approach) and valuing trade-offs, facilitating negotiation. The method allows the incorporation of stakeholders in the planning process, which should guarantee a higher acceptance of the policies to be implemented. This research has been conducted in the Mancha Oriental groundwater system Spain, subject to an intensive use of groundwater for irrigation. The main goals according to the WFD are: a good qualitative and quantitative status of the aquifer and a good quantitative and ecological status of related surface water resources (mainly the Jucar river and dependent ecosystems). The aim is to analyze the contribution of the MAVT for conflict resolution and a sustainable groundwater management, involving the stakeholders in the valuation process. A complex set of objectives and attributes has been defined. The alternatives have been evaluated according to the compliance of ecological, economic and social interests. Results show that the acceptation of alternatives depends strongly on the combination of measures and the implementation status. A high conflict potential is expected from alternatives consisting of one unique measure. Uncertainties of the results are notable, but do not influence heavily on the alternative ranking. Different future scenarios also influence on the preference of alternatives. For instance, an expected reduction of future groundwater resources by climate change increases the conflict potential, with two observed reactions: acceptance of more rigorous measures, on one hand, and a tendency to soft measures with the same cost, as a reaction to the decreased effectiveness of the alternatives. The implementation of the method to a very complex case study, with many conflicting objectives and alternatives and uncertain outcomes, including future scenarios (climate change) illustrate the potential of the method for supporting management decisions.
NASA Astrophysics Data System (ADS)
Min, Junwei; Yao, Baoli; Ketelhut, Steffi; Kemper, Björn
2017-02-01
The modular combination of optical microscopes with digital holographic microscopy (DHM) has been proven to be a powerful tool for quantitative live cell imaging. The introduction of condenser and different microscope objectives (MO) simplifies the usage of the technique and makes it easier to measure different kinds of specimens with different magnifications. However, the high flexibility of illumination and imaging also causes variable phase aberrations that need to be eliminated for high resolution quantitative phase imaging. The existent phase aberrations compensation methods either require add additional elements into the reference arm or need specimen free reference areas or separate reference holograms to build up suitable digital phase masks. These inherent requirements make them unpractical for usage with highly variable illumination and imaging systems and prevent on-line monitoring of living cells. In this paper, we present a simple numerical method for phase aberration compensation based on the analysis of holograms in spatial frequency domain with capabilities for on-line quantitative phase imaging. From a single shot off-axis hologram, the whole phase aberration can be eliminated automatically without numerical fitting or pre-knowledge of the setup. The capabilities and robustness for quantitative phase imaging of living cancer cells are demonstrated.
Morphometric Analysis of Chemoreception Organ in Male and Female Ticks (Acari: Ixodidae).
Josek, Tanya; Allan, Brian F; Alleyne, Marianne
2018-05-04
The Haller's organ plays a crucial role in a tick's ability to detect hosts. Even though this sensory organ is vital to tick survival, the morphology of this organ is not well understood. The objective of this study was to characterize variation in the morphological components of the Haller's organ of three medically important tick species using quantitative methods. The Haller's organs of Ixodes scapularis Say (Ixodida: Ixodidae) (black-legged tick), Amblyomma americanum (L.) (Ixodida: Ixodidae) (lone star tick), and Dermacentor variabilis (Say) (Ixodida: Ixodidae) (American dog tick) were morphologically analyzed using environmental scanning electron microscopy and geometric morphometrics, and the results were statistically interpreted using canonical variate analysis. Our data reveal significant, quantitative differences in the morphology of the Haller's organ among all three tick species and that in D. variabilis the sensory structure is sexually dimorphic. Studies like this can serve as a quantitative basis for further studies on sensor physiology, behavior, and tick species life history, potentially leading to novel methods for the prevention of tick-borne disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odom, R.W.
1991-06-04
The objective of the research was to develop quantitative microanalysis methods for dielectric thin films using the laser ionization mass spectrometry (LIMS) technique. The research involved preparation of thin (5,000 A) films of SiO2, Al2O3, MgF2, TiO2, Cr2O3, Ta2O5, Si3N4, and ZrO2, and doping these films with ion implant impurities of 11B, 40Ca, 56Fe, 68Zn, 81Br, and 121Sb. Laser ionization mass spectrometry (LIMS), secondary ion mass spectrometry (SIMS) and Rutherford backscattering spectrometry (RBS) were performed on these films. The research demonstrated quantitative LIMS analysis down to detection levels of 10-100 ppm, and led to the development of (1) a compoundmore » thin film standards product line for the performing organization, (2) routine LIMS analytical methods, and (3) the manufacture of high speed preamplifiers for time-of-flight mass spectrometry (TOF-MS) techniques.« less
Lifestyle Factors and Visible Skin Aging in a Population of Japanese Elders
Asakura, Keiko; Nishiwaki, Yuji; Milojevic, Ai; Michikawa, Takehiro; Kikuchi, Yuriko; Nakano, Makiko; Iwasawa, Satoko; Hillebrand, Greg; Miyamoto, Kukizo; Ono, Masaji; Kinjo, Yoshihide; Akiba, Suminori; Takebayashi, Toru
2009-01-01
Background The number of studies that use objective and quantitative methods to evaluate facial skin aging in elderly people is extremely limited, especially in Japan. Therefore, in this cross-sectional study we attempted to characterize the condition of facial skin (hyperpigmentation, pores, texture, and wrinkling) in Japanese adults aged 65 years or older by using objective and quantitative imaging methods. In addition, we aimed to identify lifestyle factors significantly associated with these visible signs of aging. Methods The study subjects were 802 community-dwelling Japanese men and women aged at least 65 years and living in the town of Kurabuchi (Takasaki City, Gunma Prefecture, Japan), a mountain community with a population of approximately 4800. The facial skin condition of subjects was assessed quantitatively using a standardized facial imaging system and subsequent computer image analysis. Lifestyle information was collected using a structured questionnaire. The association between skin condition and lifestyle factors was examined using multivariable regression analysis. Results Among women, the mean values for facial texture, hyperpigmentation, and pores were generally lower than those among age-matched men. There was no significant difference between sexes in the severity of facial wrinkling. Older age was associated with worse skin condition among women only. After adjusting for age, smoking status and topical sun protection were significantly associated with skin condition among both men and women. Conclusions Our study revealed significant differences between sexes in the severity of hyperpigmentation, texture, and pores, but not wrinkling. Smoking status and topical sun protection were significantly associated with signs of visible skin aging in this study population. PMID:19700917
Visual Aggregate Analysis of Eligibility Features of Clinical Trials
He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua
2015-01-01
Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940
Retinal status analysis method based on feature extraction and quantitative grading in OCT images.
Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri
2016-07-22
Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.
Scale-based fuzzy connectivity: a novel image segmentation methodology and its validation
NASA Astrophysics Data System (ADS)
Saha, Punam K.; Udupa, Jayaram K.
1999-05-01
This paper extends a previously reported theory and algorithms for fuzzy connected object definition. It introduces `object scale' for determining the neighborhood size for defining affinity, the degree of local hanging togetherness between image elements. Object scale allows us to use a varying neighborhood size in different parts of the image. This paper argues that scale-based fuzzy connectivity is natural in object definition and demonstrates that this leads to a more effective object segmentation than without using scale in fuzzy concentrations. Affinity is described as consisting of a homogeneity-based and an object-feature- based component. Families of non scale-based and scale-based affinity relations are constructed. An effective method for giving a rough estimate of scale at different locations in the image is presented. The original theoretical and algorithmic framework remains more-or-less the same but considerably improved segmentations result. A quantitative statistical comparison between the non scale-based and the scale-based methods was made based on phantom images generated from patient MR brain studies by first segmenting the objects, and then by adding noise and blurring, and background component. Both the statistical and the subjective tests clearly indicate the superiority of scale- based method in capturing details and in robustness to noise.
Kaur, Jaspreet; Srinivasan, K. K.; Joseph, Alex; Gupta, Abhishek; Singh, Yogendra; Srinivas, Kona S.; Jain, Garima
2010-01-01
Objective: Venlafaxine,hydrochloride is a structurally novel phenethyl bicyclic antidepressant, and is usually categorized as a serotonin–norepinephrine reuptake inhibitor (SNRI) but it has been referred to as a serotonin–norepinephrine–dopamine reuptake inhibitor. It inhibits the reuptake of dopamine. Venlafaxine HCL is widely prescribed in the form of sustained release formulations. In the current article we are reporting the development and validation of a fast and simple stability indicating, isocratic high performance liquid chromatographic (HPLC) method for the determination of venlafaxine hydrochloride in sustained release formulations. Materials and Methods: The quantitative determination of venlafaxine hydrochloride was performed on a Kromasil C18 analytical column (250 × 4.6 mm i.d., 5 μm particle size) with 0.01 M phosphate buffer (pH 4.5): methanol (40: 60) as a mobile phase, at a flow rate of 1.0 ml/min. For HPLC methods, UV detection was made at 225 nm. Results: During method validation, parameters such as precision, linearity, accuracy, stability, limit of quantification and detection and specificity were evaluated, which remained within acceptable limits. Conclusions: The method has been successfully applied for the quantification and dissolution profiling of Venlafaxine HCL in sustained release formulation. The method presents a simple and reliable solution for the routine quantitative analysis of Venlafaxine HCL. PMID:21814426
Histopathological image analysis of chemical-induced hepatocellular hypertrophy in mice.
Asaoka, Yoshiji; Togashi, Yuko; Mutsuga, Mayu; Imura, Naoko; Miyoshi, Tomoya; Miyamoto, Yohei
2016-04-01
Chemical-induced hepatocellular hypertrophy is frequently observed in rodents, and is mostly caused by the induction of phase I and phase II drug metabolic enzymes and peroxisomal lipid metabolic enzymes. Liver weight is a sensitive and commonly used marker for detecting hepatocellular hypertrophy, but is also increased by a number of other factors. Histopathological observations subjectively detect changes such as hepatocellular hypertrophy based on the size of a hepatocyte. Therefore, quantitative microscopic observations are required to evaluate histopathological alterations objectively. In the present study, we developed a novel quantitative method for an image analysis of hepatocellular hypertrophy using liver sections stained with hematoxylin and eosin, and demonstrated its usefulness for evaluating hepatocellular hypertrophy induced by phenobarbital (a phase I and phase II enzyme inducer) and clofibrate (a peroxisomal enzyme inducer) in mice. The algorithm of this imaging analysis was designed to recognize an individual hepatocyte through a combination of pixel-based and object-based analyses. Hepatocellular nuclei and the surrounding non-hepatocellular cells were recognized by the pixel-based analysis, while the areas of the recognized hepatocellular nuclei were then expanded until they ran against their expanding neighboring hepatocytes and surrounding non-hepatocellular cells by the object-based analysis. The expanded area of each hepatocellular nucleus was regarded as the size of an individual hepatocyte. The results of this imaging analysis showed that changes in the sizes of hepatocytes corresponded with histopathological observations in phenobarbital and clofibrate-treated mice, and revealed a correlation between hepatocyte size and liver weight. In conclusion, our novel image analysis method is very useful for quantitative evaluations of chemical-induced hepatocellular hypertrophy. Copyright © 2015 Elsevier GmbH. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pavlickova, Katarina; Vyskupova, Monika, E-mail: vyskupova@fns.uniba.sk
Cumulative environmental impact assessment deals with the occasional use in practical application of environmental impact assessment process. The main reasons are the difficulty of cumulative impact identification caused by lack of data, inability to measure the intensity and spatial effect of all types of impacts and the uncertainty of their future evolution. This work presents a method proposal to predict cumulative impacts on the basis of landscape vulnerability evaluation. For this purpose, qualitative assessment of landscape ecological stability is conducted and major vulnerability indicators of environmental and socio-economic receptors are specified and valuated. Potential cumulative impacts and the overall impactmore » significance are predicted quantitatively in modified Argonne multiple matrixes while considering the vulnerability of affected landscape receptors and the significance of impacts identified individually. The method was employed in the concrete environmental impact assessment process conducted in Slovakia. The results obtained in this case study reflect that this methodology is simple to apply, valid for all types of impacts and projects, inexpensive and not time-consuming. The objectivity of the partial methods used in this procedure is improved by quantitative landscape ecological stability evaluation, assignment of weights to vulnerability indicators based on the detailed characteristics of affected factors, and grading impact significance. - Highlights: • This paper suggests a method proposal for cumulative impact prediction. • The method includes landscape vulnerability evaluation. • The vulnerability of affected receptors is determined by their sensitivity. • This method can increase the objectivity of impact prediction in the EIA process.« less
Digital holographic microscopy combined with optical tweezers
NASA Astrophysics Data System (ADS)
Cardenas, Nelson; Yu, Lingfeng; Mohanty, Samarendra K.
2011-02-01
While optical tweezers have been widely used for the manipulation and organization of microscopic objects in three dimensions, observing the manipulated objects along axial direction has been quite challenging. In order to visualize organization and orientation of objects along axial direction, we report development of a Digital holographic microscopy combined with optical tweezers. Digital holography is achieved by use of a modified Mach-Zehnder interferometer with digital recording of interference pattern of the reference and sample laser beams by use of a single CCD camera. In this method, quantitative phase information is retrieved dynamically with high temporal resolution, only limited by frame rate of the CCD. Digital focusing, phase-unwrapping as well as online analysis and display of the quantitative phase images was performed on a software developed on LabView platform. Since phase changes observed in DHOT is very sensitive to optical thickness of trapped volume, estimation of number of particles trapped in the axial direction as well as orientation of non-spherical objects could be achieved with high precision. Since in diseases such as malaria and diabetics, change in refractive index of red blood cells occurs, this system can be employed to map such disease-specific changes in biological samples upon immobilization with optical tweezers.
Rosen, Robert; Marmur, Ellen; Anderson, Lawrence; Welburn, Peter; Katsamas, Janelle
2014-12-01
Local skin responses (LSRs) are the most common adverse effects of topical actinic keratosis (AK) therapy. There is currently no method available that allows objective characterization of LSRs. Here, the authors describe a new scale developed to quantitatively and objectively assess the six most common LSRs resulting from topical AK therapy with ingenol mebutate. The LSR grading scale was developed using a 0-4 numerical rating, with clinical descriptors and representative photographic images for each rating. Good inter-observer grading concordance was demonstrated in peer review during development of the tool. Data on the use of the scale are described from four phase III double-blind studies of ingenol mebutate (n = 1,005). LSRs peaked on days 4 (face/scalp) or 8 (trunk/extremities), with mean maximum composite LSR scores of 9.1 and 6.8, respectively, and a rapid return toward baseline by day 15 in most cases. Mean composite LSR score at day 57 was generally lower than at baseline. The LSR grading scale is an objective tool allowing practicing dermatologists to characterize and compare LSRs to existing and, potentially, future AK therapies.
Visual conspicuity: a new simple standard, its reliability, validity and applicability.
Wertheim, A H
2010-03-01
A general standard for quantifying conspicuity is described. It derives from a simple and easy method to quantitatively measure the visual conspicuity of an object. The method stems from the theoretical view that the conspicuity of an object is not a property of that object, but describes the degree to which the object is perceptually embedded in, i.e. laterally masked by, its visual environment. First, three variations of a simple method to measure the strength of such lateral masking are described and empirical evidence for its reliability and its validity is presented, as are several tests of predictions concerning the effects of viewing distance and ambient light. It is then shown how this method yields a conspicuity standard, expressed as a number, which can be made part of a rule of law, and which can be used to test whether or not, and to what extent, the conspicuity of a particular object, e.g. a traffic sign, meets a predetermined criterion. An additional feature is that, when used under different ambient light conditions, the method may also yield an index of the amount of visual clutter in the environment. Taken together the evidence illustrates the methods' applicability in both the laboratory and in real-life situations. STATEMENT OF RELEVANCE: This paper concerns a proposal for a new method to measure visual conspicuity, yielding a numerical index that can be used in a rule of law. It is of importance to ergonomists and human factor specialists who are asked to measure the conspicuity of an object, such as a traffic or rail-road sign, or any other object. The new method is simple and circumvents the need to perform elaborate (search) experiments and thus has great relevance as a simple tool for applied research.
A method to characterize the roughness of 2-D line features: recrystallization boundaries.
Sun, J; Zhang, Y B; Dahl, A B; Conradsen, K; Juul Jensen, D
2017-03-01
A method is presented, which allows quantification of the roughness of nonplanar boundaries of objects for which the neutral plane is not known. The method provides quantitative descriptions of both the local and global characteristics. How the method can be used to estimate the sizes of rough features and local curvatures is also presented. The potential of the method is illustrated by quantification of the roughness of two recrystallization boundaries in a pure Al specimen characterized by scanning electron microscopy. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Image enhancement using MCNP5 code and MATLAB in neutron radiography.
Tharwat, Montaser; Mohamed, Nader; Mongy, T
2014-07-01
This work presents a method that can be used to enhance the neutron radiography (NR) image for objects with high scattering materials like hydrogen, carbon and other light materials. This method used Monte Carlo code, MCNP5, to simulate the NR process and get the flux distribution for each pixel of the image and determines the scattered neutron distribution that caused image blur, and then uses MATLAB to subtract this scattered neutron distribution from the initial image to improve its quality. This work was performed before the commissioning of digital NR system in Jan. 2013. The MATLAB enhancement method is quite a good technique in the case of static based film neutron radiography, while in neutron imaging (NI) technique, image enhancement and quantitative measurement were efficient by using ImageJ software. The enhanced image quality and quantitative measurements were presented in this work. Copyright © 2014 Elsevier Ltd. All rights reserved.
Chaotic dynamics of controlled electric power systems
NASA Astrophysics Data System (ADS)
Kozlov, V. N.; Trosko, I. U.
2016-12-01
The conditions for appearance of chaotic dynamics of electromagnetic and electromechanical processes in energy systems described by the Park-Gorev bilinear differential equations with account for lags of coordinates and restrictions on control have been formulated. On the basis of classical equations, the parameters of synchronous generators and power lines, at which the chaotic dynamics of energy systems appears, have been found. The qualitative and quantitative characteristics of chaotic processes in energy associations of two types, based on the Hopf theorem, and methods of nonstationary linearization and decompositions are given. The properties of spectral characteristics of chaotic processes have been investigated, and the qualitative similarity of bilinear equations of power systems and Lorentz equations have been found. These results can be used for modernization of the systems of control of energy objects. The qualitative and quantitative characteristics for power energy systems as objects of control and for some laws of control with the feedback have been established.
Kušnierová, Pavlína; Švagera, Zdeněk; Všianský, František; Byrtusová, Monika; Hradílek, Pavel; Kurková, Barbora; Zapletalová, Olga; Bartoš, Vladimír
2016-01-01
Objectives We aimed to compare various methods for free light chain (fLC) quantitation in cerebrospinal fluid (CSF) and serum and to determine whether quantitative CSF measurements could reliably predict intrathecal fLC synthesis. In addition, we wished to determine the relationship between free kappa and free lambda light chain concentrations in CSF and serum in various disease groups. Methods We analysed 166 paired CSF and serum samples by at least one of the following methods: turbidimetry (Freelite™, SPAPLUS), nephelometry (N Latex FLC™, BN ProSpec), and two different (commercially available and in-house developed) sandwich ELISAs. The results were compared with oligoclonal fLC detected by affinity-mediated immunoblotting after isoelectric focusing. Results Although the correlations between quantitative methods were good, both proportional and systematic differences were discerned. However, no major differences were observed in the prediction of positive oligoclonal fLC test. Surprisingly, CSF free kappa/free lambda light chain ratios were lower than those in serum in about 75% of samples with negative oligoclonal fLC test. In about a half of patients with multiple sclerosis and clinically isolated syndrome, profoundly increased free kappa/free lambda light chain ratios were found in the CSF. Conclusions Our results show that using appropriate method-specific cut-offs, different methods of CSF fLC quantitation can be used for the prediction of intrathecal fLC synthesis. The reason for unexpectedly low free kappa/free lambda light chain ratios in normal CSFs remains to be elucidated. Whereas CSF free kappa light chain concentration is increased in most patients with multiple sclerosis and clinically isolated syndrome, CSF free lambda light chain values show large interindividual variability in these patients and should be investigated further for possible immunopathological and prognostic significance. PMID:27846293
Hansen, Matthew; O’Brien, Kerth; Meckler, Garth; Chang, Anna Marie; Guise, Jeanne-Marie
2016-01-01
Mixed methods research has significant potential to broaden the scope of emergency care and specifically emergency medical services investigation. Mixed methods studies involve the coordinated use of qualitative and quantitative research approaches to gain a fuller understanding of practice. By combining what is learnt from multiple methods, these approaches can help to characterise complex healthcare systems, identify the mechanisms of complex problems such as medical errors and understand aspects of human interaction such as communication, behaviour and team performance. Mixed methods approaches may be particularly useful for out-of-hospital care researchers because care is provided in complex systems where equipment, interpersonal interactions, societal norms, environment and other factors influence patient outcomes. The overall objectives of this paper are to (1) introduce the fundamental concepts and approaches of mixed methods research and (2) describe the interrelation and complementary features of the quantitative and qualitative components of mixed methods studies using specific examples from the Children’s Safety Initiative-Emergency Medical Services (CSI-EMS), a large National Institutes of Health-funded research project conducted in the USA. PMID:26949970
Link-Based Similarity Measures Using Reachability Vectors
Yoon, Seok-Ho; Kim, Ji-Soo; Ryu, Minsoo; Choi, Ho-Jin
2014-01-01
We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures. PMID:24701188
ERIC Educational Resources Information Center
Vellut, Natacha; Cook, Jon M.; Tursz, Anne
2012-01-01
Objectives: Using judicial files on neonaticides, (1) to examine the frequency of the association between neonaticide and denial of pregnancy; (2) to assess the accuracy of the concept of denial of pregnancy; (3) to examine its usefulness in programs to prevent neonaticides. Methods: Quantitative and qualitative analyses of data collected from…
A Comparison of Urban School- and Community-Based Dental Clinics
ERIC Educational Resources Information Center
Larsen, Charles D.; Larsen, Michael D.; Handwerker, Lisa B.; Kim, Maile S.; Rosenthal, Murray
2009-01-01
Background: The objective of the study was to quantitatively compare school- and community-based dental clinics in New York City that provide dental services to children in need. It was hypothesized that the school-based clinics would perform better in terms of several measures. Methods: We reviewed billing and visit data derived from encounter…
A Guide to Health Manpower Resources--1970.
ERIC Educational Resources Information Center
Washington State Dept. of Social and Health Services, Olympia.
The stated objective of this guide is to provide a quantitative description of the current supply of health manpower in the State of Washington. To do so, two methods of data collecting are used, with explanations for each. Precautions for interpreting their data are noted. The major portion (152 of 180 pages) of the guide lists information on…
ERIC Educational Resources Information Center
Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong
2015-01-01
Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…
ICT Accessibility and Usability to Support Learning of Visually-Impaired Students in Tanzania
ERIC Educational Resources Information Center
Eligi, Innosencia; Mwantimwa, Kelefa
2017-01-01
The main objective of this study was to assess the accessibility and usability of Information and Communication Technology facilities to facilitate learning among visually-impaired students at the University of Dar es Salaam (UDSM). The study employed a mixed methods design in gathering, processing and analysing quantitative and qualitative data.…
Sudden Gains during Psychological Treatments of Anxiety and Depression: A Meta-Analysis
ERIC Educational Resources Information Center
Aderka, Idan M.; Nickerson, Angela; Boe, Hans Jakob; Hofmann, Stefan G.
2012-01-01
Objective: The present study quantitatively reviewed the literature on sudden gains in psychological treatments for anxiety and depression. The authors examined the short- and long-term effects of sudden gains on treatment outcome as well as moderators of these effects. Method: The authors conducted a literature search using PubMed, PsycINFO, the…
The Representation of Islam in the Hungarian Geography Textbooks
ERIC Educational Resources Information Center
Császár, Zsuzsu M.; Vati, Tamás
2012-01-01
This research has been seeking an answer to the question about what kind of image of the Islam is conveyed by the most popular and densely used textbooks to students. In the course of analysis, primary and secondary schools textbooks were examined via quantitative and qualitative methods. The objective demonstration of the research results aims to…
ERIC Educational Resources Information Center
Bramwell-Lalor, Sharon; Rainford, Marcia
2015-01-01
This paper reports on a Mixed Methods study involving an investigation into the attitudes of advanced level biology teachers towards assessment and describes the teachers' experiences while being engaged in Assessment for Learning (AfL) practices such as sharing of learning objectives and peer- and self-assessment. Quantitative data were collected…
ERIC Educational Resources Information Center
Armin, Julie; Torres, Cristina Huebner; Vivian, James; Vergara, Cunegundo; Shaw, Susan J.
2014-01-01
Objective: This study aimed to quantitatively and qualitatively examine breast cancer screening practices, including breast self-examination (BSE), and health literacy among patients with chronic disease. Design: A prospective, multi-method study conducted with a targeted purposive sample of 297 patients with diabetes and/or hypertension from four…
Cigarette Use in 6th Through 10th Grade: The Sarasota County Demonstration Project
ERIC Educational Resources Information Center
Zapata, Lauren B.; Forthofer, Melinda S.; Eaton, Danice K.; Brown, Kelli McCormack; Bryant, Carol A.; Reynolds, Sherri T.; McDermot, Robert J.
2004-01-01
Objectives: To identify factors associated with cigarette smoking in the 6 th -grade through 10 th -grade youth population of Sarasota County, Florida. Methods: A closed-ended, quantitative survey was completed by 2004 youth and used to extract population-specific data on the correlates of cigarette use. Results: A range of factors influence…
Extracting 3D Parametric Curves from 2D Images of Helical Objects.
Willcocks, Chris G; Jackson, Philip T G; Nelson, Carl J; Obara, Boguslaw
2017-09-01
Helical objects occur in medicine, biology, cosmetics, nanotechnology, and engineering. Extracting a 3D parametric curve from a 2D image of a helical object has many practical applications, in particular being able to extract metrics such as tortuosity, frequency, and pitch. We present a method that is able to straighten the image object and derive a robust 3D helical curve from peaks in the object boundary. The algorithm has a small number of stable parameters that require little tuning, and the curve is validated against both synthetic and real-world data. The results show that the extracted 3D curve comes within close Hausdorff distance to the ground truth, and has near identical tortuosity for helical objects with a circular profile. Parameter insensitivity and robustness against high levels of image noise are demonstrated thoroughly and quantitatively.
Quantitative methods in assessment of neurologic function.
Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J
1981-01-01
Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.
Yang, Qian; Lew, Hwee Yeong; Peh, Raymond Hock Huat; Metz, Michael Patrick; Loh, Tze Ping
2016-10-01
Reference intervals are the most commonly used decision support tool when interpreting quantitative laboratory results. They may require partitioning to better describe subpopulations that display significantly different reference values. Partitioning by age is particularly important for the paediatric population since there are marked physiological changes associated with growth and maturation. However, most partitioning methods are either technically complex or require prior knowledge of the underlying physiology/biological variation of the population. There is growing interest in the use of continuous centile curves, which provides seamless laboratory reference values as a child grows, as an alternative to rigidly described fixed reference intervals. However, the mathematical functions that describe these curves can be complex and may not be easily implemented in laboratory information systems. Hence, the use of fixed reference intervals is expected to continue for a foreseeable time. We developed a method that objectively proposes optimised age partitions and reference intervals for quantitative laboratory data (http://research.sph.nus.edu.sg/pp/ppResult.aspx), based on the sum of gradient that best describes the underlying distribution of the continuous centile curves. It is hoped that this method may improve the selection of age intervals for partitioning, which is receiving increasing attention in paediatric laboratory medicine. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lorenzetti, G.; Foresta, A.; Palleschi, V.; Legnaioli, S.
2009-09-01
The recent development of mobile instrumentation, specifically devoted to in situ analysis and study of museum objects, allows the acquisition of many LIBS spectra in very short time. However, such large amount of data calls for new analytical approaches which would guarantee a prompt analysis of the results obtained. In this communication, we will present and discuss the advantages of statistical analytical methods, such as Partial Least Squares Multiple Regression algorithms vs. the classical calibration curve approach. PLS algorithms allows to obtain in real time the information on the composition of the objects under study; this feature of the method, compared to the traditional off-line analysis of the data, is extremely useful for the optimization of the measurement times and number of points associated with the analysis. In fact, the real time availability of the compositional information gives the possibility of concentrating the attention on the most `interesting' parts of the object, without over-sampling the zones which would not provide useful information for the scholars or the conservators. Some example on the applications of this method will be presented, including the studies recently performed by the researcher of the Applied Laser Spectroscopy Laboratory on museum bronze objects.
Single-shot quantitative phase microscopy with color-multiplexed differential phase contrast (cDPC).
Phillips, Zachary F; Chen, Michael; Waller, Laura
2017-01-01
We present a new technique for quantitative phase and amplitude microscopy from a single color image with coded illumination. Our system consists of a commercial brightfield microscope with one hardware modification-an inexpensive 3D printed condenser insert. The method, color-multiplexed Differential Phase Contrast (cDPC), is a single-shot variant of Differential Phase Contrast (DPC), which recovers the phase of a sample from images with asymmetric illumination. We employ partially coherent illumination to achieve resolution corresponding to 2× the objective NA. Quantitative phase can then be used to synthesize DIC and phase contrast images or extract shape and density. We demonstrate amplitude and phase recovery at camera-limited frame rates (50 fps) for various in vitro cell samples and c. elegans in a micro-fluidic channel.
Computerized quantitative evaluation of mammographic accreditation phantom images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu
2010-12-15
Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria,more » the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.« less
A Quantitative and Qualitative Exploration of Photoaversion in Achromatopsia
Aboshiha, Jonathan; Kumaran, Neruban; Kalitzeos, Angelos; Hogg, Chris; Rubin, Gary; Michaelides, Michel
2017-01-01
Purpose Photoaversion (PA) is a disabling and ubiquitous feature of achromatopsia (ACHM). We aimed to help define the characteristics of this important symptom, and present the first published assessment of its impact on patients' lives, as well as quantitative and qualitative PA assessments. Methods Molecularly confirmed ACHM subjects were assessed for PA using four tasks: structured survey of patient experience, novel quantitative subjective measurement of PA, visual acuities in differing ambient lighting, and objective palpebral aperture-related PA testing. Results Photoaversion in ACHM was found to be the most significant symptom for a substantial proportion (38%) of patients. A novel subjective PA measurement technique was developed and demonstrated fidelity with more invasive paradigms without exposing often very photosensitive patients to brighter light intensities used elsewhere. An objective PA measurement was also refined for use in trials, indicating that higher light intensities than previously published are likely to be needed. Monocular testing, as required for trials, was also validated for the first time. Conclusions This study offers new insights into PA in ACHM. It provides the first structured evidence of the great significance of this symptom to patients, suggesting that PA should be considered as an additional outcome measure in therapeutic trials. It also offers new insights into the characteristics of PA in ACHM, and describes both subjective and objective measures of PA that could be employed in clinical trials. PMID:28715587
NASA Technical Reports Server (NTRS)
Bromage, Timothy G.; Doty, Stephen B.; Smolyar, Igor; Holton, Emily
1996-01-01
Our stated primary objective is to quantify the growth rate variability of rat lamellar bone exposed to micro and macrogravity (2G). The primary significance of the proposed work is that an elegant method will be established that unequivocally characterizes the morphological consequences of gravitational factors on developing bone. The integrity of this objective depends upon our successful preparation of thin sections suitable for imaging individual bone lamellae, and our imaging and quantitation of growth rate variability in populations of lamellae from individual bone samples.
a New Object-Based Framework to Detect Shodows in High-Resolution Satellite Imagery Over Urban Areas
NASA Astrophysics Data System (ADS)
Tatar, N.; Saadatseresht, M.; Arefi, H.; Hadavand, A.
2015-12-01
In this paper a new object-based framework to detect shadow areas in high resolution satellite images is proposed. To produce shadow map in pixel level state of the art supervised machine learning algorithms are employed. Automatic ground truth generation based on Otsu thresholding on shadow and non-shadow indices is used to train the classifiers. It is followed by segmenting the image scene and create image objects. To detect shadow objects, a majority voting on pixel-based shadow detection result is designed. GeoEye-1 multi-spectral image over an urban area in Qom city of Iran is used in the experiments. Results shows the superiority of our proposed method over traditional pixel-based, visually and quantitatively.
Multi-object segmentation framework using deformable models for medical imaging analysis.
Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel
2016-08-01
Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed framework has a wide range of applications especially in the presence of adjacent structures of interest or under intra-structure inhomogeneities giving excellent quantitative results.
Single-exposure quantitative phase imaging in color-coded LED microscopy.
Lee, Wonchan; Jung, Daeseong; Ryu, Suho; Joo, Chulmin
2017-04-03
We demonstrate single-shot quantitative phase imaging (QPI) in a platform of color-coded LED microscopy (cLEDscope). The light source in a conventional microscope is replaced by a circular LED pattern that is trisected into subregions with equal area, assigned to red, green, and blue colors. Image acquisition with a color image sensor and subsequent computation based on weak object transfer functions allow for the QPI of a transparent specimen. We also provide a correction method for color-leakage, which may be encountered in implementing our method with consumer-grade LEDs and image sensors. Most commercially available LEDs and image sensors do not provide spectrally isolated emissions and pixel responses, generating significant error in phase estimation in our method. We describe the correction scheme for this color-leakage issue, and demonstrate improved phase measurement accuracy. The computational model and single-exposure QPI capability of our method are presented by showing images of calibrated phase samples and cellular specimens.
Novel method for quantitative ANA measurement using near-infrared imaging.
Peterson, Lisa K; Wells, Daniel; Shaw, Laura; Velez, Maria-Gabriela; Harbeck, Ronald; Dragone, Leonard L
2009-09-30
Antinuclear antibodies (ANA) have been detected in patients with systemic rheumatic diseases and are used in the screening and/or diagnosis of autoimmunity in patients as well as mouse models of systemic autoimmunity. Indirect immunofluorescence (IIF) on HEp-2 cells is the gold standard for ANA screening. However, its usefulness is limited in diagnosis, prognosis and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. Various immunological techniques have been developed in an attempt to improve upon the method to quantify ANA, including enzyme-linked immunosorbent assays (ELISAs), line immunoassays (LIAs), multiplexed bead immunoassays and IIF on substrates other than HEp-2 cells. Yet IIF on HEp-2 cells remains the most common screening method for ANA. In this study, we describe a simple quantitative method to detect ANA which combines IIF on HEp-2 coated slides with analysis using a near-infrared imaging (NII) system. Using NII to determine ANA titer, 86.5% (32 of 37) of the titers for human patient samples were within 2 dilutions of those determined by IIF, which is the acceptable range for proficiency testing. Combining an initial screening for nuclear staining using microscopy with titration by NII resulted in 97.3% (36 of 37) of the titers detected to be within two dilutions of those determined by IIF. The NII method for quantitative ANA measurements using serum from both patients and mice with autoimmunity provides a fast, relatively simple, objective, sensitive and reproducible assay, which could easily be standardized for comparison between laboratories.
Curtis, Tyler E; Roeder, Ryan K
2017-10-01
Advances in photon-counting detectors have enabled quantitative material decomposition using multi-energy or spectral computed tomography (CT). Supervised methods for material decomposition utilize an estimated attenuation for each material of interest at each photon energy level, which must be calibrated based upon calculated or measured values for known compositions. Measurements using a calibration phantom can advantageously account for system-specific noise, but the effect of calibration methods on the material basis matrix and subsequent quantitative material decomposition has not been experimentally investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on the accuracy of quantitative material decomposition in the image domain. Gadolinium was chosen as a model contrast agent in imaging phantoms, which also contained bone tissue and water as negative controls. The maximum gadolinium concentration (30, 60, and 90 mM) and total number of concentrations (2, 4, and 7) were independently varied to systematically investigate effects of the material basis matrix and scaling factor calibration on the quantitative (root mean squared error, RMSE) and spatial (sensitivity and specificity) accuracy of material decomposition. Images of calibration and sample phantoms were acquired using a commercially available photon-counting spectral micro-CT system with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material decomposition of gadolinium, calcium, and water was performed for each calibration method using a maximum a posteriori estimator. Both the quantitative and spatial accuracy of material decomposition were most improved by using an increased maximum gadolinium concentration (range) in the basis matrix calibration; the effects of using a greater number of concentrations were relatively small in magnitude by comparison. The material basis matrix calibration was more sensitive to changes in the calibration methods than the scaling factor calibration. The material basis matrix calibration significantly influenced both the quantitative and spatial accuracy of material decomposition, while the scaling factor calibration influenced quantitative but not spatial accuracy. Importantly, the median RMSE of material decomposition was as low as ~1.5 mM (~0.24 mg/mL gadolinium), which was similar in magnitude to that measured by optical spectroscopy on the same samples. The accuracy of quantitative material decomposition in photon-counting spectral CT was significantly influenced by calibration methods which must therefore be carefully considered for the intended diagnostic imaging application. © 2017 American Association of Physicists in Medicine.
[Study on objectively evaluating skin aging according to areas of skin texture].
Shan, Gaixin; Gan, Ping; He, Ling; Sun, Lu; Li, Qiannan; Jiang, Zheng; He, Xiangqian
2015-02-01
Skin aging principles play important roles in skin disease diagnosis, the evaluation of skin cosmetic effect, forensic identification and age identification in sports competition, etc. This paper proposes a new method to evaluate the skin aging objectively and quantitatively by skin texture area. Firstly, the enlarged skin image was acquired. Then, the skin texture image was segmented by using the iterative threshold method, and the skin ridge image was extracted according to the watershed algorithm. Finally, the skin ridge areas of the skin texture were extracted. The experiment data showed that the average areas of skin ridges, of both men and women, had a good correlation with age (the correlation coefficient r of male was 0.938, and the correlation coefficient r of female was 0.922), and skin texture area and age regression curve showed that the skin texture area increased with age. Therefore, it is effective to evaluate skin aging objectively by the new method presented in this paper.
Zhu, H B; Su, C J; Tang, H F; Ruan, Z; Liu, D H; Wang, H; Qian, Y L
2017-10-20
Objective: To establish a method for rapid determination of 47 volatile organic compounds in the air of workplace using portable gas chromatography - mass spectrometer(GC - MS). Methods: The mixed standard gas with different concentration levels was made by using the static gas distribution method with the high purity nitrogen as dilution gas. The samples were injected into the GC - MS by a hand - held probe. Retention time and characteristic ion were used for qualitative analysis,and the internal standard method was usd for quantitation. Results: The 47 poisonous substances were separated and determined well. The linear range of this method was 0.2 - 16.0 mg/m(3),and the relative standard deviation of 45 volatile ovganic compounds was 3.8% - 15.8%. The average recovery was 79.3% - 119.0%. Conclusion: The method is simple,accurate,sensitive,has good separation effect,short analysis period, can be used for qualitative and quantitative analysis of volatile organic compounds in the workplace, and also supports the rapid identification and detection of occupational hazards.
Gao, Hua-Jun; Chen, Ya-Jing; Zuo, Duo; Xiao, Ming-Ming; Li, Ying; Guo, Hua; Zhang, Ning; Chen, Rui-Bing
2015-01-01
Objective Hepatocellular carcinoma (HCC) is a leading cause of cancer-related deaths. Novel serum biomarkers are required to increase the sensitivity and specificity of serum screening for early HCC diagnosis. This study employed a quantitative proteomic strategy to analyze the differential expression of serum glycoproteins between HCC and normal control serum samples. Methods Lectin affinity chromatography (LAC) was used to enrich glycoproteins from the serum samples. Quantitative mass spectrometric analysis combined with stable isotope dimethyl labeling and 2D liquid chromatography (LC) separations were performed to examine the differential levels of the detected proteins between HCC and control serum samples. Western blot was used to analyze the differential expression levels of the three serum proteins. Results A total of 2,280 protein groups were identified in the serum samples from HCC patients by using the 2D LC-MS/MS method. Up to 36 proteins were up-regulated in the HCC serum, whereas 19 proteins were down-regulated. Three differential glycoproteins, namely, fibrinogen gamma chain (FGG), FOS-like antigen 2 (FOSL2), and α-1,6-mannosylglycoprotein 6-β-N-acetylglucosaminyltransferase B (MGAT5B) were validated by Western blot. All these three proteins were up-regulated in the HCC serum samples. Conclusion A quantitative glycoproteomic method was established and proven useful to determine potential novel biomarkers for HCC. PMID:26487969
Quantitative diagnostic method for biceps long head tendinitis by using ultrasound.
Huang, Shih-Wei; Wang, Wei-Te
2013-01-01
To investigate the feasibility of grayscale quantitative diagnostic method for biceps tendinitis and determine the cut-off points of a quantitative biceps ultrasound (US) method to diagnose biceps tendinitis. Design. Prospective cross-sectional case controlled study. Outpatient rehabilitation service. A total of 336 shoulder pain patients with suspected biceps tendinitis were recruited in this prospective observational study. The grayscale pixel data of the range of interest (ROI) were obtained for both the transverse and longitudinal views of the biceps US. A total of 136 patients were classified with biceps tendinitis, and 200 patients were classified as not having biceps tendinitis based on the diagnostic criteria. Based on the Youden index, the cut-off points were determined as 26.85 for the transverse view and 21.25 for the longitudinal view of the standard deviation (StdDev) of the ROI values, respectively. When the ROI evaluation of the US surpassed the cut-off point, the sensitivity was 68% and the specificity was 90% in the StdDev of the transverse view, and the sensitivity was 81% and the specificity was 73% in the StdDev of the longitudinal view to diagnose biceps tendinitis. For equivocal cases or inexperienced sonographers, our study provides a more objective method for diagnosing biceps tendinitis in shoulder pain patients.
2D and 3D X-ray phase retrieval of multi-material objects using a single defocus distance.
Beltran, M A; Paganin, D M; Uesugi, K; Kitchen, M J
2010-03-29
A method of tomographic phase retrieval is developed for multi-material objects whose components each has a distinct complex refractive index. The phase-retrieval algorithm, based on the Transport-of-Intensity equation, utilizes propagation-based X-ray phase contrast images acquired at a single defocus distance for each tomographic projection. The method requires a priori knowledge of the complex refractive index for each material present in the sample, together with the total projected thickness of the object at each orientation. The requirement of only a single defocus distance per projection simplifies the experimental setup and imposes no additional dose compared to conventional tomography. The algorithm was implemented using phase contrast data acquired at the SPring-8 Synchrotron facility in Japan. The three-dimensional (3D) complex refractive index distribution of a multi-material test object was quantitatively reconstructed using a single X-ray phase-contrast image per projection. The technique is robust in the presence of noise, compared to conventional absorption based tomography.
Thapaliya, Kiran; Pyun, Jae-Young; Park, Chun-Su; Kwon, Goo-Rak
2013-01-01
The level set approach is a powerful tool for segmenting images. This paper proposes a method for segmenting brain tumor images from MR images. A new signed pressure function (SPF) that can efficiently stop the contours at weak or blurred edges is introduced. The local statistics of the different objects present in the MR images were calculated. Using local statistics, the tumor objects were identified among different objects. In this level set method, the calculation of the parameters is a challenging task. The calculations of different parameters for different types of images were automatic. The basic thresholding value was updated and adjusted automatically for different MR images. This thresholding value was used to calculate the different parameters in the proposed algorithm. The proposed algorithm was tested on the magnetic resonance images of the brain for tumor segmentation and its performance was evaluated visually and quantitatively. Numerical experiments on some brain tumor images highlighted the efficiency and robustness of this method. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Quantitative Market Research Regarding Funding of District 8 Construction Projects
DOT National Transportation Integrated Search
1995-05-01
The primary objective of this quantitative research is to provide information : for more effective decision making regarding the level of investment in various : transportation systems in District 8. : This objective was accomplished by establishing ...
Surface plasmon resonance microscopy: achieving a quantitative optical response
Peterson, Alexander W.; Halter, Michael; Plant, Anne L.; Elliott, John T.
2016-01-01
Surface plasmon resonance (SPR) imaging allows real-time label-free imaging based on index of refraction, and changes in index of refraction at an interface. Optical parameter analysis is achieved by application of the Fresnel model to SPR data typically taken by an instrument in a prism based configuration. We carry out SPR imaging on a microscope by launching light into a sample, and collecting reflected light through a high numerical aperture microscope objective. The SPR microscope enables spatial resolution that approaches the diffraction limit, and has a dynamic range that allows detection of subnanometer to submicrometer changes in thickness of biological material at a surface. However, unambiguous quantitative interpretation of SPR changes using the microscope system could not be achieved using the Fresnel model because of polarization dependent attenuation and optical aberration that occurs in the high numerical aperture objective. To overcome this problem, we demonstrate a model to correct for polarization diattenuation and optical aberrations in the SPR data, and develop a procedure to calibrate reflectivity to index of refraction values. The calibration and correction strategy for quantitative analysis was validated by comparing the known indices of refraction of bulk materials with corrected SPR data interpreted with the Fresnel model. Subsequently, we applied our SPR microscopy method to evaluate the index of refraction for a series of polymer microspheres in aqueous media and validated the quality of the measurement with quantitative phase microscopy. PMID:27782542
Application of shift-and-add algorithms for imaging objects within biological media
NASA Astrophysics Data System (ADS)
Aizert, Avishai; Moshe, Tomer; Abookasis, David
2017-01-01
The Shift-and-Add (SAA) technique is a simple mathematical operation developed to reconstruct, at high spatial resolution, atmospherically degraded solar images obtained from stellar speckle interferometry systems. This method shifts and assembles individual degraded short-exposure images into a single average image with significantly improved contrast and detail. Since the inhomogeneous refractive indices of biological tissue causes light scattering similar to that induced by optical turbulence in the atmospheric layers, we assume that SAA methods can be successfully implemented to reconstruct the image of an object within a scattering biological medium. To test this hypothesis, five SAA algorithms were evaluated for reconstructing images acquired from multiple viewpoints. After successfully retrieving the hidden object's shape, quantitative image quality metrics were derived, enabling comparison of imaging error across a spectrum of layer thicknesses, demonstrating the relative efficacy of each SAA algorithm for biological imaging.
Tracer Methods for Characterizing Fracture Creation in Engineered Geothermal Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Peter; Harris, Joel
2014-05-08
The aim of this proposal is to develop, through novel high-temperature-tracing approaches, three technologies for characterizing fracture creation within Engineered Geothermal Systems (EGS). The objective of a first task is to identify, develop and demonstrate adsorbing tracers for characterizing interwell reservoir-rock surface areas and fracture spacing. The objective of a second task is to develop and demonstrate a methodology for measuring fracture surface areas adjacent to single wells. The objective of a third task is to design, fabricate and test an instrument that makes use of tracers for measuring fluid flow between newly created fractures and wellbores. In one methodmore » of deployment, it will be used to identify qualitatively which fractures were activated during a hydraulic stimulation experiment. In a second method of deployment, it will serve to measure quantitatively the rate of fluid flowing from one or more activated fracture during a production test following a hydraulic stimulation.« less
Surface pressure measurement by oxygen quenching of luminescence
NASA Technical Reports Server (NTRS)
Gouterman, Martin P. (Inventor); Kavandi, Janet L. (Inventor); Gallery, Jean (Inventor); Callis, James B. (Inventor)
1993-01-01
Methods and compositions for measuring the pressure of an oxygen-containing gas on an aerodynamic surface, by oxygen-quenching of luminescence of molecular sensors is disclosed. Objects are coated with luminescent films containing a first sensor and at least one of two additional sensors, each of the sensors having luminescences that have different dependencies on temperature and oxygen pressure. Methods and compositions are also provided for improving pressure measurements (qualitative or quantitive) on surfaces coated with a film having one or more types of sensor.
Surface pressure measurement by oxygen quenching of luminescence
NASA Technical Reports Server (NTRS)
Gouterman, Martin P. (Inventor); Kavandi, Janet L. (Inventor); Gallery, Jean (Inventor); Callis, James B. (Inventor)
1994-01-01
Methods and compositions for measuring the pressure of an oxygen-containing gas on an aerodynamic surface, by oxygen-quenching of luminescence of molecular sensors is disclosed. Objects are coated with luminescent films containing a first sensor and at least one of two additional sensors, each of the sensors having luminescences that have different dependencies on temperature and oxygen pressure. Methods and compositions are also provided for improving pressure measurements (qualitative or quantitive) on surfaces coated with a film having one or more types of sensor.
Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.
Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E
2007-01-01
This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
HOLST, Alexandra Ioana; HOLST, Stefan; HIRSCHFELDER, Ursula; von SECKENDORFF, Volker
2012-01-01
Objective The objective of this study was to investigate the applicability of micro-analytical methods with high spatial resolution to the characterization of the composition and corrosion behavior of two bracket systems. Material and methods The surfaces of six nickel-free brackets and six nickel-containing brackets were examined for signs of corrosion and qualitative surface analysis using an electron probe microanalyzer (EPMA), prior to bonding to patient's tooth surfaces and four months after clinical use. The surfaces were characterized qualitatively by secondary electron (SE) images and back scattered electron (BSE) images in both compositional and topographical mode. Qualitative and quantitative wavelength-dispersive analyses were performed for different elements, and by utilizing qualitative analysis the relative concentration of selected elements was mapped two-dimensionally. The absolute concentration of the elements was determined in specially prepared brackets by quantitative analysis using pure element standards for calibration and calculating correction-factors (ZAF). Results Clear differences were observed between the different bracket types. The nickel-containing stainless steel brackets consist of two separate pieces joined by a brazing alloy. Compositional analysis revealed two different alloy compositions, and reaction zones on both sides of the brazing alloy. The nickel-free bracket was a single piece with only slight variation in element concentration, but had a significantly rougher surface. After clinical use, no corrosive phenomena were detectable with the methods applied. Traces of intraoral wear at the contact areas between the bracket slot and the arch wire were verified. Conclusion Electron probe microanalysis is a valuable tool for the characterization of element distribution and quantitative analysis for corrosion studies. PMID:23032212
ERIC Educational Resources Information Center
Timmerman, M. C.
2004-01-01
Objective: To explore the impact of the school climate on adolescents' reporting of sexual harassment. Design: A quantitative survey among students in their 4th year of secondary education. Setting: Questionnaires were completed in a class setting. Method: An a-select sampling strategy was used to select 2808 students in 22 schools. Results:…
The Influence of Islamic Moral Values on the Students' Behavior in Aceh
ERIC Educational Resources Information Center
Nuriman; Fauzan
2017-01-01
This study shows the influence and relationship of Islamic moral values to the students' behavior in Aceh Province. Learning Objects are the moral values of Islam achieved in learning in high school and vocational institutions that are assumed to affect the students' behavior. The quantitative methods used in this study and was running by SPSS…
Joseph B. James and Fred J. Genthner
United States Environmental Protection Agency, Gulf Breeze, FL
Background: Methods using rapid cycle, real-time, quantitative (QPCR) are being developed for detecting and quantifying Enterococcus spp. as well as other aquatic b...
ERIC Educational Resources Information Center
Wang, Ye; Willis, Erin
2016-01-01
Objective: To examine whether and to what extent relevant and meaningful discussions of weight loss occurred in the Weight Watchers' online community, and whether and to what extent the online community is designed for fostering such discussions. Methods: A multimethod approach was used here. First, a quantitative content analysis was conducted on…
ERIC Educational Resources Information Center
Echon, Roger M.
2014-01-01
Purpose/Objectives: The purpose of this paper is to provide baseline data and characteristics of food served and consumed prior to the recently mandated nutrition standards as authorized by the Healthy, Hunger-Free Kids Act of 2010 (HHFKA). Methods: Over 600,000 school lunch menus with associated food production records from 61 elementary schools…
ERIC Educational Resources Information Center
Dehghan, Mahshid; Lopez Jaramillo, Patricio; Duenas, Ruby; Anaya, Lilliam Lima; Garcia, Ronald G.; Zhang, Xiaohe; Islam, Shofiqul; Merchant, Anwar T.
2012-01-01
Objective: To validate a food frequency questionnaire (FFQ) against multiple 24-hour dietary recalls (DRs) that could be used for Colombian adults. Methods: A convenience sample of 219 individuals participated in the study. The validity of the FFQ was evaluated against multiple DRs. Four dietary recalls were collected during the year, and an FFQ…
The Snacking Habits of Adolescents: Is Snack Food Necessary to Meet Dietary Recommendations?
ERIC Educational Resources Information Center
Howard, Susan; Reeves, Sue
2005-01-01
Objective: To investigate the role of snack foods in the diets of adolescents in relation to recommendations. Design: A quantitative study whereby the food intakes of 28 adolescents aged 11-14 years were recorded for three consecutive days. Setting: A secondary school in South West London. Methods: Food intake was recorded using food diaries and…
Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo
2014-01-01
This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m(2), 5074.1790 m(3) and 1316.1250 m(2), 1591.5784 m(3), respectively. The results of the study provide a new method for the quantitative study of small gully erosion.
Yuan, Tao; Zheng, Xinqi; Hu, Xuan; Zhou, Wei; Wang, Wei
2014-01-01
Objective and effective image quality assessment (IQA) is directly related to the application of optical remote sensing images (ORSI). In this study, a new IQA method of standardizing the target object recognition rate (ORR) is presented to reflect quality. First, several quality degradation treatments with high-resolution ORSIs are implemented to model the ORSIs obtained in different imaging conditions; then, a machine learning algorithm is adopted for recognition experiments on a chosen target object to obtain ORRs; finally, a comparison with commonly used IQA indicators was performed to reveal their applicability and limitations. The results showed that the ORR of the original ORSI was calculated to be up to 81.95%, whereas the ORR ratios of the quality-degraded images to the original images were 65.52%, 64.58%, 71.21%, and 73.11%. The results show that these data can more accurately reflect the advantages and disadvantages of different images in object identification and information extraction when compared with conventional digital image assessment indexes. By recognizing the difference in image quality from the application effect perspective, using a machine learning algorithm to extract regional gray scale features of typical objects in the image for analysis, and quantitatively assessing quality of ORSI according to the difference, this method provides a new approach for objective ORSI assessment.
An Objective Measure of Interconnection Usage for High Levels of Wind Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yasuda, Yoh; Gomez-Lazaro, Emilio; Holttinen, Hannele
2014-11-13
This paper analyzes selected interconnectors in Europe using several evaluation factors; capacity factor, congested time, and congestion ratio. In a quantitative and objective evaluation, the authors propose to use publically available data on maximum net transmission capacity (NTC) levels during a single year to study congestion rates, realizing that the capacity factor depends upon the chosen capacity of the selected interconnector. This value will be referred to as 'the annual maximum transmission capacity (AMTC)', which gives a transparent and objective evaluation of interconnector usage based on the published grid data. While the method is general, its initial application is motivatedmore » by transfer of renewable energy.« less
Resolving power of diffraction imaging with an objective: a numerical study.
Wang, Wenjin; Liu, Jing; Lu, Jun Qing; Ding, Junhua; Hu, Xin-Hua
2017-05-01
Diffraction imaging in the far-field can detect 3D morphological features of an object for its coherent nature. We describe methods for accurate calculation and analysis of diffraction images of scatterers of single and double spheres by an imaging unit based on microscope objective at non-conjugate positions. A quantitative study of the calculated diffraction imaging in spectral domain has been performed to assess the resolving power of diffraction imaging. It has been shown numerically that with coherent illumination of 532 nm in wavelength the imaging unit can resolve single spheres of 2 μm or larger in diameters and double spheres separated by less than 300 nm between their centers.
[Classical and molecular methods for identification and quantification of domestic moulds].
Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S
2017-12-01
To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
Lee, Onseok; Park, Sunup; Kim, Jaeyoung; Oh, Chilhwan
2017-11-01
The visual scoring method has been used as a subjective evaluation of pigmentary skin disorders. Severity of pigmentary skin disease, especially melasma, is evaluated using a visual scoring method, the MASI (melasma area severity index). This study differentiates between epidermal and dermal pigmented disease. The study was undertaken to determine methods to quantitatively measure the severity of pigmentary skin disorders under ultraviolet illumination. The optical imaging system consists of illumination (white LED, UV-A lamp) and image acquisition (DSLR camera, air cooling CMOS CCD camera). Each camera is equipped with a polarizing filter to remove glare. To analyze images of visible and UV light, images are divided into frontal, cheek, and chin regions of melasma patients. Each image must undergo image processing. To reduce the curvature error in facial contours, a gradient mask is used. The new method of segmentation of front and lateral facial images is more objective for face-area-measurement than the MASI score. Image analysis of darkness and homogeneity is adequate to quantify the conventional MASI score. Under visible light, active lesion margins appear in both epidermal and dermal melanin, whereas melanin is found in the epidermis under UV light. This study objectively analyzes severity of melasma and attempts to develop new methods of image analysis with ultraviolet optical imaging equipment. Based on the results of this study, our optical imaging system could be used as a valuable tool to assess the severity of pigmentary skin disease. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Watkins, Daphne C.; Wharton, Tracy; Mitchell, Jamie A.; Matusko, Niki; Kales, Helen
2016-01-01
The purpose of this study was to explore the role of non-spousal family support on mental health among older, church-going African American men. The mixed methods objective was to employ a design that used existing qualitative and quantitative data to explore the interpretive context within which social and cultural experiences occur. Qualitative data (n=21) were used to build a conceptual model that was tested using quantitative data (n= 401). Confirmatory factor analysis indicated an inverse association between non-spousal family support and distress. The comparative fit index, Tucker-Lewis fit index, and root mean square error of approximation indicated good model fit. This study offers unique methodological approaches to using existing, complementary data sources to understand the health of African American men. PMID:28943829
Single-shot quantitative phase microscopy with color-multiplexed differential phase contrast (cDPC)
2017-01-01
We present a new technique for quantitative phase and amplitude microscopy from a single color image with coded illumination. Our system consists of a commercial brightfield microscope with one hardware modification—an inexpensive 3D printed condenser insert. The method, color-multiplexed Differential Phase Contrast (cDPC), is a single-shot variant of Differential Phase Contrast (DPC), which recovers the phase of a sample from images with asymmetric illumination. We employ partially coherent illumination to achieve resolution corresponding to 2× the objective NA. Quantitative phase can then be used to synthesize DIC and phase contrast images or extract shape and density. We demonstrate amplitude and phase recovery at camera-limited frame rates (50 fps) for various in vitro cell samples and c. elegans in a micro-fluidic channel. PMID:28152023
A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.
Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less
A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis
Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...
2015-12-07
Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less
An Approach to Extract Moving Objects from Mls Data Using a Volumetric Background Representation
NASA Astrophysics Data System (ADS)
Gehrung, J.; Hebel, M.; Arens, M.; Stilla, U.
2017-05-01
Data recorded by mobile LiDAR systems (MLS) can be used for the generation and refinement of city models or for the automatic detection of long-term changes in the public road space. Since for this task only static structures are of interest, all mobile objects need to be removed. This work presents a straightforward but powerful approach to remove the subclass of moving objects. A probabilistic volumetric representation is utilized to separate MLS measurements recorded by a Velodyne HDL-64E into mobile objects and static background. The method was subjected to a quantitative and a qualitative examination using multiple datasets recorded by a mobile mapping platform. The results show that depending on the chosen octree resolution 87-95% of the measurements are labeled correctly.
3D Slicer as an Image Computing Platform for the Quantitative Imaging Network
Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron
2012-01-01
Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690
Graberski Matasović, M; Matasović, T; Markovac, Z
1997-06-01
The frequency of femoral quadriceps muscle hypotrophy has become a significant therapeutic problem. Efforts are being made to improve the standard scheme of kinesitherapeutic treatment by using additional more effective therapeutic methods. Beside kinesitherapy, the authors have used magnetotherapy in 30 of the 60 patients. The total of 60 patients, both sexes, similar age groups and intensity of hypotrophy, were included in the study. They were divided into groups A and B, the experimental and the control one (30 patients each). The treatment was scheduled for the usual 5-6 weeks. Electromyographic quantitative analysis was used to check-up the treatment results achieved after 5 and 6 weeks of treatment period. Analysis of results has confirmed the assumption that magnetotherapy may yield better and faster treatment results, disappearance of pain and decreased risk of complications. The same results were obtained in the experimental group, only one week earlier than in the control group. The EMG quantitative analysis has not proved sufficiently reliable and objective method in the assessment of real condition of the muscle and effects of treatment.
Quantitative tomographic imaging of intermolecular FRET in small animals
Venugopal, Vivek; Chen, Jin; Barroso, Margarida; Intes, Xavier
2012-01-01
Forster resonance energy transfer (FRET) is a nonradiative transfer of energy between two fluorescent molecules (a donor and an acceptor) in nanometer range proximity. FRET imaging methods have been applied to proteomic studies and drug discovery applications based on intermolecular FRET efficiency measurements and stoichiometric measurements of FRET interaction as quantitative parameters of interest. Importantly, FRET provides information about biomolecular interactions at a molecular level, well beyond the diffraction limits of standard microscopy techniques. The application of FRET to small animal imaging will allow biomedical researchers to investigate physiological processes occurring at nanometer range in vivo as well as in situ. In this work a new method for the quantitative reconstruction of FRET measurements in small animals, incorporating a full-field tomographic acquisition system with a Monte Carlo based hierarchical reconstruction scheme, is described and validated in murine models. Our main objective is to estimate the relative concentration of two forms of donor species, i.e., a donor molecule involved in FRETing to an acceptor close by and a nonFRETing donor molecule. PMID:23243567
Quantitative subsurface analysis using frequency modulated thermal wave imaging
NASA Astrophysics Data System (ADS)
Subhani, S. K.; Suresh, B.; Ghali, V. S.
2018-01-01
Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.
High Resolution Qualitative and Quantitative MR Evaluation of the Glenoid Labrum
Iwasaki, Kenyu; Tafur, Monica; Chang, Eric Y.; SherondaStatum; Biswas, Reni; Tran, Betty; Bae, Won C.; Du, Jiang; Bydder, Graeme M.; Chung, Christine B.
2015-01-01
Objective To implement qualitative and quantitative MR sequences for the evaluation of labral pathology. Methods Six glenoid labra were dissected and the anterior and posterior portions were divided into normal, mildly degenerated, or severely degenerated groups using gross and MR findings. Qualitative evaluation was performed using T1-weighted, proton density-weighted (PD), spoiled gradient echo (SPGR) and ultra-short echo time (UTE) sequences. Quantitative evaluation included T2 and T1rho measurements as well as T1, T2*, and T1rho measurements acquired with UTE techniques. Results SPGR and UTE sequences best demonstrated labral fiber structure. Degenerated labra had a tendency towards decreased T1 values, increased T2/T2* values and increased T1 rho values. T2* values obtained with the UTE sequence allowed for delineation between normal, mildly degenerated and severely degenerated groups (p<0.001). Conclusion Quantitative T2* measurements acquired with the UTE technique are useful for distinguishing between normal, mildly degenerated and severely degenerated labra. PMID:26359581
Abuali, M M; Katariwala, R; LaBombardi, V J
2012-05-01
The agar proportion method (APM) for determining Mycobacterium tuberculosis susceptibilities is a qualitative method that requires 21 days in order to produce the results. The Sensititre method allows for a quantitative assessment. Our objective was to compare the accuracy, time to results, and ease of use of the Sensititre method to the APM. 7H10 plates in the APM and 96-well microtiter dry MYCOTB panels containing 12 antibiotics at full dilution ranges in the Sensititre method were inoculated with M. tuberculosis and read for colony growth. Thirty-seven clinical isolates were tested using both methods and 26 challenge strains of blinded susceptibilities were tested using the Sensititre method only. The Sensititre method displayed 99.3% concordance with the APM. The APM provided reliable results on day 21, whereas the Sensititre method displayed consistent results by day 10. The Sensititre method provides a more rapid, quantitative, and efficient method of testing both first- and second-line drugs when compared to the gold standard. It will give clinicians a sense of the degree of susceptibility, thus, guiding the therapeutic decision-making process. Furthermore, the microwell plate format without the need for instrumentation will allow its use in resource-poor settings.
Odendaal, Willem; Atkins, Salla; Lewin, Simon
2016-12-15
Formative programme evaluations assess intervention implementation processes, and are seen widely as a way of unlocking the 'black box' of any programme in order to explore and understand why a programme functions as it does. However, few critical assessments of the methods used in such evaluations are available, and there are especially few that reflect on how well the evaluation achieved its objectives. This paper describes a formative evaluation of a community-based lay health worker programme for TB and HIV/AIDS clients across three low-income communities in South Africa. It assesses each of the methods used in relation to the evaluation objectives, and offers suggestions on ways of optimising the use of multiple, mixed-methods within formative evaluations of complex health system interventions. The evaluation's qualitative methods comprised interviews, focus groups, observations and diary keeping. Quantitative methods included a time-and-motion study of the lay health workers' scope of practice and a client survey. The authors conceptualised and conducted the evaluation, and through iterative discussions, assessed the methods used and their results. Overall, the evaluation highlighted programme issues and insights beyond the reach of traditional single methods evaluations. The strengths of the multiple, mixed-methods in this evaluation included a detailed description and nuanced understanding of the programme and its implementation, and triangulation of the perspectives and experiences of clients, lay health workers, and programme managers. However, the use of multiple methods needs to be carefully planned and implemented as this approach can overstretch the logistic and analytic resources of an evaluation. For complex interventions, formative evaluation designs including multiple qualitative and quantitative methods hold distinct advantages over single method evaluations. However, their value is not in the number of methods used, but in how each method matches the evaluation questions and the scientific integrity with which the methods are selected and implemented.
Anguera, M Teresa; Camerino, Oleguer; Castañer, Marta; Sánchez-Algarra, Pedro; Onwuegbuzie, Anthony J
2017-01-01
Mixed methods studies are been increasingly applied to a diversity of fields. In this paper, we discuss the growing use-and enormous potential-of mixed methods research in the field of sport and physical activity. A second aim is to contribute to strengthening the characteristics of mixed methods research by showing how systematic observation offers rigor within a flexible framework that can be applied to a wide range of situations. Observational methodology is characterized by high scientific rigor and flexibility throughout its different stages and allows the objective study of spontaneous behavior in natural settings, with no external influence. Mixed methods researchers need to take bold yet thoughtful decisions regarding both substantive and procedural issues. We present three fundamental and complementary ideas to guide researchers in this respect: we show why studies of sport and physical activity that use a mixed methods research approach should be included in the field of mixed methods research, we highlight the numerous possibilities offered by observational methodology in this field through the transformation of descriptive data into quantifiable code matrices, and we discuss possible solutions for achieving true integration of qualitative and quantitative findings.
How to assess intestinal viability during surgery: A review of techniques
Urbanavičius, Linas; Pattyn, Piet; Van de Putte, Dirk; Venskutonis, Donatas
2011-01-01
Objective and quantitative intraoperative methods of bowel viability assessment are essential in gastrointestinal surgery. Exact determination of the borderline of the viable bowel with the help of an objective test could result in a decrease of postoperative ischemic complications. An accurate, reproducible and cost effective method is desirable in every operating theater dealing with abdominal operations. Numerous techniques assessing various parameters of intestinal viability are described by the studies. However, there is no consensus about their clinical use. To evaluate the available methods, a systematic search of the English literature was performed. Virtues and drawbacks of the techniques and possibilities of clinical application are reviewed. Valuable parameters related to postoperative intestinal anastomotic or stoma complications are analyzed. Important issues in the measurement and interpretation of bowel viability are discussed. To date, only a few methods are applicable in surgical practice. Further studies are needed to determine the limiting values of intestinal tissue oxygenation and flow indicative of ischemic complications and to standardize the methods. PMID:21666808
Pilot-optimal augmentation synthesis
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1978-01-01
An augmentation synthesis method usable in the absence of quantitative handling qualities specifications, and yet explicitly including design objectives based on pilot-rating concepts, is presented. The algorithm involves the unique approach of simultaneously solving for the stability augmentation system (SAS) gains, pilot equalization and pilot rating prediction via optimal control techniques. Simultaneous solution is required in this case since the pilot model (gains, etc.) depends upon the augmented plant dynamics, and the augmentation is obviously not a priori known. Another special feature is the use of the pilot's objective function (from which the pilot model evolves) to design the SAS.
Ma, Danjun; Wang, Jiarui; Zhao, Yingchun; Lee, Wai-Nang Paul; Xiao, Jing; Go, Vay Liang W.; Wang, Qi; Recker, Robert; Xiao, Gary Guishan
2011-01-01
Objectives Novel quantitative proteomic approaches were used to study the effects of inhibition of glycogen phosphorylase on proteome and signaling pathways in MIA PaCa-2 pancreatic cancer cells. Methods We performed quantitative proteomic analysis in MIA PaCa-2 cancer cells treated with a stratified dose of CP-320626 (25 μM, 50 μM and 100 μM). The effect of metabolic inhibition on cellular protein turnover dynamics was also studied using the modified SILAC method (mSILAC). Results A total of twenty-two protein spots and four phosphoprotein spots were quantitatively analyzed. We found that dynamic expression of total proteins and phosphoproteins was significantly changed in MIA PaCa-2 cells treated with an incremental dose of CP-320626. Functional analyses suggested that most of the proteins differentially expressed were in the pathways of MAPK/ERK and TNF-α/NF-κB. Conclusions Signaling pathways and metabolic pathways share many common cofactors and substrates forming an extended metabolic network. The restriction of substrate through one pathway such as inhibition of glycogen phosphorylation induces pervasive metabolomic and proteomic changes manifested in protein synthesis, breakdown and post-translational modification of signaling molecules. Our results suggest that quantitative proteomic is an important approach to understand the interaction between metabolism and signaling pathways. PMID:22158071
Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang
2017-01-01
The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.
A method of plane geometry primitive presentation
NASA Astrophysics Data System (ADS)
Jiao, Anbo; Luo, Haibo; Chang, Zheng; Hui, Bin
2014-11-01
Point feature and line feature are basic elements in object feature sets, and they play an important role in object matching and recognition. On one hand, point feature is sensitive to noise; on the other hand, there are usually a huge number of point features in an image, which makes it complex for matching. Line feature includes straight line segment and curve. One difficulty in straight line segment matching is the uncertainty of endpoint location, the other is straight line segment fracture problem or short straight line segments joined to form long straight line segment. While for the curve, in addition to the above problems, there is another difficulty in how to quantitatively describe the shape difference between curves. Due to the problems of point feature and line feature, the robustness and accuracy of target description will be affected; in this case, a method of plane geometry primitive presentation is proposed to describe the significant structure of an object. Firstly, two types of primitives are constructed, they are intersecting line primitive and blob primitive. Secondly, a line segment detector (LSD) is applied to detect line segment, and then intersecting line primitive is extracted. Finally, robustness and accuracy of the plane geometry primitive presentation method is studied. This method has a good ability to obtain structural information of the object, even if there is rotation or scale change of the object in the image. Experimental results verify the robustness and accuracy of this method.
Moving object detection via low-rank total variation regularization
NASA Astrophysics Data System (ADS)
Wang, Pengcheng; Chen, Qian; Shao, Na
2016-09-01
Moving object detection is a challenging task in video surveillance. Recently proposed Robust Principal Component Analysis (RPCA) can recover the outlier patterns from the low-rank data under some mild conditions. However, the l-penalty in RPCA doesn't work well in moving object detection because the irrepresentable condition is often not satisfied. In this paper, a method based on total variation (TV) regularization scheme is proposed. In our model, image sequences captured with a static camera are highly related, which can be described using a low-rank matrix. Meanwhile, the low-rank matrix can absorb background motion, e.g. periodic and random perturbation. The foreground objects in the sequence are usually sparsely distributed and drifting continuously, and can be treated as group outliers from the highly-related background scenes. Instead of l-penalty, we exploit the total variation of the foreground. By minimizing the total variation energy, the outliers tend to collapse and finally converge to be the exact moving objects. The TV-penalty is superior to the l-penalty especially when the outlier is in the majority for some pixels, and our method can estimate the outlier explicitly with less bias but higher variance. To solve the problem, a joint optimization function is formulated and can be effectively solved through the inexact Augmented Lagrange Multiplier (ALM) method. We evaluate our method along with several state-of-the-art approaches in MATLAB. Both qualitative and quantitative results demonstrate that our proposed method works effectively on a large range of complex scenarios.
Derkacs, Amanda D Felder; Ward, Samuel R; Lieber, Richard L
2012-02-01
Understanding cytoskeletal dynamics in living tissue is prerequisite to understanding mechanisms of injury, mechanotransduction, and mechanical signaling. Real-time visualization is now possible using transfection with plasmids that encode fluorescent cytoskeletal proteins. Using this approach with the muscle-specific intermediate filament protein desmin, we found that a green fluorescent protein-desmin chimeric protein was unevenly distributed throughout the muscle fiber, resulting in some image areas that were saturated as well as others that lacked any signal. Our goal was to analyze the muscle fiber cytoskeletal network quantitatively in an unbiased fashion. To objectively select areas of the muscle fiber that are suitable for analysis, we devised a method that provides objective classification of regions of images of striated cytoskeletal structures into "usable" and "unusable" categories. This method consists of a combination of spatial analysis of the image using Fourier methods along with a boosted neural network that "decides" on the quality of the image based on previous training. We trained the neural network using the expert opinion of three scientists familiar with these types of images. We found that this method was over 300 times faster than manual classification and that it permitted objective and accurate classification of image regions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
WIELOPOLSKI, L.
In this short report, I reassess the feasibility of measuring iron in vivo in the liver and heart of thalassemia patients undergoing chelation therapy. Despite the multiplicity of analytical methods for analyzing iron, only two, magnetic resonance imaging, and magnetic susceptibility, are suitable for in vivo applications, and these are limited to the liver because of the heart's beat. Previously, a nuclear method, gamma-resonance scattering, offered a quantitative measure of iron in these organs; however, it was abandoned because it necessitated a nuclear reactor to produce the radioactive source. I reviewed and reassessed the status of two alternative nuclear methods,more » based on iron spectroscopy of gamma rays induced by fast neutron inelastic scattering and delayed activation in iron. Both are quantitative methods with high specificity for iron and adequate penetrating power to measure it in organs sited deep within the human body. My experiments demonstrated that both modalities met the stated qualitative objectives to measure iron. However, neutron dosimetry revealed that the intensity of the neutron radiation field was too weak to reliably assess the minimum detection limits, and to allow quantitative extrapolations to measurements in people. A review of the literature, included in this report, showed that these findings agree qualitatively with the published results, although the doses reported were about three orders-of-magnitude higher than those I used. Reviewing the limitations of the present work, steps were outlined for overcoming some of the shortcomings. Due to a dearth of valid quantitative alternatives for determining iron in vivo, I conclude that nuclear methods remain the only viable option. However, from the lessons learned, further systematic work is required before embarking on clinical studies.« less
Vibration analysis based on electronic stroboscopic speckle-shearing pattern interferometry
NASA Astrophysics Data System (ADS)
Jia, Dagong; Yu, Changsong; Xu, Tianhua; Jin, Chao; Zhang, Hongxia; Jing, Wencai; Zhang, Yimo
2008-12-01
In this paper, an electronic speckle-shearing pattern interferometer with pulsed laser and pulse frequency controller is fabricated. The principle of measuring the vibration in the object using electronic stroboscopic speckle--shearing pattern interferometer is analyzed. Using a metal plate, the edge of which is clamped, as an experimental specimen, the shear interferogram are obtained under two experimental frequencies, 100 Hz and 200 Hz. At the same time, the vibration of this metal plate under the same experimental conditions is measured using the time-average method in order to test the performance of this electronic stroboscopic speckle-shearing pattern interferometer. The result indicated that the fringe of shear interferogram become dense with the experimental frequency increasing. Compared the fringe pattern obtained by the stroboscopic method with the fringe obtained by the time-average method, the shearing interferogram of stroboscopic method is clearer than the time-average method. In addition, both the time-average method and stroboscopic method are suited for qualitative analysis for the vibration of the object. More over, the stroboscopic method is well adapted to quantitative vibration analysis.
Assessment of calcium scoring performance in cardiac computed tomography.
Ulzheimer, Stefan; Kalender, Willi A
2003-03-01
Electron beam tomography (EBT) has been used for cardiac diagnosis and the quantitative assessment of coronary calcium since the late 1980s. The introduction of mechanical multi-slice spiral CT (MSCT) scanners with shorter rotation times opened new possibilities of cardiac imaging with conventional CT scanners. The purpose of this work was to qualitatively and quantitatively evaluate the performance for EBT and MSCT for the task of coronary artery calcium imaging as a function of acquisition protocol, heart rate, spiral reconstruction algorithm (where applicable) and calcium scoring method. A cardiac CT semi-anthropomorphic phantom was designed and manufactured for the investigation of all relevant image quality parameters in cardiac CT. This phantom includes various test objects, some of which can be moved within the anthropomorphic phantom in a manner that mimics realistic heart motion. These tools were used to qualitatively and quantitatively demonstrate the accuracy of coronary calcium imaging using typical protocols for an electron beam (Evolution C-150XP, Imatron, South San Francisco, Calif.) and a 0.5-s four-slice spiral CT scanner (Sensation 4, Siemens, Erlangen, Germany). A special focus was put on the method of quantifying coronary calcium, and three scoring systems were evaluated (Agatston, volume, and mass scoring). Good reproducibility in coronary calcium scoring is always the result of a combination of high temporal and spatial resolution; consequently, thin-slice protocols in combination with retrospective gating on MSCT scanners yielded the best results. The Agatston score was found to be the least reproducible scoring method. The hydroxyapatite mass, being better reproducible and comparable on different scanners and being a physical quantitative measure, appears to be the method of choice for future clinical studies. The hydroxyapatite mass is highly correlated to the Agatston score. The introduced phantoms can be used to quantitatively assess the performance characteristics of, for example, different scanners, reconstruction algorithms, and quantification methods in cardiac CT. This is especially important for quantitative tasks, such as the determination of the amount of calcium in the coronary arteries, to achieve high and constant quality in this field.
Mohler, M Jane; Coons, Stephen Joel; Hornbrook, Mark C; Herrinton, Lisa J; Wendel, Christopher S; Grant, Marcia; Krouse, Robert S
2008-07-01
The objective of this paper is to describe the complex mixed-methods design of a study conducted to assess health-related quality of life (HRQOL) outcomes and ostomy-related obstacles and adjustments among long-term (>5 years) colorectal cancer (CRC) survivors with ostomies (cases) and without ostomies (controls). In addition, details are provided regarding the study sample and the psychometric properties of the quantitative data collection measures used. Subsequent manuscripts will present the study findings. The study design involved a cross-sectional mail survey for collecting quantitative data and focus groups for collecting qualitative data. The study subjects were individuals identified as long-term CRC survivors within a community-based health maintenance organization's enrolled population. Focus groups comprised of cases were conducted. The groups were divided by gender and HRQOL high and low quartile contrasts (based on the mail survey data). The modified City of Hope Quality of Life (mCOH-QOL)-Ostomy and SF-36v2 questionnaires were used in the mail survey. An abridged version of the mCOH-QOL-Ostomy was used for the control subjects. Focus groups explored ostomy-related barriers to self-care, adaptation methods/skills, and advice for others with an ostomy. The survey response rate was 52% (679/1308) and 34 subjects participated in focus groups. The internal consistency reliability estimates for the mCOH-QOL-Ostomy and SF-36v2 questionnaires were very acceptable for group comparisons. In addition, evidence supports the construct validity of the abridged version of the mCOH-QOL-Ostomy. Study limitations include potential non-response bias and limited minority participation. We were able to successfully recruit long-term CRC survivors into this study and the psychometric properties of the quantitative measures used were quite acceptable. Mixed-methods designs, such as the one used in this study, may be useful in identification and further elucidation of common problems, coping strategies, and HRQOL outcomes among long-term cancer survivors.
Wirth, Troy A.; Pyke, David A.
2007-01-01
Emergency Stabilization and Rehabilitation (ES&R) and Burned Area Emergency Response (BAER) treatments are short-term, high-intensity treatments designed to mitigate the adverse effects of wildfire on public lands. The federal government expends significant resources implementing ES&R and BAER treatments after wildfires; however, recent reviews have found that existing data from monitoring and research are insufficient to evaluate the effects of these activities. The purpose of this report is to: (1) document what monitoring methods are generally used by personnel in the field; (2) describe approaches and methods for post-fire vegetation and soil monitoring documented in agency manuals; (3) determine the common elements of monitoring programs recommended in these manuals; and (4) describe a common monitoring approach to determine the effectiveness of future ES&R and BAER treatments in non-forested regions. Both qualitative and quantitative methods to measure effectiveness of ES&R treatments are used by federal land management agencies. Quantitative methods are used in the field depending on factors such as funding, personnel, and time constraints. There are seven vegetation monitoring manuals produced by the federal government that address monitoring methods for (primarily) vegetation and soil attributes. These methods vary in their objectivity and repeatability. The most repeatable methods are point-intercept, quadrat-based density measurements, gap intercepts, and direct measurement of soil erosion. Additionally, these manuals recommend approaches for designing monitoring programs for the state of ecosystems or the effect of management actions. The elements of a defensible monitoring program applicable to ES&R and BAER projects that most of these manuals have in common are objectives, stratification, control areas, random sampling, data quality, and statistical analysis. The effectiveness of treatments can be determined more accurately if data are gathered using an approach that incorporates these six monitoring program design elements and objectives, as well as repeatable procedures to measure cover, density, gap intercept, and soil erosion within each ecoregion and plant community. Additionally, using a common monitoring program design with comparable methods, consistently documenting results, and creating and maintaining a central database for query and reporting, will ultimately allow a determination of the effectiveness of post-fire rehabilitation activities region-wide.
ERIC Educational Resources Information Center
Nigg, Joel T.; Lewis, Kara; Edinger, Tracy; Falk, Michael
2012-01-01
Objective: The role of diet and of food colors in attention-deficit/hyperactivity disorder (ADHD) or its symptoms warrants updated quantitative meta-analysis, in light of recent divergent policy in Europe and the United States. Method: Studies were identified through a literature search using the PubMed, Cochrane Library, and PsycNET databases…
NASA/BLM Applications Pilot Test (APT), phase 2. Volume 3: Technology transfer
NASA Technical Reports Server (NTRS)
1981-01-01
Techniques used and materials presented at a planning session and two workshops held to provide hands-on training in the integration of quantitatively based remote sensing data are described as well as methods used to enhance understanding of approaches to inventories that integrate multiple data sources given various resource information objectives. Significant results from each of the technology transfer sessions are examined.
ERIC Educational Resources Information Center
Sharma, Shreela V.; Hedberg, Ann Marie; Skala, Katherine A.; Chuang, Ru-Jye; Lewis, Tamara
2015-01-01
Garden-based lessons are gaining popularity as a means of increasing fruit and vegetable intake among children. The study objective was to pilot test a garden-based preschool curriculum for feasibility and acceptability in Harris County Department of Education Head Start using qualitative and quantitative methods. A total of 103, 3- to 5-year-old…
2012-01-01
Background Sasang constitutional medicine (SCM) is a unique form of traditional Korean medicine that divides human beings into four constitutional types (Tae-Yang: TY, Tae-Eum: TE, So-Yang: SY, and So-Eum: SE), which differ in inherited characteristics, such as external appearance, personality traits, susceptibility to particular diseases, drug responses, and equilibrium among internal organ functions. According to SCM, herbs that belong to a certain constitution cannot be used in patients with other constitutions; otherwise, this practice may result in no effect or in an adverse effect. Thus, the diagnosis of SC type is the most crucial step in SCM practice. The diagnosis, however, tends to be subjective due to a lack of quantitative standards for SC diagnosis. Methods We have attempted to make the diagnosis method as objective as possible by basing it on an analysis of quantitative data from various Oriental medical clinics. Four individual diagnostic models were developed with multinomial logistic regression based on face, body shape, voice, and questionnaire responses. Inspired by SCM practitioners’ holistic diagnostic processes, an integrated diagnostic model was then proposed by combining the four individual models. Results The diagnostic accuracies in the test set, after the four individual models had been integrated into a single model, improved to 64.0% and 55.2% in the male and female patient groups, respectively. Using a cut-off value for the integrated SC score, such as 1.6, the accuracies increased by 14.7% in male patients and by 4.6% in female patients, which showed that a higher integrated SC score corresponded to a higher diagnostic accuracy. Conclusions This study represents the first trial of integrating the objectification of SC diagnosis based on quantitative data and SCM practitioners’ holistic diagnostic processes. Although the diagnostic accuracy was not great, it is noted that the proposed diagnostic model represents common rules among practitioners who have various points of view. Our results are expected to contribute as a desirable research guide for objective diagnosis in traditional medicine, as well as to contribute to the precise diagnosis of SC types in an objective manner in clinical practice. PMID:22762505
[Assessment of skin aging grading based on computer vision].
Li, Lingyu; Xue, Jinxia; He, Xiangqian; Zhang, Sheng; Fan, Chu
2017-06-01
Skin aging is the most intuitive and obvious sign of the human aging processes. Qualitative and quantitative determination of skin aging is of particular importance for the evaluation of human aging and anti-aging treatment effects. To solve the problem of subjectivity of conventional skin aging grading methods, the self-organizing map (SOM) network was used to explore an automatic method for skin aging grading. First, the ventral forearm skin images were obtained by a portable digital microscope and two texture parameters, i.e. , mean width of skin furrows and the number of intersections were extracted by image processing algorithm. Then, the values of texture parameters were taken as inputs of SOM network to train the network. The experimental results showed that the network achieved an overall accuracy of 80.8%, compared with the aging grading results by human graders. The designed method appeared to be rapid and objective, which can be used for quantitative analysis of skin images, and automatic assessment of skin aging grading.
An exploratory sequential design to validate measures of moral emotions.
Márquez, Margarita G; Delgado, Ana R
2017-05-01
This paper presents an exploratory and sequential mixed methods approach in validating measures of knowledge of the moral emotions of contempt, anger and disgust. The sample comprised 60 participants in the qualitative phase when a measurement instrument was designed. Item stems, response options and correction keys were planned following the results obtained in a descriptive phenomenological analysis of the interviews. In the quantitative phase, the scale was used with a sample of 102 Spanish participants, and the results were analysed with the Rasch model. In the qualitative phase, salient themes included reasons, objects and action tendencies. In the quantitative phase, good psychometric properties were obtained. The model fit was adequate. However, some changes had to be made to the scale in order to improve the proportion of variance explained. Substantive and methodological im-plications of this mixed-methods study are discussed. Had the study used a single re-search method in isolation, aspects of the global understanding of contempt, anger and disgust would have been lost.
[Free crystalline silica: a comparison of methods for its determination in total dust].
Maciejewska, Aleksandra; Szadkowska-Stańczyk, Irena; Kondratowicz, Grzegorz
2005-01-01
The major objective of the study was to compare and investigate the usefulness of quantitative analyses of free crystalline silica (FCS) in the assessment of dust exposure in samples of total dust of varied composition, using three methods: chemical method in common use in Poland; infrared spectrometry; and x-ray powder diffraction. Mineral composition and FCS contents were investigated in 9 laboratory samples of raw materials, materials, and industrial wastes, containing from about 2 to over 80% of crystalline silica and reduced to particles of size corresponding with that of total dust. Sample components were identified using XRD and FT-IR methods. Ten independent determinations of FCS with each of the three study methods were performed in dust samples. An analysis of linear correlation was applied to investigate interrelationship between mean FCS determinations. In analyzed dust samples, along with silica dust there were numerous minerals interfering with silica during the quantitative analysis. Comparison of mean results of FCS determinations showed that the results obtained using the FT-IR method were by 12-13% lower than those obtained with two other methods. However, the differences observed were within the limits of changeability of results associated with their precision and dependence on reference materials used. Assessment of occupational exposure to dusts containing crystalline silica can be performed on the basis of quantitative analysis of FCS in total dusts using each of the compared methods. The FT-IR method is most appropriate for the FCS determination in samples of small amount of silica or collected at low dust concentrations; the XRD method for the analysis of multicomponent samples; and the chemical method in the case of medium and high FCS contents in samples or high concentrations of dusts in the work environment.
Yurt, Kıymet Kübra; Kivrak, Elfide Gizem; Altun, Gamze; Mohamed, Hamza; Ali, Fathelrahman; Gasmalla, Hosam Eldeen; Kaplan, Suleyman
2018-02-26
A quantitative description of a three-dimensional (3D) object based on two-dimensional images can be made using stereological methods These methods involve unbiased approaches and provide reliable results with quantitative data. The quantitative morphology of the nervous system has been thoroughly researched in this context. In particular, various novel methods such as design-based stereological approaches have been applied in neuoromorphological studies. The main foundations of these methods are systematic random sampling and a 3D approach to structures such as tissues and organs. One key point in these methods is that selected samples should represent the entire structure. Quantification of neurons, i.e. particles, is important for revealing degrees of neurodegeneration and regeneration in an organ or system. One of the most crucial morphometric parameters in biological studies is thus the "number". The disector counting method introduced by Sterio in 1984 is an efficient and reliable solution for particle number estimation. In order to obtain precise results by means of stereological analysis, counting items should be seen clearly in the tissue. If an item in the tissue cannot be seen, these cannot be analyzed even using unbiased stereological techniques. Staining and sectioning processes therefore play a critical role in stereological analysis. The purpose of this review is to evaluate current neuroscientific studies using optical and physical disector counting methods and to discuss their definitions and methodological characteristics. Although the efficiency of the optical disector method in light microscopic studies has been revealed in recent years, the physical disector method is more easily performed in electron microscopic studies. Also, we offered to readers summaries of some common basic staining and sectioning methods, which can be used for stereological techniques in this review. Copyright © 2018 Elsevier B.V. All rights reserved.
Fluorescence imaging of tryptophan and collagen cross-links to evaluate wound closure ex vivo
NASA Astrophysics Data System (ADS)
Wang, Ying; Ortega-Martinez, Antonio; Farinelli, Bill; Anderson, R. R.; Franco, Walfre
2016-02-01
Wound size is a key parameter in monitoring healing. Current methods to measure wound size are often subjective, time-consuming and marginally invasive. Recently, we developed a non-invasive, non-contact, fast and simple but robust fluorescence imaging (u-FEI) method to monitor the healing of skin wounds. This method exploits the fluorescence of native molecules to tissue as functional and structural markers. The objective of the present study is to demonstrate the feasibility of using variations in the fluorescence intensity of tryptophan and cross-links of collagen to evaluate proliferation of keratinocyte cells and quantitate size of wound during healing, respectively. Circular dermal wounds were created in ex vivo human skin and cultured in different media. Two serial fluorescence images of tryptophan and collagen cross-links were acquired every two days. Histology and immunohistology were used to validate correlation between fluorescence and epithelialization. Images of collagen cross-links show fluorescence of the exposed dermis and, hence, are a measure of wound area. Images of tryptophan show higher fluorescence intensity of proliferating keratinocytes forming new epithelium, as compared to surrounding keratinocytes not involved in epithelialization. These images are complementary since collagen cross-links report on structure while tryptophan reports on function. HE and immunohistology show that tryptophan fluorescence correlates with newly formed epidermis. We have established a fluorescence imaging method for studying epithelialization processes during wound healing in a skin organ culture model, our approach has the potential to provide a non-invasive, non-contact, quick, objective and direct method for quantitative measurements in wound healing in vivo.
Remote sensing imagery classification using multi-objective gravitational search algorithm
NASA Astrophysics Data System (ADS)
Zhang, Aizhu; Sun, Genyun; Wang, Zhenjie
2016-10-01
Simultaneous optimization of different validity measures can capture different data characteristics of remote sensing imagery (RSI) and thereby achieving high quality classification results. In this paper, two conflicting cluster validity indices, the Xie-Beni (XB) index and the fuzzy C-means (FCM) (Jm) measure, are integrated with a diversity-enhanced and memory-based multi-objective gravitational search algorithm (DMMOGSA) to present a novel multi-objective optimization based RSI classification method. In this method, the Gabor filter method is firstly implemented to extract texture features of RSI. Then, the texture features are syncretized with the spectral features to construct the spatial-spectral feature space/set of the RSI. Afterwards, cluster of the spectral-spatial feature set is carried out on the basis of the proposed method. To be specific, cluster centers are randomly generated initially. After that, the cluster centers are updated and optimized adaptively by employing the DMMOGSA. Accordingly, a set of non-dominated cluster centers are obtained. Therefore, numbers of image classification results of RSI are produced and users can pick up the most promising one according to their problem requirements. To quantitatively and qualitatively validate the effectiveness of the proposed method, the proposed classification method was applied to classifier two aerial high-resolution remote sensing imageries. The obtained classification results are compared with that produced by two single cluster validity index based and two state-of-the-art multi-objective optimization algorithms based classification results. Comparison results show that the proposed method can achieve more accurate RSI classification.
Toomey, Elaine; Matthews, James; Hurley, Deirdre A
2017-01-01
Objectives and design Despite an increasing awareness of the importance of fidelity of delivery within complex behaviour change interventions, it is often poorly assessed. This mixed methods study aimed to establish the fidelity of delivery of a complex self-management intervention and explore the reasons for these findings using a convergent/triangulation design. Setting Feasibility trial of the Self-management of Osteoarthritis and Low back pain through Activity and Skills (SOLAS) intervention (ISRCTN49875385), delivered in primary care physiotherapy. Methods and outcomes 60 SOLAS sessions were delivered across seven sites by nine physiotherapists. Fidelity of delivery of prespecified intervention components was evaluated using (1) audio-recordings (n=60), direct observations (n=24) and self-report checklists (n=60) and (2) individual interviews with physiotherapists (n=9). Quantitatively, fidelity scores were calculated using percentage means and SD of components delivered. Associations between fidelity scores and physiotherapist variables were analysed using Spearman’s correlations. Interviews were analysed using thematic analysis to explore potential reasons for fidelity scores. Integration of quantitative and qualitative data occurred at an interpretation level using triangulation. Results Quantitatively, fidelity scores were high for all assessment methods; with self-report (92.7%) consistently higher than direct observations (82.7%) or audio-recordings (81.7%). There was significant variation between physiotherapists’ individual scores (69.8% - 100%). Both qualitative and quantitative data (from physiotherapist variables) found that physiotherapists’ knowledge (Spearman’s association at p=0.003) and previous experience (p=0.008) were factors that influenced their fidelity. The qualitative data also postulated participant-level (eg, individual needs) and programme-level factors (eg, resources) as additional elements that influenced fidelity. Conclusion The intervention was delivered with high fidelity. This study contributes to the limited evidence regarding fidelity assessment methods within complex behaviour change interventions. The findings suggest a combination of quantitative methods is suitable for the assessment of fidelity of delivery. A mixed methods approach provided a more insightful understanding of fidelity and its influencing factors. Trial registration number ISRCTN49875385; Pre-results. PMID:28780544
The Basic Shelf Experience: a comprehensive evaluation.
Dewolfe, Judith A; Greaves, Gaye
2003-01-01
The Basic Shelf Experience is a program designed to assist people living on limited incomes to make better use of their food resources. The purpose of this research was to learn if the Basic Shelf Experience program helps such people to 1. utilize food resources more effectively and 2. cope, through group support, with poverty-associated stressors that influence food security. Both quantitative and qualitative methods were used to evaluate the program objectives. Participants completed a questionnaire at the beginning and end of the six-week program. The questionnaire asked about their food access, food security, and feelings about themselves. Participants returned for a focus group discussion and completed the questionnaire again three months after the program ended. The focus group was designed to elicit information about perceived changes, if any, attributed to the program. Forty-two people completed the questionnaires pre-program and 20 post-program; 17 participated in the three-month follow-up session. While results from quantitative data analysis indicate that program objectives were not met, qualitative data provide evidence that the program did achieve its stated objectives. Our results suggest such programs as the Basic Shelf Experience can assist people living on limited incomes to achieve food security.
Wang, Yiqin; Yan, Hanxia; Yan, Jianjun; Yuan, Fengyin; Xu, Zhaoxia; Liu, Guoping; Xu, Wenjie
2015-01-01
Objective. This research provides objective and quantitative parameters of the traditional Chinese medicine (TCM) pulse conditions for distinguishing between patients with the coronary heart disease (CHD) and normal people by using the proposed classification approach based on Hilbert-Huang transform (HHT) and random forest. Methods. The energy and the sample entropy features were extracted by applying the HHT to TCM pulse by treating these pulse signals as time series. By using the random forest classifier, the extracted two types of features and their combination were, respectively, used as input data to establish classification model. Results. Statistical results showed that there were significant differences in the pulse energy and sample entropy between the CHD group and the normal group. Moreover, the energy features, sample entropy features, and their combination were inputted as pulse feature vectors; the corresponding average recognition rates were 84%, 76.35%, and 90.21%, respectively. Conclusion. The proposed approach could be appropriately used to analyze pulses of patients with CHD, which can lay a foundation for research on objective and quantitative criteria on disease diagnosis or Zheng differentiation. PMID:26180536
Guo, Rui; Wang, Yiqin; Yan, Hanxia; Yan, Jianjun; Yuan, Fengyin; Xu, Zhaoxia; Liu, Guoping; Xu, Wenjie
2015-01-01
Objective. This research provides objective and quantitative parameters of the traditional Chinese medicine (TCM) pulse conditions for distinguishing between patients with the coronary heart disease (CHD) and normal people by using the proposed classification approach based on Hilbert-Huang transform (HHT) and random forest. Methods. The energy and the sample entropy features were extracted by applying the HHT to TCM pulse by treating these pulse signals as time series. By using the random forest classifier, the extracted two types of features and their combination were, respectively, used as input data to establish classification model. Results. Statistical results showed that there were significant differences in the pulse energy and sample entropy between the CHD group and the normal group. Moreover, the energy features, sample entropy features, and their combination were inputted as pulse feature vectors; the corresponding average recognition rates were 84%, 76.35%, and 90.21%, respectively. Conclusion. The proposed approach could be appropriately used to analyze pulses of patients with CHD, which can lay a foundation for research on objective and quantitative criteria on disease diagnosis or Zheng differentiation.
Non-animal approaches for toxicokinetics in risk evaluations of food chemicals.
Punt, Ans; Peijnenburg, Ad A C M; Hoogenboom, Ron L A P; Bouwmeester, Hans
2017-01-01
The objective of the present work was to review the availability and predictive value of non-animal toxicokinetic approaches and to evaluate their current use in European risk evaluations of food contaminants, additives and food contact materials, as well as pesticides and medicines. Results revealed little use of quantitative animal or human kinetic data in risk evaluations of food chemicals, compared with pesticides and medicines. Risk evaluations of medicines provided sufficient in vivo kinetic data from different species to evaluate the predictive value of animal kinetic data for humans. These data showed a relatively poor correlation between the in vivo bioavailability in rats and dogs versus that in humans. In contrast, in vitro (human) kinetic data have been demonstrated to provide adequate predictions of the fate of compounds in humans, using appropriate in vitro-in vivo scalers and by integration of in vitro kinetic data with in silico kinetic modelling. Even though in vitro kinetic data were found to be occasionally included within risk evaluations of food chemicals, particularly results from Caco-2 absorption experiments and in vitro data on gut-microbial conversions, only minor use of in vitro methods for metabolism and quantitative in vitro-in vivo extrapolation methods was identified. Yet, such quantitative predictions are essential in the development of alternatives to animal testing as well as to increase human relevance of toxicological risk evaluations. Future research should aim at further improving and validating quantitative alternative methods for kinetics, thereby increasing regulatory acceptance of non-animal kinetic data.
Computer-Based Image Analysis for Plus Disease Diagnosis in Retinopathy of Prematurity
Wittenberg, Leah A.; Jonsson, Nina J.; Chan, RV Paul; Chiang, Michael F.
2014-01-01
Presence of plus disease in retinopathy of prematurity (ROP) is an important criterion for identifying treatment-requiring ROP. Plus disease is defined by a standard published photograph selected over 20 years ago by expert consensus. However, diagnosis of plus disease has been shown to be subjective and qualitative. Computer-based image analysis, using quantitative methods, has potential to improve the objectivity of plus disease diagnosis. The objective was to review the published literature involving computer-based image analysis for ROP diagnosis. The PubMed and Cochrane library databases were searched for the keywords “retinopathy of prematurity” AND “image analysis” AND/OR “plus disease.” Reference lists of retrieved articles were searched to identify additional relevant studies. All relevant English-language studies were reviewed. There are four main computer-based systems, ROPtool (AU ROC curve, plus tortuosity 0.95, plus dilation 0.87), RISA (AU ROC curve, arteriolar TI 0.71, venular diameter 0.82), Vessel Map (AU ROC curve, arteriolar dilation 0.75, venular dilation 0.96), and CAIAR (AU ROC curve, arteriole tortuosity 0.92, venular dilation 0.91), attempting to objectively analyze vessel tortuosity and dilation in plus disease in ROP. Some of them show promise for identification of plus disease using quantitative methods. This has potential to improve the diagnosis of plus disease, and may contribute to the management of ROP using both traditional binocular indirect ophthalmoscopy and image-based telemedicine approaches. PMID:21366159
Hammer, K A; Janes, F R
1995-01-01
The objectives for developing the participative method of subject definition were to gain all the relevant information to a high level of fidelity in the earliest stages of the work and so be able to build a realistic model at reduced labour cost. In order to better integrate the two activities--information acquisition and mathematical modelling--a procedure was devised using the methods of interactive management to facilitate teamwork. This procedure provided the techniques to create suitable working relationships between the two groups, the informants and the modellers, so as to maximize their free and accurate intercommunication, both during the initial definition of the linen service and during the monitoring of the accuracy and reality of the draft models. The objectives of this project were met in that the final model was quickly validated and approved, at a low labour cost.
Badran, Hani; Pluye, Pierre; Grad, Roland
2017-03-14
The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. ©Hani Badran, Pierre Pluye, Roland Grad. Originally published in JMIR Medical Education (http://mededu.jmir.org), 14.03.2017.
Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes.
Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning
2015-08-27
This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications.
Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes
Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning
2015-01-01
This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications. PMID:26343656
Wachowiak, Roman; Strach, Bogna
2006-01-01
The study takes advantage of the presently available effective physicochemical methods (isolation, crystallization, determination of melting point, TLC, GLC and UV spectrophotometry) for an objective and reliable qualitative and quantitative analysis of frequently abused drugs. The authors determined the conditions for qualitative and quantitative analysis of active components of the secured evidence materials containing amphetamine sulphate, methylamphetamine hydrochloride, 3,4-me-tylenedioxy-methamphetamine hydrochloride (MDMA, Ecstasy), as well as delta(9)-tetrahydrocannabinol (delta(9)-THC) as an active component of cannabis (marihuana, hashish). The usefulness of physicochemical tests of evidence materials for opinionating purposes is subject to a detailed forensic toxicological interpretation.
Quantitative petri net model of gene regulated metabolic networks in the cell.
Chen, Ming; Hofestädt, Ralf
2011-01-01
A method to exploit hybrid Petri nets (HPN) for quantitatively modeling and simulating gene regulated metabolic networks is demonstrated. A global kinetic modeling strategy and Petri net modeling algorithm are applied to perform the bioprocess functioning and model analysis. With the model, the interrelations between pathway analysis and metabolic control mechanism are outlined. Diagrammatical results of the dynamics of metabolites are simulated and observed by implementing a HPN tool, Visual Object Net ++. An explanation of the observed behavior of the urea cycle is proposed to indicate possibilities for metabolic engineering and medical care. Finally, the perspective of Petri nets on modeling and simulation of metabolic networks is discussed.
Machine learning for predicting the response of breast cancer to neoadjuvant chemotherapy
Mani, Subramani; Chen, Yukun; Li, Xia; Arlinghaus, Lori; Chakravarthy, A Bapsi; Abramson, Vandana; Bhave, Sandeep R; Levy, Mia A; Xu, Hua; Yankeelov, Thomas E
2013-01-01
Objective To employ machine learning methods to predict the eventual therapeutic response of breast cancer patients after a single cycle of neoadjuvant chemotherapy (NAC). Materials and methods Quantitative dynamic contrast-enhanced MRI and diffusion-weighted MRI data were acquired on 28 patients before and after one cycle of NAC. A total of 118 semiquantitative and quantitative parameters were derived from these data and combined with 11 clinical variables. We used Bayesian logistic regression in combination with feature selection using a machine learning framework for predictive model building. Results The best predictive models using feature selection obtained an area under the curve of 0.86 and an accuracy of 0.86, with a sensitivity of 0.88 and a specificity of 0.82. Discussion With the numerous options for NAC available, development of a method to predict response early in the course of therapy is needed. Unfortunately, by the time most patients are found not to be responding, their disease may no longer be surgically resectable, and this situation could be avoided by the development of techniques to assess response earlier in the treatment regimen. The method outlined here is one possible solution to this important clinical problem. Conclusions Predictive modeling approaches based on machine learning using readily available clinical and quantitative MRI data show promise in distinguishing breast cancer responders from non-responders after the first cycle of NAC. PMID:23616206
Towards discrete wavelet transform-based human activity recognition
NASA Astrophysics Data System (ADS)
Khare, Manish; Jeon, Moongu
2017-06-01
Providing accurate recognition of human activities is a challenging problem for visual surveillance applications. In this paper, we present a simple and efficient algorithm for human activity recognition based on a wavelet transform. We adopt discrete wavelet transform (DWT) coefficients as a feature of human objects to obtain advantages of its multiresolution approach. The proposed method is tested on multiple levels of DWT. Experiments are carried out on different standard action datasets including KTH and i3D Post. The proposed method is compared with other state-of-the-art methods in terms of different quantitative performance measures. The proposed method is found to have better recognition accuracy in comparison to the state-of-the-art methods.
Video-based noncooperative iris image segmentation.
Du, Yingzi; Arslanturk, Emrah; Zhou, Zhi; Belcher, Craig
2011-02-01
In this paper, we propose a video-based noncooperative iris image segmentation scheme that incorporates a quality filter to quickly eliminate images without an eye, employs a coarse-to-fine segmentation scheme to improve the overall efficiency, uses a direct least squares fitting of ellipses method to model the deformed pupil and limbic boundaries, and develops a window gradient-based method to remove noise in the iris region. A remote iris acquisition system is set up to collect noncooperative iris video images. An objective method is used to quantitatively evaluate the accuracy of the segmentation results. The experimental results demonstrate the effectiveness of this method. The proposed method would make noncooperative iris recognition or iris surveillance possible.
Whitcombe, Anne; Cooper, Kay; Palmer, Emma
2016-06-01
The objective of this mixed methods systematic review is to examine the relationship between organizational culture and the health and wellbeing of hospital nurses, and to develop an aggregated synthesis of quantitative and qualitative systematic reviews to derive recommendations for policy and practice.Organizational culture comprises factors such as leadership, management and support, a health and safety oriented workplace climate and job characteristics.The quantitative component of this review will explore the relationship between organizational culture and the following outcomes in hospital nurses which may be indicators of health and wellbeing: work-related injury such as needlestick or sharp injuries, musculoskeletal injuries and conditions such as low back pain, burnout and general wellbeing.The qualitative component of this review will explore the perceptions of hospital nurses in relation to the impact of organizational culture on their own health and wellbeing and those of their nursing colleagues.
Design-based and model-based inference in surveys of freshwater mollusks
Dorazio, R.M.
1999-01-01
Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.
Nakano, Eri; Hata, Masayuki; Oishi, Akio; Miyamoto, Kazuaki; Uji, Akihito; Fujimoto, Masahiro; Miyata, Manabu; Yoshimura, Nagahisa
2016-08-01
The purpose was to investigate an objective and quantitative method to estimate the redness of the optic disc neuroretinal rim, and to determine the usefulness of this method to differentiate compressive optic neuropathy (CON) from glaucomatous optic neuropathy (GON). In our study there were 126 eyes: 40 with CON, 40 with normal tension glaucoma (NTG), and 46 normal eyes (NOR). Digital color fundus photographs were assessed for the redness of disc rim color using ImageJ software. We separately measured the intensity of red, green, and blue pixels from RGB images. Three disc color indices (DCIs), which indicate the redness intensity, were calculated through existing formulas. All three DCIs of CON were significantly smaller than those of NOR (P < 0.001). In addition, when compared with NTG, DCIs were also significantly smaller in CON (P < 0.05). A comparison of mild CON and mild NTG (mean deviation (MD) > -6 dB), in which the extent of retinal nerve fiber layer thinning is comparable, the DCIs of mild CON were significantly smaller than those of mild NTG (P < 0.05). In contrast, DCIs did not differ between moderate-to-severe stages of CON and NTG (MD ≤ -6 dB), though the retinal nerve fibers of CON were more severely damaged than those of NTG. To differentiate between mild CON and mild NTG, all AUROCs for the three DCIs were above 0.700. A quantitative and objective assessment of optic disc color was useful in differentiating early-stage CON from GON and NOR.
Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza
2014-05-22
Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher sensitivity and specificity for the diagnosis of CR-BSIs in newborns when compared to the quantitative technique. In addition, this method is easier to perform and shows better agreement with the gold standard, and should therefore be recommended for routine clinical laboratory use. PFGE may contribute to the control of CR-BSIs by identifying clusters of microorganisms in neonatal ICUs, providing a means of determining potential cross-infection between patients.
A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.
Rong, Xing; Frey, Eric C
2013-08-01
Post-therapy quantitative 90Y bremsstrahlung single photon emission computed tomography (SPECT) has shown great potential to provide reliable activity estimates, which are essential for dose verification. Typically 90Y imaging is performed with high- or medium-energy collimators. However, the energy spectrum of 90Y bremsstrahlung photons is substantially different than typical for these collimators. In addition, dosimetry requires quantitative images, and collimators are not typically optimized for such tasks. Optimizing a collimator for 90Y imaging is both novel and potentially important. Conventional optimization methods are not appropriate for 90Y bremsstrahlung photons, which have a continuous and broad energy distribution. In this work, the authors developed a parallel-hole collimator optimization method for quantitative tasks that is particularly applicable to radionuclides with complex emission energy spectra. The authors applied the proposed method to develop an optimal collimator for quantitative 90Y bremsstrahlung SPECT in the context of microsphere radioembolization. To account for the effects of the collimator on both the bias and the variance of the activity estimates, the authors used the root mean squared error (RMSE) of the volume of interest activity estimates as the figure of merit (FOM). In the FOM, the bias due to the null space of the image formation process was taken in account. The RMSE was weighted by the inverse mass to reflect the application to dosimetry; for a different application, more relevant weighting could easily be adopted. The authors proposed a parameterization for the collimator that facilitates the incorporation of the important factors (geometric sensitivity, geometric resolution, and septal penetration fraction) determining collimator performance, while keeping the number of free parameters describing the collimator small (i.e., two parameters). To make the optimization results for quantitative 90Y bremsstrahlung SPECT more general, the authors simulated multiple tumors of various sizes in the liver. The authors realistically simulated human anatomy using a digital phantom and the image formation process using a previously validated and computationally efficient method for modeling the image-degrading effects including object scatter, attenuation, and the full collimator-detector response (CDR). The scatter kernels and CDR function tables used in the modeling method were generated using a previously validated Monte Carlo simulation code. The hole length, hole diameter, and septal thickness of the obtained optimal collimator were 84, 3.5, and 1.4 mm, respectively. Compared to a commercial high-energy general-purpose collimator, the optimal collimator improved the resolution and FOM by 27% and 18%, respectively. The proposed collimator optimization method may be useful for improving quantitative SPECT imaging for radionuclides with complex energy spectra. The obtained optimal collimator provided a substantial improvement in quantitative performance for the microsphere radioembolization task considered.
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Randhawa, Parmjeet S; Farasati, Noush A; Huang, Yuchen; Mapara, Markus Y; Shapiro, Ron
2010-12-01
Our objective was to determine whether quantitative polymerase chain reaction (PCR) can be used to measure the effect of tyrosine kinase (TK) inhibition on polyomavirus BK (BKV) replication. The BKV was grown in a cell culture system. The rate of viral replication in the presence or absence of the drug being tested was assessed by amplifying the viral genome using primers directed against the viral capsid 1 protein. Dasatinib, erlotinib, gefitinib, imatinib, sunitinib, and sorafenib all showed antiviral activity at micromolar concentrations. The 50% effective concentration for erlotinib and sorafenib was within blood concentrations readily achieved in human subjects. Quantitative PCR is a convenient method for viral drug sensitivity testing for slow-growing viruses that do not readily produce cytopathic effect. TK inhibitors deserve further consideration as a potential therapeutic option for BKV-associated nephropathy and hemorrhagic cystitis.
QPI for prostate cancer diagnosis: quantitative separation of Gleason grades 3 and 4
NASA Astrophysics Data System (ADS)
Sridharan, Shamira; Macias, Virgilia; Tangella, Krishnarao; Kajdacsy-Balla, Andre; Popescu, Gabriel
2015-03-01
1 in 7 men receive a diagnosis of prostate cancer in their lifetime. The aggressiveness of the treatment plan adopted by the patient is strongly influenced by Gleason grade. Gleason grade is determined by the pathologist based on the level of glandular formation and complexity seen in the patient's biopsy. However, studies have shown that the disagreement rate between pathologists on Gleason grades 3 and 4 is high and this affects treatment options. We used quantitative phase imaging to develop an objective method for Gleason grading. Using the glandular solidity, which is the ratio of the area of the gland to a convex hull fit around it, and anisotropy of light scattered from the stroma immediately adjoining the gland, we were able to quantitatively separate Gleason grades 3 and 4 with 81% accuracy in 43 cases marked as difficult by pathologists.
Portable smartphone based quantitative phase microscope
NASA Astrophysics Data System (ADS)
Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu
2018-01-01
To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.
Mari, João Fernando; Saito, José Hiroki; Neves, Amanda Ferreira; Lotufo, Celina Monteiro da Cruz; Destro-Filho, João-Batista; Nicoletti, Maria do Carmo
2015-12-01
Microelectrode Arrays (MEA) are devices for long term electrophysiological recording of extracellular spontaneous or evocated activities on in vitro neuron culture. This work proposes and develops a framework for quantitative and morphological analysis of neuron cultures on MEAs, by processing their corresponding images, acquired by fluorescence microscopy. The neurons are segmented from the fluorescence channel images using a combination of segmentation by thresholding, watershed transform, and object classification. The positioning of microelectrodes is obtained from the transmitted light channel images using the circular Hough transform. The proposed method was applied to images of dissociated culture of rat dorsal root ganglion (DRG) neuronal cells. The morphological and topological quantitative analysis carried out produced information regarding the state of culture, such as population count, neuron-to-neuron and neuron-to-microelectrode distances, soma morphologies, neuron sizes, neuron and microelectrode spatial distributions. Most of the analysis of microscopy images taken from neuronal cultures on MEA only consider simple qualitative analysis. Also, the proposed framework aims to standardize the image processing and to compute quantitative useful measures for integrated image-signal studies and further computational simulations. As results show, the implemented microelectrode identification method is robust and so are the implemented neuron segmentation and classification one (with a correct segmentation rate up to 84%). The quantitative information retrieved by the method is highly relevant to assist the integrated signal-image study of recorded electrophysiological signals as well as the physical aspects of the neuron culture on MEA. Although the experiments deal with DRG cell images, cortical and hippocampal cell images could also be processed with small adjustments in the image processing parameter estimation.
Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao
2015-01-01
Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Demidenko, Natalia V; Penin, Aleksey A
2012-01-01
qRT-PCR is a generally acknowledged method for gene expression analysis due to its precision and reproducibility. However, it is well known that the accuracy of qRT-PCR data varies greatly depending on the experimental design and data analysis. Recently, a set of guidelines has been proposed that aims to improve the reliability of qRT-PCR. However, there are additional factors that have not been taken into consideration in these guidelines that can seriously affect the data obtained using this method. In this study, we report the influence that object morphology can have on qRT-PCR data. We have used a number of Arabidopsis thaliana mutants with altered floral morphology as models for this study. These mutants have been well characterised (including in terms of gene expression levels and patterns) by other techniques. This allows us to compare the results from the qRT-PCR with the results inferred from other methods. We demonstrate that the comparison of gene expression levels in objects that differ greatly in their morphology can lead to erroneous results.
Elovic, Elie P; Simone, Lisa K; Zafonte, Ross
2004-01-01
The objective of this article was to (1) review the engineering and medical literature to structure the available information concerning the assessment of spasticity in the neurological population; (2) to discuss the strengths and weaknesses of the different methods currently in use in spasticity assessment; and (3) make recommendations for future efforts in spasticity outcome assessment. Spasticity textbooks, Web sites, and OVID, IEEE, and Medline searches from 1966 through 2003 of spasticity, quantitative measure, or outcome assessment in the rehabilitation population were used as data sources. Over 500 articles were reviewed. Articles that discussed outcome measures used to assess interventions and evaluation of spasticity were included. Authors reviewed the articles looking at inclusion criteria, data collection, methodology, assessment methods, and conclusions for validity and relevance to this article. Issues such as clinical relevance, real-world function and lack of objectivity, and time consumed during performance are important issues for spasticity assessment. Some measures such as the Ashworth Scale remain in common use secondary to ease of use despite their obvious functional limitations. More functional outcome goals are plagued by being more time consuming and a general inability to demonstrate changes after an intervention. This may be secondary to the other factors that combine with spasticity to cause dysfunction at that level. Quantitative metrics can provide more objective measurements but their clinical relevance is sometimes problematic. The assessment of spasticity outcome is still somewhat problematic. Further work is necessary to develop measures that have real-world functional significance to both the individuals being treated and the clinicians. A lack of objectivity is still a problem. In the future it is important for clinicians and the engineers to work together in the development of better outcome measures.
Investigation of Portevin-Le Chatelier band with temporal phase analysis of speckle interferometry
NASA Astrophysics Data System (ADS)
Jiang, Zhenyu; Zhang, Qingchuan; Wu, Xiaoping
2003-04-01
A new method combining temporal phase analysis with dynamic digital speckle pattern interferometry is proposed to study Portevin-Le Chatelier effect quantitatively. The principle bases on that the phase difference of interference speckle patterns is a time-dependent function related to the object deformation. The interference speckle patterns of specimen are recorded with high sampling rate while PLC effect occurs, and the 2D displacement map of PLC band and its width are obtained by analyzing the displacement of specimen with proposed method.
Advanced wave field sensing using computational shear interferometry
NASA Astrophysics Data System (ADS)
Falldorf, Claas; Agour, Mostafa; Bergmann, Ralf B.
2014-07-01
In this publication we give a brief introduction into the field of Computational Shear Interferometry (CoSI), which allows for determining arbitrary wave fields from a set of shear interferograms. We discuss limitations of the method with respect to the coherence of the underlying wave field and present various numerical methods to recover it from its sheared representations. Finally, we show experimental results on Digital Holography of objects with rough surface using a fiber coupled light emitting diode and quantitative phase contrast imaging as well as numerical refocusing in Differential Interference Contrast (DIC) microscopy.
Stoyanova, Detelina; Algee-Hewitt, Bridget F B; Slice, Dennis E
2015-11-01
The pubic symphysis is frequently used to estimate age-at-death from the adult skeleton. Assessment methods require the visual comparison of the bone morphology against age-informative characteristics that represent a series of phases. Age-at-death is then estimated from the age-range previously associated with the chosen phase. While easily executed, the "morphoscopic" process of feature-scoring and bone-to-phase-matching is known to be subjective. Studies of method and practitioner error demonstrate a need for alternative tools to quantify age-progressive change in the pubic symphysis. This article proposes a more objective, quantitative method that analyzes three-dimensional (3D) surface scans of the pubic symphysis using a thin plate spline algorithm (TPS). This algorithm models the bending of a flat plane to approximately match the surface of the bone and minimizes the bending energy required for this transformation. Known age-at-death and bending energy were used to construct a linear model to predict age from observed bending energy. This approach is tested with scans from 44 documented white male skeletons and 12 casts. The results of the surface analysis show a significant association (regression p-value = 0.0002 and coefficient of determination = 0.2270) between the minimum bending energy and age-at-death, with a root mean square error of ≈19 years. This TPS method yields estimates comparable to established methods but offers a fully integrated, objective and quantitative framework of analysis and has potential for use in archaeological and forensic casework. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Viswanath, Satish; Tiwari, Pallavi; Rosen, Mark; Madabhushi, Anant
2008-03-01
Recently, in vivo Magnetic Resonance Imaging (MRI) and Magnetic Resonance Spectroscopy (MRS) have emerged as promising new modalities to aid in prostate cancer (CaP) detection. MRI provides anatomic and structural information of the prostate while MRS provides functional data pertaining to biochemical concentrations of metabolites such as creatine, choline and citrate. We have previously presented a hierarchical clustering scheme for CaP detection on in vivo prostate MRS and have recently developed a computer-aided method for CaP detection on in vivo prostate MRI. In this paper we present a novel scheme to develop a meta-classifier to detect CaP in vivo via quantitative integration of multimodal prostate MRS and MRI by use of non-linear dimensionality reduction (NLDR) methods including spectral clustering and locally linear embedding (LLE). Quantitative integration of multimodal image data (MRI and PET) involves the concatenation of image intensities following image registration. However multimodal data integration is non-trivial when the individual modalities include spectral and image intensity data. We propose a data combination solution wherein we project the feature spaces (image intensities and spectral data) associated with each of the modalities into a lower dimensional embedding space via NLDR. NLDR methods preserve the relationships between the objects in the original high dimensional space when projecting them into the reduced low dimensional space. Since the original spectral and image intensity data are divorced from their original physical meaning in the reduced dimensional space, data at the same spatial location can be integrated by concatenating the respective embedding vectors. Unsupervised consensus clustering is then used to partition objects into different classes in the combined MRS and MRI embedding space. Quantitative results of our multimodal computer-aided diagnosis scheme on 16 sets of patient data obtained from the ACRIN trial, for which corresponding histological ground truth for spatial extent of CaP is known, show a marginally higher sensitivity, specificity, and positive predictive value compared to corresponding CAD results with the individual modalities.
Digital learning objects in nursing consultation: technology assessment by undergraduate students.
Silveira, DeniseTolfo; Catalan, Vanessa Menezes; Neutzling, Agnes Ludwig; Martinato, Luísa Helena Machado
2010-01-01
This study followed the teaching-learning process about the nursing consultation, based on digital learning objects developed through the active Problem Based Learning method. The goals were to evaluate the digital learning objects about nursing consultation, develop cognitive skills on the subject using problem based learning and identify the students' opinions on the use of technology. This is an exploratory and descriptive study with a quantitative approach. The sample consisted of 71 students in the sixth period of the nursing program at the Federal University of Rio Grande do Sul. The data was collected through a questionnaire to evaluate the learning objects. The results showed positive agreement (58%) on the content, usability and didactics of the proposed computer-mediated activity regarding the nursing consultation. The application of materials to the students is considered positive.
Intensity-based segmentation and visualization of cells in 3D microscopic images using the GPU
NASA Astrophysics Data System (ADS)
Kang, Mi-Sun; Lee, Jeong-Eom; Jeon, Woong-ki; Choi, Heung-Kook; Kim, Myoung-Hee
2013-02-01
3D microscopy images contain abundant astronomical data, rendering 3D microscopy image processing time-consuming and laborious on a central processing unit (CPU). To solve these problems, many people crop a region of interest (ROI) of the input image to a small size. Although this reduces cost and time, there are drawbacks at the image processing level, e.g., the selected ROI strongly depends on the user and there is a loss in original image information. To mitigate these problems, we developed a 3D microscopy image processing tool on a graphics processing unit (GPU). Our tool provides efficient and various automatic thresholding methods to achieve intensity-based segmentation of 3D microscopy images. Users can select the algorithm to be applied. Further, the image processing tool provides visualization of segmented volume data and can set the scale, transportation, etc. using a keyboard and mouse. However, the 3D objects visualized fast still need to be analyzed to obtain information for biologists. To analyze 3D microscopic images, we need quantitative data of the images. Therefore, we label the segmented 3D objects within all 3D microscopic images and obtain quantitative information on each labeled object. This information can use the classification feature. A user can select the object to be analyzed. Our tool allows the selected object to be displayed on a new window, and hence, more details of the object can be observed. Finally, we validate the effectiveness of our tool by comparing the CPU and GPU processing times by matching the specification and configuration.
Statistical procedures for evaluating daily and monthly hydrologic model predictions
Coffey, M.E.; Workman, S.R.; Taraba, J.L.; Fogle, A.W.
2004-01-01
The overall study objective was to evaluate the applicability of different qualitative and quantitative methods for comparing daily and monthly SWAT computer model hydrologic streamflow predictions to observed data, and to recommend statistical methods for use in future model evaluations. Statistical methods were tested using daily streamflows and monthly equivalent runoff depths. The statistical techniques included linear regression, Nash-Sutcliffe efficiency, nonparametric tests, t-test, objective functions, autocorrelation, and cross-correlation. None of the methods specifically applied to the non-normal distribution and dependence between data points for the daily predicted and observed data. Of the tested methods, median objective functions, sign test, autocorrelation, and cross-correlation were most applicable for the daily data. The robust coefficient of determination (CD*) and robust modeling efficiency (EF*) objective functions were the preferred methods for daily model results due to the ease of comparing these values with a fixed ideal reference value of one. Predicted and observed monthly totals were more normally distributed, and there was less dependence between individual monthly totals than was observed for the corresponding predicted and observed daily values. More statistical methods were available for comparing SWAT model-predicted and observed monthly totals. The 1995 monthly SWAT model predictions and observed data had a regression Rr2 of 0.70, a Nash-Sutcliffe efficiency of 0.41, and the t-test failed to reject the equal data means hypothesis. The Nash-Sutcliffe coefficient and the R r2 coefficient were the preferred methods for monthly results due to the ability to compare these coefficients to a set ideal value of one.
ERIC Educational Resources Information Center
Tuzlak, Kadir
2016-01-01
The overall objective of this study is to increase the awareness of secondary school students of the effects of pollen allergy on human health by mapping allergic pollens appearing in Burdur atmosphere. This study is a pre-test and post-test experimental design. Mix method is applied thus. Both qualitative and quantitative data are gathered. The…
Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo
2014-01-01
This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626
Context-specific metabolic networks are consistent with experiments.
Becker, Scott A; Palsson, Bernhard O
2008-05-16
Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME) to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.
Liu, L; Kan, A; Leckie, C; Hodgkin, P D
2017-04-01
Time-lapse fluorescence microscopy is a valuable technology in cell biology, but it suffers from the inherent problem of intensity inhomogeneity due to uneven illumination or camera nonlinearity, known as shading artefacts. This will lead to inaccurate estimates of single-cell features such as average and total intensity. Numerous shading correction methods have been proposed to remove this effect. In order to compare the performance of different methods, many quantitative performance measures have been developed. However, there is little discussion about which performance measure should be generally applied for evaluation on real data, where the ground truth is absent. In this paper, the state-of-the-art shading correction methods and performance evaluation methods are reviewed. We implement 10 popular shading correction methods on two artificial datasets and four real ones. In order to make an objective comparison between those methods, we employ a number of quantitative performance measures. Extensive validation demonstrates that the coefficient of joint variation (CJV) is the most applicable measure in time-lapse fluorescence images. Based on this measure, we have proposed a novel shading correction method that performs better compared to well-established methods for a range of real data tested. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Assessment methods for the evaluation of vitiligo.
Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K
2012-12-01
There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials. © 2012 The Authors. Journal of the European Academy of Dermatology and Venereology © 2012 European Academy of Dermatology and Venereology.
Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.
2014-01-01
Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139
Automatic trajectory measurement of large numbers of crowded objects
NASA Astrophysics Data System (ADS)
Li, Hui; Liu, Ye; Chen, Yan Qiu
2013-06-01
Complex motion patterns of natural systems, such as fish schools, bird flocks, and cell groups, have attracted great attention from scientists for years. Trajectory measurement of individuals is vital for quantitative and high-throughput study of their collective behaviors. However, such data are rare mainly due to the challenges of detection and tracking of large numbers of objects with similar visual features and frequent occlusions. We present an automatic and effective framework to measure trajectories of large numbers of crowded oval-shaped objects, such as fish and cells. We first use a novel dual ellipse locator to detect the coarse position of each individual and then propose a variance minimization active contour method to obtain the optimal segmentation results. For tracking, cost matrix of assignment between consecutive frames is trainable via a random forest classifier with many spatial, texture, and shape features. The optimal trajectories are found for the whole image sequence by solving two linear assignment problems. We evaluate the proposed method on many challenging data sets.
Discomfort Evaluation of Truck Ingress/Egress Motions Based on Biomechanical Analysis
Choi, Nam-Chul; Lee, Sang Hun
2015-01-01
This paper presents a quantitative discomfort evaluation method based on biomechanical analysis results for human body movement, as well as its application to an assessment of the discomfort for truck ingress and egress. In this study, the motions of a human subject entering and exiting truck cabins with different types, numbers, and heights of footsteps were first measured using an optical motion capture system and load sensors. Next, the maximum voluntary contraction (MVC) ratios of the muscles were calculated through a biomechanical analysis of the musculoskeletal human model for the captured motion. Finally, the objective discomfort was evaluated using the proposed discomfort model based on the MVC ratios. To validate this new discomfort assessment method, human subject experiments were performed to investigate the subjective discomfort levels through a questionnaire for comparison with the objective discomfort levels. The validation results showed that the correlation between the objective and subjective discomforts was significant and could be described by a linear regression model. PMID:26067194
Latychevskaia, T; Chushkin, Y; Fink, H-W
2016-10-01
In coherent diffractive imaging, the resolution of the reconstructed object is limited by the numerical aperture of the experimental setup. We present here a theoretical and numerical study for achieving super-resolution by postextrapolation of coherent diffraction images, such as diffraction patterns or holograms. We demonstrate that a diffraction pattern can unambiguously be extrapolated from only a fraction of the entire pattern and that the ratio of the extrapolated signal to the originally available signal is linearly proportional to the oversampling ratio. Although there could be in principle other methods to achieve extrapolation, we devote our discussion to employing iterative phase retrieval methods and demonstrate their limits. We present two numerical studies; namely, the extrapolation of diffraction patterns of nonbinary and that of phase objects together with a discussion of the optimal extrapolation procedure. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Frame sequences analysis technique of linear objects movement
NASA Astrophysics Data System (ADS)
Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.
2017-12-01
Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.
High-Throughput Quantitation of Neonicotinoids in Lyophilized Surface Water by LC-APCI-MS/MS.
Morrison, Lucas M; Renaud, Justin B; Sabourin, Lyne; Sumarah, Mark W; Yeung, Ken K C; Lapen, David R
2018-05-21
Background : Neonicotinoids are among the most widely used insecticides. Recently, there has been concern associated with unintended adverse effects on honeybees and aquatic invertebrates at low parts-per-trillion levels. Objective : There is a need for LC-MS/MS methods that are capable of high-throughput measurements of the most widely used neonicotinoids at environmentally relevant concentrations in surface water. Methods : This method allows for quantitation of acetamiprid, clothianidin, imidacloprid, dinotefuran, nitenpyram, thiacloprid, and thiamethoxam in surface water. Deuterated internal standards are added to 20 mL environmental samples, which are concentrated by lyophilisation and reconstituted with methanol followed by acetonitrile. Results : A large variation of mean recovery efficiencies across five different surface water sampling sites within this study was observed, ranging from 45 to 74%. This demonstrated the need for labelled internal standards to compensate for these differences. Atmospheric pressure chemical ionization (APCI) performed better than electrospray ionization (ESI) with limited matrix suppression, achieving 71-110% of the laboratory fortified blank signal. Neonicotinoids were resolved on a C18 column using a 5 min LC method, in which MQL ranged between 0.93 and 4.88 ng/L. Conclusions : This method enables cost effective, accurate, and reproducible monitoring of these pesticides in the aquatic environment. Highlights : Lyophilization is used for high throughput concentration of neonicotinoids in surface water. Variations in matrix effects between samples was greatly reduced by using APCI compared with ESI. Clothianidin and thiamethoxam were detected in all samples with levels ranging from below method quantitation limit to 65 ng/L.
Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T
2016-01-01
The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.
Leaner and greener analysis of cannabinoids.
Mudge, Elizabeth M; Murch, Susan J; Brown, Paula N
2017-05-01
There is an explosion in the number of labs analyzing cannabinoids in marijuana (Cannabis sativa L., Cannabaceae) but existing methods are inefficient, require expert analysts, and use large volumes of potentially environmentally damaging solvents. The objective of this work was to develop and validate an accurate method for analyzing cannabinoids in cannabis raw materials and finished products that is more efficient and uses fewer toxic solvents. An HPLC-DAD method was developed for eight cannabinoids in cannabis flowers and oils using a statistically guided optimization plan based on the principles of green chemistry. A single-laboratory validation determined the linearity, selectivity, accuracy, repeatability, intermediate precision, limit of detection, and limit of quantitation of the method. Amounts of individual cannabinoids above the limit of quantitation in the flowers ranged from 0.02 to 14.9% w/w, with repeatability ranging from 0.78 to 10.08% relative standard deviation. The intermediate precision determined using HorRat ratios ranged from 0.3 to 2.0. The LOQs for individual cannabinoids in flowers ranged from 0.02 to 0.17% w/w. This is a significant improvement over previous methods and is suitable for a wide range of applications including regulatory compliance, clinical studies, direct patient medical services, and commercial suppliers.
Pirat, Bahar; Khoury, Dirar S.; Hartley, Craig J.; Tiller, Les; Rao, Liyun; Schulz, Daryl G.; Nagueh, Sherif F.; Zoghbi, William A.
2012-01-01
Objectives The aim of this study was to validate a novel, angle-independent, feature-tracking method for the echocardiographic quantitation of regional function. Background A new echocardiographic method, Velocity Vector Imaging (VVI) (syngo Velocity Vector Imaging technology, Siemens Medical Solutions, Ultrasound Division, Mountain View, California), has been introduced, based on feature tracking—incorporating speckle and endocardial border tracking, that allows the quantitation of endocardial strain, strain rate (SR), and velocity. Methods Seven dogs were studied during baseline, and various interventions causing alterations in regional function: dobutamine, 5-min coronary occlusion with reperfusion up to 1 h, followed by dobutamine and esmolol infusions. Echocardiographic images were acquired from short- and long-axis views of the left ventricle. Segment-length sonomicrometry crystals were used as the reference method. Results Changes in systolic strain in ischemic segments were tracked well with VVI during the different states of regional function. There was a good correlation between circumferential and longitudinal systolic strain by VVI and sonomicrometry (r = 0.88 and r = 0.83, respectively, p < 0.001). Strain measurements in the nonischemic basal segments also demonstrated a significant correlation between the 2 methods (r = 0.65, p < 0.001). Similarly, a significant relation was observed for circumferential and longitudinal SR between the 2 methods (r = 0.94, p < 0.001 and r = 0.90, p < 0.001, respectively). The endocardial velocity relation to changes in strain by sonomicrometry was weaker owing to significant cardiac translation. Conclusions Velocity Vector Imaging, a new feature-tracking method, can accurately assess regional myocardial function at the endocardial level and is a promising clinical tool for the simultaneous quantification of regional and global myocardial function. PMID:18261685
SU-F-207-16: CT Protocols Optimization Using Model Observer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tseng, H; Fan, J; Kupinski, M
2015-06-15
Purpose: To quantitatively evaluate the performance of different CT protocols using task-based measures of image quality. This work studies the task of size and the contrast estimation of different iodine concentration rods inserted in head- and body-sized phantoms using different imaging protocols. These protocols are designed to have the same dose level (CTDIvol) but using different X-ray tube voltage settings (kVp). Methods: Different concentrations of iodine objects inserted in a head size phantom and a body size phantom are imaged on a 64-slice commercial CT scanner. Scanning protocols with various tube voltages (80, 100, and 120 kVp) and current settingsmore » are selected, which output the same absorbed dose level (CTDIvol). Because the phantom design (size of the iodine objects, the air gap between the inserted objects and the phantom) is not ideal for a model observer study, the acquired CT images are used to generate simulation images with four different sizes and five different contracts iodine objects. For each type of the objects, 500 images (100 x 100 pixels) are generated for the observer study. The observer selected in this study is the channelized scanning linear observer which could be applied to estimate the size and the contrast. The figure of merit used is the correct estimation ratio. The mean and the variance are estimated by the shuffle method. Results: The results indicate that the protocols with 100 kVp tube voltage setting provides the best performance for iodine insert size and contrast estimation for both head and body phantom cases. Conclusion: This work presents a practical and robust quantitative approach using channelized scanning linear observer to study contrast and size estimation performance from different CT protocols. Different protocols at same CTDIvol setting could Result in different image quality performance. The relationship between the absorbed dose and the diagnostic image quality is not linear.« less
Integrating planning perception and action for informed object search.
Manso, Luis J; Gutierrez, Marco A; Bustos, Pablo; Bachiller, Pilar
2018-05-01
This paper presents a method to reduce the time spent by a robot with cognitive abilities when looking for objects in unknown locations. It describes how machine learning techniques can be used to decide which places should be inspected first, based on images that the robot acquires passively. The proposal is composed of two concurrent processes. The first one uses the aforementioned images to generate a description of the types of objects found in each object container seen by the robot. This is done passively, regardless of the task being performed. The containers can be tables, boxes, shelves or any other kind of container of known shape whose contents can be seen from a distance. The second process uses the previously computed estimation of the contents of the containers to decide which is the most likely container having the object to be found. This second process is deliberative and takes place only when the robot needs to find an object, whether because it is explicitly asked to locate one or because it is needed as a step to fulfil the mission of the robot. Upon failure to guess the right container, the robot can continue making guesses until the object is found. Guesses are made based on the semantic distance between the object to find and the description of the types of the objects found in each object container. The paper provides quantitative results comparing the efficiency of the proposed method and two base approaches.
A new approach to the identification of Landscape Quality Objectives (LQOs) as a set of indicators.
Sowińska-Świerkosz, Barbara Natalia; Chmielewski, Tadeusz J
2016-12-15
The objective of the paper is threefold: (1) to introduce Landscape Quality Objectives (LQOs) as a set of indicators; (2) to present a method of linking social and expert opinion in the process of the formulation of landscape indicators; and (3) to present a methodological framework for the identification of LQOs. The implementation of these goals adopted a six-stage procedure based on the use of landscape units: (1) GIS analysis; (2) classification; (3) social survey; (4) expert value judgement; (5) quality assessment; and (6) guidelines formulation. The essence of the research was the presentation of features that determine landscape quality according to public opinion as a set of indicators. The results showed that 80 such indicators were identified, of both a qualitative (49) and a quantitative character (31). Among the analysed units, 60% (18 objects) featured socially expected (and confirmed by experts) levels of landscape quality, and 20% (6 objects) required overall quality improvement in terms of both public and expert opinion. The adopted procedure provides a new tool for integrating social responsibility into environmental management. The advantage of the presented method is the possibility of its application in the territories of various European countries. It is flexible enough to be based on cartographic studies, landscape research methods, and environmental quality standards existing in a given country. Copyright © 2016 Elsevier Ltd. All rights reserved.
Quantitative phenotyping of X-disease resistance in chokecherry using real-time PCR.
Huang, Danqiong; Walla, James A; Dai, Wenhao
2014-03-01
A quantitative real-time SYBR Green PCR (qPCR) assay has been developed to detect and quantify X-disease phytoplasmas in chokecherry. An X-disease phytoplasma-specific and high sensitivity primer pair was designed based on the 16S rRNA gene sequence of X-disease phytoplasmas. This primer pair was specific to the 16SrIII group (X-disease) phytoplasmas. The qPCR method can quantify phytoplasmas from a DNA mix (a mix of both chokecherry and X-disease phytoplasma DNA) at as low as 0.001 ng, 10-fold lower than conventional PCR using the same primer pair. A significant correlation between the copy number of phytoplasmas and visual phenotypic rating scores of X-disease resistance in chokecherry plants was observed. Disease resistant chokecherries had a significantly lower titer of X-disease phytoplasmas than susceptible plants. This suggests that the qPCR assay provides a more objective tool to phenotype phytoplasma disease severity, particularly for early evaluation of host resistance; therefore, this method will facilitate quantitative phenotyping of disease resistance and has great potential in enhancing plant breeding. Copyright © 2013 Elsevier B.V. All rights reserved.
A two-factor error model for quantitative steganalysis
NASA Astrophysics Data System (ADS)
Böhme, Rainer; Ker, Andrew D.
2006-02-01
Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.
Validation of Greyscale-Based Quantitative Ultrasound in Manual Wheelchair Users
Collinger, Jennifer L.; Fullerton, Bradley; Impink, Bradley G.; Koontz, Alicia M.; Boninger, Michael L.
2010-01-01
Objective The primary aim of this study is to establish the validity of greyscale-based quantitative ultrasound (QUS) measures of the biceps and supraspinatus tendons. Design Nine QUS measures of the biceps and supraspinatus tendons were computed from ultrasound images collected from sixty-seven manual wheelchair users. Shoulder pathology was measured using questionnaires, physical examination maneuvers, and a clinical ultrasound grading scale. Results Increased age, duration of wheelchair use, and body mass correlated with a darker, more homogenous tendon appearance. Subjects with pain during physical examination tests for biceps tenderness and acromioclavicular joint tenderness exhibited significantly different supraspinatus QUS values. Even when controlling for tendon depth, QUS measures of the biceps tendon differed significantly between subjects with healthy tendons, mild tendinosis, and severe tendinosis. Clinical grading of supraspinatus tendon health was correlated with QUS measures of the supraspinatus tendon. Conclusions Quantitative ultrasound is valid method to quantify tendinopathy and may allow for early detection of tendinosis. Manual wheelchair users are at a high risk for developing shoulder tendon pathology and may benefit from quantitative ultrasound-based research that focuses on identifying interventions designed to reduce this risk. PMID:20407304
Fuzzy Performance between Surface Fitting and Energy Distribution in Turbulence Runner
Liang, Zhongwei; Liu, Xiaochu; Ye, Bangyan; Brauwer, Richard Kars
2012-01-01
Because the application of surface fitting algorithms exerts a considerable fuzzy influence on the mathematical features of kinetic energy distribution, their relation mechanism in different external conditional parameters must be quantitatively analyzed. Through determining the kinetic energy value of each selected representative position coordinate point by calculating kinetic energy parameters, several typical algorithms of complicated surface fitting are applied for constructing microkinetic energy distribution surface models in the objective turbulence runner with those obtained kinetic energy values. On the base of calculating the newly proposed mathematical features, we construct fuzzy evaluation data sequence and present a new three-dimensional fuzzy quantitative evaluation method; then the value change tendencies of kinetic energy distribution surface features can be clearly quantified, and the fuzzy performance mechanism discipline between the performance results of surface fitting algorithms, the spatial features of turbulence kinetic energy distribution surface, and their respective environmental parameter conditions can be quantitatively analyzed in detail, which results in the acquirement of final conclusions concerning the inherent turbulence kinetic energy distribution performance mechanism and its mathematical relation. A further turbulence energy quantitative study can be ensured. PMID:23213287
Hansen, Matthew; O'Brien, Kerth; Meckler, Garth; Chang, Anna Marie; Guise, Jeanne-Marie
2016-07-01
Mixed methods research has significant potential to broaden the scope of emergency care and specifically emergency medical services investigation. Mixed methods studies involve the coordinated use of qualitative and quantitative research approaches to gain a fuller understanding of practice. By combining what is learnt from multiple methods, these approaches can help to characterise complex healthcare systems, identify the mechanisms of complex problems such as medical errors and understand aspects of human interaction such as communication, behaviour and team performance. Mixed methods approaches may be particularly useful for out-of-hospital care researchers because care is provided in complex systems where equipment, interpersonal interactions, societal norms, environment and other factors influence patient outcomes. The overall objectives of this paper are to (1) introduce the fundamental concepts and approaches of mixed methods research and (2) describe the interrelation and complementary features of the quantitative and qualitative components of mixed methods studies using specific examples from the Children's Safety Initiative-Emergency Medical Services (CSI-EMS), a large National Institutes of Health-funded research project conducted in the USA. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Anguera, M. Teresa; Camerino, Oleguer; Castañer, Marta; Sánchez-Algarra, Pedro; Onwuegbuzie, Anthony J.
2017-01-01
Mixed methods studies are been increasingly applied to a diversity of fields. In this paper, we discuss the growing use—and enormous potential—of mixed methods research in the field of sport and physical activity. A second aim is to contribute to strengthening the characteristics of mixed methods research by showing how systematic observation offers rigor within a flexible framework that can be applied to a wide range of situations. Observational methodology is characterized by high scientific rigor and flexibility throughout its different stages and allows the objective study of spontaneous behavior in natural settings, with no external influence. Mixed methods researchers need to take bold yet thoughtful decisions regarding both substantive and procedural issues. We present three fundamental and complementary ideas to guide researchers in this respect: we show why studies of sport and physical activity that use a mixed methods research approach should be included in the field of mixed methods research, we highlight the numerous possibilities offered by observational methodology in this field through the transformation of descriptive data into quantifiable code matrices, and we discuss possible solutions for achieving true integration of qualitative and quantitative findings. PMID:29312061
Measure Landscape Diversity with Logical Scout Agents
NASA Astrophysics Data System (ADS)
Wirth, E.; Szabó, G.; Czinkóczky, A.
2016-06-01
The Common Agricultural Policy reform of the EU focuses on three long-term objectives: viable food production, sustainable management of natural resources and climate action with balanced territorial development. To achieve these goals, the EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity. Current paper introduces an agent-based method to calculate the potential of landscape diversity. The method tries to catch the nature of heterogeneity using logic and modelling as opposed to the traditional statistical reasoning. The outlined Random Walk Scouting algorithm registers the land cover crossings of the scout agents to a Monte Carlo integral. The potential is proportional with the composition and the configuration (spatial character) of the landscape. Based on the measured points a potential map is derived to give an objective and quantitative basis to the stakeholders (policy makers, farmers).
Dual-channel in-line digital holographic double random phase encryption
Das, Bhargab; Yelleswarapu, Chandra S; Rao, D V G L N
2012-01-01
We present a robust encryption method for the encoding of 2D/3D objects using digital holography and virtual optics. Using our recently developed dual-plane in-line digital holography technique, two in-line digital holograms are recorded at two different planes and are encrypted using two different double random phase encryption configurations, independently. The process of using two mutually exclusive encryption channels makes the system more robust against attacks since both the channels should be decrypted accurately in order to get a recognizable reconstruction. Results show that the reconstructed object is unrecognizable even when the portion of the correct phase keys used during decryption is close to 75%. The system is verified against blind decryptions by evaluating the SNR and MSE. Validation of the proposed method and sensitivities of the associated parameters are quantitatively analyzed and illustrated. PMID:23471012
Measurement of in vivo local shear modulus using MR elastography multiple-phase patchwork offsets.
Suga, Mikio; Matsuda, Tetsuya; Minato, Kotaro; Oshiro, Osamu; Chihara, Kunihiro; Okamoto, Jun; Takizawa, Osamu; Komori, Masaru; Takahashi, Takashi
2003-07-01
Magnetic resonance elastography (MRE) is a method that can visualize the propagating and standing shear waves in an object being measured. The quantitative value of a shear modulus can be calculated by estimating the local shear wavelength. Low-frequency mechanical motion must be used for soft, tissue-like objects because a propagating shear wave rapidly attenuates at a higher frequency. Moreover, a propagating shear wave is distorted by reflections from the boundaries of objects. However, the distortions are minimal around the wave front of the propagating shear wave. Therefore, we can avoid the effect of reflection on a region of interest (ROI) by adjusting the duration of mechanical vibrations. Thus, the ROI is often shorter than the propagating shear wavelength. In the MRE sequence, a motion-sensitizing gradient (MSG) is synchronized with mechanical cyclic motion. MRE images with multiple initial phase offsets can be generated with increasing delays between the MSG and mechanical vibrations. This paper proposes a method for measuring the local shear wavelength using MRE multiple initial phase patchwork offsets that can be used when the size of the object being measured is shorter than the local wavelength. To confirm the reliability of the proposed method, computer simulations, a simulated tissue study and in vitro and in vivo studies were performed.
Automatic and Objective Assessment of Alternating Tapping Performance in Parkinson's Disease
Memedi, Mevludin; Khan, Taha; Grenholm, Peter; Nyholm, Dag; Westin, Jerker
2013-01-01
This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson's disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’) and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson's Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping. PMID:24351667
Automatic and objective assessment of alternating tapping performance in Parkinson's disease.
Memedi, Mevludin; Khan, Taha; Grenholm, Peter; Nyholm, Dag; Westin, Jerker
2013-12-09
This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson's disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions ('speed', 'accuracy', 'fatigue' and 'arrhythmia') and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson's Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.
Exposure assessment of tetrafluoroethylene and ammonium perfluorooctanoate 1951-2002.
Sleeuwenhoek, Anne; Cherrie, John W
2012-03-01
To develop a method to reconstruct exposure to tetrafluoroethylene (TFE) and ammonium perfluorooctanoate (APFO) in plants producing polytetrafluoroethylene (PTFE) in the absence of suitable objective measurements. These data were used to inform an epidemiological study being carried out to investigate possible risks in workers employed in the manufacture of PTFE and to study trends in exposure over time. For each plant, detailed descriptions of all occupational titles, including tasks and changes over time, were obtained during semi-structured interviews with key plant personnel. A semi-quantitative assessment method was used to assess inhalation exposure to TFE and inhalation plus dermal exposure to APFO. Temporal trends in exposure to TFE and APFO were investigated. In each plant the highest exposures for both TFE and APFO occurred in the polymerisation area. Due to the introduction of control measures, increasing process automation and other improvements, exposures generally decreased over time. In the polymerisation area, the annual decline in exposure to TFE varied by plant from 3.8 to 5.7% and for APFO from 2.2 to 5.5%. A simple method for assessing exposure was developed which used detailed process information and job descriptions to estimate average annual TFE and APFO exposure on an arbitrary semi-quantitative scale. These semi-quantitative estimates are sufficient to identify relative differences in exposure for the epidemiological study and should good data become available, they could be used to provide quantitative estimates for all plants across the whole period of operation. This journal is © The Royal Society of Chemistry 2012
A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue
Nyengaard, Jens Randel; Lind, Martin; Spector, Myron
2015-01-01
Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715
Krahenbühl, Tathyane; Gonçalves, Ezequiel Moreira; Costa, Eduardo Tavares; Barros, Antonio de Azevedo
2014-01-01
Objective: To analyze the main factors that influence bone mass in children and teenagers assessed by quantitative ultrasound (QUS) of the phalanges. Data source: A systematic literature review was performed according to the PRISMA method with searches in databases Pubmed/Medline, SciELO and Bireme for the period 2001-2012, in English and Portuguese languages, using the keywords: children, teenagers, adolescent, ultrasound finger phalanges, quantitative ultrasound of phalanges, phalangeal quantitative ultrasound. Data synthesis: 21 articles were included. Girls had, in QUS, Amplitude Dependent Speed of Sound (AD-SoS) values higher than boys during pubertal development. The values of the parameters of QUS of the phalanges and dual-energy X-ray Absorptiometry (DXA) increased with the increase of the maturational stage. Anthropometric variables such as age, weight, height, body mass index (BMI), lean mass showed positive correlations with the values of QUS of the phalanges. Physical activity has also been shown to be positively associated with increased bone mass. Factors such as ethnicity, genetics, caloric intake and socioeconomic profile have not yet shown a conclusive relationship and need a larger number of studies. Conclusions: QUS of the phalanges is a method used to evaluate the progressive acquisition of bone mass during growth and maturation of individuals in school phase, by monitoring changes that occur with increasing age and pubertal stage. There were mainly positive influences variables of sex, maturity, height, weight and BMI, with similar data when compared to the gold standard method, the DXA. PMID:25479860
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
Yaqoob, Zahid; Choi, Wonshik; Oh, Seungeun; Lue, Niyom; Park, Yongkeun; Fang-Yen, Christopher; Dasari, Ramachandra R.; Badizadegan, Kamran; Feld, Michael S.
2010-01-01
We report a quantitative phase microscope based on spectral domain optical coherence tomography and line-field illumination. The line illumination allows self phase-referencing method to reject common-mode phase noise. The quantitative phase microscope also features a separate reference arm, permitting the use of high numerical aperture (NA > 1) microscope objectives for high resolution phase measurement at multiple points along the line of illumination. We demonstrate that the path-length sensitivity of the instrument can be as good as 41 pm/Hz, which makes it suitable for nanometer scale study of cell motility. We present the detection of natural motions of cell surface and two-dimensional surface profiling of a HeLa cell. PMID:19550464
Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1989-09-01
This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less
NASA Technical Reports Server (NTRS)
Rey, P. A.; Gourinard, Y.; Cambou, F. (Principal Investigator); Guyader, J. C.; Gouaux, P.; Letoan, T.; Monchant, M.; Donville, B.; Loubet, D.
1973-01-01
The author has identified the following significant results. Significant results of the ARNICA program (February - December 1973) were: (1) The quantitative processing of ERTS-1 data was developed along two lines: the study of geological structures and lineaments of Spanish Catalonia, and the phytogeographical study of the forest region of the Landes of Gascony (France). In both cases it is shown that the ERTS-1 imagery can be used in establishing zonings of equal quantitative interpretation value. (2) In keeping with the operational transfer program proposed in previous reports between exploration of the imagery and charting of the object, a precise data processing method was developed, concerning more particularly the selection of digital equidensity samples computer display and rigorous referencing.
Methodology for determining the investment attractiveness of construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana
2018-03-01
The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.
Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin
2018-03-14
The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.
Carrer, Francesco
2017-01-01
This paper deals with the ethnoarchaeological analysis of the spatial pattern of artefacts and ecofacts within two traditional pastoral huts (a dwelling and a seasonal dairy) in the uplands of Val Maudagna (Cuneo province, Italian western Alps). The composition of the ethnoarchaeological assemblages of the two huts was studied and compared; point pattern analysis was applied to identify spatial processes mirrored in the interactions between objects; Moran's I correlogram and empirical variogram were used to investigate the effects of trampling on the displacement of objects on the floor. The results were compared with information provided by the herder who still used the huts. The quantitative and ethnographical data enabled inferences to be made that can help in the interpretation of archaeological seasonal sites. The function of a seasonal site can be recognized, as can the impact of delayed curation on the composition of the assemblage and the importance of the intensity of occupation compared with the frequency of occupation. The spatial organization of activities is reflected in the spatial patterns of objects, with clearer identification of activity areas in intensively occupied sites, and there is evidence for the behaviour behind the spatial segregation of activities. Trampling is a crucial post-depositional factor in the displacement of artefacts and ecofacts, especially in non-intensively exploited sites. From a methodological point of view, this research is another example that highlights the importance of integrating quantitative methods (especially spatial analysis and geostatistical methods) and ethnoarchaeological data in order to improve the interpretation of archaeological sites and assemblages.
Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J
2014-01-01
The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.
2006-11-01
A specific objective here is to determine the role and subtype of adenosine receptors that mediate skeletal muscle protection using a quantitative... using a mouse hindlimb model and it defined adenosine A3 receptors as one of the skeletal muscle protective adenosine receptors. The study also...from the US Army Medical Research and Materiel Command Human Subjects Research Review Board on 9/9/2005. Twenty two subjects were consented but
1998-09-01
breast tissues may provide unique information which could increase detection and/or characterization of potentially malignant masses not accessible... masses deep in the breast , or within relatively dense, stiff, or heterogeneous tissues, is poor. The principal objective of this project is to develop...or propagating shear wave is documented by imaging devices. In the original MRI method, spatial magnetization tagging was applied, but this had poor
Jose M. Iniguez; Joseph L. Ganey; Peter J. Daughtery; John D. Bailey
2005-01-01
The objective of this study was to develop a rule based cover type classification system for the forest and woodland vegetation in the Sky Islands of southeastern Arizona. In order to develop such a system we qualitatively and quantitatively compared a hierarchical (Wardâs) and a non-hierarchical (k-means) clustering method. Ecologically, unique groups represented by...
Jose M. Iniguez; Joseph L. Ganey; Peter J. Daugherty; John D. Bailey
2005-01-01
The objective of this study was to develop a rule based cover type classification system for the forest and woodland vegetation in the Sky Islands of southeastern Arizona. In order to develop such system we qualitatively and quantitatively compared a hierarchical (Wardâs) and a non-hierarchical (k-means) clustering method. Ecologically, unique groups and plots...
Dynamical Systems Approach to Endothelial Heterogeneity
Regan, Erzsébet Ravasz; Aird, William C.
2012-01-01
Rationale Objective Here we reexamine our current understanding of the molecular basis of endothelial heterogeneity. We introduce multistability as a new explanatory framework in vascular biology. Methods We draw on the field of non-linear dynamics to propose a dynamical systems framework for modeling multistability and its derivative properties, including robustness, memory, and plasticity. Conclusions Our perspective allows for both a conceptual and quantitative description of system-level features of endothelial regulation. PMID:22723222
Representation and Reconstruction of Three-dimensional Microstructures in Ni-based Superalloys
2010-12-20
Materiala, 56, pp. 427-437 (2009); • Application of joint histogram and mutual information to registration and data fusion problems in serial...sectioning data sets and synthetically generated microstructures. The method is easy to use, and allows for a quantitative description of shapes. Further...following objectives were achieved: • we have successfully applied 3-D moment invariant analysis to several experimental data sets; • we have extended 2-D
Blanchet-Réthoré, Sandrine; Bourdès, Valérie; Mercenier, Annick; Haddar, Cyrille H; Verhoeven, Paul O; Andres, Philippe
2017-01-01
Staphylococcus aureus dominates the skin microbiota in patients with atopic dermatitis (AD), with bacterial loads correlating with disease severity. The aim of this exploratory study was to investigate the effect of a cosmetic lotion containing heat-treated Lactobacillus johnsonii NCC 533 (HT La1) on S. aureus colonization in AD patients. This open-label, multicenter study was performed in AD patients in Germany. First, detection of S. aureus was performed in all patients using the swab or scrub-wash method of sampling, followed by quantitative culture or quantitative polymerase chain reaction. Repeatability and reproducibility of all method combinations were evaluated to select the best combination of sampling and quantification. Second, a lotion containing HT La1 was applied to lesional skin twice daily for 3 weeks. Scoring using local objective SCORing Atopic Dermatitis (SCORAD), measurement of S. aureus load, and lesional microbiome analysis were performed before and after the 3-week treatment period. Thirty-one patients with AD were included in the study. All sampling and quantification methods were found to be robust, reproducible, and repeatable for assessing S. aureus load. For simplicity, a combination of swab and quantitative polymerase chain reaction was chosen to assess the efficacy of HT La1. Following application of a lotion containing HT La1 to AD lesions for 3 weeks, a reduction in S. aureus load was observed in patients, which correlated with a decrease in local objective SCORAD. Interestingly, high baseline skin concentrations of S. aureus were associated with good responses to the lotion. This study demonstrated that the application of a lotion containing HT La1 to the lesional skin of patients with AD for 3 weeks controlled S. aureus colonization and was associated with local clinical improvement (SCORAD). These findings support further development of topical treatments containing heat-treated nonreplicating beneficial bacteria for patients with AD.
Weak fault detection and health degradation monitoring using customized standard multiwavelets
NASA Astrophysics Data System (ADS)
Yuan, Jing; Wang, Yu; Peng, Yizhen; Wei, Chenjun
2017-09-01
Due to the nonobvious symptoms contaminated by a large amount of background noise, it is challenging to beforehand detect and predictively monitor the weak faults for machinery security assurance. Multiwavelets can act as adaptive non-stationary signal processing tools, potentially viable for weak fault diagnosis. However, the signal-based multiwavelets suffer from such problems as the imperfect properties missing the crucial orthogonality, the decomposition distortion impossibly reflecting the relationships between the faults and signatures, the single objective optimization and independence for fault prognostic. Thus, customized standard multiwavelets are proposed for weak fault detection and health degradation monitoring, especially the weak fault signature quantitative identification. First, the flexible standard multiwavelets are designed using the construction method derived from scalar wavelets, seizing the desired properties for accurate detection of weak faults and avoiding the distortion issue for feature quantitative identification. Second, the multi-objective optimization combined three dimensionless indicators of the normalized energy entropy, normalized singular entropy and kurtosis index is introduced to the evaluation criterions, and benefits for selecting the potential best basis functions for weak faults without the influence of the variable working condition. Third, an ensemble health indicator fused by the kurtosis index, impulse index and clearance index of the original signal along with the normalized energy entropy and normalized singular entropy by the customized standard multiwavelets is achieved using Mahalanobis distance to continuously monitor the health condition and track the performance degradation. Finally, three experimental case studies are implemented to demonstrate the feasibility and effectiveness of the proposed method. The results show that the proposed method can quantitatively identify the fault signature of a slight rub on the inner race of a locomotive bearing, effectively detect and locate the potential failure from a complicated epicyclic gear train and successfully reveal the fault development and performance degradation of a test bearing in the lifetime.
Quantitative Ultrasound: Transition from the Laboratory to the Clinic
NASA Astrophysics Data System (ADS)
Hall, Timothy
2014-03-01
There is a long history of development and testing of quantitative methods in medical ultrasound. From the initial attempts to scan breasts with ultrasound in the early 1950's, there was a simultaneous attempt to classify tissue as benign or malignant based on the appearance of the echo signal on an oscilloscope. Since that time, there has been substantial improvement in the ultrasound systems used, the models to describe wave propagation in random media, the methods of signal detection theory, and the combination of those models and methods into parameter estimation techniques. One particularly useful measure in ultrasonics is the acoustic differential scattering cross section per unit volume in the special case of the 180° (as occurs in pulse-echo ultrasound imaging) which is known as the backscatter coefficient. The backscatter coefficient, and parameters derived from it, can be used to objectively measure quantities that are used clinically to subjectively describe ultrasound images. For example, the ``echogenicity'' (relative ultrasound image brightness) of the renal cortex is commonly compared to that of the liver. Investigating the possibility of liver disease, it is assumed the renal cortex echogenicity is normal. Investigating the kidney, it is assumed the liver echogenicity is normal. Objective measures of backscatter remove these assumptions. There is a 30-year history of accurate estimates of acoustic backscatter coefficients with laboratory systems. Twenty years ago that ability was extended to clinical imaging systems with array transducers. Recent studies involving multiple laboratories and a variety of clinical imaging systems has demonstrated system-independent estimates of acoustic backscatter coefficients in well-characterized media (agreement within about 1.5dB over about a 1-decade frequency range). Advancements that made this possible, transition of this and similar capabilities into medical practice and the prospects for quantitative image-based biomarkers will be discussed. This work was supported, in part, by NIH grants R01CA140271 and R01HD072077.
Using Live-Crown Ratio to Control Wood Quality: An Example of Quantitative Silviculture
Thomas J. Dean
1999-01-01
Quantitative silviculture is the application of biological relationships in meeting specific, quantitative management objectives. It is a two-sided approach requiring the identification and application of biological relationships. An example of quantitative silviculture is presented that uses a relationship between average-live crown ratio and relative stand density...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shekar, Venkateswaran; Fiondella, Lance; Chatterjee, Samrat
Transportation networks are critical to the social and economic function of nations. Given the continuing increase in the populations of cities throughout the world, the criticality of transportation infrastructure is expected to increase. Thus, it is ever more important to mitigate congestion as well as to assess the impact disruptions would have on individuals who depend on transportation for their work and livelihood. Moreover, several government organizations are responsible for ensuring transportation networks are available despite the constant threat of natural disasters and terrorist activities. Most of the previous transportation network vulnerability research has been performed in the context ofmore » static traffic models, many of which are formulated as traditional optimization problems. However, transportation networks are dynamic because their usage varies over time. Thus, more appropriate methods to characterize the vulnerability of transportation networks should consider their dynamic properties. This paper presents a quantitative approach to assess the vulnerability of a transportation network to disruptions with methods from traffic simulation. Our approach can prioritize the critical links over time and is generalizable to the case where both link and node disruptions are of concern. We illustrate the approach through a series of examples. Our results demonstrate that the approach provides quantitative insight into the time varying criticality of links. Such an approach could be used as the objective function of less traditional optimization methods that use simulation and other techniques to evaluate the relative utility of a particular network defense to reduce vulnerability and increase resilience.« less
The application of multiple reaction monitoring and multi-analyte profiling to HDL proteins
2014-01-01
Background HDL carries a rich protein cargo and examining HDL protein composition promises to improve our understanding of its functions. Conventional mass spectrometry methods can be lengthy and difficult to extend to large populations. In addition, without prior enrichment of the sample, the ability of these methods to detect low abundance proteins is limited. Our objective was to develop a high-throughput approach to examine HDL protein composition applicable to diabetes and cardiovascular disease (CVD). Methods We optimized two multiplexed assays to examine HDL proteins using a quantitative immunoassay (Multi-Analyte Profiling- MAP) and mass spectrometric-based quantitative proteomics (Multiple Reaction Monitoring-MRM). We screened HDL proteins using human xMAP (90 protein panel) and MRM (56 protein panel). We extended the application of these two methods to HDL isolated from a group of participants with diabetes and prior cardiovascular events and a group of non-diabetic controls. Results We were able to quantitate 69 HDL proteins using MAP and 32 proteins using MRM. For several common proteins, the use of MRM and MAP was highly correlated (p < 0.01). Using MAP, several low abundance proteins implicated in atherosclerosis and inflammation were found on HDL. On the other hand, MRM allowed the examination of several HDL proteins not available by MAP. Conclusions MAP and MRM offer a sensitive and high-throughput approach to examine changes in HDL proteins in diabetes and CVD. This approach can be used to measure the presented HDL proteins in large clinical studies. PMID:24397693
Morgan, Patrick; Nissi, Mikko J; Hughes, John; Mortazavi, Shabnam; Ellerman, Jutta
2017-07-01
Objectives The purpose of this study was to validate T2* mapping as an objective, noninvasive method for the prediction of acetabular cartilage damage. Methods This is the second step in the validation of T2*. In a previous study, we established a quantitative predictive model for identifying and grading acetabular cartilage damage. In this study, the model was applied to a second cohort of 27 consecutive hips to validate the model. A clinical 3.0-T imaging protocol with T2* mapping was used. Acetabular regions of interest (ROI) were identified on magnetic resonance and graded using the previously established model. Each ROI was then graded in a blinded fashion by arthroscopy. Accurate surgical location of ROIs was facilitated with a 2-dimensional map projection of the acetabulum. A total of 459 ROIs were studied. Results When T2* mapping and arthroscopic assessment were compared, 82% of ROIs were within 1 Beck group (of a total 6 possible) and 32% of ROIs were classified identically. Disease prediction based on receiver operating characteristic curve analysis demonstrated a sensitivity of 0.713 and a specificity of 0.804. Model stability evaluation required no significant changes to the predictive model produced in the initial study. Conclusions These results validate that T2* mapping provides statistically comparable information regarding acetabular cartilage when compared to arthroscopy. In contrast to arthroscopy, T2* mapping is quantitative, noninvasive, and can be used in follow-up. Unlike research quantitative magnetic resonance protocols, T2* takes little time and does not require a contrast agent. This may facilitate its use in the clinical sphere.
Lim, Tze-Peng; Ledesma, Kimberly R.; Chang, Kai-Tai; Hou, Jing-Guo; Kwa, Andrea L.; Nikolaou, Michael; Quinn, John P.; Prince, Randall A.; Tam, Vincent H.
2008-01-01
Treatment of multidrug-resistant bacterial infections poses a therapeutic challenge to clinicians; combination therapy is often the only viable option for multidrug-resistant infections. A quantitative method was developed to assess the combined killing abilities of antimicrobial agents. Time-kill studies (TKS) were performed using a multidrug-resistant clinical isolate of Acinetobacter baumannii with escalating concentrations of cefepime (0 to 512 mg/liter), amikacin (0 to 256 mg/liter), and levofloxacin (0 to 64 mg/liter). The bacterial burden data in single and combined (two of the three agents with clinically achievable concentrations in serum) TKS at 24 h were mathematically modeled to provide an objective basis for comparing various antimicrobial agent combinations. Synergy and antagonism were defined as interaction indices of <1 and >1, respectively. A hollow-fiber infection model (HFIM) simulating various clinical (fluctuating concentrations over time) dosing exposures was used to selectively validate our quantitative assessment of the combined killing effect. Model fits in all single-agent TKS were satisfactory (r2 > 0.97). An enhanced combined overall killing effect was seen in the cefepime-amikacin combination (interactive index, 0.698; 95% confidence interval [CI], 0.675 to 0.722) and the cefepime-levofloxacin combination (interactive index, 0.929; 95% CI, 0.903 to 0.956), but no significant difference in the combined overall killing effect for the levofloxacin-amikacin combination was observed (interactive index, 0.994; 95% CI, 0.982 to 1.005). These assessments were consistent with observations in HFIM validation studies. Our method could be used to objectively rank the combined killing activities of two antimicrobial agents when used together against a multidrug-resistant A. baumannii isolate. It may offer better insights into the effectiveness of various antimicrobial combinations and warrants further investigations. PMID:18505848
Quantitative damage imaging using Lamb wave diffraction tomography
NASA Astrophysics Data System (ADS)
Zhang, Hai-Yan; Ruan, Min; Zhu, Wen-Fa; Chai, Xiao-Dong
2016-12-01
In this paper, we investigate the diffraction tomography for quantitative imaging damages of partly through-thickness holes with various shapes in isotropic plates by using converted and non-converted scattered Lamb waves generated numerically. Finite element simulations are carried out to provide the scattered wave data. The validity of the finite element model is confirmed by the comparison of scattering directivity pattern (SDP) of circle blind hole damage between the finite element simulations and the analytical results. The imaging method is based on a theoretical relation between the one-dimensional (1D) Fourier transform of the scattered projection and two-dimensional (2D) spatial Fourier transform of the scattering object. A quantitative image of the damage is obtained by carrying out the 2D inverse Fourier transform of the scattering object. The proposed approach employs a circle transducer network containing forward and backward projections, which lead to so-called transmission mode (TMDT) and reflection mode diffraction tomography (RMDT), respectively. The reconstructed results of the two projections for a non-converted S0 scattered mode are investigated to illuminate the influence of the scattering field data. The results show that Lamb wave diffraction tomography using the combination of TMDT and RMDT improves the imaging effect compared with by using only the TMDT or RMDT. The scattered data of the converted A0 mode are also used to assess the performance of the diffraction tomography method. It is found that the circle and elliptical shaped damages can still be reasonably identified from the reconstructed images while the reconstructed results of other complex shaped damages like crisscross rectangles and racecourse are relatively poor. Project supported by the National Natural Science Foundation of China (Grant Nos. 11474195, 11274226, 11674214, and 51478258).
Gastrointestinal Traits: Individualizing Therapy for Obesity with Drugs and Devices
Camilleri, Michael; Acosta, Andres
2015-01-01
Objectives The objectives were to review the discrepancy between numbers of people requiring weight loss treatment and results, and to assess the potential effects of pharmacological treatments (recently approved for obesity) and endoscopically deployed devices on quantitative gastrointestinal traits in development for obesity treatment. Methods We conducted a review of relevant literature to achieve our objectives. Results The 2013 guidelines increased the number of adults recommended for weight loss treatment by 20.9% (116.0 million to 140.2 million). There is an imbalance between efficacy and costs of commercial weight loss programs and drug therapy (average weight loss ~5 kg). The number of bariatric procedures performed in the United States has doubled in the past decade. The efficacy of bariatric surgery is attributed to reduction in the volume of the stomach, nutrient malabsorption with some types of surgery, increased postprandial incretin responses, and activation of farnesoid X receptor mechanisms. These gastrointestinal and behavioral traits identify sub-phenotypes of obesity based on recent research. Conclusions The mechanisms or traits targeted by drug and device treatments include centrally mediated alterations of appetite or satiation, diversion of nutrients, and alteration of stomach capacity, gastric emptying, or incretin hormones. Future treatment may be individualized based on quantitative gastrointestinal and behavioral traits measured in obese patients. PMID:26271184
Prioritising coastal zone management issues through fuzzy cognitive mapping approach.
Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi
2012-04-30
Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.
Attitudes of emergency department staff toward family presence during resuscitation.
Wacht, Oren; Dopelt, Keren; Snir, Yoram; Davidovitch, Nadav
2010-06-01
While family presence during resuscitation has been researched extensively in the international and especially American medical literature, in Israel this subject has rarely been researched. Because such policies have become common practice in many countries, it is important to investigate the attitudes of health care staff in Israeli emergency departments to better understand the potential implication of adopting such policies. To examine the attitudes of the physicians and nurses in the ED of Soroka Medical Center to FPDR. The methods we used were both qualitative (partly structured open interviews of 10 ED staff members from various medical professions) and quantitative (an anonymous questionnaire that collected sociodemographic, professional, and attitude data). The qualitative and quantitative results showed that most staff members opposed FPDR. The main reasons for objecting to FPDR were concern about family criticism, the added pressure that would be put on the staff members, fear of lawsuits, fear of hurting the feelings of the families, and the danger of losing one's "objectivity" while treating patients. Physicians objected more strongly to FPDR than did nurses. More research is needed on FPDR in Israel, including an examination of its medical, ethical, legal and logistic aspects. In addition to the views of the medical staff, the attitudes of patients and their families should also be examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Kareem, O.; Ghoneim, M.; Harith, M. A.
2011-09-22
Analysis of metal objects is a necessary step for establishing an appropriate conservation treatment of an object or to follow up the application's result of the suggested treatments. The main considerations on selecting a method that can be used in investigation and analysis of metal objects are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to the appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal objects. In this study various historical metal objects collected from different museums andmore » excavations in Egypt were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal objects were investigated by other methods such as Scanning Electron Microscope with energy-dispersive x-ray analyzer (SEM-EDX) and X-ray Diffraction (XRD). This study confirms that Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal objects. LIBS analysis can quickly provide information on the qualitative and semi-quantitative elemental content of different metal objects and their characterization and classification. It is practically non-destructive technique with the critical advantage of being applicable in situ, thereby avoiding sampling and sample preparations. It is can be dependable, satisfactory and effective method for low cost study of archaeological and historical metals. But we have to take into consideration that the corrosion of metal leads to material alteration and possible loss of certain metals in the form of soluble salts. Certain corrosion products are known to leach out of the object and therefore, their low content does not necessarily reflect the composition of the metal at the time of the object manufacture. Another point should be taken into consideration that the heterogeneity of a metal alloy object that often result from poor mixing of the different metal alloy composition.There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal objects.« less
Proposed Objective Odor Control Test Methodology for Waste Containment
NASA Technical Reports Server (NTRS)
Vos, Gordon
2010-01-01
The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.
Shin, S M; Choi, Y-S; Yamaguchi, T; Maki, K; Cho, B-H; Park, S-B
2015-01-01
Objectives: To evaluate axial cervical vertebral (ACV) shape quantitatively and to build a prediction model for skeletal maturation level using statistical shape analysis for Japanese individuals. Methods: The sample included 24 female and 19 male patients with hand–wrist radiographs and CBCT images. Through generalized Procrustes analysis and principal components (PCs) analysis, the meaningful PCs were extracted from each ACV shape and analysed for the estimation regression model. Results: Each ACV shape had meaningful PCs, except for the second axial cervical vertebra. Based on these models, the smallest prediction intervals (PIs) were from the combination of the shape space PCs, age and gender. Overall, the PIs of the male group were smaller than those of the female group. There was no significant correlation between centroid size as a size factor and skeletal maturation level. Conclusions: Our findings suggest that the ACV maturation method, which was applied by statistical shape analysis, could confirm information about skeletal maturation in Japanese individuals as an available quantifier of skeletal maturation and could be as useful a quantitative method as the skeletal maturation index. PMID:25411713
Lee, Won-Joon; Wilkinson, Caroline M; Hwang, Hyeon-Shik; Lee, Sang-Mi
2015-05-01
Accuracy is the most important factor supporting the reliability of forensic facial reconstruction (FFR) comparing to the corresponding actual face. A number of methods have been employed to evaluate objective accuracy of FFR. Recently, it has been attempted that the degree of resemblance between computer-generated FFR and actual face is measured by geometric surface comparison method. In this study, three FFRs were produced employing live adult Korean subjects and three-dimensional computerized modeling software. The deviations of the facial surfaces between the FFR and the head scan CT of the corresponding subject were analyzed in reverse modeling software. The results were compared with those from a previous study which applied the same methodology as this study except average facial soft tissue depth dataset. Three FFRs of this study that applied updated dataset demonstrated lesser deviation errors between the facial surfaces of the FFR and corresponding subject than those from the previous study. The results proposed that appropriate average tissue depth data are important to increase quantitative accuracy of FFR. © 2015 American Academy of Forensic Sciences.
Geometric, Kinematic and Radiometric Aspects of Image-Based Measurements
NASA Technical Reports Server (NTRS)
Liu, Tianshu
2002-01-01
This paper discusses theoretical foundations of quantitative image-based measurements for extracting and reconstructing geometric, kinematic and dynamic properties of observed objects. New results are obtained by using a combination of methods in perspective geometry, differential geometry. radiometry, kinematics and dynamics. Specific topics include perspective projection transformation. perspective developable conical surface, perspective projection under surface constraint, perspective invariants, the point correspondence problem. motion fields of curves and surfaces. and motion equations of image intensity. The methods given in this paper arc useful for determining morphology and motion fields of deformable bodies such as elastic bodies. viscoelastic mediums and fluids.
Diagnostic analysis of liver B ultrasonic texture features based on LM neural network
NASA Astrophysics Data System (ADS)
Chi, Qingyun; Hua, Hu; Liu, Menglin; Jiang, Xiuying
2017-03-01
In this study, B ultrasound images of 124 benign and malignant patients were randomly selected as the study objects. The B ultrasound images of the liver were treated by enhanced de-noising. By constructing the gray level co-occurrence matrix which reflects the information of each angle, Principal Component Analysis of 22 texture features were extracted and combined with LM neural network for diagnosis and classification. Experimental results show that this method is a rapid and effective diagnostic method for liver imaging, which provides a quantitative basis for clinical diagnosis of liver diseases.
An ultra-wideband microwave tomography system: preliminary results.
Gilmore, Colin; Mojabi, Puyan; Zakaria, Amer; Ostadrahimi, Majid; Kaye, Cam; Noghanian, Sima; Shafai, Lotfollah; Pistorius, Stephen; LoVetri, Joe
2009-01-01
We describe a 2D wide-band multi-frequency microwave imaging system intended for biomedical imaging. The system is capable of collecting data from 2-10 GHz, with 24 antenna elements connected to a vector network analyzer via a 2 x 24 port matrix switch. Through the use of two different nonlinear reconstruction schemes: the Multiplicative-Regularized Contrast Source Inversion method and an enhanced version of the Distorted Born Iterative Method, we show preliminary imaging results from dielectric phantoms where data were collected from 3-6 GHz. The early inversion results show that the system is capable of quantitatively reconstructing dielectric objects.
Orthoclinostatic test as one of the methods for evaluating the human functional state
NASA Technical Reports Server (NTRS)
Doskin, V. A.; Gissen, L. D.; Bomshteyn, O. Z.; Merkin, E. N.; Sarychev, S. B.
1980-01-01
The possible use of different methods to evaluate the autonomic regulation in hygienic studies were examined. The simplest and most objective tests were selected. It is shown that the use of the optimized standards not only makes it possible to detect earlier unfavorables shifts, but also permits a quantitative characterization of the degree of impairment in the state of the organism. Precise interpretation of the observed shifts is possible. Results indicate that the standards can serve as one of the criteria for evaluating the state and can be widely used in hygienic practice.
3D Actin Network Centerline Extraction with Multiple Active Contours
Xu, Ting; Vavylonis, Dimitrios; Huang, Xiaolei
2013-01-01
Fluorescence microscopy is frequently used to study two and three dimensional network structures formed by cytoskeletal polymer fibers such as actin filaments and actin cables. While these cytoskeletal structures are often dilute enough to allow imaging of individual filaments or bundles of them, quantitative analysis of these images is challenging. To facilitate quantitative, reproducible and objective analysis of the image data, we propose a semi-automated method to extract actin networks and retrieve their topology in 3D. Our method uses multiple Stretching Open Active Contours (SOACs) that are automatically initialized at image intensity ridges and then evolve along the centerlines of filaments in the network. SOACs can merge, stop at junctions, and reconfigure with others to allow smooth crossing at junctions of filaments. The proposed approach is generally applicable to images of curvilinear networks with low SNR. We demonstrate its potential by extracting the centerlines of synthetic meshwork images, actin networks in 2D Total Internal Reflection Fluorescence Microscopy images, and 3D actin cable meshworks of live fission yeast cells imaged by spinning disk confocal microscopy. Quantitative evaluation of the method using synthetic images shows that for images with SNR above 5.0, the average vertex error measured by the distance between our result and ground truth is 1 voxel, and the average Hausdorff distance is below 10 voxels. PMID:24316442
Methods for collecting algal samples as part of the National Water-Quality Assessment Program
Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.
1993-01-01
Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.
Berton, Danilo Cortozi; Kalil, Andre C; Cavalcanti, Manuela; Teixeira, Paulo José Zimermann
2008-10-08
Ventilator-associated pneumonia (VAP) is a common infectious disease in intensive care units (ICUs). The best diagnostic approach to resolve this condition remains uncertain. To evaluate whether quantitative cultures of respiratory secretions are effective in reducing mortality in immunocompetent patients with VAP, compared with qualitative cultures. We also considered changes in antibiotic use, length of ICU stay and mechanical ventilation. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library 2007, issue 4), which contains the Acute Respiratory Infections Group's Specialized Register; MEDLINE (1966 to December 2007); EMBASE (1974 to December 2007); and LILACS (1982 to December 2007). Randomized controlled trials (RCTs) comparing respiratory samples processed quantitatively or qualitatively, obtained by invasive or non-invasive methods from immunocompetent patients with VAP, and which analyzed the impact of these methods on antibiotic use and mortality rates. Two review authors independently reviewed and selected trials from the search results, and assessed studies for suitability, methodology and quality. We analyzed data using Review Manager software. We pooled the included studies to yield the risk ratio (RR) for mortality and antibiotic change with 95% confidence intervals (CI). Of the 3931 references identified from the electronic databases, five RCTs (1367 patients) met the inclusion criteria. Three studies compared invasive methods using quantitative cultures versus non-invasive methods using qualitative cultures, and were used to answer the main objective of this review. The other two studies compared invasive versus non-invasive methods, both using quantitative cultures. All five studies were combined to compare invasive versus non-invasive interventions for diagnosing VAP. The studies that compared quantitative and qualitative cultures (1240 patients) showed no statistically significant differences in mortality rates (RR = 0.91, 95% CI 0.75 to 1.11). The analysis of all five RCTs showed there was no evidence of mortality reduction in the invasive group versus the non-invasive group (RR = 0.93, 95% CI 0.78 to 1.11). There were no significant differences between the interventions with respect to the number of days on mechanical ventilation, length of ICU stay or antibiotic change. There is no evidence that the use of quantitative cultures of respiratory secretions results in reduced mortality, reduced time in ICU and on mechanical ventilation, or higher rates of antibiotic change when compared to qualitative cultures in patients with VAP. Similar results were observed when invasive strategies were compared with non-invasive strategies.
An interactive tool for semi-automatic feature extraction of hyperspectral data
NASA Astrophysics Data System (ADS)
Kovács, Zoltán; Szabó, Szilárd
2016-09-01
The spectral reflectance of the surface provides valuable information about the environment, which can be used to identify objects (e.g. land cover classification) or to estimate quantities of substances (e.g. biomass). We aimed to develop an MS Excel add-in - Hyperspectral Data Analyst (HypDA) - for a multipurpose quantitative analysis of spectral data in VBA programming language. HypDA was designed to calculate spectral indices from spectral data with user defined formulas (in all possible combinations involving a maximum of 4 bands) and to find the best correlations between the quantitative attribute data of the same object. Different types of regression models reveal the relationships, and the best results are saved in a worksheet. Qualitative variables can also be involved in the analysis carried out with separability and hypothesis testing; i.e. to find the wavelengths responsible for separating data into predefined groups. HypDA can be used both with hyperspectral imagery and spectrometer measurements. This bivariate approach requires significantly fewer observations than popular multivariate methods; it can therefore be applied to a wide range of research areas.
Application of fuzzy theories to formulation of multi-objective design problems. [for helicopters
NASA Technical Reports Server (NTRS)
Dhingra, A. K.; Rao, S. S.; Miura, H.
1988-01-01
Much of the decision making in real world takes place in an environment in which the goals, the constraints, and the consequences of possible actions are not known precisely. In order to deal with imprecision quantitatively, the tools of fuzzy set theory can by used. This paper demonstrates the effectiveness of fuzzy theories in the formulation and solution of two types of helicopter design problems involving multiple objectives. The first problem deals with the determination of optimal flight parameters to accomplish a specified mission in the presence of three competing objectives. The second problem addresses the optimal design of the main rotor of a helicopter involving eight objective functions. A method of solving these multi-objective problems using nonlinear programming techniques is presented. Results obtained using fuzzy formulation are compared with those obtained using crisp optimization techniques. The outlined procedures are expected to be useful in situations where doubt arises about the exactness of permissible values, degree of credibility, and correctness of statements and judgements.
Disease quantification on PET/CT images without object delineation
NASA Astrophysics Data System (ADS)
Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.
2017-03-01
The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.
A practical material decomposition method for x-ray dual spectral computed tomography.
Hu, Jingjing; Zhao, Xing
2016-03-17
X-ray dual spectral CT (DSCT) scans the measured object with two different x-ray spectra, and the acquired rawdata can be used to perform the material decomposition of the object. Direct calibration methods allow a faster material decomposition for DSCT and can be separated in two groups: image-based and rawdata-based. The image-based method is an approximative method, and beam hardening artifacts remain in the resulting material-selective images. The rawdata-based method generally obtains better image quality than the image-based method, but this method requires geometrically consistent rawdata. However, today's clinical dual energy CT scanners usually measure different rays for different energy spectra and acquire geometrically inconsistent rawdata sets, and thus cannot meet the requirement. This paper proposes a practical material decomposition method to perform rawdata-based material decomposition in the case of inconsistent measurement. This method first yields the desired consistent rawdata sets from the measured inconsistent rawdata sets, and then employs rawdata-based technique to perform material decomposition and reconstruct material-selective images. The proposed method was evaluated by use of simulated FORBILD thorax phantom rawdata and dental CT rawdata, and simulation results indicate that this method can produce highly quantitative DSCT images in the case of inconsistent DSCT measurements.
Techniques and Methods for Testing the Postural Function in Healthy and Pathological Subjects
Paillard, Thierry; Noé, Frédéric
2015-01-01
The different techniques and methods employed as well as the different quantitative and qualitative variables measured in order to objectify postural control are often chosen without taking into account the population studied, the objective of the postural test, and the environmental conditions. For these reasons, the aim of this review was to present and justify the different testing techniques and methods with their different quantitative and qualitative variables to make it possible to precisely evaluate each sensory, central, and motor component of the postural function according to the experiment protocol under consideration. The main practical and technological methods and techniques used in evaluating postural control were explained and justified according to the experimental protocol defined. The main postural conditions (postural stance, visual condition, balance condition, and test duration) were also analyzed. Moreover, the mechanistic exploration of the postural function often requires implementing disturbing postural conditions by using motor disturbance (mechanical disturbance), sensory stimulation (sensory manipulation), and/or cognitive disturbance (cognitive task associated with maintaining postural balance) protocols. Each type of disturbance was tackled in order to facilitate understanding of subtle postural control mechanisms and the means to explore them. PMID:26640800
Techniques and Methods for Testing the Postural Function in Healthy and Pathological Subjects.
Paillard, Thierry; Noé, Frédéric
2015-01-01
The different techniques and methods employed as well as the different quantitative and qualitative variables measured in order to objectify postural control are often chosen without taking into account the population studied, the objective of the postural test, and the environmental conditions. For these reasons, the aim of this review was to present and justify the different testing techniques and methods with their different quantitative and qualitative variables to make it possible to precisely evaluate each sensory, central, and motor component of the postural function according to the experiment protocol under consideration. The main practical and technological methods and techniques used in evaluating postural control were explained and justified according to the experimental protocol defined. The main postural conditions (postural stance, visual condition, balance condition, and test duration) were also analyzed. Moreover, the mechanistic exploration of the postural function often requires implementing disturbing postural conditions by using motor disturbance (mechanical disturbance), sensory stimulation (sensory manipulation), and/or cognitive disturbance (cognitive task associated with maintaining postural balance) protocols. Each type of disturbance was tackled in order to facilitate understanding of subtle postural control mechanisms and the means to explore them.
Cui, Xueliang; Chen, Hui; Rui, Yunfeng; Niu, Yang; Li, He
2018-01-01
Objectives Two-stage open reduction and internal fixation (ORIF) and limited internal fixation combined with external fixation (LIFEF) are two widely used methods to treat Pilon injury. However, which method is superior to the other remains controversial. This meta-analysis was performed to quantitatively compare two-stage ORIF and LIFEF and clarify which method is better with respect to postoperative complications in the treatment of tibial Pilon fractures. Methods We conducted a meta-analysis to quantitatively compare the postoperative complications between two-stage ORIF and LIFEF. Eight studies involving 360 fractures in 359 patients were included in the meta-analysis. Results The two-stage ORIF group had a significantly lower risk of superficial infection, nonunion, and bone healing problems than the LIFEF group. However, no significant differences in deep infection, delayed union, malunion, arthritis symptoms, or chronic osteomyelitis were found between the two groups. Conclusion Two-stage ORIF was associated with a lower risk of postoperative complications with respect to superficial infection, nonunion, and bone healing problems than LIFEF for tibial Pilon fractures. Level of evidence 2.
Quality of data in multiethnic health surveys.
Pasick, R. J.; Stewart, S. L.; Bird, J. A.; D'Onofrio, C. N.
2001-01-01
OBJECTIVE: There has been insufficient research on the influence of ethno-cultural and language differences in public health surveys. Using data from three independent studies, the authors examine methods to assess data quality and to identify causes of problematic survey questions. METHODS: Qualitative and quantitative methods were used in this exploratory study, including secondary analyses of data from three baseline surveys (conducted in English, Spanish, Cantonese, Mandarin, and Vietnamese). Collection of additional data included interviews with investigators and interviewers; observations of item development; focus groups; think-aloud interviews; a test-retest assessment survey; and a pilot test of alternatively worded questions. RESULTS: The authors identify underlying causes for the 12 most problematic variables in three multiethnic surveys and describe them in terms of ethnic differences in reliability, validity, and cognitive processes (interpretation, memory retrieval, judgment formation, and response editing), and differences with regard to cultural appropriateness and translation problems. CONCLUSIONS: Multiple complex elements affect measurement in a multiethnic survey, many of which are neither readily observed nor understood through standard tests of data quality. Multiethnic survey questions are best evaluated using a variety of quantitative and qualitative methods that reveal different types and causes of problems. PMID:11889288
Cost analysis of objective resident cataract surgery assessments.
Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M
2015-05-01
To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Jawak, Shridhar D.; Jadhav, Ajay; Luis, Alvarinho J.
2016-05-01
Supraglacial debris was mapped in the Schirmacher Oasis, east Antarctica, by using WorldView-2 (WV-2) high resolution optical remote sensing data consisting of 8-band calibrated Gram Schmidt (GS)-sharpened and atmospherically corrected WV-2 imagery. This study is a preliminary attempt to develop an object-oriented rule set to extract supraglacial debris for Antarctic region using 8-spectral band imagery. Supraglacial debris was manually digitized from the satellite imagery to generate the ground reference data. Several trials were performed using few existing traditional pixel-based classification techniques and color-texture based object-oriented classification methods to extract supraglacial debris over a small domain of the study area. Multi-level segmentation and attributes such as scale, shape, size, compactness along with spectral information from the data were used for developing the rule set. The quantitative analysis of error was carried out against the manually digitized reference data to test the practicability of our approach over the traditional pixel-based methods. Our results indicate that OBIA-based approach (overall accuracy: 93%) for extracting supraglacial debris performed better than all the traditional pixel-based methods (overall accuracy: 80-85%). The present attempt provides a comprehensive improved method for semiautomatic feature extraction in supraglacial environment and a new direction in the cryospheric research.
Leadership and management curriculum planning for Iranian general practitioners.
Khosravan, Shahla; Karimi Moonaghi, Hossein; Yazdani, Shahram; Ahmadi, Soleiman; Mansoorian, Mohammad Reza
2015-10-01
Leadership and management are two expected features and competencies for general practitioners (GPs). The purpose of this study was leadership and management curriculum planning for GPs which was performed based on Kern's curriculum planning cycle. This study was conducted in 2011- 2012 in Iran using an explanatory mixed-methods approach. It was conducted through an initial qualitative phase using two focus group discussions and 28 semi-structured interviews with key informants to capture their experiences and viewpoints about the necessity of management courses for undergraduate medical students, goals, objectives, and educational strategies according to Kern's curriculum planning cycle. The data was used to develop a questionnaire to be used in a quantitative written survey. Results of these two phases and that of the review of medical curriculum in other countries and management curriculum of other medical disciplines in Iran were used in management and leadership curriculum planning. In the qualitative phase, purposeful sampling and content analysis with constant comparison based on Strauss and Corbin's method were used; descriptive and analytic tests were used for quantitative data by SPSS version 14. In the qualitatively stage of this research, 6 main categories including the necessity of management course, features and objectives of management curriculum, proper educational setting, educational methods and strategies, evolutionary method and feedback result were determined. In the quantitatively stage of the research, from the viewpoints of 51.6% of 126 units of research who filled out the questionnaire, ranked high necessary of management courses. The coordination of care and clinical leadership was determined as the most important role for GPs with a mean of 6.2 from sample viewpoint. Also, team working and group dynamics had the first priority related to the principles and basics of management with a mean of 3.59. Other results were shown in the paper. Results of this study indicated the need to provide educational programs for GPs; it led to a systematic curriculum theory and clinical management using Kern cycle for general practitioner's discipline. Implementation and evaluation of this program is recommended.
Automated Tumor Volumetry Using Computer-Aided Image Segmentation
Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A.; Ali, Zarina S.; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M.; Davatzikos, Christos
2015-01-01
Rationale and Objectives Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. Materials and Methods A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Results Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0–5 rating scale where 5 indicated perfect segmentation. Conclusions The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. PMID:25770633
Visual Tracking via Sparse and Local Linear Coding.
Wang, Guofeng; Qin, Xueying; Zhong, Fan; Liu, Yue; Li, Hongbo; Peng, Qunsheng; Yang, Ming-Hsuan
2015-11-01
The state search is an important component of any object tracking algorithm. Numerous algorithms have been proposed, but stochastic sampling methods (e.g., particle filters) are arguably one of the most effective approaches. However, the discretization of the state space complicates the search for the precise object location. In this paper, we propose a novel tracking algorithm that extends the state space of particle observations from discrete to continuous. The solution is determined accurately via iterative linear coding between two convex hulls. The algorithm is modeled by an optimal function, which can be efficiently solved by either convex sparse coding or locality constrained linear coding. The algorithm is also very flexible and can be combined with many generic object representations. Thus, we first use sparse representation to achieve an efficient searching mechanism of the algorithm and demonstrate its accuracy. Next, two other object representation models, i.e., least soft-threshold squares and adaptive structural local sparse appearance, are implemented with improved accuracy to demonstrate the flexibility of our algorithm. Qualitative and quantitative experimental results demonstrate that the proposed tracking algorithm performs favorably against the state-of-the-art methods in dynamic scenes.
An interactive method based on the live wire for segmentation of the breast in mammography images.
Zewei, Zhang; Tianyue, Wang; Li, Guo; Tingting, Wang; Lu, Xu
2014-01-01
In order to improve accuracy of computer-aided diagnosis of breast lumps, the authors introduce an improved interactive segmentation method based on Live Wire. This paper presents the Gabor filters and FCM clustering algorithm is introduced to the Live Wire cost function definition. According to the image FCM analysis for image edge enhancement, we eliminate the interference of weak edge and access external features clear segmentation results of breast lumps through improving Live Wire on two cases of breast segmentation data. Compared with the traditional method of image segmentation, experimental results show that the method achieves more accurate segmentation of breast lumps and provides more accurate objective basis on quantitative and qualitative analysis of breast lumps.
Scattering calculation and image reconstruction using elevation-focused beams
Duncan, David P.; Astheimer, Jeffrey P.; Waag, Robert C.
2009-01-01
Pressure scattered by cylindrical and spherical objects with elevation-focused illumination and reception has been analytically calculated, and corresponding cross sections have been reconstructed with a two-dimensional algorithm. Elevation focusing was used to elucidate constraints on quantitative imaging of three-dimensional objects with two-dimensional algorithms. Focused illumination and reception are represented by angular spectra of plane waves that were efficiently computed using a Fourier interpolation method to maintain the same angles for all temporal frequencies. Reconstructions were formed using an eigenfunction method with multiple frequencies, phase compensation, and iteration. The results show that the scattered pressure reduces to a two-dimensional expression, and two-dimensional algorithms are applicable when the region of a three-dimensional object within an elevation-focused beam is approximately constant in elevation. The results also show that energy scattered out of the reception aperture by objects contained within the focused beam can result in the reconstructed values of attenuation slope being greater than true values at the boundary of the object. Reconstructed sound speed images, however, appear to be relatively unaffected by the loss in scattered energy. The broad conclusion that can be drawn from these results is that two-dimensional reconstructions require compensation to account for uncaptured three-dimensional scattering. PMID:19425653
Scattering calculation and image reconstruction using elevation-focused beams.
Duncan, David P; Astheimer, Jeffrey P; Waag, Robert C
2009-05-01
Pressure scattered by cylindrical and spherical objects with elevation-focused illumination and reception has been analytically calculated, and corresponding cross sections have been reconstructed with a two-dimensional algorithm. Elevation focusing was used to elucidate constraints on quantitative imaging of three-dimensional objects with two-dimensional algorithms. Focused illumination and reception are represented by angular spectra of plane waves that were efficiently computed using a Fourier interpolation method to maintain the same angles for all temporal frequencies. Reconstructions were formed using an eigenfunction method with multiple frequencies, phase compensation, and iteration. The results show that the scattered pressure reduces to a two-dimensional expression, and two-dimensional algorithms are applicable when the region of a three-dimensional object within an elevation-focused beam is approximately constant in elevation. The results also show that energy scattered out of the reception aperture by objects contained within the focused beam can result in the reconstructed values of attenuation slope being greater than true values at the boundary of the object. Reconstructed sound speed images, however, appear to be relatively unaffected by the loss in scattered energy. The broad conclusion that can be drawn from these results is that two-dimensional reconstructions require compensation to account for uncaptured three-dimensional scattering.
Isotropic differential phase contrast microscopy for quantitative phase bio-imaging.
Chen, Hsi-Hsun; Lin, Yu-Zi; Luo, Yuan
2018-05-16
Quantitative phase imaging (QPI) has been investigated to retrieve optical phase information of an object and applied to biological microscopy and related medical studies. In recent examples, differential phase contrast (DPC) microscopy can recover phase image of thin sample under multi-axis intensity measurements in wide-field scheme. Unlike conventional DPC, based on theoretical approach under partially coherent condition, we propose a new method to achieve isotropic differential phase contrast (iDPC) with high accuracy and stability for phase recovery in simple and high-speed fashion. The iDPC is simply implemented with a partially coherent microscopy and a programmable thin-film transistor (TFT) shield to digitally modulate structured illumination patterns for QPI. In this article, simulation results show consistency of our theoretical approach for iDPC under partial coherence. In addition, we further demonstrate experiments of quantitative phase images of a standard micro-lens array, as well as label-free live human cell samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantitative analysis of multiple sclerosis: a feasibility study
NASA Astrophysics Data System (ADS)
Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong
2006-03-01
Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.
Knowles, D.B.
1955-01-01
The objective of the Ground Water Branch is to evaluate the occurrence, availability, and quality of ground water. The science of ground-water hydrology is applied toward attaining that goal. Although many ground-water investigations are of a qualitative nature, quantitative studies are necessarily an integral component of the complete evaluation of occurrence and availability. The worth of an aquifer as a fully developed source of water depends largely on two inherent characteristics: its ability to store, and its ability to transmit water. Furthermore, quantitative knowledge of these characteristics facilitates measurement of hydrologic entities such as recharge, leakage, evapotranspiration, etc. It is recognized that these two characteristics, referred to as the coefficients of storage and transmissibility, generally provide the very foundation on which quantitative studies are constructed. Within the science of ground-water hydrology, ground-water hydraulics methods are applied to determine these constats from field data.
NASA Astrophysics Data System (ADS)
Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.
2010-03-01
A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.
Fortin, Carole; Ehrmann Feldman, Debbie; Cheriet, Farida; Labelle, Hubert
2013-08-01
The objective of this study was to explore whether differences in standing and sitting postures of youth with idiopathic scoliosis could be detected from quantitative analysis of digital photographs. Standing and sitting postures of 50 participants aged 10-20-years-old with idiopathic scoliosis (Cobb angle: 15° to 60°) were assessed from digital photographs using a posture evaluation software program. Based on the XY coordinates of markers, 13 angular and linear posture indices were calculated in both positions. Paired t-tests were used to compare values of standing and sitting posture indices. Significant differences between standing and sitting positions (p < 0.05) were found for head protraction, shoulder elevation, scapula asymmetry, trunk list, scoliosis angle, waist angles, and frontal and sagittal plane pelvic tilt. Quantitative analysis of digital photographs is a clinically feasible method to measure standing and sitting postures among youth with scoliosis and to assist in decisions on therapeutic interventions.
NASA Astrophysics Data System (ADS)
Zhang, Jialin; Chen, Qian; Li, Jiaji; Zuo, Chao
2017-02-01
The transport of intensity equation (TIE) is a powerful tool for direct quantitative phase retrieval in microscopy imaging. However, there may be some problems when dealing with the boundary condition of the TIE. The previous work introduces a hard-edged aperture to the camera port of the traditional bright field microscope to generate the boundary signal for the TIE solver. Under this Neumann boundary condition, we can obtain the quantitative phase without any assumption or prior knowledge about the test object and the setup. In this paper, we will demonstrate the effectiveness of this method based on some experiments in practice. The micro lens array will be used for the comparison of two TIE solvers results based on introducing the aperture or not and this accurate quantitative phase imaging technique allows measuring cell dry mass which is used in biology to follow cell cycle, to investigate cell metabolism, or to address effects of drugs.
Tanaka, Yohei; Tsunemi, Yuichiro; Kawashima, Makoto; Tatewaki, Naoto; Nishida, Hiroshi
2013-01-01
Background Near-infrared has been shown to penetrate deeper than optical light sources independent of skin color, allowing safer treatment for the Asian skin type. Many studies have indicated the efficacy of various types of devices, but have not included a sufficiently objective evaluation. In this study, we used three-dimensional imaging for objective evaluation of facial skin tightening using a water-filtered near-infrared device. Methods Twenty Japanese patients were treated with the water-filtered near-infrared (1,000–1,800 nm) device using a contact-cooling and nonfreezing gel stored in a freezer. Three-dimensional imaging was performed, and quantitative volume measurements were taken to evaluate the change in post-treatment volume. The patients then provided their subjective assessments. Results Objective assessments of the treated cheek volume evaluated by a three-dimensional color schematic representation with quantitative volume measurements showed significant improvement 3 months after treatment. The mean volume reduction at the last post-treatment visit was 2.554 ± 0.999 mL. The post-treatment volume was significantly reduced compared with the pretreatment volume in all patients (P < 0.0001). Eighty-five percent of patients reported satisfaction with the improvement of skin laxity, and 80% of patients reported satisfaction with improvement of rhytids, such as the nasolabial folds. Side effects, such as epidermal burns and scar formation, were not observed throughout the study. Conclusion The advantages of this water-filtered near-infrared treatment are its high efficacy for skin tightening, associated with a minimal level of discomfort and minimal side effects. Together, these characteristics facilitate our ability to administer repeated treatments and provide alternative or adjunctive treatment for patients, with improved results. This study provides a qualitative and quantitative volumetric assessment, establishing the ability of this technology to reduce volume through noninvasive skin tightening. PMID:23837000
3D Filament Network Segmentation with Multiple Active Contours
NASA Astrophysics Data System (ADS)
Xu, Ting; Vavylonis, Dimitrios; Huang, Xiaolei
2014-03-01
Fluorescence microscopy is frequently used to study two and three dimensional network structures formed by cytoskeletal polymer fibers such as actin filaments and microtubules. While these cytoskeletal structures are often dilute enough to allow imaging of individual filaments or bundles of them, quantitative analysis of these images is challenging. To facilitate quantitative, reproducible and objective analysis of the image data, we developed a semi-automated method to extract actin networks and retrieve their topology in 3D. Our method uses multiple Stretching Open Active Contours (SOACs) that are automatically initialized at image intensity ridges and then evolve along the centerlines of filaments in the network. SOACs can merge, stop at junctions, and reconfigure with others to allow smooth crossing at junctions of filaments. The proposed approach is generally applicable to images of curvilinear networks with low SNR. We demonstrate its potential by extracting the centerlines of synthetic meshwork images, actin networks in 2D TIRF Microscopy images, and 3D actin cable meshworks of live fission yeast cells imaged by spinning disk confocal microscopy.
Coherent diffraction surface imaging in reflection geometry.
Marathe, Shashidhara; Kim, S S; Kim, S N; Kim, Chan; Kang, H C; Nickles, P V; Noh, D Y
2010-03-29
We present a reflection based coherent diffraction imaging method which can be used to reconstruct a non periodic surface image from a diffraction amplitude measured in reflection geometry. Using a He-Ne laser, we demonstrated that a surface image can be reconstructed solely from the reflected intensity from a surface without relying on any prior knowledge of the sample object or the object support. The reconstructed phase image of the exit wave is particularly interesting since it can be used to obtain quantitative information of the surface depth profile or the phase change during the reflection process. We believe that this work will broaden the application areas of coherent diffraction imaging techniques using light sources with limited penetration depth.
[Progress in stable isotope labeled quantitative proteomics methods].
Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui
2013-06-01
Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.
Using multi-species occupancy models in structured decision making on managed lands
Sauer, John R.; Blank, Peter J.; Zipkin, Elise F.; Fallon, Jane E.; Fallon, Frederick W.
2013-01-01
Land managers must balance the needs of a variety of species when manipulating habitats. Structured decision making provides a systematic means of defining choices and choosing among alternative management options; implementation of a structured decision requires quantitative approaches to predicting consequences of management on the relevant species. Multi-species occupancy models provide a convenient framework for making structured decisions when the management objective is focused on a collection of species. These models use replicate survey data that are often collected on managed lands. Occupancy can be modeled for each species as a function of habitat and other environmental features, and Bayesian methods allow for estimation and prediction of collective responses of groups of species to alternative scenarios of habitat management. We provide an example of this approach using data from breeding bird surveys conducted in 2008 at the Patuxent Research Refuge in Laurel, Maryland, evaluating the effects of eliminating meadow and wetland habitats on scrub-successional and woodland-breeding bird species using summed total occupancy of species as an objective function. Removal of meadows and wetlands decreased value of an objective function based on scrub-successional species by 23.3% (95% CI: 20.3–26.5), but caused only a 2% (0.5, 3.5) increase in value of an objective function based on woodland species, documenting differential effects of elimination of meadows and wetlands on these groups of breeding birds. This approach provides a useful quantitative tool for managers interested in structured decision making.
Destounis, Stamatia; Arieno, Andrea; Morgan, Renee; Roberts, Christina; Chan, Ariane
2017-01-01
Mammographic breast density (MBD) has been proven to be an important risk factor for breast cancer and an important determinant of mammographic screening performance. The measurement of density has changed dramatically since its inception. Initial qualitative measurement methods have been found to have limited consistency between readers, and in regards to breast cancer risk. Following the introduction of full-field digital mammography, more sophisticated measurement methodology is now possible. Automated computer-based density measurements can provide consistent, reproducible, and objective results. In this review paper, we describe various methods currently available to assess MBD, and provide a discussion on the clinical utility of such methods for breast cancer screening. PMID:28561776
1984-05-01
chemicals used by the U.S. Air Force. Snyder-Theilen Feline Sarcoma Virus (ST-FeSV), quantitatively transforms human skin fibroblasts following second...Objective 1 The cell line used for this aspect of this program was Detroit 550, a human diploid skin fibroblast line from the American Type Culture...Branch of the National Cancer Institute. The results are presented herein. Materials and Methods 1. Cells. Detroit 550 human skin fibroblast (HSF) cells
Method for matching customer and manufacturer positions for metal product parameters standardization
NASA Astrophysics Data System (ADS)
Polyakova, Marina; Rubin, Gennadij; Danilova, Yulija
2018-04-01
Decision making is the main stage of regulation the relations between customer and manufacturer during the design the demands of norms in standards. It is necessary to match the positions of the negotiating sides in order to gain the consensus. In order to take into consideration the differences of customer and manufacturer estimation of the object under standardization process it is obvious to use special methods of analysis. It is proposed to establish relationships between product properties and its functions using functional-target analysis. The special feature of this type of functional analysis is the consideration of the research object functions and properties. It is shown on the example of hexagonal head crew the possibility to establish links between its functions and properties. Such approach allows obtaining a quantitative assessment of the closeness the positions of customer and manufacturer at decision making during the standard norms establishment.
NASA Astrophysics Data System (ADS)
Hubert, Maxime; Pacureanu, Alexandra; Guilloud, Cyril; Yang, Yang; da Silva, Julio C.; Laurencin, Jerome; Lefebvre-Joud, Florence; Cloetens, Peter
2018-05-01
In X-ray tomography, ring-shaped artifacts present in the reconstructed slices are an inherent problem degrading the global image quality and hindering the extraction of quantitative information. To overcome this issue, we propose a strategy for suppression of ring artifacts originating from the coherent mixing of the incident wave and the object. We discuss the limits of validity of the empty beam correction in the framework of a simple formalism. We then deduce a correction method based on two-dimensional random sample displacement, with minimal cost in terms of spatial resolution, acquisition, and processing time. The method is demonstrated on bone tissue and on a hydrogen electrode of a ceramic-metallic solid oxide cell. Compared to the standard empty beam correction, we obtain high quality nanotomography images revealing detailed object features. The resulting absence of artifacts allows straightforward segmentation and posterior quantification of the data.
Sugimoto, Masahiro; Obiya, Shinichi; Kaneko, Miku; Enomoto, Ayame; Honma, Mayu; Wakayama, Masataka; Soga, Tomoyoshi; Tomita, Masaru
2017-01-18
Dry-cured hams are popular among consumers. To increase the attractiveness of the product, objective analytical methods and algorithms to evaluate the relationship between observable properties and consumer acceptability are required. In this study, metabolomics, which is used for quantitative profiling of hundreds of small molecules, was applied to 12 kinds of dry-cured hams from Japan and Europe. In total, 203 charged metabolites, including amino acids, organic acids, nucleotides, and peptides, were successfully identified and quantified. Metabolite profiles were compared for the samples with different countries of origin and processing methods (e.g., smoking or use of a starter culture). Principal component analysis of the metabolite profiles with sensory properties revealed significant correlations for redness, homogeneity, and fat whiteness. This approach could be used to design new ham products by objective evaluation of various features.
Raina, Abhay; Hennessy, Ricky; Rains, Michael; Allred, James; Hirshburg, Jason M; Diven, Dayna; Markey, Mia K.
2016-01-01
Background Traditional metrics for evaluating the severity of psoriasis are subjective, which complicates efforts to measure effective treatments in clinical trials. Methods We collected images of psoriasis plaques and calibrated the coloration of the images according to an included color card. Features were extracted from the images and used to train a linear discriminant analysis classifier with cross-validation to automatically classify the degree of erythema. The results were tested against numerical scores obtained by a panel of dermatologists using a standard rating system. Results Quantitative measures of erythema based on the digital color images showed good agreement with subjective assessment of erythema severity (κ = 0.4203). The color calibration process improved the agreement from κ = 0.2364 to κ = 0.4203. Conclusions We propose a method for the objective measurement of the psoriasis severity parameter of erythema and show that the calibration process improved the results. PMID:26517973
Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.
1993-01-01
Benthic invertebrate communities are evaluated as part of the ecological survey component of the U.S. Geological Survey's National Water-Quality Assessment Program. These biological data are collected along with physical and chemical data to assess water-quality conditions and to develop an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. The objectives of benthic invertebrate community characterizations are to (1) develop for each site a list of tax a within the associated stream reach and (2) determine the structure of benthic invertebrate communities within selected habitats of that reach. A nationally consistent approach is used to achieve these objectives. This approach provides guidance on site, reach, and habitat selection and methods and equipment for qualitative multihabitat sampling and semi-quantitative single habitat sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data within and among study units.
Developing a Social, Cultural and Economic Report Card for a Regional Industrial Harbour.
Pascoe, Sean; Tobin, Renae; Windle, Jill; Cannard, Toni; Marshall, Nadine; Kabir, Zobaidul; Flint, Nicole
2016-01-01
Report cards are increasingly used to provide ongoing snap-shots of progress towards specific ecosystem health goals, particularly in coastal regions where planners need to balance competing demands for coastal resources from a range of industries. While most previous report cards focus on the biophysical components of the system, there is a growing interest in including the social and economic implications of ecosystem management to provide a greater social-ecological system understanding. Such a report card was requested on the Gladstone Harbour area in central Queensland, Australia. Gladstone Harbour adjoins the southern Great Barrier Reef, and is also a major industrial and shipping port. Balancing social, economic and environmental interests is therefore of great concern to the regional managers. While environmental benchmarking procedures are well established within Australia (and elsewhere), a method for assessing social and economic performance of coastal management is generally lacking. The key aim of this study was to develop and pilot a system for the development of a report card relating to appropriate cultural, social and economic objectives. The approach developed uses a range of multicriteria decision analysis methods to assess and combine different qualitative and quantitative measures, including the use of Bayesian Belief Networks to combine the different measures and provide an overall quantitative score for each of the key management objectives. The approach developed is readily transferable for purposes of similar assessments in other regions.
Developing a Social, Cultural and Economic Report Card for a Regional Industrial Harbour
Pascoe, Sean; Tobin, Renae; Windle, Jill; Cannard, Toni; Marshall, Nadine; Kabir, Zobaidul; Flint, Nicole
2016-01-01
Report cards are increasingly used to provide ongoing snap-shots of progress towards specific ecosystem health goals, particularly in coastal regions where planners need to balance competing demands for coastal resources from a range of industries. While most previous report cards focus on the biophysical components of the system, there is a growing interest in including the social and economic implications of ecosystem management to provide a greater social-ecological system understanding. Such a report card was requested on the Gladstone Harbour area in central Queensland, Australia. Gladstone Harbour adjoins the southern Great Barrier Reef, and is also a major industrial and shipping port. Balancing social, economic and environmental interests is therefore of great concern to the regional managers. While environmental benchmarking procedures are well established within Australia (and elsewhere), a method for assessing social and economic performance of coastal management is generally lacking. The key aim of this study was to develop and pilot a system for the development of a report card relating to appropriate cultural, social and economic objectives. The approach developed uses a range of multicriteria decision analysis methods to assess and combine different qualitative and quantitative measures, including the use of Bayesian Belief Networks to combine the different measures and provide an overall quantitative score for each of the key management objectives. The approach developed is readily transferable for purposes of similar assessments in other regions. PMID:26839949
Statistical significance of trace evidence matches using independent physicochemical measurements
NASA Astrophysics Data System (ADS)
Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George
1997-02-01
A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.
SpArcFiRe: Scalable automated detection of spiral galaxy arm segments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Darren R.; Hayes, Wayne B., E-mail: drdavis@uci.edu, E-mail: whayes@uci.edu
Given an approximately centered image of a spiral galaxy, we describe an entirely automated method that finds, centers, and sizes the galaxy (possibly masking nearby stars and other objects if necessary in order to isolate the galaxy itself) and then automatically extracts structural information about the spiral arms. For each arm segment found, we list the pixels in that segment, allowing image analysis on a per-arm-segment basis. We also perform a least-squares fit of a logarithmic spiral arc to the pixels in that segment, giving per-arc parameters, such as the pitch angle, arm segment length, location, etc. The algorithm takesmore » about one minute per galaxies, and can easily be scaled using parallelism. We have run it on all ∼644,000 Sloan objects that are larger than 40 pixels across and classified as 'galaxies'. We find a very good correlation between our quantitative description of a spiral structure and the qualitative description provided by Galaxy Zoo humans. Our objective, quantitative measures of structure demonstrate the difficulty in defining exactly what constitutes a spiral 'arm', leading us to prefer the term 'arm segment'. We find that pitch angle often varies significantly segment-to-segment in a single spiral galaxy, making it difficult to define the pitch angle for a single galaxy. We demonstrate how our new database of arm segments can be queried to find galaxies satisfying specific quantitative visual criteria. For example, even though our code does not explicitly find rings, a good surrogate is to look for galaxies having one long, low-pitch-angle arm—which is how our code views ring galaxies. SpArcFiRe is available at http://sparcfire.ics.uci.edu.« less
Risk assessment for construction projects of transport infrastructure objects
NASA Astrophysics Data System (ADS)
Titarenko, Boris
2017-10-01
The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.
Bae, Won C.; Ruangchaijatuporn, Thumanoon; Chang, Eric Y; Biswas, Reni; Du, Jiang; Statum, Sheronda
2016-01-01
Objective To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Materials and Methods Five cadaveric wrists (22 to 70 yrs) were imaged at 3T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. Results On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. Conclusion These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques. PMID:26691643
Toward an Objective Enhanced-V Detection Algorithm
NASA Technical Reports Server (NTRS)
Brunner, Jason; Feltz, Wayne; Moses, John; Rabin, Robert; Ackerman, Steven
2007-01-01
The area of coldest cloud tops above thunderstorms sometimes has a distinct V or U shape. This pattern, often referred to as an "enhanced-V' signature, has been observed to occur during and preceding severe weather in previous studies. This study describes an algorithmic approach to objectively detect enhanced-V features with observations from the Geostationary Operational Environmental Satellite and Low Earth Orbit data. The methodology consists of cross correlation statistics of pixels and thresholds of enhanced-V quantitative parameters. The effectiveness of the enhanced-V detection method will be examined using Geostationary Operational Environmental Satellite, MODerate-resolution Imaging Spectroradiometer, and Advanced Very High Resolution Radiometer image data from case studies in the 2003-2006 seasons. The main goal of this study is to develop an objective enhanced-V detection algorithm for future implementation into operations with future sensors, such as GOES-R.
Subjective and objective scales to assess the development of children cerebral palsy.
Pietrzak, S; Jóźwiak, M
2001-01-01
Many scoring systems hale been constructed to assess the motor development of cerebral palsy children and to evaluate the effectiveness of treatment. According to the purposes they fulfill, these instruments may be divided into three types: discriminative, evaluative and predictive. The design and measurement methodology are the criteria that determine whether a given scale is quantitative or qualitative in nature, and whether is should be considered to be objective or subjective. The article presents the "reaching, losing and regaining" scale (constructed by the authors to assess functional development and its changes in certain periods of time), the Munich Functional Development Diagnostics, and the Gross Motor Function Measure (GMFM). Special attention is given to the GMFM, its methods, evaluation of results, and application. A comparison of subjective and objective assessment of two cerebral palsy children is included.
Multi-sensor image fusion algorithm based on multi-objective particle swarm optimization algorithm
NASA Astrophysics Data System (ADS)
Xie, Xia-zhu; Xu, Ya-wei
2017-11-01
On the basis of DT-CWT (Dual-Tree Complex Wavelet Transform - DT-CWT) theory, an approach based on MOPSO (Multi-objective Particle Swarm Optimization Algorithm) was proposed to objectively choose the fused weights of low frequency sub-bands. High and low frequency sub-bands were produced by DT-CWT. Absolute value of coefficients was adopted as fusion rule to fuse high frequency sub-bands. Fusion weights in low frequency sub-bands were used as particles in MOPSO. Spatial Frequency and Average Gradient were adopted as two kinds of fitness functions in MOPSO. The experimental result shows that the proposed approach performances better than Average Fusion and fusion methods based on local variance and local energy respectively in brightness, clarity and quantitative evaluation which includes Entropy, Spatial Frequency, Average Gradient and QAB/F.
Robust infrared target tracking using discriminative and generative approaches
NASA Astrophysics Data System (ADS)
Asha, C. S.; Narasimhadhan, A. V.
2017-09-01
The process of designing an efficient tracker for thermal infrared imagery is one of the most challenging tasks in computer vision. Although a lot of advancement has been achieved in RGB videos over the decades, textureless and colorless properties of objects in thermal imagery pose hard constraints in the design of an efficient tracker. Tracking of an object using a single feature or a technique often fails to achieve greater accuracy. Here, we propose an effective method to track an object in infrared imagery based on a combination of discriminative and generative approaches. The discriminative technique makes use of two complementary methods such as kernelized correlation filter with spatial feature and AdaBoost classifier with pixel intesity features to operate in parallel. After obtaining optimized locations through discriminative approaches, the generative technique is applied to determine the best target location using a linear search method. Unlike the baseline algorithms, the proposed method estimates the scale of the target by Lucas-Kanade homography estimation. To evaluate the proposed method, extensive experiments are conducted on 17 challenging infrared image sequences obtained from LTIR dataset and a significant improvement of mean distance precision and mean overlap precision is accomplished as compared with the existing trackers. Further, a quantitative and qualitative assessment of the proposed approach with the state-of-the-art trackers is illustrated to clearly demonstrate an overall increase in performance.
King, Michael A; Scotty, Nicole; Klein, Ronald L; Meyer, Edwin M
2002-10-01
Assessing the efficacy of in vivo gene transfer often requires a quantitative determination of the number, size, shape, or histological visualization characteristics of biological objects. The optical fractionator has become a choice stereological method for estimating the number of objects, such as neurons, in a structure, such as a brain subregion. Digital image processing and analytic methods can increase detection sensitivity and quantify structural and/or spectral features located in histological specimens. We describe a hardware and software system that we have developed for conducting the optical fractionator process. A microscope equipped with a video camera and motorized stage and focus controls is interfaced with a desktop computer. The computer contains a combination live video/computer graphics adapter with a video frame grabber and controls the stage, focus, and video via a commercial imaging software package. Specialized macro programs have been constructed with this software to execute command sequences requisite to the optical fractionator method: defining regions of interest, positioning specimens in a systematic uniform random manner, and stepping through known volumes of tissue for interactive object identification (optical dissectors). The system affords the flexibility to work with count regions that exceed the microscope image field size at low magnifications and to adjust the parameters of the fractionator sampling to best match the demands of particular specimens and object types. Digital image processing can be used to facilitate object detection and identification, and objects that meet criteria for counting can be analyzed for a variety of morphometric and optical properties. Copyright 2002 Elsevier Science (USA)
NASA Astrophysics Data System (ADS)
Dwiyanti, Stephani; Soeroso, Yuniarti; Sunarto, Hari; Radi, Basuni
2017-02-01
Coronary heart disease is a narrowing of coronary artery due to plaque build-up. [1] Chronic periodontitis increases risk of cardiovascular disease. P.gingivalis is linked to both diseases. Objective: to analyse quantitative difference of P.gingivalis on dental plaque and its relationship with periodontal status of CHD patient and control. Methods: Periodontal status of 66 CHD patient and 40 control was checked. Subgingival plaque was isolated and P.gingivalis was measured using real-time PCR. Result: P.gingivalis of CHD patient differs from control. P.gingivalis is linked to pocket depth of CHD patient. Conclusion: P.gingivalis count of CHD patient is higher than control. P.gingivalis count is not linked to any periodontal status, except for pocket depth of CHD patient.
Olokundun, Maxwell; Iyiola, Oluwole; Ibidunni, Stephen; Ogbari, Mercy; Falola, Hezekiah; Salau, Odunayo; Peter, Fred; Borishade, Taiye
2018-06-01
The article presented data on the effectiveness of entrepreneurship curriculum contents on university students' entrepreneurial interest and knowledge. The study focused on the perceptions of Nigerian university students. Emphasis was laid on the first four universities in Nigeria to offer a degree programme in entrepreneurship. The study adopted quantitative approach with a descriptive research design to establish trends related to the objective of the study. Survey was be used as quantitative research method. The population of this study included all students in the selected universities. Data was analyzed with the use of Statistical Package for Social Sciences (SPSS). Mean score was used as statistical tool of analysis. The field data set is made widely accessible to enable critical or a more comprehensive investigation.
Kidd, I M; Clark, D A; Emery, V C
2000-06-01
Quantitative-competitive polymerase chain reaction (QCPCR) is a well-optimised and objective methodology for the determination of viral load in clinical specimens. A major advantage of QCPCR is the ability to control for the differential modulation of the PCR process in the presence of potentially inhibitory material. QCPCR protocols were developed previously for CMV, HHV-6, HHV-7 and HHV-8 and relied upon radioactively labelled primers, followed by autoradiography of the separated and digested PCR products to quantify viral load. Whilst this approach offers high accuracy and dynamic range, non-radioactive approaches would be attractive. Here, an alternative detection system is reported, based on simple ethidium bromide staining and computer analysis of the separated reaction products, which enables its adoption in the analysis of a large number of samples. In calibration experiments using cloned HHV-7 DNA, the ethidium bromide detection method showed an improved correlation with known copy number over that obtained with the isotopic method. In addition, 67 HHV-7 PCR positive blood samples, derived from immunocompromised patients, were quantified using both detection techniques. The results showed a highly significant correlation with no significant difference between the two methods. The applicability of the computerised densitometry method in the routine laboratory is discussed.
HPLC analysis and standardization of Brahmi vati – An Ayurvedic poly-herbal formulation
Mishra, Amrita; Mishra, Arun K.; Tiwari, Om Prakash; Jha, Shivesh
2013-01-01
Objectives The aim of the present study was to standardize Brahmi vati (BV) by simultaneous quantitative estimation of Bacoside A3 and Piperine adopting HPLC–UV method. BV very important Ayurvedic polyherbo formulation used to treat epilepsy and mental disorders containing thirty eight ingredients including Bacopa monnieri L. and Piper longum L. Materials and methods An HPLC–UV method was developed for the standardization of BV in light of simultaneous quantitative estimation of Bacoside A3 and Piperine, the major constituents of B. monnieri L. and P. longum L. respectively. The developed method was validated on parameters including linearity, precision, accuracy and robustness. Results The HPLC analysis showed significant increase in amount of Bacoside A3 and Piperine in the in-house sample of BV when compared with all three different marketed samples of the same. Results showed variations in the amount of Bacoside A3 and Piperine in different samples which indicate non-uniformity in their quality which will lead to difference in their therapeutic effects. Conclusion The outcome of the present investigation underlines the importance of standardization of Ayurvedic formulations. The developed method may be further used to standardize other samples of BV or other formulations containing Bacoside A3 and Piperine. PMID:24396246
Do, Jun-Hyeong; Jang, Eunsu; Ku, Boncho; Jang, Jun-Su; Kim, Honggie; Kim, Jong Yeol
2012-07-04
Sasang constitutional medicine (SCM) is a unique form of traditional Korean medicine that divides human beings into four constitutional types (Tae-Yang: TY, Tae-Eum: TE, So-Yang: SY, and So-Eum: SE), which differ in inherited characteristics, such as external appearance, personality traits, susceptibility to particular diseases, drug responses, and equilibrium among internal organ functions. According to SCM, herbs that belong to a certain constitution cannot be used in patients with other constitutions; otherwise, this practice may result in no effect or in an adverse effect. Thus, the diagnosis of SC type is the most crucial step in SCM practice. The diagnosis, however, tends to be subjective due to a lack of quantitative standards for SC diagnosis. We have attempted to make the diagnosis method as objective as possible by basing it on an analysis of quantitative data from various Oriental medical clinics. Four individual diagnostic models were developed with multinomial logistic regression based on face, body shape, voice, and questionnaire responses. Inspired by SCM practitioners' holistic diagnostic processes, an integrated diagnostic model was then proposed by combining the four individual models. The diagnostic accuracies in the test set, after the four individual models had been integrated into a single model, improved to 64.0% and 55.2% in the male and female patient groups, respectively. Using a cut-off value for the integrated SC score, such as 1.6, the accuracies increased by 14.7% in male patients and by 4.6% in female patients, which showed that a higher integrated SC score corresponded to a higher diagnostic accuracy. This study represents the first trial of integrating the objectification of SC diagnosis based on quantitative data and SCM practitioners' holistic diagnostic processes. Although the diagnostic accuracy was not great, it is noted that the proposed diagnostic model represents common rules among practitioners who have various points of view. Our results are expected to contribute as a desirable research guide for objective diagnosis in traditional medicine, as well as to contribute to the precise diagnosis of SC types in an objective manner in clinical practice.
VanderMolen, Karen M.; Cech, Nadja B.; Paine, Mary F.
2013-01-01
Introduction Grapefruit juice can increase or decrease the systemic exposure of myriad oral medications, leading to untoward effects or reduced efficacy. Furanocoumarins in grapefruit juice have been established as inhibitors of cytochrome P450 3A (CYP3A)-mediated metabolism and P-glycoprotein (P-gp)-mediated efflux, while flavonoids have been implicated as inhibitors of organic anion transporting polypeptide (OATP)-mediated absorptive uptake in the intestine. The potential for drug interactions with a food product necessitates an understanding of the expected concentrations of a suite of structurally diverse and potentially bioactive compounds. Objective Develop methods for the rapid quantitation of two furanocoumarins (bergamottin and 6′,7′-dihydroxybergamottin) and four flavonoids (naringin, naringenin, narirutin, and hesperidin) in five grapefruit juice products using ultra performance liquid chromatography (UPLC). Methodology Grapefruit juice products were extracted with ethyl acetate; the concentrated extract was analyzed by UPLC using acetonitrile:water gradients and a C18 column. Analytes were detected using a photodiode array detector, set at 250 nm (furanocoumarins) and 310 nm (flavonoids). Intraday and interday precision and accuracy and limits of detection and quantitation were determined. Results Rapid (<5.0 min) UPLC methods were developed to measure the aforementioned furanocoumarins and flavonoids. R2 values for the calibration curves of all analytes were >0.999. Considerable between-juice variation in the concentrations of these compounds was observed, and the quantities measured were in agreement with the concentrations published in HPLC studies. Conclusion These analytical methods provide an expedient means to quantitate key furanocoumarins and flavonoids in grapefruit juice and other foods used in dietary substance-drug interaction studies. PMID:23780830
Lyngby, Janne G; Court, Michael H; Lee, Pamela M
2017-08-01
The clopidogrel active metabolite (CAM) is unstable and challenging to quantitate. The objective was to validate a new method for stabilization and quantitation of CAM, clopidogrel, and the inactive metabolites clopidogrel carboxylic acid and 2-oxo-clopiodgrel in feline plasma. Two healthy cats administered clopidogrel to demonstrate assay in vivo utility. Stabilization of CAM was achieved by adding 2-bromo-3'methoxyacetophenone to blood tubes to form a derivatized CAM (CAM-D). Method validation included evaluation of calibration curve linearity, accuracy, and precision; within and between assay precision and accuracy; and compound stability using spiked blank feline plasma. Analytes were measured by high performance liquid chromatography with tandem mass spectrometry. In vivo utility was demonstrated by a pharmacokinetic study of cats given a single oral dose of 18.75mg clopidogrel. The 2-oxo-clopidogrel metabolite was unstable. Clopidogrel, CAM-D, and clopidogrel carboxylic acid appear stable for 1 week at room temperature and 9 months at -80°C. Standard curves showed linearity for CAM-D, clopidogrel, and clopidogrel carboxylic acid (r > 0.99). Between assay accuracy and precision was ≤2.6% and ≤7.1% for CAM-D and ≤17.9% and ≤11.3% for clopidogrel and clopidogrel carboxylic acid. Within assay precision for all three compounds was ≤7%. All three compounds were detected in plasma from healthy cats receiving clopidogrel. This methodology is accurate and precise for simultaneous quantitation of CAM-D, clopidogrel, and clopidogrel carboxylic acid in feline plasma but not 2-oxo-clopidogrel. Validation of this assay is the first step to more fully understanding the use of clopidogrel in cats. Copyright © 2017 Elsevier B.V. All rights reserved.
Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia AM
2013-01-01
Background: There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. Objective: To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. Methods: We obtained conventional PDw and T2w images from 10 patients with relapsing–remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Results: Our study’s ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. Conclusion: ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished. PMID:23037551
Fuld, Matthew K; Halaweish, Ahmed F; Newell, John D; Krauss, Bernhard; Hoffman, Eric A
2013-09-01
Dual-energy x-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study, we sought to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon/oxygen gas mixtures (0%, 20%, 25%, 33%, 50%, 66%, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved 3-material decomposition calibration parameters. In addition, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Attenuation curves for xenon were obtained from the syringe test-objects and were used to develop improved 3-material decomposition parameters (Hounsfield unit enhancement per percentage xenon: within the chest phantom, 2.25 at 80 kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; in open air, 2.5 at 80 kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally nondependent portion of the airway tree test-object, while not affecting the quantitation of xenon in the 3-material decomposition DECT. The mixture 40% Xe/40% He/20% O2 provided good signal-to-noise ratio (SNR), greater than the Rose criterion (SNR > 5), while avoiding gravitational effects of similar concentrations of xenon in a 60% O2 mixture. Compared with 100/140 Sn kVp, 80/140 Sn kVp (Sn = tin filtered) provided improved SNR in a swine with an equivalent thoracic transverse density to a human subject with a body mass index of 33 kg/m. Airways were brighter in the 80/140 Sn kVp scan (80/140 Sn, 31.6%; 100/140 Sn, 25.1%) with considerably lower noise (80/140 Sn, coefficient of variation of 0.140; 100/140 Sn, coefficient of variation of 0.216). To provide a truly quantitative measure of regional lung function with xenon-DECT, the basic protocols and parameter calibrations need to be better understood and quantified. It is critically important to understand the fundamentals of new techniques to allow for proper implementation and interpretation of their results before widespread usage. With the use of an in-house derived xenon calibration curve for 3-material decomposition rather than the scanner supplied calibration and a xenon/helium/oxygen mixture, we demonstrate highly accurate quantitation of xenon gas volumes and avoid gravitational effects on gas distribution. This study provides a foundation for other researchers to use and test these methods with the goal of clinical translation.