Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.
NASA Technical Reports Server (NTRS)
Leonard, Desiree M.
1991-01-01
Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.
Assessment and monitoring of forest ecosystem structure
Oscar A. Aguirre Calderón; Javier Jiménez Pérez; Horst Kramer
2006-01-01
Characterization of forest ecosystems structure must be based on quantitative indices that allow objective analysis of human influences or natural succession processes. The objective of this paper is the compilation of diverse quantitative variables to describe structural attributes from the arboreal stratum of the ecosystem, as well as different methods of forest...
Dolled-Filhart, Marisa P; Gustavson, Mark D
2012-11-01
Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.
Integrated Approach To Design And Analysis Of Systems
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Iverson, David L.
1993-01-01
Object-oriented fault-tree representation unifies evaluation of reliability and diagnosis of faults. Programming/fault tree described more fully in "Object-Oriented Algorithm For Evaluation Of Fault Trees" (ARC-12731). Augmented fault tree object contains more information than fault tree object used in quantitative analysis of reliability. Additional information needed to diagnose faults in system represented by fault tree.
Multivariate Quantitative Chemical Analysis
NASA Technical Reports Server (NTRS)
Kinchen, David G.; Capezza, Mary
1995-01-01
Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.
A further component analysis for illicit drugs mixtures with THz-TDS
NASA Astrophysics Data System (ADS)
Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui
2009-07-01
A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.
On the Formal-Logical Analysis of the Foundations of Mathematics Applied to Problems in Physics
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2016-03-01
Analysis of the foundations of mathematics applied to problems in physics was proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is shown that critical analysis of the concept of mathematical quantity - central concept of mathematics - leads to the following conclusion: (1) The concept of ``mathematical quantity'' is the result of the following mental operations: (a) abstraction of the ``quantitative determinacy of physical quantity'' from the ``physical quantity'' at that the ``quantitative determinacy of physical quantity'' is an independent object of thought; (b) abstraction of the ``amount (i.e., abstract number)'' from the ``quantitative determinacy of physical quantity'' at that the ``amount (i.e., abstract number)'' is an independent object of thought. In this case, unnamed, abstract numbers are the only sign of the ``mathematical quantity''. This sign is not an essential sign of the material objects. (2) The concept of mathematical quantity is meaningless, erroneous, and inadmissible concept in science because it represents the following formal-logical and dialectical-materialistic error: negation of the existence of the essential sign of the concept (i.e., negation of the existence of the essence of the concept) and negation of the existence of measure of material object.
Economic analysis of light brown apple moth using GIS and quantitative modeling
Glenn Fowler; Lynn Garrett; Alison Neeley; Roger Magarey; Dan Borchert; Brian Spears
2011-01-01
We conducted an economic analysis of the light brown apple moth (LBAM), (piphyas postvittana (Walker)), whose presence in California has resulted in a regulatory program. Our objective was to quantitatively characterize the economic costs to apple, grape, orange, and pear crops that would result from LBAM's introduction into the continental...
Xu, Yihua; Pitot, Henry C
2006-03-01
In the studies of quantitative stereology of rat hepatocarcinogenesis, we have used image analysis technology (automatic particle analysis) to obtain data such as liver tissue area, size and location of altered hepatic focal lesions (AHF), and nuclei counts. These data are then used for three-dimensional estimation of AHF occurrence and nuclear labeling index analysis. These are important parameters for quantitative studies of carcinogenesis, for screening and classifying carcinogens, and for risk estimation. To take such measurements, structures or cells of interest should be separated from the other components based on the difference of color and density. Common background problems seen on the captured sample image such as uneven light illumination or color shading can cause severe problems in the measurement. Two application programs (BK_Correction and Pixel_Separator) have been developed to solve these problems. With BK_Correction, common background problems such as incorrect color temperature setting, color shading, and uneven light illumination background, can be corrected. With Pixel_Separator different types of objects can be separated from each other in relation to their color, such as seen with different colors in immunohistochemically stained slides. The resultant images of such objects separated from other components are then ready for particle analysis. Objects that have the same darkness but different colors can be accurately differentiated in a grayscale image analysis system after application of these programs.
NASA Astrophysics Data System (ADS)
Latief, F. D. E.; Mohammad, I. H.; Rarasati, A. D.
2017-11-01
Digital imaging of a concrete sample using high resolution tomographic imaging by means of X-Ray Micro Computed Tomography (μ-CT) has been conducted to assess the characteristic of the sample’s structure. A standard procedure of image acquisition, reconstruction, image processing of the method using a particular scanning device i.e., the Bruker SkyScan 1173 High Energy Micro-CT are elaborated. A qualitative and a quantitative analysis were briefly performed on the sample to deliver some basic ideas of the capability of the system and the bundled software package. Calculation of total VOI volume, object volume, percent of object volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity were conducted and analysed. This paper should serve as a brief description of how the device can produce the preferred image quality as well as the ability of the bundled software packages to help in performing qualitative and quantitative analysis.
ERIC Educational Resources Information Center
Reese, Debbie Denise; Tabachnick, Barbara G.
2010-01-01
In this paper, the authors summarize a quantitative analysis demonstrating that the CyGaMEs toolset for embedded assessment of learning within instructional games measures growth in conceptual knowledge by quantifying player behavior. CyGaMEs stands for Cyberlearning through GaME-based, Metaphor Enhanced Learning Objects. Some scientists of…
ERIC Educational Resources Information Center
Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.
2012-01-01
Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…
Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana
2014-01-01
Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282
United States Marine Corps Basic Reconnaissance Course: Predictors of Success
2017-03-01
PAGE INTENTIONALLY LEFT BLANK 81 VI. CONCLUSIONS AND RECOMMENDATIONS A. CONCLUSIONS The objective of my research is to provide quantitative ...percent over the last three years, illustrating there is room for improvement. This study conducts a quantitative and qualitative analysis of the...criteria used to select candidates for the BRC. The research uses multi-variate logistic regression models and survival analysis to determine to what
NASA Technical Reports Server (NTRS)
1986-01-01
Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.
Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations
NASA Technical Reports Server (NTRS)
Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.
1993-01-01
We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.
Luciw, Paul A; Oslund, Karen L; Yang, Xiao-Wei; Adamson, Lourdes; Ravindran, Resmi; Canfield, Don R; Tarara, Ross; Hirst, Linda; Christensen, Miles; Lerche, Nicholas W; Offenstein, Heather; Lewinsohn, David; Ventimiglia, Frank; Brignolo, Laurie; Wisner, Erik R; Hyde, Dallas M
2011-11-01
Infection with Mycobacterium tuberculosis primarily produces a multifocal distribution of pulmonary granulomas in which the pathogen resides. Accordingly, quantitative assessment of the bacterial load and pathology is a substantial challenge in tuberculosis. Such assessments are critical for studies of the pathogenesis and for the development of vaccines and drugs in animal models of experimental M. tuberculosis infection. Stereology enables unbiased quantitation of three-dimensional objects from two-dimensional sections and thus is suited to quantify histological lesions. We have developed a protocol for stereological analysis of the lung in rhesus macaques inoculated with a pathogenic clinical strain of M. tuberculosis (Erdman strain). These animals exhibit a pattern of infection and tuberculosis similar to that of naturally infected humans. Conditions were optimized for collecting lung samples in a nonbiased, random manner. Bacterial load in these samples was assessed by a standard plating assay, and granulomas were graded and enumerated microscopically. Stereological analysis provided quantitative data that supported a significant correlation between bacterial load and lung granulomas. Thus this stereological approach enables a quantitative, statistically valid analysis of the impact of M. tuberculosis infection in the lung and will serve as an essential tool for objectively comparing the efficacy of drugs and vaccines.
Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review
ERIC Educational Resources Information Center
Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael
2014-01-01
Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…
ERIC Educational Resources Information Center
Castillo, Enrico G.; Pincus, Harold A.; Wieland, Melissa; Roter, Debra; Larson, Susan; Houck, Patricia; Reynolds, Charles F.; Cruz, Mario
2012-01-01
Objective: The authors quantitatively examined differences in psychiatric residents' and attending physicians' communication profiles and voice tones. Methods: Audiotaped recordings of 49 resident-patient and 35 attending-patient medication-management appointments at four ambulatory sites were analyzed with the Roter Interaction Analysis System…
Madill, A; Jordan, A; Shirley, C
2000-02-01
The effect of the individual analyst on research findings can create a credibility problem for qualitative approaches from the perspective of evaluative criteria utilized in quantitative psychology. This paper explicates the ways in which objectivity and reliability are understood in qualitative analysis conducted from within three distinct epistemological frameworks: realism, contextual constructionism, and radical constructionism. It is argued that quality criteria utilized in quantitative psychology are appropriate to the evaluation of qualitative analysis only to the extent that it is conducted within a naive or scientific realist framework. The discussion is illustrated with reference to the comparison of two independent grounded theory analyses of identical material. An implication of this illustration is to identify the potential to develop a radical constructionist strand of grounded theory.
Computer simulation of schlieren images of rotationally symmetric plasma systems: a simple method.
Noll, R; Haas, C R; Weikl, B; Herziger, G
1986-03-01
Schlieren techniques are commonly used methods for quantitative analysis of cylindrical or spherical index of refraction profiles. Many schlieren objects, however, are characterized by more complex geometries, so we have investigated the more general case of noncylindrical, rotationally symmetric distributions of index of refraction n(r,z). Assuming straight ray paths in the schlieren object we have calculated 2-D beam deviation profiles. It is shown that experimental schlieren images of the noncylindrical plasma generated by a plasma focus device can be simulated with these deviation profiles. The computer simulation allows a quantitative analysis of these schlieren images, which yields, for example, the plasma parameters, electron density, and electron density gradients.
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
ERIC Educational Resources Information Center
Chudagr, Amita; Luschei, Thomas F.
2016-01-01
The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…
USDA-ARS?s Scientific Manuscript database
Cotton cultivars with reduced fiber-seed attachment force have the potential to be ginned faster with less energy. The objective of this study was to identify quantitative trait loci (QTL) for net ginning energy (NGE) requirement, and its relationship with other fiber quality traits in upland cotton...
Application of magnetic carriers to two examples of quantitative cell analysis
NASA Astrophysics Data System (ADS)
Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E.; Todd, Paul; Hanley, Thomas R.
2017-04-01
The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility.
THE APPLICATION OF CYBERNETICS IN PEDAGOGY.
ERIC Educational Resources Information Center
ATUTOV, P.R.
THE APPLICATION OF CYBERNETICS TO PEDAGOGY CAN CREATE A PRECISE SCIENCE OF INSTRUCTION AND EDUCATION THROUGH THE TIME-CONSUMING BUT INEVITABLE TRANSITION FROM IDENTIFICATION OF QUALITATIVE RELATIONSHIPS AMONG PEDAGOGICAL OBJECTS TO QUANTITATIVE ANALYSIS OF THESE OBJECTS. THE THEORETICAL UTILITY OF MATHEMATICAL MODELS AND FORMULAE FOR EXPLANATORY…
Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.
Boucheron, Laura E
2013-07-16
Quantitative object and spatial arrangement-level analysis of tissue are detailed using expert (pathologist) input to guide the classification process. A two-step method is disclosed for imaging tissue, by classifying one or more biological materials, e.g. nuclei, cytoplasm, and stroma, in the tissue into one or more identified classes on a pixel-by-pixel basis, and segmenting the identified classes to agglomerate one or more sets of identified pixels into segmented regions. Typically, the one or more biological materials comprises nuclear material, cytoplasm material, and stromal material. The method further allows a user to markup the image subsequent to the classification to re-classify said materials. The markup is performed via a graphic user interface to edit designated regions in the image.
ERIC Educational Resources Information Center
Yeni, Sabiha; Ozdener, Nesrin
2014-01-01
The purpose of the study is to investigate how pre-service teachers benefit from learning objects repositories while preparing course content. Qualitative and quantitative data collection methods were used in a mixed methods approach. This study was carried out with 74 teachers from the Faculty of Education. In the first phase of the study,…
Products of combustion of non-metallic materials
NASA Technical Reports Server (NTRS)
Perry, Cortes L.
1995-01-01
The objective of this project is to evaluate methodologies for the qualitative and quantitative determination of the gaseous products of combustion of non-metallic materials of interest to the aerospace community. The goal is to develop instrumentation and analysis procedures which qualitatively and quantitatively identify gaseous products evolved by thermal decomposition and provide NASA a detailed system operating procedure.
DOT National Transportation Integrated Search
1995-07-01
An objective and quantitative method has been developed for deriving models of complex and specialized spheres of activity (domains) from domain-generated verbal data. The method was developed for analysis of interview transcripts, incident reports, ...
Matrix evaluation of science objectives
NASA Technical Reports Server (NTRS)
Wessen, Randii R.
1994-01-01
The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.
Meta-Analysis and Systematic Review Assessing the Efficacy of Dialectical Behavior Therapy (DBT)
ERIC Educational Resources Information Center
Panos, Patrick T.; Jackson, John W.; Hasan, Omar; Panos, Angelea
2014-01-01
Objective: The objective was to quantitatively and qualitatively examine the efficacy of DBT (e.g., decreasing life-threatening suicidal and parasuicidal acts, attrition, and depression) explicitly with borderline personality disorder (BPD) and using conservative assumptions and criteria, across treatment providers and settings. Method: Five…
Histopathological image analysis of chemical-induced hepatocellular hypertrophy in mice.
Asaoka, Yoshiji; Togashi, Yuko; Mutsuga, Mayu; Imura, Naoko; Miyoshi, Tomoya; Miyamoto, Yohei
2016-04-01
Chemical-induced hepatocellular hypertrophy is frequently observed in rodents, and is mostly caused by the induction of phase I and phase II drug metabolic enzymes and peroxisomal lipid metabolic enzymes. Liver weight is a sensitive and commonly used marker for detecting hepatocellular hypertrophy, but is also increased by a number of other factors. Histopathological observations subjectively detect changes such as hepatocellular hypertrophy based on the size of a hepatocyte. Therefore, quantitative microscopic observations are required to evaluate histopathological alterations objectively. In the present study, we developed a novel quantitative method for an image analysis of hepatocellular hypertrophy using liver sections stained with hematoxylin and eosin, and demonstrated its usefulness for evaluating hepatocellular hypertrophy induced by phenobarbital (a phase I and phase II enzyme inducer) and clofibrate (a peroxisomal enzyme inducer) in mice. The algorithm of this imaging analysis was designed to recognize an individual hepatocyte through a combination of pixel-based and object-based analyses. Hepatocellular nuclei and the surrounding non-hepatocellular cells were recognized by the pixel-based analysis, while the areas of the recognized hepatocellular nuclei were then expanded until they ran against their expanding neighboring hepatocytes and surrounding non-hepatocellular cells by the object-based analysis. The expanded area of each hepatocellular nucleus was regarded as the size of an individual hepatocyte. The results of this imaging analysis showed that changes in the sizes of hepatocytes corresponded with histopathological observations in phenobarbital and clofibrate-treated mice, and revealed a correlation between hepatocyte size and liver weight. In conclusion, our novel image analysis method is very useful for quantitative evaluations of chemical-induced hepatocellular hypertrophy. Copyright © 2015 Elsevier GmbH. All rights reserved.
SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena
Tohsato, Yukako; Ho, Kenneth H. L.; Kyoda, Koji; Onami, Shuichi
2016-01-01
Motivation: Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. Results: We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus. The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. Availability and Implementation: SSBD is accessible at http://ssbd.qbic.riken.jp. Contact: sonami@riken.jp PMID:27412095
SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena.
Tohsato, Yukako; Ho, Kenneth H L; Kyoda, Koji; Onami, Shuichi
2016-11-15
Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. SSBD is accessible at http://ssbd.qbic.riken.jp CONTACT: sonami@riken.jp. © The Author 2016. Published by Oxford University Press.
2016-09-15
Investigative Questions This research will quantitatively address the impact of proposed benefits of a 3D printed satellite architecture on the...subsystems of a CubeSat. The objective of this research is to bring a quantitative analysis to the discussion of whether a fully 3D printed satellite...manufacturers to quantitatively address what impact the architecture would have on the subsystems of a CubeSat. Summary of Research Gap, Research Questions, and
ERIC Educational Resources Information Center
Storfer-Isser, Amy; Musher-Eizenman, Dara
2013-01-01
Objective: To examine the psychometric properties of 9 quantitative items that assess time scarcity and fatigue as parent barriers to planning and preparing meals for their children. Methods: A convenience sample of 342 parents of children aged 2-6 years completed a 20-minute online survey. Exploratory factor analysis was used to examine the…
Quantification of EEG reactivity in comatose patients
Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas
2016-01-01
Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757
Taxonomy based analysis of force exchanges during object grasping and manipulation
Martin-Brevet, Sandra; Jarrassé, Nathanaël; Burdet, Etienne
2017-01-01
The flexibility of the human hand in object manipulation is essential for daily life activities, but remains relatively little explored with quantitative methods. On the one hand, recent taxonomies describe qualitatively the classes of hand postures for object grasping and manipulation. On the other hand, the quantitative analysis of hand function has been generally restricted to precision grip (with thumb and index opposition) during lifting tasks. The aim of the present study is to fill the gap between these two kinds of descriptions, by investigating quantitatively the forces exerted by the hand on an instrumented object in a set of representative manipulation tasks. The object was a parallelepiped object able to measure the force exerted on the six faces and its acceleration. The grasping force was estimated from the lateral force and the unloading force from the bottom force. The protocol included eleven tasks with complementary constraints inspired by recent taxonomies: four tasks corresponding to lifting and holding the object with different grasp configurations, and seven to manipulating the object (rotation around each of its axis and translation). The grasping and unloading forces and object rotations were measured during the five phases of the actions: unloading, lifting, holding or manipulation, preparation to deposit, and deposit. The results confirm the tight regulation between grasping and unloading forces during lifting, and extend this to the deposit phase. In addition, they provide a precise description of the regulation of force exchanges during various manipulation tasks spanning representative actions of daily life. The timing of manipulation showed both sequential and overlapping organization of the different sub-actions, and micro-errors could be detected. This phenomenological study confirms the feasibility of using an instrumented object to investigate complex manipulative behavior in humans. This protocol will be used in the future to investigate upper-limb dexterity in patients with sensory-motor impairments. PMID:28562617
2012-05-18
by the AWAC. It is a surface- penetrating device that measures continuous changes in the water elevations over time at much higher sampling rates of...background subtraction, a technique based on detecting change from a background scene. Their study highlights the difficulty in object detection and tracking...movements (Zhang et al. 2009) Alternatively, another common object detection method , known as Optical Flow Analysis , may be utilized for vessel
A programmable light engine for quantitative single molecule TIRF and HILO imaging.
van 't Hoff, Marcel; de Sars, Vincent; Oheim, Martin
2008-10-27
We report on a simple yet powerful implementation of objective-type total internal reflection fluorescence (TIRF) and highly inclined and laminated optical sheet (HILO, a type of dark-field) illumination. Instead of focusing the illuminating laser beam to a single spot close to the edge of the microscope objective, we are scanning during the acquisition of a fluorescence image the focused spot in a circular orbit, thereby illuminating the sample from various directions. We measure parameters relevant for quantitative image analysis during fluorescence image acquisition by capturing an image of the excitation light distribution in an equivalent objective backfocal plane (BFP). Operating at scan rates above 1 MHz, our programmable light engine allows directional averaging by circular spinning the spot even for sub-millisecond exposure times. We show that restoring the symmetry of TIRF/HILO illumination reduces scattering and produces an evenly lit field-of-view that affords on-line analysis of evanescnt-field excited fluorescence without pre-processing. Utilizing crossed acousto-optical deflectors, our device generates arbitrary intensity profiles in BFP, permitting variable-angle, multi-color illumination, or objective lenses to be rapidly exchanged.
A Historical Analysis of Internal Review
1981-03-01
the background material presented. In such a study as this, the absence of quantitative data forces narrative descriptions and arguments vice... difinitive graphic displays. The chapter seeked to convey a sense of history and development of auditing in general and internal review in particular. In...measure represents the closest feasible way of measuring the accomplishment of an objective that cannot itself be expressed quantitatively . Such a measure
Gross, Colin A; Reddy, Chandan K; Dazzo, Frank B
2010-02-01
Quantitative microscopy and digital image analysis are underutilized in microbial ecology largely because of the laborious task to segment foreground object pixels from background, especially in complex color micrographs of environmental samples. In this paper, we describe an improved computing technology developed to alleviate this limitation. The system's uniqueness is its ability to edit digital images accurately when presented with the difficult yet commonplace challenge of removing background pixels whose three-dimensional color space overlaps the range that defines foreground objects. Image segmentation is accomplished by utilizing algorithms that address color and spatial relationships of user-selected foreground object pixels. Performance of the color segmentation algorithm evaluated on 26 complex micrographs at single pixel resolution had an overall pixel classification accuracy of 99+%. Several applications illustrate how this improved computing technology can successfully resolve numerous challenges of complex color segmentation in order to produce images from which quantitative information can be accurately extracted, thereby gain new perspectives on the in situ ecology of microorganisms. Examples include improvements in the quantitative analysis of (1) microbial abundance and phylotype diversity of single cells classified by their discriminating color within heterogeneous communities, (2) cell viability, (3) spatial relationships and intensity of bacterial gene expression involved in cellular communication between individual cells within rhizoplane biofilms, and (4) biofilm ecophysiology based on ribotype-differentiated radioactive substrate utilization. The stand-alone executable file plus user manual and tutorial images for this color segmentation computing application are freely available at http://cme.msu.edu/cmeias/ . This improved computing technology opens new opportunities of imaging applications where discriminating colors really matter most, thereby strengthening quantitative microscopy-based approaches to advance microbial ecology in situ at individual single-cell resolution.
DOT National Transportation Integrated Search
1981-10-01
The objectives of the Systems Operation Studies (SOS) for automated guideway transit (AGT) systems are to develop models for the analysis of system operations, to evaluate performance and cost, and to establish guidelines for the design and operation...
A benchmark for comparison of dental radiography analysis algorithms.
Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia
2016-07-01
Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028
Quantitative Hyperspectral Reflectance Imaging
Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.
2008-01-01
Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831
CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis
Choi, Kyoungah; Lee, Impyeong
2015-01-01
We propose a novel approach to evaluating how effectively a closed circuit television (CCTV) system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system. PMID:26389909
Quantitative subsurface analysis using frequency modulated thermal wave imaging
NASA Astrophysics Data System (ADS)
Subhani, S. K.; Suresh, B.; Ghali, V. S.
2018-01-01
Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.
Using enterprise architecture to analyse how organisational structure impact motivation and learning
NASA Astrophysics Data System (ADS)
Närman, Pia; Johnson, Pontus; Gingnell, Liv
2016-06-01
When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.
Determining characteristics of artificial near-Earth objects using observability analysis
NASA Astrophysics Data System (ADS)
Friedman, Alex M.; Frueh, Carolin
2018-03-01
Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.
Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy
NASA Astrophysics Data System (ADS)
Sugiyama, Naruhisa; Shirakawa, Tomohiro
2017-07-01
The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.
Influence of corrosion layers on quantitative analysis
NASA Astrophysics Data System (ADS)
Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Röhrich, J.; Strub, E.
2005-09-01
Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed.
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia
2016-01-01
ABSTRACT Introduction/Background: Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. Material and Methods: n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Results: Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Conclusion: Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they can act as surrogate markers for grade in a manner that is more objective and reproducible. PMID:27532114
Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia
2016-01-01
Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they can act as surrogate markers for grade in a manner that is more objective and reproducible. Copyright® by the International Brazilian Journal of Urology.
2015-09-30
TREX13 data analysis /modeling Dajun (DJ) Tang Applied Physics Laboratory, University of Washington 1013 NE 40th Street, Seattle, WA 98105...accuracy in those predictions. With extensive TREX13 data in hand, the objective now shifts to realizing the long-term goals using data analysis and...be quantitatively addressed. The approach to analysis can be summarized into the following steps: 1. Based on measurements, assess to what degree
Miller, Christopher B.; Bartlett, Delwyn J.; Mullins, Anna E.; Dodds, Kirsty L.; Gordon, Christopher J.; Kyle, Simon D.; Kim, Jong Won; D'Rozario, Angela L.; Lee, Rico S.C.; Comas, Maria; Marshall, Nathaniel S.; Yee, Brendon J.; Espie, Colin A.; Grunstein, Ronald R.
2016-01-01
Study Objectives: To empirically derive and evaluate potential clusters of Insomnia Disorder through cluster analysis from polysomnography (PSG). We hypothesized that clusters would differ on neurocognitive performance, sleep-onset measures of quantitative (q)-EEG and heart rate variability (HRV). Methods: Research volunteers with Insomnia Disorder (DSM-5) completed a neurocognitive assessment and overnight PSG measures of total sleep time (TST), wake time after sleep onset (WASO), and sleep onset latency (SOL) were used to determine clusters. Results: From 96 volunteers with Insomnia Disorder, cluster analysis derived at least two clusters from objective sleep parameters: Insomnia with normal objective sleep duration (I-NSD: n = 53) and Insomnia with short sleep duration (I-SSD: n = 43). At sleep onset, differences in HRV between I-NSD and I-SSD clusters suggest attenuated parasympathetic activity in I-SSD (P < 0.05). Preliminary work suggested three clusters by retaining the I-NSD and splitting the I-SSD cluster into two: I-SSD A (n = 29): defined by high WASO and I-SSD B (n = 14): a second I-SSD cluster with high SOL and medium WASO. The I-SSD B cluster performed worse than I-SSD A and I-NSD for sustained attention (P ≤ 0.05). In an exploratory analysis, q-EEG revealed reduced spectral power also in I-SSD B before (Delta, Alpha, Beta-1) and after sleep-onset (Beta-2) compared to I-SSD A and I-NSD (P ≤ 0.05). Conclusions: Two insomnia clusters derived from cluster analysis differ in sleep onset HRV. Preliminary data suggest evidence for three clusters in insomnia with differences for sustained attention and sleep-onset q-EEG. Clinical Trial Registration: Insomnia 100 sleep study: Australia New Zealand Clinical Trials Registry (ANZCTR) identification number 12612000049875. URL: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=347742. Citation: Miller CB, Bartlett DJ, Mullins AE, Dodds KL, Gordon CJ, Kyle SD, Kim JW, D'Rozario AL, Lee RS, Comas M, Marshall NS, Yee BJ, Espie CA, Grunstein RR. Clusters of Insomnia Disorder: an exploratory cluster analysis of objective sleep parameters reveals differences in neurocognitive functioning, quantitative EEG, and heart rate variability. SLEEP 2016;39(11):1993–2004. PMID:27568796
A Meta-Analysis of Predictors of Offender Treatment Attrition and Its Relationship to Recidivism
ERIC Educational Resources Information Center
Olver, Mark E.; Stockdale, Keira C.; Wormith, J. Stephen
2011-01-01
Objective: The failure of offenders to complete psychological treatment can pose significant concerns, including increased risk for recidivism. Although a large literature identifying predictors of offender treatment attrition has accumulated, there has yet to be a comprehensive quantitative review. Method: A meta-analysis of the offender…
NASA Astrophysics Data System (ADS)
Qi, Pan; Shao, Wenbin; Liao, Shusheng
2016-02-01
For quantitative defects detection research on heat transfer tube in nuclear power plants (NPP), two parts of work are carried out based on the crack as the main research objects. (1) Production optimization of calibration tube. Firstly, ASME, RSEM and homemade crack calibration tubes are applied to quantitatively analyze the defects depth on other designed crack test tubes, and then the judgment with quantitative results under crack calibration tube with more accuracy is given. Base on that, weight analysis of influence factors for crack depth quantitative test such as crack orientation, length, volume and so on can be undertaken, which will optimize manufacture technology of calibration tubes. (2) Quantitative optimization of crack depth. Neural network model with multi-calibration curve adopted to optimize natural crack test depth generated in in-service tubes shows preliminary ability to improve quantitative accuracy.
Leung, Janet T Y; Shek, Daniel T L
2011-01-01
This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.
An improved level set method for brain MR images segmentation and bias correction.
Chen, Yunjie; Zhang, Jianwei; Macione, Jim
2009-10-01
Intensity inhomogeneities cause considerable difficulty in the quantitative analysis of magnetic resonance (MR) images. Thus, bias field estimation is a necessary step before quantitative analysis of MR data can be undertaken. This paper presents a variational level set approach to bias correction and segmentation for images with intensity inhomogeneities. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the overall intensity inhomogeneity. We first define a localized K-means-type clustering objective function for image intensities in a neighborhood around each point. The cluster centers in this objective function have a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain to define the data term into the level set framework. Our method is able to capture bias of quite general profiles. Moreover, it is robust to initialization, and thereby allows fully automated applications. The proposed method has been used for images of various modalities with promising results.
Wachowiak, Roman; Strach, Bogna
2006-01-01
The study takes advantage of the presently available effective physicochemical methods (isolation, crystallization, determination of melting point, TLC, GLC and UV spectrophotometry) for an objective and reliable qualitative and quantitative analysis of frequently abused drugs. The authors determined the conditions for qualitative and quantitative analysis of active components of the secured evidence materials containing amphetamine sulphate, methylamphetamine hydrochloride, 3,4-me-tylenedioxy-methamphetamine hydrochloride (MDMA, Ecstasy), as well as delta(9)-tetrahydrocannabinol (delta(9)-THC) as an active component of cannabis (marihuana, hashish). The usefulness of physicochemical tests of evidence materials for opinionating purposes is subject to a detailed forensic toxicological interpretation.
Detection and analysis of high-temperature events in the BIRD mission
NASA Astrophysics Data System (ADS)
Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang
2005-01-01
The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.
Man-machine analysis of translation and work tasks of Skylab films
NASA Technical Reports Server (NTRS)
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
1979-01-01
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
DOT National Transportation Integrated Search
2017-05-01
In this project, Florida Atlantic University researchers developed a methodology and software tools that allow objective, quantitative analysis of the performance of signal systems. : The researchers surveyed the state of practice for traffic signal ...
Frame sequences analysis technique of linear objects movement
NASA Astrophysics Data System (ADS)
Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.
2017-12-01
Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.
Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro
2016-01-01
Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image analysis.
Comparative analysis of imaging configurations and objectives for Fourier microscopy.
Kurvits, Jonathan A; Jiang, Mingming; Zia, Rashid
2015-11-01
Fourier microscopy is becoming an increasingly important tool for the analysis of optical nanostructures and quantum emitters. However, achieving quantitative Fourier space measurements requires a thorough understanding of the impact of aberrations introduced by optical microscopes that have been optimized for conventional real-space imaging. Here we present a detailed framework for analyzing the performance of microscope objectives for several common Fourier imaging configurations. To this end, we model objectives from Nikon, Olympus, and Zeiss using parameters that were inferred from patent literature and confirmed, where possible, by physical disassembly. We then examine the aberrations most relevant to Fourier microscopy, including the alignment tolerances of apodization factors for different objective classes, the effect of magnification on the modulation transfer function, and vignetting-induced reductions of the effective numerical aperture for wide-field measurements. Based on this analysis, we identify an optimal objective class and imaging configuration for Fourier microscopy. In addition, the Zemax files for the objectives and setups used in this analysis have been made publicly available as a resource for future studies.
Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.
Surface plasmon resonance microscopy: achieving a quantitative optical response
Peterson, Alexander W.; Halter, Michael; Plant, Anne L.; Elliott, John T.
2016-01-01
Surface plasmon resonance (SPR) imaging allows real-time label-free imaging based on index of refraction, and changes in index of refraction at an interface. Optical parameter analysis is achieved by application of the Fresnel model to SPR data typically taken by an instrument in a prism based configuration. We carry out SPR imaging on a microscope by launching light into a sample, and collecting reflected light through a high numerical aperture microscope objective. The SPR microscope enables spatial resolution that approaches the diffraction limit, and has a dynamic range that allows detection of subnanometer to submicrometer changes in thickness of biological material at a surface. However, unambiguous quantitative interpretation of SPR changes using the microscope system could not be achieved using the Fresnel model because of polarization dependent attenuation and optical aberration that occurs in the high numerical aperture objective. To overcome this problem, we demonstrate a model to correct for polarization diattenuation and optical aberrations in the SPR data, and develop a procedure to calibrate reflectivity to index of refraction values. The calibration and correction strategy for quantitative analysis was validated by comparing the known indices of refraction of bulk materials with corrected SPR data interpreted with the Fresnel model. Subsequently, we applied our SPR microscopy method to evaluate the index of refraction for a series of polymer microspheres in aqueous media and validated the quality of the measurement with quantitative phase microscopy. PMID:27782542
ERIC Educational Resources Information Center
Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong
2015-01-01
Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…
Trade Space Analysis: Rotational Analyst Research Project
2015-09-01
POM Program Objective Memoranda PM Program Manager RFP Request for Proposal ROM Rough Order Magnitude RSM Response Surface Method RSE ...response surface method (RSM) / response surface equations ( RSEs ) as surrogate models. It uses the RSEs with Monte Carlo simulation to quantitatively
With the objective of detecting and quantitating low concentrations of perfluorinated carboxylic acids (PFCAs), including perfluorinated octanoic acid (PFOA), in soils, we compared the analytical suitability of liquid chromatography columns containing three different stationary p...
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko
2017-07-01
Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.
Statistical significance of trace evidence matches using independent physicochemical measurements
NASA Astrophysics Data System (ADS)
Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George
1997-02-01
A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.
ERIC Educational Resources Information Center
Plonsky, Luke; Gonulal, Talip
2015-01-01
Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…
Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter
2012-04-01
Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.
Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He
2017-07-01
This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.
The SCHEIE Visual Field Grading System
Sankar, Prithvi S.; O’Keefe, Laura; Choi, Daniel; Salowe, Rebecca; Miller-Ellis, Eydie; Lehman, Amanda; Addis, Victoria; Ramakrishnan, Meera; Natesh, Vikas; Whitehead, Gideon; Khachatryan, Naira; O’Brien, Joan
2017-01-01
Objective No method of grading visual field (VF) defects has been widely accepted throughout the glaucoma community. The SCHEIE (Systematic Classification of Humphrey visual fields-Easy Interpretation and Evaluation) grading system for glaucomatous visual fields was created to convey qualitative and quantitative information regarding visual field defects in an objective, reproducible, and easily applicable manner for research purposes. Methods The SCHEIE grading system is composed of a qualitative and quantitative score. The qualitative score consists of designation in one or more of the following categories: normal, central scotoma, paracentral scotoma, paracentral crescent, temporal quadrant, nasal quadrant, peripheral arcuate defect, expansive arcuate, or altitudinal defect. The quantitative component incorporates the Humphrey visual field index (VFI), location of visual defects for superior and inferior hemifields, and blind spot involvement. Accuracy and speed at grading using the qualitative and quantitative components was calculated for non-physician graders. Results Graders had a median accuracy of 96.67% for their qualitative scores and a median accuracy of 98.75% for their quantitative scores. Graders took a mean of 56 seconds per visual field to assign a qualitative score and 20 seconds per visual field to assign a quantitative score. Conclusion The SCHEIE grading system is a reproducible tool that combines qualitative and quantitative measurements to grade glaucomatous visual field defects. The system aims to standardize clinical staging and to make specific visual field defects more easily identifiable. Specific patterns of visual field loss may also be associated with genetic variants in future genetic analysis. PMID:28932621
Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
Benefit-risk analysis : a brief review and proposed quantitative approaches.
Holden, William L
2003-01-01
Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.
Normalized Temperature Contrast Processing in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
HOLST, Alexandra Ioana; HOLST, Stefan; HIRSCHFELDER, Ursula; von SECKENDORFF, Volker
2012-01-01
Objective The objective of this study was to investigate the applicability of micro-analytical methods with high spatial resolution to the characterization of the composition and corrosion behavior of two bracket systems. Material and methods The surfaces of six nickel-free brackets and six nickel-containing brackets were examined for signs of corrosion and qualitative surface analysis using an electron probe microanalyzer (EPMA), prior to bonding to patient's tooth surfaces and four months after clinical use. The surfaces were characterized qualitatively by secondary electron (SE) images and back scattered electron (BSE) images in both compositional and topographical mode. Qualitative and quantitative wavelength-dispersive analyses were performed for different elements, and by utilizing qualitative analysis the relative concentration of selected elements was mapped two-dimensionally. The absolute concentration of the elements was determined in specially prepared brackets by quantitative analysis using pure element standards for calibration and calculating correction-factors (ZAF). Results Clear differences were observed between the different bracket types. The nickel-containing stainless steel brackets consist of two separate pieces joined by a brazing alloy. Compositional analysis revealed two different alloy compositions, and reaction zones on both sides of the brazing alloy. The nickel-free bracket was a single piece with only slight variation in element concentration, but had a significantly rougher surface. After clinical use, no corrosive phenomena were detectable with the methods applied. Traces of intraoral wear at the contact areas between the bracket slot and the arch wire were verified. Conclusion Electron probe microanalysis is a valuable tool for the characterization of element distribution and quantitative analysis for corrosion studies. PMID:23032212
Fortin, Carole; Ehrmann Feldman, Debbie; Cheriet, Farida; Labelle, Hubert
2013-08-01
The objective of this study was to explore whether differences in standing and sitting postures of youth with idiopathic scoliosis could be detected from quantitative analysis of digital photographs. Standing and sitting postures of 50 participants aged 10-20-years-old with idiopathic scoliosis (Cobb angle: 15° to 60°) were assessed from digital photographs using a posture evaluation software program. Based on the XY coordinates of markers, 13 angular and linear posture indices were calculated in both positions. Paired t-tests were used to compare values of standing and sitting posture indices. Significant differences between standing and sitting positions (p < 0.05) were found for head protraction, shoulder elevation, scapula asymmetry, trunk list, scoliosis angle, waist angles, and frontal and sagittal plane pelvic tilt. Quantitative analysis of digital photographs is a clinically feasible method to measure standing and sitting postures among youth with scoliosis and to assist in decisions on therapeutic interventions.
DOT National Transportation Integrated Search
2010-01-01
The Smart Grid is a cyber-physical system comprised of physical components, such as transmission lines and generators, and a : network of embedded systems deployed for their cyber control. Our objective is to qualitatively and quantitatively analyze ...
Impact of antimicrobial use during beef production on fecal occurrence of antimicrobial resistance
USDA-ARS?s Scientific Manuscript database
Objective: To determine the impact of typical antimicrobial use during cattle production on fecal occurrence of antimicrobial resistance by culture, quantitative PCR, and metagenomic sequencing. Experimental Design & Analysis: Feces were recovered from colons of 36 lots of "conventional" (CONV) ca...
Sensorized toys for measuring manipulation capabilities of infants at home.
Passetti, Giovanni; Cecchi, Francesca; Baldoli, Ilaria; Sgandurra, Giuseppina; Beani, Elena; Cioni, Giovanni; Laschi, Cecilia; Dario, Paolo
2015-01-01
Preterm infants, i.e. babies born after a gestation period shorter than 37 weeks, spend less time exploring objects. The quantitative measurement of grasping actions and forces in infants can give insights on their typical or atypical motor development. The aim of this work was to test a new tool, a kit of sensorized toys, to longitudinally measure, monitor and promote preterm infants manipulation capabilities with a purposive training in an ecological environment. This study presents preliminary analysis of grasping activity. Three preterm infants performed 4 weeks of daily training at home. Sensorized toys with embedded pressure sensors were used as part of the training to allow quantitative analysis of grasping (pressure and acceleration applied to toys while playing). Each toy was placed on the midline, while the infant was in supine position. Preliminary data show differences in the grasping parameters in relation to infants age and the performed daily training. Ongoing clinical trial will allow a full validation of this new tool for promoting object exploration in preterm infants.
Trojan, Hilda, and Cybele asteroids - New lightcurve observations and analysis
NASA Technical Reports Server (NTRS)
Binzel, Richard P.; Sauter, Linda M.
1992-01-01
Lightcurve observations of 23 Trojan, Hilda, and Cybele asteroids are presently subjected to a correction procedure for multiple-aspect lightcurves, followed by a quantitative, bias-corrected analysis of lightcurve amplitude distributions for all published data on these asteroids. While the largest Trojans are found to have a higher mean-lightcurve amplitude than their low-albedo, main-belt counterparts, the smaller Trojans and all Hildas and Cybeles display lightcurve properties resembling main-belt objects. Only the largest Trojans have retained their initial forms after subsequent collisional evolution; 90 km may accordingly represent a transitional magnitude between primordial objects and collision fragments.
Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong
2015-01-01
Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364
A clustering approach to segmenting users of internet-based risk calculators.
Harle, C A; Downs, J S; Padman, R
2011-01-01
Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.
NASA Astrophysics Data System (ADS)
Lorenzetti, G.; Foresta, A.; Palleschi, V.; Legnaioli, S.
2009-09-01
The recent development of mobile instrumentation, specifically devoted to in situ analysis and study of museum objects, allows the acquisition of many LIBS spectra in very short time. However, such large amount of data calls for new analytical approaches which would guarantee a prompt analysis of the results obtained. In this communication, we will present and discuss the advantages of statistical analytical methods, such as Partial Least Squares Multiple Regression algorithms vs. the classical calibration curve approach. PLS algorithms allows to obtain in real time the information on the composition of the objects under study; this feature of the method, compared to the traditional off-line analysis of the data, is extremely useful for the optimization of the measurement times and number of points associated with the analysis. In fact, the real time availability of the compositional information gives the possibility of concentrating the attention on the most `interesting' parts of the object, without over-sampling the zones which would not provide useful information for the scholars or the conservators. Some example on the applications of this method will be presented, including the studies recently performed by the researcher of the Applied Laser Spectroscopy Laboratory on museum bronze objects.
Practical considerations of image analysis and quantification of signal transduction IHC staining.
Grunkin, Michael; Raundahl, Jakob; Foged, Niels T
2011-01-01
The dramatic increase in computer processing power in combination with the availability of high-quality digital cameras during the last 10 years has fertilized the grounds for quantitative microscopy based on digital image analysis. With the present introduction of robust scanners for whole slide imaging in both research and routine, the benefits of automation and objectivity in the analysis of tissue sections will be even more obvious. For in situ studies of signal transduction, the combination of tissue microarrays, immunohistochemistry, digital imaging, and quantitative image analysis will be central operations. However, immunohistochemistry is a multistep procedure including a lot of technical pitfalls leading to intra- and interlaboratory variability of its outcome. The resulting variations in staining intensity and disruption of original morphology are an extra challenge for the image analysis software, which therefore preferably should be dedicated to the detection and quantification of histomorphometrical end points.
Ghosh, Debasree; Chattopadhyay, Parimal
2012-06-01
The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.
Quantitative imaging of aggregated emulsions.
Penfold, Robert; Watson, Andrew D; Mackie, Alan R; Hibberd, David J
2006-02-28
Noise reduction, restoration, and segmentation methods are developed for the quantitative structural analysis in three dimensions of aggregated oil-in-water emulsion systems imaged by fluorescence confocal laser scanning microscopy. Mindful of typical industrial formulations, the methods are demonstrated for concentrated (30% volume fraction) and polydisperse emulsions. Following a regularized deconvolution step using an analytic optical transfer function and appropriate binary thresholding, novel application of the Euclidean distance map provides effective discrimination of closely clustered emulsion droplets with size variation over at least 1 order of magnitude. The a priori assumption of spherical nonintersecting objects provides crucial information to combat the ill-posed inverse problem presented by locating individual particles. Position coordinates and size estimates are recovered with sufficient precision to permit quantitative study of static geometrical features. In particular, aggregate morphology is characterized by a novel void distribution measure based on the generalized Apollonius problem. This is also compared with conventional Voronoi/Delauney analysis.
Quantitative petri net model of gene regulated metabolic networks in the cell.
Chen, Ming; Hofestädt, Ralf
2011-01-01
A method to exploit hybrid Petri nets (HPN) for quantitatively modeling and simulating gene regulated metabolic networks is demonstrated. A global kinetic modeling strategy and Petri net modeling algorithm are applied to perform the bioprocess functioning and model analysis. With the model, the interrelations between pathway analysis and metabolic control mechanism are outlined. Diagrammatical results of the dynamics of metabolites are simulated and observed by implementing a HPN tool, Visual Object Net ++. An explanation of the observed behavior of the urea cycle is proposed to indicate possibilities for metabolic engineering and medical care. Finally, the perspective of Petri nets on modeling and simulation of metabolic networks is discussed.
Research Analysis on MOOC Course Dropout and Retention Rates
ERIC Educational Resources Information Center
Gomez-Zermeno, Marcela Gerogina; Aleman de La Garza, Lorena
2016-01-01
This research's objective was to identify the terminal efficiency of the Massive Online Open Course "Educational Innovation with Open Resources" offered by a Mexican private university. A quantitative methodology was used, combining descriptive statistics and probabilistic models to analyze the levels of retention, completion, and…
ERIC Educational Resources Information Center
Chiu, Chung-Yi; Lynch, Ruth Torkelson; Chan, Fong; Rose, Lindsey
2012-01-01
The main objective of this study was to evaluate the health action process approach (HAPA) as a motivational model for dietary self-management for people with multiple sclerosis (MS). Quantitative descriptive research design using path analysis was used. Participants were 209 individuals with MS recruited from the National MS Society and a…
ERIC Educational Resources Information Center
Fisher, Aaron J.; Newman, Michelle G.; Molenaar, Peter C. M.
2011-01-01
Objective: The present article aimed to demonstrate that the establishment of dynamic patterns during the course of psychotherapy can create attractor states for continued adaptive change following the conclusion of treatment. Method: This study is a secondary analysis of T. D. Borkovec and E. Costello (1993). Of the 55 participants in the…
ERIC Educational Resources Information Center
Nigg, Joel T.; Lewis, Kara; Edinger, Tracy; Falk, Michael
2012-01-01
Objective: The role of diet and of food colors in attention-deficit/hyperactivity disorder (ADHD) or its symptoms warrants updated quantitative meta-analysis, in light of recent divergent policy in Europe and the United States. Method: Studies were identified through a literature search using the PubMed, Cochrane Library, and PsycNET databases…
ERIC Educational Resources Information Center
Keegan, John P.; Chan, Fong; Ditchman, Nicole; Chiu, Chung-Yi
2012-01-01
The main objective of this study was to validate Pender's Health Promotion Model (HPM) as a motivational model for exercise/physical activity self-management for people with spinal cord injuries (SCIs). Quantitative descriptive research design using hierarchical regression analysis (HRA) was used. A total of 126 individuals with SCI were recruited…
Analysis and Derivation of Allocations for Fiber Contaminants in Liquid Bipropellant Systems
NASA Technical Reports Server (NTRS)
Lowrey, N. M; ibrahim, K. Y.
2012-01-01
An analysis was performed to identify the engineering rationale for the existing particulate limits in MSFC-SPEC-164, Cleanliness of Components for Use in Oxygen, Fuel, and Pneumatic Systems, determine the applicability of this rationale to fibers, identify potential risks that may result from fiber contamination in liquid oxygen/fuel bipropellant systems, and bound each of these risks. The objective of this analysis was to determine whether fiber contamination exceeding the established quantitative limits for particulate can be tolerated in these systems and, if so, to derive and recommend quantitative allocations for fibers beyond the limits established for other particulate. Knowledge gaps were identified that limit a complete understanding of the risk of promoted ignition from an accumulation of fibers in a gaseous oxygen system.
Accuracy of a remote quantitative image analysis in the whole slide images.
Słodkowska, Janina; Markiewicz, Tomasz; Grala, Bartłomiej; Kozłowski, Wojciech; Papierz, Wielisław; Pleskacz, Katarzyna; Murawski, Piotr
2011-03-30
The rationale for choosing a remote quantitative method supporting a diagnostic decision requires some empirical studies and knowledge on scenarios including valid telepathology standards. The tumours of the central nervous system [CNS] are graded on the base of the morphological features and the Ki-67 labelling Index [Ki-67 LI]. Various methods have been applied for Ki-67 LI estimation. Recently we have introduced the Computerized Analysis of Medical Images [CAMI] software for an automated Ki-67 LI counting in the digital images. Aims of our study was to explore the accuracy and reliability of a remote assessment of Ki-67 LI with CAMI software applied to the whole slide images [WSI]. The WSI representing CNS tumours: 18 meningiomas and 10 oligodendrogliomas were stored on the server of the Warsaw University of Technology. The digital copies of entire glass slides were created automatically by the Aperio ScanScope CS with objective 20x or 40x. Aperio's Image Scope software provided functionality for a remote viewing of WSI. The Ki-67 LI assessment was carried on within 2 out of 20 selected fields of view (objective 40x) representing the highest labelling areas in each WSI. The Ki-67 LI counting was performed by 3 various methods: 1) the manual reading in the light microscope - LM, 2) the automated counting with CAMI software on the digital images - DI , and 3) the remote quantitation on the WSIs - as WSI method. The quality of WSIs and technical efficiency of the on-line system were analysed. The comparative statistical analysis was performed for the results obtained by 3 methods of Ki-67 LI counting. The preliminary analysis showed that in 18% of WSI the results of Ki-67 LI differed from those obtained in other 2 methods of counting when the quality of the glass slides was below the standard range. The results of our investigations indicate that the remote automated Ki-67 LI analysis performed with the CAMI algorithm on the whole slide images of meningiomas and oligodendrogliomas could be successfully used as an alternative method to the manual reading as well as to the digital images quantitation with CAMI software. According to our observation a need of a remote supervision/consultation and training for the effective use of remote quantitative analysis of WSI is necessary.
NASA Astrophysics Data System (ADS)
Chien, Kuang-Che Chang; Tu, Han-Yen; Hsieh, Ching-Huang; Cheng, Chau-Jern; Chang, Chun-Yen
2018-01-01
This study proposes a regional fringe analysis (RFA) method to detect the regions of a target object in captured shifted images to improve depth measurement in phase-shifting fringe projection profilometry (PS-FPP). In the RFA method, region-based segmentation is exploited to segment the de-fringed image of a target object, and a multi-level fuzzy-based classification with five presented features is used to analyze and discriminate the regions of an object from the segmented regions, which were associated with explicit fringe information. Then, in the experiment, the performance of the proposed method is tested and evaluated on 26 test cases made of five types of materials. The qualitative and quantitative results demonstrate that the proposed RFA method can effectively detect the desired regions of an object to improve depth measurement in the PS-FPP system.
Quantitative pathology in virtual microscopy: history, applications, perspectives.
Kayser, Gian; Kayser, Klaus
2013-07-01
With the emerging success of commercially available personal computers and the rapid progress in the development of information technologies, morphometric analyses of static histological images have been introduced to improve our understanding of the biology of diseases such as cancer. First applications have been quantifications of immunohistochemical expression patterns. In addition to object counting and feature extraction, laws of thermodynamics have been applied in morphometric calculations termed syntactic structure analysis. Here, one has to consider that the information of an image can be calculated for separate hierarchical layers such as single pixels, cluster of pixels, segmented small objects, clusters of small objects, objects of higher order composed of several small objects. Using syntactic structure analysis in histological images, functional states can be extracted and efficiency of labor in tissues can be quantified. Image standardization procedures, such as shading correction and color normalization, can overcome artifacts blurring clear thresholds. Morphometric techniques are not only useful to learn more about biological features of growth patterns, they can also be helpful in routine diagnostic pathology. In such cases, entropy calculations are applied in analogy to theoretical considerations concerning information content. Thus, regions with high information content can automatically be highlighted. Analysis of the "regions of high diagnostic value" can deliver in the context of clinical information, site of involvement and patient data (e.g. age, sex), support in histopathological differential diagnoses. It can be expected that quantitative virtual microscopy will open new possibilities for automated histological support. Automated integrated quantification of histological slides also serves for quality assurance. The development and theoretical background of morphometric analyses in histopathology are reviewed, as well as their application and potential future implementation in virtual microscopy. Copyright © 2012 Elsevier GmbH. All rights reserved.
The Evolution of 3D Microimaging Techniques in Geosciences
NASA Astrophysics Data System (ADS)
Sahagian, D.; Proussevitch, A.
2009-05-01
In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.
Yang, Lixia; Mu, Yuming; Quaglia, Luiz Augusto; Tang, Qi; Guan, Lina; Wang, Chunmei; Shih, Ming Chi
2012-01-01
The study aim was to compare two different stress echocardiography interpretation techniques based on the correlation with thrombosis in myocardial infarction (TIMI ) flow grading from acute coronary syndrome (ACS) patients. Forty-one patients with suspected ACS were studied before diagnostic coronary angiography with myocardial contrast echocardiography (MCE) at rest and at stress. The correlation of visual interpretation of MCE and TIMI flow grade was significant. The quantitative analysis (myocardial perfusion parameters: A, β, and A × β) and TIMI flow grade were significant. MCE visual interpretation and TIMI flow grade had a high degree of agreement, on diagnosing myocardial perfusion abnormality. If one considers TIMI flow grade <3 as abnormal, MCE visual interpretation at rest had 73.1% accuracy with 58.2% sensitivity and 84.2% specificity and at stress had 80.4% accuracy with 76.6% sensitivity and 83.3% specificity. The MCE quantitative analysis has better accuracy with 100% of agreement with different level of TIMI flow grading. MCE quantitative analysis at stress has showed a direct correlation with TIMI flow grade, more significant than the visual interpretation technique. Further studies could measure the clinical relevance of this more objective approach to managing acute coronary syndrome patient before percutaneous coronary intervention (PCI). PMID:22778555
Allahdina, Ali M; Stetson, Paul F; Vitale, Susan; Wong, Wai T; Chew, Emily Y; Ferris, Fredrick L; Sieving, Paul A; Cukras, Catherine
2018-04-01
As optical coherence tomography (OCT) minimum intensity (MI) analysis provides a quantitative assessment of changes in the outer nuclear layer (ONL), we evaluated the ability of OCT-MI analysis to detect hydroxychloroquine toxicity. Fifty-seven predominantly female participants (91.2% female; mean age, 55.7 ± 10.4 years; mean time on hydroxychloroquine, 15.0 ± 7.5 years) were enrolled in a case-control study and categorized into affected (i.e., with toxicity, n = 19) and unaffected (n = 38) groups using objective multifocal electroretinographic (mfERG) criteria. Spectral-domain OCT scans of the macula were analyzed and OCT-MI values quantitated for each subfield of the Early Treatment Diabetic Retinopathy Study (ETDRS) grid. A two-sample U-test and a cross-validation approach were used to assess the sensitivity and specificity of toxicity detection according to OCT-MI criteria. The medians of the OCT-MI values in all nine of the ETDRS subfields were significantly elevated in the affected group relative to the unaffected group (P < 0.005 for all comparisons), with the largest difference found for the inner inferior subfield (P < 0.0001). The receiver operating characteristic analysis of median MI values of the inner inferior subfields showed high sensitivity and high specificity in the detection of toxicity with area under the curve = 0.99. Retinal changes secondary to hydroxychloroquine toxicity result in increased OCT reflectivity in the ONL that can be detected and quantitated using OCT-MI analysis. Analysis of OCT-MI values demonstrates high sensitivity and specificity for detecting the presence of hydroxychloroquine toxicity in this cohort and may contribute additionally to current screening practices.
Allahdina, Ali M.; Stetson, Paul F.; Vitale, Susan; Wong, Wai T.; Chew, Emily Y.; Ferris, Fredrick L.; Sieving, Paul A.
2018-01-01
Purpose As optical coherence tomography (OCT) minimum intensity (MI) analysis provides a quantitative assessment of changes in the outer nuclear layer (ONL), we evaluated the ability of OCT-MI analysis to detect hydroxychloroquine toxicity. Methods Fifty-seven predominantly female participants (91.2% female; mean age, 55.7 ± 10.4 years; mean time on hydroxychloroquine, 15.0 ± 7.5 years) were enrolled in a case-control study and categorized into affected (i.e., with toxicity, n = 19) and unaffected (n = 38) groups using objective multifocal electroretinographic (mfERG) criteria. Spectral-domain OCT scans of the macula were analyzed and OCT-MI values quantitated for each subfield of the Early Treatment Diabetic Retinopathy Study (ETDRS) grid. A two-sample U-test and a cross-validation approach were used to assess the sensitivity and specificity of toxicity detection according to OCT-MI criteria. Results The medians of the OCT-MI values in all nine of the ETDRS subfields were significantly elevated in the affected group relative to the unaffected group (P < 0.005 for all comparisons), with the largest difference found for the inner inferior subfield (P < 0.0001). The receiver operating characteristic analysis of median MI values of the inner inferior subfields showed high sensitivity and high specificity in the detection of toxicity with area under the curve = 0.99. Conclusions Retinal changes secondary to hydroxychloroquine toxicity result in increased OCT reflectivity in the ONL that can be detected and quantitated using OCT-MI analysis. Analysis of OCT-MI values demonstrates high sensitivity and specificity for detecting the presence of hydroxychloroquine toxicity in this cohort and may contribute additionally to current screening practices. PMID:29677357
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C
2016-02-01
Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p < 0.05) difference in sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.
León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.
2013-01-01
The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lima, F.W.; Pagano, C.; Schneiderman, B.
1959-07-01
Boron can be determined quantitatively by absorption spectrophotometry of solutions of the red compound formed by the reaction of boric acid with curcumin. This reaction is affected by various factors, some of which can be detected easily in the data interpretation. Others, however, provide more difficulty. The application of modern statistical method to the study of the influence of these factors on the quantitative determination of boron is presented. These methods provide objective ways of establishing significant effects of the factors involved. (auth)
An Analysis of Trends in Publications on "Tobacco Control"
ERIC Educational Resources Information Center
Unutmaz Durmusoglu, Zeynep Didem; Kocabey Çiftçi, Pinar
2017-01-01
Objectives: Publications on tobacco control were quantitatively analysed to gain insight into the essential characteristics of the research field and trends and patterns in publication activities. The goal was to detect changes in the number of publications before and after the World Health Organization Framework Convention on Tobacco Control (WHO…
42 CFR 50.605 - Management and reporting of financial conflicts of interest.
Code of Federal Regulations, 2013 CFR
2013-10-01
... quantitative data to support any actual or future harm; analysis of whether the research project is salvageable... SERVICES GRANTS POLICIES OF GENERAL APPLICABILITY Promoting Objectivity in Research § 50.605 Management and... to the Institution's expenditure of any funds under a PHS-funded research project, the designated...
42 CFR 50.605 - Management and reporting of financial conflicts of interest.
Code of Federal Regulations, 2014 CFR
2014-10-01
... quantitative data to support any actual or future harm; analysis of whether the research project is salvageable... SERVICES GRANTS POLICIES OF GENERAL APPLICABILITY Promoting Objectivity in Research § 50.605 Management and... to the Institution's expenditure of any funds under a PHS-funded research project, the designated...
42 CFR 50.605 - Management and reporting of financial conflicts of interest.
Code of Federal Regulations, 2012 CFR
2012-10-01
... quantitative data to support any actual or future harm; analysis of whether the research project is salvageable... SERVICES GRANTS POLICIES OF GENERAL APPLICABILITY Promoting Objectivity in Research § 50.605 Management and... to the Institution's expenditure of any funds under a PHS-funded research project, the designated...
42 CFR 50.605 - Management and reporting of financial conflicts of interest.
Code of Federal Regulations, 2011 CFR
2011-10-01
... quantitative data to support any actual or future harm; analysis of whether the research project is salvageable... SERVICES GRANTS POLICIES OF GENERAL APPLICABILITY Promoting Objectivity in Research § 50.605 Management and... to the Institution's expenditure of any funds under a PHS-funded research project, the designated...
Gender Balance in Teaching Awards: Evidence from 18 Years of National Data
ERIC Educational Resources Information Center
Marchant, Teresa; Wallace, Michelle
2016-01-01
Gender implications of nationally competitive teaching awards were examined to determine whether women receive sufficient accolades, given their dominant position in university teaching. Quantitative methods and secondary data provided objective analysis of teaching awards for Australian universities, for an 18-year data set with 2046 units of…
Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R
2016-01-01
Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.
Cardiac imaging: working towards fully-automated machine analysis & interpretation
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-01-01
Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804
Joshi, Molishree; Keith Pittman, H; Haisch, Carl; Verbanac, Kathryn
2008-09-01
Quantitative real-time PCR (qPCR) is a sensitive technique for the detection and quantitation of specific DNA sequences. Here we describe a Taqman qPCR assay for quantification of tissue-localized, adoptively transferred enhanced green fluorescent protein (EGFP)-transgenic cells. A standard curve constructed from serial dilutions of a plasmid containing the EGFP transgene was (i) highly reproducible, (ii) detected as few as two copies, and (iii) was included in each qPCR assay. qPCR analysis of genomic DNA was used to determine transgene copy number in several mouse strains. Fluorescent microscopy of tissue sections showed that adoptively transferred vascular endothelial cells (VEC) from EGFP-transgenic mice specifically localized to tissue with metastatic tumors in syngeneic recipients. VEC microscopic enumeration of liver metastases strongly correlated with qPCR analysis of identical sections (Pearson correlation 0.81). EGFP was undetectable in tissue from control mice by qPCR. In another study using intra-tumor EGFP-VEC delivery to subcutaneous tumors, manual cell count and qPCR analysis of alternating sections also strongly correlated (Pearson correlation 0.82). Confocal microscopy of the subcutaneous tumor sections determined that visual fluorescent signals were frequently tissue artifacts. This qPCR methodology offers specific, objective, and rapid quantitation, uncomplicated by tissue autofluorescence, and should be readily transferable to other in vivo models to quantitate the biolocalization of transplanted cells.
Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.
Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L
2017-10-01
The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.
Yan, Jianjun; Shen, Xiaojing; Wang, Yiqin; Li, Fufeng; Xia, Chunming; Guo, Rui; Chen, Chunfeng; Shen, Qingwei
2010-01-01
This study aims at utilising Wavelet Packet Transform (WPT) and Support Vector Machine (SVM) algorithm to make objective analysis and quantitative research for the auscultation in Traditional Chinese Medicine (TCM) diagnosis. First, Wavelet Packet Decomposition (WPD) at level 6 was employed to split more elaborate frequency bands of the auscultation signals. Then statistic analysis was made based on the extracted Wavelet Packet Energy (WPE) features from WPD coefficients. Furthermore, the pattern recognition was used to distinguish mixed subjects' statistical feature values of sample groups through SVM. Finally, the experimental results showed that the classification accuracies were at a high level.
The application of LIBS for the analysis of archaeological ceramic and metal artifacts
NASA Astrophysics Data System (ADS)
Melessanaki, Kristalia; Mateo, Maripaz; Ferrence, Susan C.; Betancourt, Philip P.; Anglos, Demetrios
2002-09-01
A bench-top laser-induced breakdown spectroscopy (LIBS) system has been used in the examination of pottery, jewelry and metal artifacts found in archaeological excavations in central and eastern Crete, Greece. The objects date from the Middle and Late Minoan periods (ca. 20th-13th century B. C.) through Byzantine and Venetian to Ottoman times (ca. 5th-19th century A.D.). The spectral data indicates the qualitative and often the semi-quantitative elemental composition of the examined materials. In the case of colored glazed ceramics, the identity of pigments was established while in the case of metal and jewelry analysis, the type of metal or metal alloy used was determined. The analyses demonstrate the potential of the LIBS technique for performing routine, rapid, on-site analysis of archaeological objects, which leads to the quick characterization or screening of different types of objects.
Quantitative Medical Image Analysis for Clinical Development of Therapeutics
NASA Astrophysics Data System (ADS)
Analoui, Mostafa
There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.
Keenan, S J; Diamond, J; McCluggage, W G; Bharucha, H; Thompson, D; Bartels, P H; Hamilton, P W
2000-11-01
The histological grading of cervical intraepithelial neoplasia (CIN) remains subjective, resulting in inter- and intra-observer variation and poor reproducibility in the grading of cervical lesions. This study has attempted to develop an objective grading system using automated machine vision. The architectural features of cervical squamous epithelium are quantitatively analysed using a combination of computerized digital image processing and Delaunay triangulation analysis; 230 images digitally captured from cases previously classified by a gynaecological pathologist included normal cervical squamous epithelium (n=30), koilocytosis (n=46), CIN 1 (n=52), CIN 2 (n=56), and CIN 3 (n=46). Intra- and inter-observer variation had kappa values of 0.502 and 0.415, respectively. A machine vision system was developed in KS400 macro programming language to segment and mark the centres of all nuclei within the epithelium. By object-oriented analysis of image components, the positional information of nuclei was used to construct a Delaunay triangulation mesh. Each mesh was analysed to compute triangle dimensions including the mean triangle area, the mean triangle edge length, and the number of triangles per unit area, giving an individual quantitative profile of measurements for each case. Discriminant analysis of the geometric data revealed the significant discriminatory variables from which a classification score was derived. The scoring system distinguished between normal and CIN 3 in 98.7% of cases and between koilocytosis and CIN 1 in 76.5% of cases, but only 62.3% of the CIN cases were classified into the correct group, with the CIN 2 group showing the highest rate of misclassification. Graphical plots of triangulation data demonstrated the continuum of morphological change from normal squamous epithelium to the highest grade of CIN, with overlapping of the groups originally defined by the pathologists. This study shows that automated location of nuclei in cervical biopsies using computerized image analysis is possible. Analysis of positional information enables quantitative evaluation of architectural features in CIN using Delaunay triangulation meshes, which is effective in the objective classification of CIN. This demonstrates the future potential of automated machine vision systems in diagnostic histopathology. Copyright 2000 John Wiley & Sons, Ltd.
LOD significance thresholds for QTL analysis in experimental populations of diploid species
Van Ooijen JW
1999-11-01
Linkage analysis with molecular genetic markers is a very powerful tool in the biological research of quantitative traits. The lack of an easy way to know what areas of the genome can be designated as statistically significant for containing a gene affecting the quantitative trait of interest hampers the important prediction of the rate of false positives. In this paper four tables, obtained by large-scale simulations, are presented that can be used with a simple formula to get the false-positives rate for analyses of the standard types of experimental populations with diploid species with any size of genome. A new definition of the term 'suggestive linkage' is proposed that allows a more objective comparison of results across species.
Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.
2012-01-01
The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600
Sornborger, Andrew; Broder, Josef; Majumder, Anirban; Srinivasamoorthy, Ganesh; Porter, Erika; Reagin, Sean S; Keith, Charles; Lauderdale, James D
2008-09-01
Ratiometric fluorescent indicators are used for making quantitative measurements of a variety of physiological variables. Their utility is often limited by noise. This is the second in a series of papers describing statistical methods for denoising ratiometric data with the aim of obtaining improved quantitative estimates of variables of interest. Here, we outline a statistical optimization method that is designed for the analysis of ratiometric imaging data in which multiple measurements have been taken of systems responding to the same stimulation protocol. This method takes advantage of correlated information across multiple datasets for objectively detecting and estimating ratiometric signals. We demonstrate our method by showing results of its application on multiple, ratiometric calcium imaging experiments.
Thakran, S; Gupta, P K; Kabra, V; Saha, I; Jain, P; Gupta, R K; Singh, A
2018-06-14
The objective of this study was to quantify the hemodynamic parameters using first pass analysis of T 1 -perfusion magnetic resonance imaging (MRI) data of human breast and to compare these parameters with the existing tracer kinetic parameters, semi-quantitative and qualitative T 1 -perfusion analysis in terms of lesion characterization. MRI of the breast was performed in 50 women (mean age, 44±11 [SD] years; range: 26-75) years with a total of 15 benign and 35 malignant breast lesions. After pre-processing, T 1 -perfusion MRI data was analyzed using qualitative approach by two radiologists (visual inspection of the kinetic curve into types I, II or III), semi-quantitative (characterization of kinetic curve types using empirical parameters), generalized-tracer-kinetic-model (tracer kinetic parameters) and first pass analysis (hemodynamic-parameters). Chi-squared test, t-test, one-way analysis-of-variance (ANOVA) using Bonferroni post-hoc test and receiver-operating-characteristic (ROC) curve were used for statistical analysis. All quantitative parameters except leakage volume (Ve), qualitative (type-I and III) and semi-quantitative curves (type-I and III) provided significant differences (P<0.05) between benign and malignant lesions. Kinetic parameters, particularly volume transfer coefficient (K trans ) provided a significant difference (P<0.05) between all grades except grade-II vs III. The hemodynamic parameter (relative-leakage-corrected-breast-blood-volume [rBBVcorr) provided a statistically significant difference (P<0.05) between all grades. It also provided highest sensitivity and specificity among all parameters in differentiation between different grades of malignant breast lesions. Quantitative parameters, particularly rBBVcorr and K trans provided similar sensitivity and specificity in differentiating benign from malignant breast lesions for this cohort. Moreover, rBBVcorr provided better differentiation between different grades of malignant breast lesions among all the parameters. Copyright © 2018. Published by Elsevier Masson SAS.
[Sample preparation and bioanalysis in mass spectrometry].
Bourgogne, Emmanuel; Wagner, Michel
2015-01-01
The quantitative analysis of compounds of clinical interest of low molecular weight (<1000 Da) in biological fluids is currently in most cases performed by liquid chromatography-mass spectrometry (LC-MS). Analysis of these compounds in biological fluids (plasma, urine, saliva, hair...) is a difficult task requiring a sample preparation. Sample preparation is a crucial part of chemical/biological analysis and in a sense is considered the bottleneck of the whole analytical process. The main objectives of sample preparation are the removal of potential interferences, analyte preconcentration, and converting (if needed) the analyte into a more suitable form for detection or separation. Without chromatographic separation, endogenous compounds, co-eluted products may affect a quantitative method in mass spectrometry performance. This work focuses on three distinct parts. First, quantitative bioanalysis will be defined, different matrices and sample preparation techniques currently used in bioanalysis by mass spectrometry of/for small molecules of clinical interest in biological fluids. In a second step the goals of sample preparation will be described. Finally, in a third step, sample preparation strategies will be made either directly ("dilute and shoot") or after precipitation.
Graberski Matasović, M; Matasović, T; Markovac, Z
1997-06-01
The frequency of femoral quadriceps muscle hypotrophy has become a significant therapeutic problem. Efforts are being made to improve the standard scheme of kinesitherapeutic treatment by using additional more effective therapeutic methods. Beside kinesitherapy, the authors have used magnetotherapy in 30 of the 60 patients. The total of 60 patients, both sexes, similar age groups and intensity of hypotrophy, were included in the study. They were divided into groups A and B, the experimental and the control one (30 patients each). The treatment was scheduled for the usual 5-6 weeks. Electromyographic quantitative analysis was used to check-up the treatment results achieved after 5 and 6 weeks of treatment period. Analysis of results has confirmed the assumption that magnetotherapy may yield better and faster treatment results, disappearance of pain and decreased risk of complications. The same results were obtained in the experimental group, only one week earlier than in the control group. The EMG quantitative analysis has not proved sufficiently reliable and objective method in the assessment of real condition of the muscle and effects of treatment.
A Method to Measure and Estimate Normalized Contrast in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.
He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan
2009-07-01
Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi
2015-01-01
Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi
2015-04-01
Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Savel'eva, N B; Bykovskaia, N Iu; Dikunets, M A; Bolotov, S L; Rodchenkov, G M
2010-01-01
The objective of this study was to demonstrate the possibility to use deuterated compounds as internal standards for the quantitative analysis of morphine by gas chromatography with mass-selective detection for the purpose of doping control. The paper is focused on the problems associated with the use of deuterated morphine-D3 as the internal standard. Quantitative characteristics of the calibration dependence thus documented are presented along with uncertainty values obtained in the measurements with the use of deuterated morphine-D6. An approach to the assessment of method bias associated with the application of morphine-D6 as the deuterated internal standard is described.
Robot and Human Surface Operations on Solar System Bodies
NASA Technical Reports Server (NTRS)
Weisbin, C. R.; Easter, R.; Rodriguez, G.
2001-01-01
This paper presents a comparison of robot and human surface operations on solar system bodies. The topics include: 1) Long Range Vision of Surface Scenarios; 2) Human and Robots Complement Each Other; 3) Respective Human and Robot Strengths; 4) Need More In-Depth Quantitative Analysis; 5) Projected Study Objectives; 6) Analysis Process Summary; 7) Mission Scenarios Decompose into Primitive Tasks; 7) Features of the Projected Analysis Approach; and 8) The "Getting There Effect" is a Major Consideration. This paper is in viewgraph form.
Network of TAMCNS: Identifying Influence Regions Within the GCSS-MC Database
2017-06-01
relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This... methodology identifies the most important TAMCN and provides a list of TAMCNs in order of importance. We also analyze the community and core structure of...relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This
Subcellular object quantification with Squassh3C and SquasshAnalyst.
Rizk, Aurélien; Mansouri, Maysam; Ballmer-Hofer, Kurt; Berger, Philipp
2015-11-01
Quantitative image analysis plays an important role in contemporary biomedical research. Squassh is a method for automatic detection, segmentation, and quantification of subcellular structures and analysis of their colocalization. Here we present the applications Squassh3C and SquasshAnalyst. Squassh3C extends the functionality of Squassh to three fluorescence channels and live-cell movie analysis. SquasshAnalyst is an interactive web interface for the analysis of Squassh3C object data. It provides segmentation image overview and data exploration, figure generation, object and image filtering, and a statistical significance test in an easy-to-use interface. The overall procedure combines the Squassh3C plug-in for the free biological image processing program ImageJ and a web application working in conjunction with the free statistical environment R, and it is compatible with Linux, MacOS X, or Microsoft Windows. Squassh3C and SquasshAnalyst are available for download at www.psi.ch/lbr/SquasshAnalystEN/SquasshAnalyst.zip.
Fine phenotyping of pod and seed traits in Arachis germplasm accessions using digital image analysis
USDA-ARS?s Scientific Manuscript database
Reliable and objective phenotyping of peanut pod and seed traits is important for cultivar selection and genetic mapping of yield components. To develop useful and efficient methods to quantitatively define peanut pod and seed traits, a group of peanut germplasm with high levels of phenotypic varia...
ERIC Educational Resources Information Center
Pruett, Steven R.; Deiches, Jon; Pfaller, Joseph; Moser, Erin; Chan, Fong
2014-01-01
Objective: To determine the factorial validity of the Internal and External Motivation to Respond without Prejudice toward People with Disabilities Scale (D-IMS/EMS). Design: A quantitative descriptive design using factor analysis. Participants: 233 rehabilitation counseling and rehabilitation services students. Results: Both exploratory and…
Modelling Factors of Students' Work in Western Balkan Countries
ERIC Educational Resources Information Center
Savic, Mirko; Kresoja, Milena
2018-01-01
The positive side of employment during studies is the increase of net investments in human capital. The main objective of this paper is to discover factors influencing the work of students in Serbia, Bosnia and Herzegovina and Montenegro and to compare students' employment in these three Western Balkan countries. Quantitative analysis based on…
Development of a Computer-Based Visualised Quantitative Learning System for Playing Violin Vibrato
ERIC Educational Resources Information Center
Ho, Tracy Kwei-Liang; Lin, Huann-shyang; Chen, Ching-Kong; Tsai, Jih-Long
2015-01-01
Traditional methods of teaching music are largely subjective, with the lack of objectivity being particularly challenging for violin students learning vibrato because of the existence of conflicting theories. By using a computer-based analysis method, this study found that maintaining temporal coincidence between the intensity peak and the target…
Examination of Modeling Languages to Allow Quantitative Analysis for Model-Based Systems Engineering
2014-06-01
x THIS PAGE INTENTIONALLY LEFT BLANK xi LIST OF ACRONYMS AND ABBREVIATIONS BOM Base Object Model BPMN Business Process Model & Notation DOD...SysML. There are many variants such as the Unified Profile for DODAF/MODAF (UPDM) and Business Process Model & Notation ( BPMN ) that have origins in
Genome-Wide Association Study of Intelligence: Additive Effects of Novel Brain Expressed Genes
ERIC Educational Resources Information Center
Loo, Sandra K.; Shtir, Corina; Doyle, Alysa E.; Mick, Eric; McGough, James J.; McCracken, James; Biederman, Joseph; Smalley, Susan L.; Cantor, Rita M.; Faraone, Stephen V.; Nelson, Stanley F.
2012-01-01
Objective: The purpose of the present study was to identify common genetic variants that are associated with human intelligence or general cognitive ability. Method: We performed a genome-wide association analysis with a dense set of 1 million single-nucleotide polymorphisms (SNPs) and quantitative intelligence scores within an ancestrally…
Nutrient uptake, biomass yield and quantitative analysis of aliphatic aldehydes in cilantro plants
USDA-ARS?s Scientific Manuscript database
The objective of this study was to evaluate the nutrient uptake, biomass production and yield of the major compounds in the essential oil of five genotypes of Coriandrum sativum L. The treatments were four accessions donated by the National Genetic Resources Advisory Council (NGRAC), U.S. Department...
Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes
2018-05-18
Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.
An interactive tool for semi-automatic feature extraction of hyperspectral data
NASA Astrophysics Data System (ADS)
Kovács, Zoltán; Szabó, Szilárd
2016-09-01
The spectral reflectance of the surface provides valuable information about the environment, which can be used to identify objects (e.g. land cover classification) or to estimate quantities of substances (e.g. biomass). We aimed to develop an MS Excel add-in - Hyperspectral Data Analyst (HypDA) - for a multipurpose quantitative analysis of spectral data in VBA programming language. HypDA was designed to calculate spectral indices from spectral data with user defined formulas (in all possible combinations involving a maximum of 4 bands) and to find the best correlations between the quantitative attribute data of the same object. Different types of regression models reveal the relationships, and the best results are saved in a worksheet. Qualitative variables can also be involved in the analysis carried out with separability and hypothesis testing; i.e. to find the wavelengths responsible for separating data into predefined groups. HypDA can be used both with hyperspectral imagery and spectrometer measurements. This bivariate approach requires significantly fewer observations than popular multivariate methods; it can therefore be applied to a wide range of research areas.
NASA Astrophysics Data System (ADS)
Kobrina, Yevgeniya; Isaksson, Hanna; Sinisaari, Miikka; Rieppo, Lassi; Brama, Pieter A.; van Weeren, René; Helminen, Heikki J.; Jurvelin, Jukka S.; Saarakkala, Simo
2010-11-01
The collagen phase in bone is known to undergo major changes during growth and maturation. The objective of this study is to clarify whether Fourier transform infrared (FTIR) microspectroscopy, coupled with cluster analysis, can detect quantitative and qualitative changes in the collagen matrix of subchondral bone in horses during maturation and growth. Equine subchondral bone samples (n = 29) from the proximal joint surface of the first phalanx are prepared from two sites subjected to different loading conditions. Three age groups are studied: newborn (0 days old), immature (5 to 11 months old), and adult (6 to 10 years old) horses. Spatial collagen content and collagen cross-link ratio are quantified from the spectra. Additionally, normalized second derivative spectra of samples are clustered using the k-means clustering algorithm. In quantitative analysis, collagen content in the subchondral bone increases rapidly between the newborn and immature horses. The collagen cross-link ratio increases significantly with age. In qualitative analysis, clustering is able to separate newborn and adult samples into two different groups. The immature samples display some nonhomogeneity. In conclusion, this is the first study showing that FTIR spectral imaging combined with clustering techniques can detect quantitative and qualitative changes in the collagen matrix of subchondral bone during growth and maturation.
Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang
2017-01-01
The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.
CALIPSO: an interactive image analysis software package for desktop PACS workstations
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Huang, H. K.
1990-07-01
The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade
Stenner, A Jackson; Fisher, William P; Stone, Mark H; Burdick, Donald S
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained.
Stenner, A. Jackson; Fisher, William P.; Stone, Mark H.; Burdick, Donald S.
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained. PMID:23986726
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Retinal status analysis method based on feature extraction and quantitative grading in OCT images.
Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri
2016-07-22
Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.
Sankhalkar, Sangeeta; Vernekar, Vrunda
2016-01-01
Background: Number of secondary compounds is produced by plants as natural antioxidants. Moringa oleifera Lam. and Ocimum tenuiflorum L. are known for their wide applications in food and pharmaceutical industry. Objective: To compare phenolic and flavonoid content in M. oleifera Lam and O. tenuiflorum L. by quantitative and qualitative analysis. Materials and Methods: Phenolic and flavonoid content were studied spectrophotometrically and by paper chromatography in M. oleifera Lam. and O. tenuiflorum L. Results: Higher phenolic and flavonoid content were observed in Moringa leaf and flower. Ocimum flower showed higher phenolic content and low flavonoid in comparison to Moringa. Flavonoids such as biflavonyl, flavones, glycosylflavones, and kaempferol were identified by paper chromatography. Phytochemical analysis for flavonoid, tannins, saponins, alkaloids, reducing sugars, and anthraquinones were tested positive for Moringa and Ocimum leaf as well as flower. Conclusions: In the present study higher phenolic and flavonoid content, indicated the natural antioxidant nature of Moringa and Ocimum signifying their medicinal importance. SUMMARY Moringa oleifera Lam. and Ocimum tenuiflorum L. are widly grown in India and are known for their medicinal properties. Number of secondary metabolites like phenolics and flavonoids are known to be present in both the plants. The present study was conducted with an objective to qualitatively and quantitatively compare the phenolics and flavanoids in these two medicinally important plants.Quantitation of total phenolics and flavanoids was done by spectrophotometrically while qualitative analysis was perfomed by paper chromatography and by phytochemical tests. Our results have shown higher phenolics and flavanoid content in Moringa leaf and flower. However, higher phenolic content was absent in Ocimum flower compared to that of Moringa. Phytochemical analysis of various metabolites such as flavonoids, tanins, sapponins, alkaloids, anthraquinones revealed that both the plant extracts were rich sources of secondary metabolites and thus tested positive for the above tests. Various flavanoids and Phenolics were identified by paper chromatography based on their Rf values and significant colors. From the above study we conclude that Moringa and Ocimum are rich in natural antioxidants hence are potent source in pharmaceutical industry. PMID:26941531
Object-oriented Persistent Homology
Wang, Bao; Wei, Guo-Wei
2015-01-01
Persistent homology provides a new approach for the topological simplification of big data via measuring the life time of intrinsic topological features in a filtration process and has found its success in scientific and engineering applications. However, such a success is essentially limited to qualitative data classification and analysis. Indeed, persistent homology has rarely been employed for quantitative modeling and prediction. Additionally, the present persistent homology is a passive tool, rather than a proactive technique, for classification and analysis. In this work, we outline a general protocol to construct object-oriented persistent homology methods. By means of differential geometry theory of surfaces, we construct an objective functional, namely, a surface free energy defined on the data of interest. The minimization of the objective functional leads to a Laplace-Beltrami operator which generates a multiscale representation of the initial data and offers an objective oriented filtration process. The resulting differential geometry based object-oriented persistent homology is able to preserve desirable geometric features in the evolutionary filtration and enhances the corresponding topological persistence. The cubical complex based homology algorithm is employed in the present work to be compatible with the Cartesian representation of the Laplace-Beltrami flow. The proposed Laplace-Beltrami flow based persistent homology method is extensively validated. The consistence between Laplace-Beltrami flow based filtration and Euclidean distance based filtration is confirmed on the Vietoris-Rips complex for a large amount of numerical tests. The convergence and reliability of the present Laplace-Beltrami flow based cubical complex filtration approach are analyzed over various spatial and temporal mesh sizes. The Laplace-Beltrami flow based persistent homology approach is utilized to study the intrinsic topology of proteins and fullerene molecules. Based on a quantitative model which correlates the topological persistence of fullerene central cavity with the total curvature energy of the fullerene structure, the proposed method is used for the prediction of fullerene isomer stability. The efficiency and robustness of the present method are verified by more than 500 fullerene molecules. It is shown that the proposed persistent homology based quantitative model offers good predictions of total curvature energies for ten types of fullerene isomers. The present work offers the first example to design object-oriented persistent homology to enhance or preserve desirable features in the original data during the filtration process and then automatically detect or extract the corresponding topological traits from the data. PMID:26705370
Mari, João Fernando; Saito, José Hiroki; Neves, Amanda Ferreira; Lotufo, Celina Monteiro da Cruz; Destro-Filho, João-Batista; Nicoletti, Maria do Carmo
2015-12-01
Microelectrode Arrays (MEA) are devices for long term electrophysiological recording of extracellular spontaneous or evocated activities on in vitro neuron culture. This work proposes and develops a framework for quantitative and morphological analysis of neuron cultures on MEAs, by processing their corresponding images, acquired by fluorescence microscopy. The neurons are segmented from the fluorescence channel images using a combination of segmentation by thresholding, watershed transform, and object classification. The positioning of microelectrodes is obtained from the transmitted light channel images using the circular Hough transform. The proposed method was applied to images of dissociated culture of rat dorsal root ganglion (DRG) neuronal cells. The morphological and topological quantitative analysis carried out produced information regarding the state of culture, such as population count, neuron-to-neuron and neuron-to-microelectrode distances, soma morphologies, neuron sizes, neuron and microelectrode spatial distributions. Most of the analysis of microscopy images taken from neuronal cultures on MEA only consider simple qualitative analysis. Also, the proposed framework aims to standardize the image processing and to compute quantitative useful measures for integrated image-signal studies and further computational simulations. As results show, the implemented microelectrode identification method is robust and so are the implemented neuron segmentation and classification one (with a correct segmentation rate up to 84%). The quantitative information retrieved by the method is highly relevant to assist the integrated signal-image study of recorded electrophysiological signals as well as the physical aspects of the neuron culture on MEA. Although the experiments deal with DRG cell images, cortical and hippocampal cell images could also be processed with small adjustments in the image processing parameter estimation.
Computer-Based Image Analysis for Plus Disease Diagnosis in Retinopathy of Prematurity
Wittenberg, Leah A.; Jonsson, Nina J.; Chan, RV Paul; Chiang, Michael F.
2014-01-01
Presence of plus disease in retinopathy of prematurity (ROP) is an important criterion for identifying treatment-requiring ROP. Plus disease is defined by a standard published photograph selected over 20 years ago by expert consensus. However, diagnosis of plus disease has been shown to be subjective and qualitative. Computer-based image analysis, using quantitative methods, has potential to improve the objectivity of plus disease diagnosis. The objective was to review the published literature involving computer-based image analysis for ROP diagnosis. The PubMed and Cochrane library databases were searched for the keywords “retinopathy of prematurity” AND “image analysis” AND/OR “plus disease.” Reference lists of retrieved articles were searched to identify additional relevant studies. All relevant English-language studies were reviewed. There are four main computer-based systems, ROPtool (AU ROC curve, plus tortuosity 0.95, plus dilation 0.87), RISA (AU ROC curve, arteriolar TI 0.71, venular diameter 0.82), Vessel Map (AU ROC curve, arteriolar dilation 0.75, venular dilation 0.96), and CAIAR (AU ROC curve, arteriole tortuosity 0.92, venular dilation 0.91), attempting to objectively analyze vessel tortuosity and dilation in plus disease in ROP. Some of them show promise for identification of plus disease using quantitative methods. This has potential to improve the diagnosis of plus disease, and may contribute to the management of ROP using both traditional binocular indirect ophthalmoscopy and image-based telemedicine approaches. PMID:21366159
John Archibald Wheeler: A study of mentoring in modern physics
NASA Astrophysics Data System (ADS)
Christensen, Terry M.
This dissertation has two objectives. The first objective is to determine where best to situate the study of mentoring (i.e. the 'making of scientists') on the landscape of the history of science and science studies. This task is accomplished by establishing mentoring studies as a link between the robust body of literature dealing with Research Schools and the emerging scholarship surrounding the development, dispersion, and evolution of pedagogy in the training of twentieth century physicists. The second, and perhaps more significant and novel objective, is to develop a means to quantitatively assess the mentoring workmanship of scientific craftsmen who preside over the final stages of preparation when apprentices are transformed into professional scientists. The project builds upon a 2006 Master's Thesis that examined John Archibald Wheeler's work as a mentor of theoretical physicists at Princeton University in the years 1938--1976. It includes Wheeler's work as a mentor at the University of Texas and is qualitatively and quantitatively enhanced by virtue of the author having access to five separate collections with archival holdings of John Wheeler's papers and correspondence, as well as having access to thirty one tape recorded interviews that feature John Wheeler as either the interviewee or a prominent subject of discussion. The project also benefited from the opportunity to meet with and gather background information from a number of John Wheeler's former colleagues and students. Included in the dissertation is a content analysis of the acknowledgements in 949 Ph.D. dissertations, 122 Master's Theses, and 670 Senior Theses that were submitted during Wheeler's career as an active mentor. By establishing a census of the students of the most active mentors at Princeton and Texas, it is possible to tabulate the publication record of these apprentice groups and obtain objective measures of mentoring efficacy. The dissertation concludes by discussing the wider applicability of the quantitative methodology and the qualitative analysis for the history of science and science studies.
NASA Technical Reports Server (NTRS)
Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.
1995-01-01
Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.
Morphometric Analysis of Chemoreception Organ in Male and Female Ticks (Acari: Ixodidae).
Josek, Tanya; Allan, Brian F; Alleyne, Marianne
2018-05-04
The Haller's organ plays a crucial role in a tick's ability to detect hosts. Even though this sensory organ is vital to tick survival, the morphology of this organ is not well understood. The objective of this study was to characterize variation in the morphological components of the Haller's organ of three medically important tick species using quantitative methods. The Haller's organs of Ixodes scapularis Say (Ixodida: Ixodidae) (black-legged tick), Amblyomma americanum (L.) (Ixodida: Ixodidae) (lone star tick), and Dermacentor variabilis (Say) (Ixodida: Ixodidae) (American dog tick) were morphologically analyzed using environmental scanning electron microscopy and geometric morphometrics, and the results were statistically interpreted using canonical variate analysis. Our data reveal significant, quantitative differences in the morphology of the Haller's organ among all three tick species and that in D. variabilis the sensory structure is sexually dimorphic. Studies like this can serve as a quantitative basis for further studies on sensor physiology, behavior, and tick species life history, potentially leading to novel methods for the prevention of tick-borne disease.
Quantitative analysis of multiple sclerosis: a feasibility study
NASA Astrophysics Data System (ADS)
Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong
2006-03-01
Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.
Quantitative endoscopy: initial accuracy measurements.
Truitt, T O; Adelman, R A; Kelly, D H; Willging, J P
2000-02-01
The geometric optics of an endoscope can be used to determine the absolute size of an object in an endoscopic field without knowing the actual distance from the object. This study explores the accuracy of a technique that estimates absolute object size from endoscopic images. Quantitative endoscopy involves calibrating a rigid endoscope to produce size estimates from 2 images taken with a known traveled distance between the images. The heights of 12 samples, ranging in size from 0.78 to 11.80 mm, were estimated with this calibrated endoscope. Backup distances of 5 mm and 10 mm were used for comparison. The mean percent error for all estimated measurements when compared with the actual object sizes was 1.12%. The mean errors for 5-mm and 10-mm backup distances were 0.76% and 1.65%, respectively. The mean errors for objects <2 mm and > or =2 mm were 0.94% and 1.18%, respectively. Quantitative endoscopy estimates endoscopic image size to within 5% of the actual object size. This method remains promising for quantitatively evaluating object size from endoscopic images. It does not require knowledge of the absolute distance of the endoscope from the object, rather, only the distance traveled by the endoscope between images.
Digital holographic microscopy combined with optical tweezers
NASA Astrophysics Data System (ADS)
Cardenas, Nelson; Yu, Lingfeng; Mohanty, Samarendra K.
2011-02-01
While optical tweezers have been widely used for the manipulation and organization of microscopic objects in three dimensions, observing the manipulated objects along axial direction has been quite challenging. In order to visualize organization and orientation of objects along axial direction, we report development of a Digital holographic microscopy combined with optical tweezers. Digital holography is achieved by use of a modified Mach-Zehnder interferometer with digital recording of interference pattern of the reference and sample laser beams by use of a single CCD camera. In this method, quantitative phase information is retrieved dynamically with high temporal resolution, only limited by frame rate of the CCD. Digital focusing, phase-unwrapping as well as online analysis and display of the quantitative phase images was performed on a software developed on LabView platform. Since phase changes observed in DHOT is very sensitive to optical thickness of trapped volume, estimation of number of particles trapped in the axial direction as well as orientation of non-spherical objects could be achieved with high precision. Since in diseases such as malaria and diabetics, change in refractive index of red blood cells occurs, this system can be employed to map such disease-specific changes in biological samples upon immobilization with optical tweezers.
3D Slicer as an Image Computing Platform for the Quantitative Imaging Network
Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron
2012-01-01
Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690
Shen, Qijun; Shan, Yanna; Hu, Zhengyu; Chen, Wenhui; Yang, Bing; Han, Jing; Huang, Yanfang; Xu, Wen; Feng, Zhan
2018-04-30
To objectively quantify intracranial hematoma (ICH) enlargement by analysing the image texture of head CT scans and to provide objective and quantitative imaging parameters for predicting early hematoma enlargement. We retrospectively studied 108 ICH patients with baseline non-contrast computed tomography (NCCT) and 24-h follow-up CT available. Image data were assessed by a chief radiologist and a resident radiologist. Consistency analysis between observers was tested. The patients were divided into training set (75%) and validation set (25%) by stratified sampling. Patients in the training set were dichotomized according to 24-h hematoma expansion ≥ 33%. Using the Laplacian of Gaussian bandpass filter, we chose different anatomical spatial domains ranging from fine texture to coarse texture to obtain a series of derived parameters (mean grayscale intensity, variance, uniformity) in order to quantify and evaluate all data. The parameters were externally validated on validation set. Significant differences were found between the two groups of patients within variance at V 1.0 and in uniformity at U 1.0 , U 1.8 and U 2.5 . The intraclass correlation coefficients for the texture parameters were between 0.67 and 0.99. The area under the ROC curve between the two groups of ICH cases was between 0.77 and 0.92. The accuracy of validation set by CTTA was 0.59-0.85. NCCT texture analysis can objectively quantify the heterogeneity of ICH and independently predict early hematoma enlargement. • Heterogeneity is helpful in predicting ICH enlargement. • CTTA could play an important role in predicting early ICH enlargement. • After filtering, fine texture had the best diagnostic performance. • The histogram-based uniformity parameters can independently predict ICH enlargement. • CTTA is more objective, more comprehensive, more independently operable, than previous methods.
Farberg, Aaron S; Winkelmann, Richard R; Tucker, Natalie; White, Richard; Rigel, Darrell S
2017-09-01
BACKGROUND: Early diagnosis of melanoma is critical to survival. New technologies, such as a multi-spectral digital skin lesion analysis (MSDSLA) device [MelaFind, STRATA Skin Sciences, Horsham, Pennsylvania] may be useful to enhance clinician evaluation of concerning pigmented skin lesions. Previous studies evaluated the effect of only the binary output. OBJECTIVE: The objective of this study was to determine how decisions dermatologists make regarding pigmented lesion biopsies are impacted by providing both the underlying classifier score (CS) and associated probability risk provided by multi-spectral digital skin lesion analysis. This outcome was also compared against the improvement reported with the provision of only the binary output. METHODS: Dermatologists attending an educational conference evaluated 50 pigmented lesions (25 melanomas and 25 benign lesions). Participants were asked if they would biopsy the lesion based on clinical images, and were asked this question again after being shown multi-spectral digital skin lesion analysis data that included the probability graphs and classifier score. RESULTS: Data were analyzed from a total of 160 United States board-certified dermatologists. Biopsy sensitivity for melanoma improved from 76 percent following clinical evaluation to 92 percent after quantitative multi-spectral digital skin lesion analysis information was provided ( p <0.0001). Specificity improved from 52 percent to 79 percent ( p <0.0001). The positive predictive value increased from 61 percent to 81 percent ( p <0.01) when the quantitative data were provided. Negative predictive value also increased (68% vs. 91%, p<0.01), and overall biopsy accuracy was greater with multi-spectral digital skin lesion analysis (64% vs. 86%, p <0.001). Interrater reliability improved (intraclass correlation 0.466 before, 0.559 after). CONCLUSION: Incorporating the classifier score and probability data into physician evaluation of pigmented lesions led to both increased sensitivity and specificity, thereby resulting in more accurate biopsy decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miron, M.S.; Christopher, C.; Hirshfield, S.
1978-05-01
Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to rootmore » form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.« less
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
NASA Technical Reports Server (NTRS)
Pepper, Stephen V.
1995-01-01
A grazing angle objective on an infrared microspectrometer is studied for quantitative spectroscopy by considering the angular dependence of the incident intensity within the objective's angular aperture. The assumption that there is no angular dependence is tested by comparing the experimental reflectance of Si and KBr surfaces with the reflectance calculated by integrating the Fresnel reflection coefficient over the angular aperture under this assumption. Good agreement was found, indicating that the specular reflectance of surfaces can straight-forwardly be quantitatively integrated over the angular aperture without considering non-uniform incident intensity. This quantitative approach is applied to the thickness determination of dipcoated Krytox on gold. The infrared optical constants of both materials are known, allowing the integration to be carried out. The thickness obtained is in fair agreement with the value determined by ellipsometry in the visible. Therefore, this paper illustrates a method for more quantitative use of a grazing angle objective for infrared reflectance microspectroscopy.
Smile line assessment comparing quantitative measurement and visual estimation.
Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie
2011-02-01
Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
NASA Astrophysics Data System (ADS)
Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.
2006-03-01
Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.
Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary
2014-12-05
Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.
Murray C. Richardson; Carl P. J. Mitchell; Brian A. Branfireun; Randall K. Kolka
2010-01-01
A new technique for quantifying the geomorphic form of northern forested wetlands from airborne LiDAR surveys is introduced, demonstrating the unprecedented ability to characterize the geomorphic form of northern forested wetlands using high-resolution digital topography. Two quantitative indices are presented, including the lagg width index (LWI) which objectively...
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
ERIC Educational Resources Information Center
Vellut, Natacha; Cook, Jon M.; Tursz, Anne
2012-01-01
Objectives: Using judicial files on neonaticides, (1) to examine the frequency of the association between neonaticide and denial of pregnancy; (2) to assess the accuracy of the concept of denial of pregnancy; (3) to examine its usefulness in programs to prevent neonaticides. Methods: Quantitative and qualitative analyses of data collected from…
ERIC Educational Resources Information Center
Mooss, Angela; Brock-Getz, Petra; Ladner, Robert; Fiano, Theresa
2013-01-01
Objective: The aim of this study was to examine the relationships between health literacy, knowledge of health status, and human immunodeficiency virus/acquired immune deficiency syndrome (HIV/AIDS) transmission beliefs among recipients of Ryan White care. Design: Quota and convenience sampled, quantitative analysis captured with closed and…
Sudden Gains during Psychological Treatments of Anxiety and Depression: A Meta-Analysis
ERIC Educational Resources Information Center
Aderka, Idan M.; Nickerson, Angela; Boe, Hans Jakob; Hofmann, Stefan G.
2012-01-01
Objective: The present study quantitatively reviewed the literature on sudden gains in psychological treatments for anxiety and depression. The authors examined the short- and long-term effects of sudden gains on treatment outcome as well as moderators of these effects. Method: The authors conducted a literature search using PubMed, PsycINFO, the…
The Representation of Islam in the Hungarian Geography Textbooks
ERIC Educational Resources Information Center
Császár, Zsuzsu M.; Vati, Tamás
2012-01-01
This research has been seeking an answer to the question about what kind of image of the Islam is conveyed by the most popular and densely used textbooks to students. In the course of analysis, primary and secondary schools textbooks were examined via quantitative and qualitative methods. The objective demonstration of the research results aims to…
Why Do Photo Finish Images Look Weird?
ERIC Educational Resources Information Center
Gregorcic, Bor; Planinsic, Gorazd
2012-01-01
This paper deals with effects that appear on photographs of rotating objects when taken by a photo finish camera, a rolling shutter camera or a computer scanner. These effects are very similar to Roget's palisade illusion. A simple quantitative analysis of the images is also provided. The effects are explored using a computer scanner in a way that…
USDA-ARS?s Scientific Manuscript database
Soybean seeds are major sources of essential amino acids, protein, and fatty acids. Limited information is available on the genetic analysis of amino acid composition in soybean. Therefore, the objective of this study was to identify genomic regions containing quantitative trait loci (QTL) controlli...
ERIC Educational Resources Information Center
Nobile, Maria; Perego, Paolo; Piccinini, Luigi; Mani, Elisa; Rossi, Agnese; Bellina, Monica; Molteni, Massimo
2011-01-01
In order to increase the knowledge of locomotor disturbances in children with autism, and of the mechanism underlying them, the objective of this exploratory study was to reliably and quantitatively evaluate linear gait parameters (spatio-temporal and kinematic parameters), upper body kinematic parameters, walk orientation and smoothness using an…
Ronald S., Jr. Zalesny; Don E. Riemenschneider; Richard B. Hall
2005-01-01
Rooting of hardwood cuttings is under strong genetic control, although genotype x environment interactions affect selection of promising genotypes. Our objectives were (1) to assess the variation in rooting ability among 21 Populus clones and (2) to examine genotype x environment interactions to refine clonal recommendations. The clones belonged to...
A Meta-Analysis of Interventions for Bereaved Children and Adolescents
ERIC Educational Resources Information Center
Rosner, Rita; Kruse, Joachim; Hagl, Maria
2010-01-01
The main objective of this review was to provide a quantitative and methodologically sound evaluation of existing treatments for bereavement and grief reactions in children and adolescents. Two meta-analyses were conducted: 1 on controlled studies and 1 on uncontrolled studies. The 2 meta-analyses were based on a total of 27 treatment studies…
Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.
2016-01-01
Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard. PMID:27257542
2004-06-01
obtained. Further refinements of the technique based on recent research in experimental psychology are also considered. INTRODUCTION The key...an established line of research in psychology in which objective and subjective metrics are combined to analyse the degree of ‘calibration’ in... Creelman , 1991). A notable exception is the study by Kunimoto et al. (2001) in which confidence ratings were subjected to SDT analysis to evaluate the
Informational analysis involving application of complex information system
NASA Astrophysics Data System (ADS)
Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael
The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.
Laboratory techniques and rhythmometry
NASA Technical Reports Server (NTRS)
Halberg, F.
1973-01-01
Some of the procedures used for the analysis of rhythms are illustrated, notably as these apply to current medical and biological practice. For a quantitative approach to medical and broader socio-ecologic goals, the chronobiologist gathers numerical objective reference standards for rhythmic biophysical, biochemical, and behavioral variables. These biological reference standards can be derived by specialized computer analyses of largely self-measured (until eventually automatically recorded) time series (autorhythmometry). Objective numerical values for individual and population parameters of reproductive cycles can be obtained concomitantly with characteristics of about-yearly (circannual), about-daily (circadian) and other rhythms.
Visual Aggregate Analysis of Eligibility Features of Clinical Trials
He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua
2015-01-01
Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940
Multi-object segmentation framework using deformable models for medical imaging analysis.
Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel
2016-08-01
Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed framework has a wide range of applications especially in the presence of adjacent structures of interest or under intra-structure inhomogeneities giving excellent quantitative results.
Rigoard, P; Nivole, K; Blouin, P; Monlezun, O; Roulaud, M; Lorgeoux, B; Bataille, B; Guetarni, F
2015-03-01
One of the major challenges of neurostimulation is actually to address the back pain component in patients suffering from refractory chronic back and leg pain. Facing a tremendous expansion of neurostimulation techniques and available devices, implanters and patients can still remain confused as they need to select the right tool for the right indication. To be able to evaluate and compare objectively patient outcomes, depending on therapeutical strategies, it appears essential to develop a rational and quantitative approach to pain assessment for those who undergo neurostimulation implantation. We developed a touch screen interface, in Poitiers University Hospital and N(3)Lab, called the "Neuro-Pain'T", to detect, record and quantify the painful area surface and intensity changes in an implanted patient within time. The second aim of this software is to analyse the link between a paraesthesia coverage generated by a type of neurostimulation and a potential analgesic effect, measured by pain surface reduction, pain intensity reduction within the painful surface and local change in pain characteristics distribution. The third aim of Neuro-Pain'T is to correlate these clinical parameters to global patient data and functional outcome analysis, via a network database (Neuro-Database), to be able to provide a concise but objective approach of the neurostimulation efficacy, summarized by an index called "RFG Index". This software has been used in more than 190 patients since 2012, leading us to define three clinical parameters grouped as a clinical component of the RFG Index, which might be helpful to assess neurostimulation efficacy and compare implanted devices. The Neuro-Pain'T is an original software designed to objectively and quantitatively characterize reduction of a painful area in a given individual, in terms of intensity, surface and pain typology, in response to a treatment strategy or implantation of an analgesic device. Because pain is a physical sensation, which integrates a psychological dimension, its assessment justifies the use of multidimensional and global evaluation scales. However, in the context of neurostimulation and comparative clinical trials designed to test the technical efficacy of a given device, a simple, objective and quantitative evaluation tool could help to guide tomorrow's treatment options by transforming personal convictions into a more robust scientific rationale based on data collection and data mining techniques. Copyright © 2014. Published by Elsevier Masson SAS.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2015-02-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2017-01-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353
A spherical aberration-free microscopy system for live brain imaging.
Ue, Yoshihiro; Monai, Hiromu; Higuchi, Kaori; Nishiwaki, Daisuke; Tajima, Tetsuya; Okazaki, Kenya; Hama, Hiroshi; Hirase, Hajime; Miyawaki, Atsushi
2018-06-02
The high-resolution in vivo imaging of mouse brain for quantitative analysis of fine structures, such as dendritic spines, requires objectives with high numerical apertures (NAs) and long working distances (WDs). However, this imaging approach is often hampered by spherical aberration (SA) that results from the mismatch of refractive indices in the optical path and becomes more severe with increasing depth of target from the brain surface. Whereas a revolving objective correction collar has been designed to compensate SA, its adjustment requires manual operation and is inevitably accompanied by considerable focal shift, making it difficult to acquire the best image of a given fluorescent object. To solve the problems, we have created an objective-attached device and formulated a fast iterative algorithm for the realization of an automatic SA compensation system. The device coordinates the collar rotation and the Z-position of an objective, enabling correction collar adjustment while stably focusing on a target. The algorithm provides the best adjustment on the basis of the calculated contrast of acquired images. Together, they enable the system to compensate SA at a given depth. As proof of concept, we applied the SA compensation system to in vivo two-photon imaging with a 25 × water-immersion objective (NA, 1.05; WD, 2 mm). It effectively reduced SA regardless of location, allowing quantitative and reproducible analysis of fine structures of YFP-labeled neurons in the mouse cerebral cortical layers. Interestingly, although the cortical structure was optically heterogeneous along the z-axis, the refractive index of each layer could be assessed on the basis of the compensation degree. It was also possible to make fully corrected three-dimensional reconstructions of YFP-labeled neurons in live brain samples. Our SA compensation system, called Deep-C, is expected to bring out the best in all correction-collar-equipped objectives for imaging deep regions of heterogeneous tissues. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Quantitative Market Research Regarding Funding of District 8 Construction Projects
DOT National Transportation Integrated Search
1995-05-01
The primary objective of this quantitative research is to provide information : for more effective decision making regarding the level of investment in various : transportation systems in District 8. : This objective was accomplished by establishing ...
Discomfort Evaluation of Truck Ingress/Egress Motions Based on Biomechanical Analysis
Choi, Nam-Chul; Lee, Sang Hun
2015-01-01
This paper presents a quantitative discomfort evaluation method based on biomechanical analysis results for human body movement, as well as its application to an assessment of the discomfort for truck ingress and egress. In this study, the motions of a human subject entering and exiting truck cabins with different types, numbers, and heights of footsteps were first measured using an optical motion capture system and load sensors. Next, the maximum voluntary contraction (MVC) ratios of the muscles were calculated through a biomechanical analysis of the musculoskeletal human model for the captured motion. Finally, the objective discomfort was evaluated using the proposed discomfort model based on the MVC ratios. To validate this new discomfort assessment method, human subject experiments were performed to investigate the subjective discomfort levels through a questionnaire for comparison with the objective discomfort levels. The validation results showed that the correlation between the objective and subjective discomforts was significant and could be described by a linear regression model. PMID:26067194
Salgia, Gaurav; Kulkarni, Deepak G; Shetty, Lakshmi
2015-01-01
C-reactive protein (CRP) estimation for quantitative analysis to assess anti-inflammatory action of nonsteroidal anti-inflammatory drugs (NSAIDs) after surgery in maxillofacial surgery. This study was to evaluate the efficacy of CRP as a quantitative analysis for objective assessment of efficacy of three NSAIDs in postoperative inflammation and pain control. The parallel study group design of randomization was done. Totally 60 patients were divided into three groups. CRP was evaluated at baseline and postoperatively (immediate and 72 h) after surgical removal of impacted lower third molar. The respective group received the drugs by random coding postoperatively. The assessment of pain control and inflammation using NSAIDs postoperatively after surgical removal of impacted lower third molar was qualitatively and quantitatively assessed with CRP levels. The blood sample of the patient was assessed immediate postoperatively and after 72 h. The visual analog scale (VAS) was used for assessment of pain and its correlation with CRP levels. Comparison of difference in levels of CRP levels had P < 0.05 with immediate postoperative and baseline levels. The duration of surgery with association of CRP levels P = 0.425 which was nonsignificant. The pain score was increased with mefenamic acid (P = 0.003), which was significant on VAS. Diclofenac had the best anti-inflammatory action. There was a significant increase in CRP levels in immediate postoperative values and 72 h. CRP test proved to be a useful indicator as a quantitative assessment tool for monitoring postsurgical inflammation and therapeutic effects of various anti-inflammatory drugs. CRP test is a useful indicator for quantitative assessment for comparative evaluation of NSAIDs.
Kudella, Patrick Wolfgang; Moll, Kirsten; Wahlgren, Mats; Wixforth, Achim; Westerhausen, Christoph
2016-04-18
Rosetting is associated with severe malaria and a primary cause of death in Plasmodium falciparum infections. Detailed understanding of this adhesive phenomenon may enable the development of new therapies interfering with rosette formation. For this, it is crucial to determine parameters such as rosetting and parasitaemia of laboratory strains or patient isolates, a bottleneck in malaria research due to the time consuming and error prone manual analysis of specimens. Here, the automated, free, stand-alone analysis software automated rosetting analyzer for micrographs (ARAM) to determine rosetting rate, rosette size distribution as well as parasitaemia with a convenient graphical user interface is presented. Automated rosetting analyzer for micrographs is an executable with two operation modes for automated identification of objects on images. The default mode detects red blood cells and fluorescently labelled parasitized red blood cells by combining an intensity-gradient with a threshold filter. The second mode determines object location and size distribution from a single contrast method. The obtained results are compared with standardized manual analysis. Automated rosetting analyzer for micrographs calculates statistical confidence probabilities for rosetting rate and parasitaemia. Automated rosetting analyzer for micrographs analyses 25 cell objects per second reliably delivering identical results compared to manual analysis. For the first time rosette size distribution is determined in a precise and quantitative manner employing ARAM in combination with established inhibition tests. Additionally ARAM measures the essential observables parasitaemia, rosetting rate and size as well as location of all detected objects and provides confidence intervals for the determined observables. No other existing software solution offers this range of function. The second, non-malaria specific, analysis mode of ARAM offers the functionality to detect arbitrary objects. Automated rosetting analyzer for micrographs has the capability to push malaria research to a more quantitative and statistically significant level with increased reliability due to operator independence. As an installation file for Windows © 7, 8.1 and 10 is available for free, ARAM offers a novel open and easy-to-use platform for the malaria community to elucidate resetting. © 7, 8.1 and 10 is available for free, ARAM offers a novel open and easy-to-use platform for the malaria community to elucidate rosetting.
Hu, Yong; Kwok, Jerry Weilun; Tse, Jessica Yuk-Hang; Luk, Keith Dip-Kei
2014-06-01
Nonsurgical rehabilitation therapy is a commonly used strategy to treat chronic low back pain (LBP). The selection of the most appropriate therapeutic options is still a big challenge in clinical practices. Surface electromyography (sEMG) topography has been proposed to be an objective assessment of LBP rehabilitation. The quantitative analysis of dynamic sEMG would provide an objective tool of prognosis for LBP rehabilitation. To evaluate the prognostic value of quantitative sEMG topographic analysis and to verify the accuracy of the performance of proposed time-varying topographic parameters for identifying the patients who have better response toward the rehabilitation program. A retrospective study of consecutive patients. Thirty-eight patients with chronic nonspecific LBP and 43 healthy subjects. The accuracy of the time-varying quantitative sEMG topographic analysis for monitoring LBP rehabilitation progress was determined by calculating the corresponding receiver-operating characteristic (ROC) curves. Physiologic measure was the sEMG during lumbar flexion and extension. Patients who suffered from chronic nonspecific LBP without the history of back surgery and any medical conditions causing acute exacerbation of LBP during the clinical test were enlisted to perform the clinical test during the 12-week physiotherapy (PT) treatment. Low back pain patients were classified into two groups: "responding" and "nonresponding" based on the clinical assessment. The responding group referred to the LBP patients who began to recover after the PT treatment, whereas the nonresponding group referred to some LBP patients who did not recover or got worse after the treatment. The results of the time-varying analysis in the responding group were compared with those in the nonresponding group. In addition, the accuracy of the analysis was analyzed through ROC curves. The time-varying analysis showed discrepancies in the root-mean-square difference (RMSD) parameters between the responding and nonresponding groups. The relative area (RA) and relative width (RW) of RMSD at flexion and extension in the responding group were significantly lower than those in the nonresponding group (p<.05). The areas under the ROC curve of RA and RW of RMSD at flexion and extension were greater than 0.7 and were statistically significant. The quantitative time-varying analysis of sEMG topography showed significant difference between the healthy and LBP groups. The discrepancies in quantitative dynamic sEMG topography of LBP group from normal group, in terms of RA and RW of RMSD at flexion and extension, were able to identify those LBP subjects who would respond to a conservative rehabilitation program focused on functional restoration of lumbar muscle. Copyright © 2014 Elsevier Inc. All rights reserved.
Multicriteria decision analysis applied to Glen Canyon Dam
Flug, M.; Seitz, H.L.H.; Scott, J.F.
2000-01-01
Conflicts in water resources exist because river-reservoir systems are managed to optimize traditional benefits (e.g., hydropower and flood control), which are historically quantified in economic terms, whereas natural and environmental resources, including in-stream and riparian resources, are more difficult or impossible to quantify in economic terms. Multicriteria decision analysis provides a quantitative approach to evaluate resources subject to river basin management alternatives. This objective quantification method includes inputs from special interest groups, the general public, and concerned individuals, as well as professionals for each resource considered in a trade-off analysis. Multicriteria decision analysis is applied to resources and flow alternatives presented in the environmental impact statement for Glen Canyon Dam on the Colorado River. A numeric rating and priority-weighting scheme is used to evaluate 29 specific natural resource attributes, grouped into seven main resource objectives, for nine flow alternatives enumerated in the environmental impact statement.
Mirsky, Simcha K; Barnea, Itay; Levi, Mattan; Greenspan, Hayit; Shaked, Natan T
2017-09-01
Currently, the delicate process of selecting sperm cells to be used for in vitro fertilization (IVF) is still based on the subjective, qualitative analysis of experienced clinicians using non-quantitative optical microscopy techniques. In this work, a method was developed for the automated analysis of sperm cells based on the quantitative phase maps acquired through use of interferometric phase microscopy (IPM). Over 1,400 human sperm cells from 8 donors were imaged using IPM, and an algorithm was designed to digitally isolate sperm cell heads from the quantitative phase maps while taking into consideration both the cell 3D morphology and contents, as well as acquire features describing sperm head morphology. A subset of these features was used to train a support vector machine (SVM) classifier to automatically classify sperm of good and bad morphology. The SVM achieves an area under the receiver operating characteristic curve of 88.59% and an area under the precision-recall curve of 88.67%, as well as precisions of 90% or higher. We believe that our automatic analysis can become the basis for objective and automatic sperm cell selection in IVF. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques
Rosebrock, Adrian; Caban, Jesus J.; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark
2014-01-01
Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute’s Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant. PMID:25722829
Keighren, Margaret A; Flockhart, Jean; Hodson, Benjamin A; Shen, Guan-Yi; Birtley, James R; Notarnicola-Harwood, Antonio; West, John D
2015-08-01
Recent reports of a new generation of ubiquitous transgenic chimaera markers prompted us to consider the criteria used to evaluate new chimaera markers and develop more objective assessment methods. To investigate this experimentally we used several series of fetal and adult chimaeras, carrying an older, multi-copy transgenic marker. We used two additional independent markers and objective, quantitative criteria for cell selection and cell mixing to investigate quantitative and spatial aspects of developmental neutrality. We also suggest how the quantitative analysis we used could be simplified for future use with other markers. As a result, we recommend a five-step procedure for investigators to evaluate new chimaera markers based partly on criteria proposed previously but with a greater emphasis on examining the developmental neutrality of prospective new markers. These five steps comprise (1) review of published information, (2) evaluation of marker detection, (3) genetic crosses to check for effects on viability and growth, (4) comparisons of chimaeras with and without the marker and (5) analysis of chimaeras with both cell populations labelled. Finally, we review a number of different chimaera markers and evaluate them using the extended set of criteria. These comparisons indicate that, although the new generation of ubiquitous fluorescent markers are the best of those currently available and fulfil most of the criteria required of a chimaera marker, further work is required to determine whether they are developmentally neutral.
Carrer, Francesco
2017-01-01
This paper deals with the ethnoarchaeological analysis of the spatial pattern of artefacts and ecofacts within two traditional pastoral huts (a dwelling and a seasonal dairy) in the uplands of Val Maudagna (Cuneo province, Italian western Alps). The composition of the ethnoarchaeological assemblages of the two huts was studied and compared; point pattern analysis was applied to identify spatial processes mirrored in the interactions between objects; Moran's I correlogram and empirical variogram were used to investigate the effects of trampling on the displacement of objects on the floor. The results were compared with information provided by the herder who still used the huts. The quantitative and ethnographical data enabled inferences to be made that can help in the interpretation of archaeological seasonal sites. The function of a seasonal site can be recognized, as can the impact of delayed curation on the composition of the assemblage and the importance of the intensity of occupation compared with the frequency of occupation. The spatial organization of activities is reflected in the spatial patterns of objects, with clearer identification of activity areas in intensively occupied sites, and there is evidence for the behaviour behind the spatial segregation of activities. Trampling is a crucial post-depositional factor in the displacement of artefacts and ecofacts, especially in non-intensively exploited sites. From a methodological point of view, this research is another example that highlights the importance of integrating quantitative methods (especially spatial analysis and geostatistical methods) and ethnoarchaeological data in order to improve the interpretation of archaeological sites and assemblages.
Animal behavior as a conceptual framework for the study of obsessive-compulsive disorder (OCD).
Eilam, David; Zor, Rama; Fineberg, Naomi; Hermesh, Haggai
2012-06-01
Research on affective disorders may benefit from the methodology of studying animal behavior, in which tools are available for qualitatively and quantitatively measuring and assessing behavior with as much sophistication and attention to detail as in the analysis of the brain. To illustrate this, we first briefly review the characteristics of obsessive-compulsive disorder (OCD), and then demonstrate how the quinpirole rat model is used as a conceptual model in studying human OCD patients. Like the rat model, the study of OCD in humans is based on video-telemetry, whereby observable, measurable, and relatively objective characteristics of OCD behavior may be extracted. In this process, OCD rituals are defined in terms of the space in which they are executed and the movements (acts) that are performed at each location or object in this space. Accordingly, OCD behavior is conceived of as comprising three hierarchical components: (i) rituals (as defined by the patients); (ii) visits to objects/locations in the environment at which the patient stops during the ritual; and (iii) acts performed at each object/location during visits. Scoring these structural components (behavioral units) is conveniently possible with readily available tools for behavioral description and analysis, providing quantitative and qualitative measures of the OCD hallmarks of repetition and addition, as well as the reduced functionality in OCD behavior. Altogether, the concept that was developed in the context of an animal model provides a useful tool that may facilitate OCD diagnosis, assessment and treatment, and may be similarly applied for other psychiatric disorders. Copyright © 2011 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Abouelenein, Mahmoud S.
2012-01-01
The purpose of this quantitative, descriptive research study was to determine, through statistical analysis, any correlation between the perceived transformational leadership traits of CIOs at two-year community colleges in Kansas and measures of the job satisfaction among IT workers at those community colleges. The objectives of this research…
USDA-ARS?s Scientific Manuscript database
Soybean (Glycine max (L.) Merr.) is a major source of plant protein for humans and livestock. Low levels of sulfur containing amino acids (cysteine and methionine) in soybean protein is the main limitation of soybean meal as animal food. The objectives of this study were to identify and validate Q...
ERIC Educational Resources Information Center
Robbins, Chandan Morris
2012-01-01
The objectives of this work are: (1) to agarose-stabilize fragile biofilms for quantitative structure analysis; (2) to understand the influences of LuxS on biofilm formation; (3) to improve teacher quality by preparing Georgia's middle school science teachers to integrate inquiry-based, hands-on research modules in the classroom. Quantitative…
ERIC Educational Resources Information Center
Wang, Ye; Willis, Erin
2016-01-01
Objective: To examine whether and to what extent relevant and meaningful discussions of weight loss occurred in the Weight Watchers' online community, and whether and to what extent the online community is designed for fostering such discussions. Methods: A multimethod approach was used here. First, a quantitative content analysis was conducted on…
Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo
2014-01-01
This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m(2), 5074.1790 m(3) and 1316.1250 m(2), 1591.5784 m(3), respectively. The results of the study provide a new method for the quantitative study of small gully erosion.
Sun, Ziwen; Shi, Xiaomin; Wang, Yun; Zhao, Yi
2018-06-05
An objective and quantitative method to evaluate psoriasis severity is important for practice and research in the precision care of psoriasis. We aimed to explore serum biomarkers quantitatively in association with disease severity and treatment response in psoriasis patients, with serum squamous cell carcinoma antigen (SCCA) evaluated in this pilot study. 15 psoriasis patients were treated with adalimumab. At different visits before and after treatment, quantitative body surface area (qBSA) was obtained from standardized digital body images of the patients, and the psoriasis area severity index (PASI) was also monitored. SCCA were detected by using microparticle enzyme immunoassay. The serum biomarkers were also tested in healthy volunteers as normal controls. Receiver-operating characteristic (ROC) curve analysis was used to explore the optimal cutoff point of SCCA to differentiate mild and moderate-to-severe psoriasis. The serum SCCA level in the psoriasis group was significantly higher (p < 0.05) than in the normal control group. After treatment, the serum SCCA levels were significantly decreased (p < 0.05). The SCCA level was well correlated with PASI and qBSA. In ROC analysis, when taking PASI = 10 or qBSA = 10% as the threshold, an optimal cutoff point of SCCA was found at 2.0 ng/mL with the highest Youden index. Serum SCCA might be a useful quantitative biomarker for psoriasis disease severity. © 2018 S. Karger AG, Basel.
Resolving power of diffraction imaging with an objective: a numerical study.
Wang, Wenjin; Liu, Jing; Lu, Jun Qing; Ding, Junhua; Hu, Xin-Hua
2017-05-01
Diffraction imaging in the far-field can detect 3D morphological features of an object for its coherent nature. We describe methods for accurate calculation and analysis of diffraction images of scatterers of single and double spheres by an imaging unit based on microscope objective at non-conjugate positions. A quantitative study of the calculated diffraction imaging in spectral domain has been performed to assess the resolving power of diffraction imaging. It has been shown numerically that with coherent illumination of 532 nm in wavelength the imaging unit can resolve single spheres of 2 μm or larger in diameters and double spheres separated by less than 300 nm between their centers.
Objective analysis of pseudostress over the Indian Ocean using a direct-minimization approach
NASA Technical Reports Server (NTRS)
Legler, David M.; Navon, I. M.; O'Brien, James J.
1989-01-01
A technique not previously used in objective analysis of meteorological data is used here to produce monthly average surface pseudostress data over the Indian Ocean. An initial guess field is derived and a cost functional is constructed with five terms: approximation to initial guess, approximation to climatology, a smoothness parameter, and two kinematic terms. The functional is minimized using a conjugate-gradient technique, and the weight for the climatology term controls the overall balance of influence between the climatology and the initial guess. Results from various weight combinations are presented for January and July 1984. Quantitative and qualitative comparisons to the subject analysis are made to find which weight combination provides the best results. The weight on the approximation to climatology is found to balance the influence of the original field and climatology.
Phase retrieval with the reverse projection method in the presence of object's scattering
NASA Astrophysics Data System (ADS)
Wang, Zhili; Gao, Kun; Wang, Dajiang
2017-08-01
X-ray grating interferometry can provide substantially increased contrast over traditional attenuation-based techniques in biomedical applications, and therefore novel and complementary information. Recently, special attention has been paid to quantitative phase retrieval in X-ray grating interferometry, which is mandatory to perform phase tomography, to achieve material identification, etc. An innovative approach, dubbed ;Reverse Projection; (RP), has been developed for quantitative phase retrieval. The RP method abandons grating scanning completely, and is thus advantageous in terms of higher efficiency and reduced radiation damage. Therefore, it is expected that this novel method would find its potential in preclinical and clinical implementations. Strictly speaking, the reverse projection method is applicable for objects exhibiting only absorption and refraction. In this contribution, we discuss the phase retrieval with the reverse projection method for general objects with absorption, refraction and scattering simultaneously. Especially, we investigate the influence of the object's scattering on the retrieved refraction signal. Both theoretical analysis and numerical experiments are performed. The results show that the retrieved refraction signal is the product of object's refraction and scattering signals for small values. In the case of a strong scattering, the reverse projection method cannot provide reliable phase retrieval. Those presented results will guide the use of the reverse projection method for future practical applications, and help to explain some possible artifacts in the retrieved images and/or reconstructed slices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, D.G.; Parks, J.M.
1984-04-01
Silhouette shapes are two-dimensional projections of three-dimensional objects such as sand grains, gravel, and fossils. Within-the-margin markings such as chamber boundaries, sutures, or ribs are ignored. Comparisons between populations of objects from similar and differential origins (i.e., environments, species or genera, growth series, etc) is aided by quantifying the shapes. The Multiple Rotations Method (MRM) uses a variation of ''eigenshapes'', which is capable of distinguishing most of the subtle variations that the ''trained eye'' can detect. With a video-digitizer and microcomputer, MRM is fast, more accurate, and more objective than the human eye. The resulting shape descriptors comprise 5 ormore » 6 numbers per object that can be stored and retrieved to compare with similar descriptions of other objects. The original-shape outlines can be reconstituted sufficiently for gross recognition from these few numerical descriptors. Thus, a semi-automated data-retrieval system becomes feasible, with silhouette-shape descriptions as one of several recognition criteria. MRM consists of four ''rotations'': rotation about a center to a comparable orientation; a principal-components rotation to reduce the many original shape descriptors to a few; a VARIMAX orthogonal-factor rotation to achieve simple structure; and a rotation to achieve factor scores on individual objects. A variety of subtly different shapes includes sand grains from several locations, ages, and environments, and fossils of several types. This variety illustrates the feasibility of quantitative comparisons by MRM.« less
Longitudinal study of skin aging: from microrelief to wrinkles.
Bazin, Roland; Lévêque, Jean Luc
2011-05-01
To study the changes in skin microrelief and periocular wrinkles during the aging process. Replicas of the crow's feet area of volunteers were recorded in 1987 and 2008 and observed comparatively. Characteristic features were quantified by image analysis. Observation shows that some microrelief features disappear and even merge with wrinkles that become more marked. Some primary lines also tend to merge to form thin new wrinkles. Quantitative data support these observations: the size of small and medium objects of skin relief decreases with age while large objects are becoming larger. Over 21 years, in the group studied, the total area of the detected objects remains quite constant. Only the distribution between small and large detected objects (microrelief features and wrinkles, respectively) is modified. © 2011 John Wiley & Sons A/S.
NASA Technical Reports Server (NTRS)
Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.
1993-01-01
The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).
Shin, S M; Choi, Y-S; Yamaguchi, T; Maki, K; Cho, B-H; Park, S-B
2015-01-01
Objectives: To evaluate axial cervical vertebral (ACV) shape quantitatively and to build a prediction model for skeletal maturation level using statistical shape analysis for Japanese individuals. Methods: The sample included 24 female and 19 male patients with hand–wrist radiographs and CBCT images. Through generalized Procrustes analysis and principal components (PCs) analysis, the meaningful PCs were extracted from each ACV shape and analysed for the estimation regression model. Results: Each ACV shape had meaningful PCs, except for the second axial cervical vertebra. Based on these models, the smallest prediction intervals (PIs) were from the combination of the shape space PCs, age and gender. Overall, the PIs of the male group were smaller than those of the female group. There was no significant correlation between centroid size as a size factor and skeletal maturation level. Conclusions: Our findings suggest that the ACV maturation method, which was applied by statistical shape analysis, could confirm information about skeletal maturation in Japanese individuals as an available quantifier of skeletal maturation and could be as useful a quantitative method as the skeletal maturation index. PMID:25411713
Peeters, Michael J; Vaidya, Varun A
2016-06-25
Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.
Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng
2016-04-01
The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized measurements with limited intra or inter-observer variability.
de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R
2016-04-01
A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.
Analysis of artifacts suggests DGGE should not be used for quantitative diversity analysis.
Neilson, Julia W; Jordan, Fiona L; Maier, Raina M
2013-03-01
PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Urbaneja, Miguel A.; Kudritzki, Rolf P.
2017-11-01
Blue supergiant stars of B and A spectral types are amongst the visually brightest non-transient astronomical objects. Their intrinsic brightness makes it possible to obtain high quality optical spectra of these objects in distant galaxies, enabling the study not only of these stars in different environments, but also to use them as tools to probe their host galaxies. Quantitative analysis of their optical spectra provide tight constraints on their evolution in a wide range of metallicities, as well as on the present-day chemical composition, extinction laws and distances to their host galaxies. We review in this contribution recent results in this field.
Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei
2012-01-01
We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.
Li, Yuanpeng; Li, Fucui; Yang, Xinhao; Guo, Liu; Huang, Furong; Chen, Zhenqiang; Chen, Xingdan; Zheng, Shifu
2018-08-05
A rapid quantitative analysis model for determining the glycated albumin (GA) content based on Attenuated total reflectance (ATR)-Fourier transform infrared spectroscopy (FTIR) combining with linear SiPLS and nonlinear SVM has been developed. Firstly, the real GA content in human serum was determined by GA enzymatic method, meanwhile, the ATR-FTIR spectra of serum samples from the population of health examination were obtained. The spectral data of the whole spectra mid-infrared region (4000-600 cm -1 ) and GA's characteristic region (1800-800 cm -1 ) were used as the research object of quantitative analysis. Secondly, several preprocessing steps including first derivative, second derivative, variable standardization and spectral normalization, were performed. Lastly, quantitative analysis regression models were established by using SiPLS and SVM respectively. The SiPLS modeling results are as follows: root mean square error of cross validation (RMSECV T ) = 0.523 g/L, calibration coefficient (R C ) = 0.937, Root Mean Square Error of Prediction (RMSEP T ) = 0.787 g/L, and prediction coefficient (R P ) = 0.938. The SVM modeling results are as follows: RMSECV T = 0.0048 g/L, R C = 0.998, RMSEP T = 0.442 g/L, and R p = 0.916. The results indicated that the model performance was improved significantly after preprocessing and optimization of characteristic regions. While modeling performance of nonlinear SVM was considerably better than that of linear SiPLS. Hence, the quantitative analysis model for GA in human serum based on ATR-FTIR combined with SiPLS and SVM is effective. And it does not need sample preprocessing while being characterized by simple operations and high time efficiency, providing a rapid and accurate method for GA content determination. Copyright © 2018 Elsevier B.V. All rights reserved.
Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool
NASA Astrophysics Data System (ADS)
Chakraborty, Monisha; Ghosh, Dipak
2017-12-01
Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.
Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool
NASA Astrophysics Data System (ADS)
Chakraborty, Monisha; Ghosh, Dipak
2018-04-01
Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.
Quantification of EEG reactivity in comatose patients.
Hermans, Mathilde C; Westover, M Brandon; van Putten, Michel J A M; Hirsch, Lawrence J; Gaspard, Nicolas
2016-01-01
EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet's AC1: 65-70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts' agreement regarding reactivity for each individual case. Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
de Certaines, J D; Henriksen, O; Spisni, A; Cortsen, M; Ring, P B
1993-01-01
Quantitative magnetic resonance imaging may offer unique potential for tissue characterization in vivo. In this connection texture analysis of quantitative MR images may be of special importance. Because evaluation of texture analysis needs large data material, multicenter approaches become mandatory. Within the frame of BME Concerted Action on Tissue Characterization by MRI and MRS, a pilot multicenter study was launched in order to evaluate the technical problems including comparability of relaxation time measurements carried out in the individual sites. Human brain, skeletal muscle, and liver were used as models. A total of 218 healthy volunteers were studied. Fifteen MRI scanners with field strength ranging from 0.08 T to 1.5 T were induced. Measurement accuracy was tested on the Eurospin relaxation time test object (TO5) and the obtained calibration curve was used for correction of the in vivo data. The results established that, by following a standardized procedure, comparable quantitative measurements can be obtained in vivo from a number of MR sites. The overall variation coefficient in vivo was in the same order of magnitude as ex vivo relaxometry. Thus, it is possible to carry out international multicenter studies on quantitative imaging, provided that quality control with respect to measurement accuracy and calibration of the MR equipments are performed.
Simulating realistic predator signatures in quantitative fatty acid signature analysis
Bromaghin, Jeffrey F.
2015-01-01
Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.
Modeling and Analysis of Space Based Transceivers
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben
2005-01-01
This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.
Modeling and Analysis of Space Based Transceivers
NASA Technical Reports Server (NTRS)
Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.
2007-01-01
This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.
Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.
Kemp, Pavlina S; VanderVeen, Deborah K
2016-01-01
The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.
High-fidelity, low-cost, automated method to assess laparoscopic skills objectively.
Gray, Richard J; Kahol, Kanav; Islam, Gazi; Smith, Marshall; Chapital, Alyssa; Ferrara, John
2012-01-01
We sought to define the extent to which a motion analysis-based assessment system constructed with simple equipment could measure technical skill objectively and quantitatively. An "off-the-shelf" digital video system was used to capture the hand and instrument movement of surgical trainees (beginner level = PGY-1, intermediate level = PGY-3, and advanced level = PGY-5/fellows) while they performed a peg transfer exercise. The video data were passed through a custom computer vision algorithm that analyzed incoming pixels to measure movement smoothness objectively. The beginner-level group had the poorest performance, whereas those in the advanced group generated the highest scores. Intermediate-level trainees scored significantly (p < 0.04) better than beginner trainees. Advanced-level trainees scored significantly better than intermediate-level trainees and beginner-level trainees (p < 0.04 and p < 0.03, respectively). A computer vision-based analysis of surgical movements provides an objective basis for technical expertise-level analysis with construct validity. The technology to capture the data is simple, low cost, and readily available, and it obviates the need for expert human assessment in this setting. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
RNA-based determination of ESR1 and HER2 expression and response to neoadjuvant chemotherapy.
Denkert, C; Loibl, S; Kronenwett, R; Budczies, J; von Törne, C; Nekljudova, V; Darb-Esfahani, S; Solbach, C; Sinn, B V; Petry, C; Müller, B M; Hilfrich, J; Altmann, G; Staebler, A; Roth, C; Ataseven, B; Kirchner, T; Dietel, M; Untch, M; von Minckwitz, G
2013-03-01
Hormone and human epidermal growth factor receptor 2 (HER2) receptors are the most important breast cancer biomarkers, and additional objective and quantitative test methods such as messenger RNA (mRNA)-based quantitative analysis are urgently needed. In this study, we investigated the clinical validity of RT-PCR-based evaluation of estrogen receptor (ESR1) and HER2 mRNA expression. A total of 1050 core biopsies from two retrospective (GeparTrio, GeparQuattro) and one prospective (PREDICT) neoadjuvant studies were evaluated by quantitative RT-PCR for ESR1 and HER2. ESR1 mRNA was significantly predictive for reduced response to neoadjuvant chemotherapy in univariate and multivariate analysis in all three cohorts. The complete pathologically documented response (pathological complete response, pCR) rate for ESR1+/HER2- tumors was 7.3%, 8.0% and 8.6%; for ESR1-/HER2- tumors it was 34.4%, 33.7% and 37.3% in GeparTrio, GeparQuattro and PREDICT, respectively (P < 0.001 in each cohort). In the Kaplan-Meier analysis in GeparTrio patients with ESR1+/HER2- tumors had the best prognosis, compared with ESR1-/HER2- and ESR1-/HER2+ tumors [disease-free survival (DFS): P < 0.0005, overall survival (OS): P < 0.0005]. Our results suggest that mRNA levels of ESR1 and HER2 predict response to neoadjuvant chemotherapy and are significantly associated with long-term outcome. As an additional option to standard immunohistochemistry and gene-array-based analysis, quantitative RT-PCR analysis might be useful for determination of the receptor status in breast cancer.
ERIC Educational Resources Information Center
Ikuma, Takeshi; Kunduk, Melda; McWhorter, Andrew J.
2014-01-01
Purpose: The model-based quantitative analysis of high-speed videoendoscopy (HSV) data at a low frame rate of 2,000 frames per second was assessed for its clinical adequacy. Stepwise regression was employed to evaluate the HSV parameters using harmonic models and their relationships to the Voice Handicap Index (VHI). Also, the model-based HSV…
Software for Automated Image-to-Image Co-registration
NASA Technical Reports Server (NTRS)
Benkelman, Cody A.; Hughes, Heidi
2007-01-01
The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.
NASA Technical Reports Server (NTRS)
1977-01-01
Those areas of future missions which will be impacted by planetary quarantine (PQ) constraints were identified. The specific objectives for this reporting period were (1) to perform an analysis of the effects of PQ on an outer planet atmospheric probe, and (2) to prepare a quantitative illustration of spacecraft microbial reduction resulting from exposure to space environments. The Jupiter Orbiter Probe mission was used as a model for both of these efforts.
ERIC Educational Resources Information Center
Grobler, R.; Khatite, M.
2012-01-01
This study inquires into some of the factors that might predispose the use and abuse of drugs among secondary school learners in a township school. The objective of this research is to identify these factors and to offer a few suggestions on how the abuse may be prevented. A quantitative research strategy is used and a document analysis technique…
Miller, Christopher B; Bartlett, Delwyn J; Mullins, Anna E; Dodds, Kirsty L; Gordon, Christopher J; Kyle, Simon D; Kim, Jong Won; D'Rozario, Angela L; Lee, Rico S C; Comas, Maria; Marshall, Nathaniel S; Yee, Brendon J; Espie, Colin A; Grunstein, Ronald R
2016-11-01
To empirically derive and evaluate potential clusters of Insomnia Disorder through cluster analysis from polysomnography (PSG). We hypothesized that clusters would differ on neurocognitive performance, sleep-onset measures of quantitative ( q )-EEG and heart rate variability (HRV). Research volunteers with Insomnia Disorder (DSM-5) completed a neurocognitive assessment and overnight PSG measures of total sleep time (TST), wake time after sleep onset (WASO), and sleep onset latency (SOL) were used to determine clusters. From 96 volunteers with Insomnia Disorder, cluster analysis derived at least two clusters from objective sleep parameters: Insomnia with normal objective sleep duration (I-NSD: n = 53) and Insomnia with short sleep duration (I-SSD: n = 43). At sleep onset, differences in HRV between I-NSD and I-SSD clusters suggest attenuated parasympathetic activity in I-SSD (P < 0.05). Preliminary work suggested three clusters by retaining the I-NSD and splitting the I-SSD cluster into two: I-SSD A (n = 29): defined by high WASO and I-SSD B (n = 14): a second I-SSD cluster with high SOL and medium WASO. The I-SSD B cluster performed worse than I-SSD A and I-NSD for sustained attention (P ≤ 0.05). In an exploratory analysis, q -EEG revealed reduced spectral power also in I-SSD B before (Delta, Alpha, Beta-1) and after sleep-onset (Beta-2) compared to I-SSD A and I-NSD (P ≤ 0.05). Two insomnia clusters derived from cluster analysis differ in sleep onset HRV. Preliminary data suggest evidence for three clusters in insomnia with differences for sustained attention and sleep-onset q -EEG. Insomnia 100 sleep study: Australia New Zealand Clinical Trials Registry (ANZCTR) identification number 12612000049875. URL: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=347742. © 2016 Associated Professional Sleep Societies, LLC.
An Overview of data science uses in bioimage informatics.
Chessel, Anatole
2017-02-15
This review aims at providing a practical overview of the use of statistical features and associated data science methods in bioimage informatics. To achieve a quantitative link between images and biological concepts, one typically replaces an object coming from an image (a segmented cell or intracellular object, a pattern of expression or localisation, even a whole image) by a vector of numbers. They range from carefully crafted biologically relevant measurements to features learnt through deep neural networks. This replacement allows for the use of practical algorithms for visualisation, comparison and inference, such as the ones from machine learning or multivariate statistics. While originating mainly, for biology, in high content screening, those methods are integral to the use of data science for the quantitative analysis of microscopy images to gain biological insight, and they are sure to gather more interest as the need to make sense of the increasing amount of acquired imaging data grows more pressing. Copyright © 2017 Elsevier Inc. All rights reserved.
A computational image analysis glossary for biologists.
Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M
2012-09-01
Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies.
The Basic Shelf Experience: a comprehensive evaluation.
Dewolfe, Judith A; Greaves, Gaye
2003-01-01
The Basic Shelf Experience is a program designed to assist people living on limited incomes to make better use of their food resources. The purpose of this research was to learn if the Basic Shelf Experience program helps such people to 1. utilize food resources more effectively and 2. cope, through group support, with poverty-associated stressors that influence food security. Both quantitative and qualitative methods were used to evaluate the program objectives. Participants completed a questionnaire at the beginning and end of the six-week program. The questionnaire asked about their food access, food security, and feelings about themselves. Participants returned for a focus group discussion and completed the questionnaire again three months after the program ended. The focus group was designed to elicit information about perceived changes, if any, attributed to the program. Forty-two people completed the questionnaires pre-program and 20 post-program; 17 participated in the three-month follow-up session. While results from quantitative data analysis indicate that program objectives were not met, qualitative data provide evidence that the program did achieve its stated objectives. Our results suggest such programs as the Basic Shelf Experience can assist people living on limited incomes to achieve food security.
Wang, Yiqin; Yan, Hanxia; Yan, Jianjun; Yuan, Fengyin; Xu, Zhaoxia; Liu, Guoping; Xu, Wenjie
2015-01-01
Objective. This research provides objective and quantitative parameters of the traditional Chinese medicine (TCM) pulse conditions for distinguishing between patients with the coronary heart disease (CHD) and normal people by using the proposed classification approach based on Hilbert-Huang transform (HHT) and random forest. Methods. The energy and the sample entropy features were extracted by applying the HHT to TCM pulse by treating these pulse signals as time series. By using the random forest classifier, the extracted two types of features and their combination were, respectively, used as input data to establish classification model. Results. Statistical results showed that there were significant differences in the pulse energy and sample entropy between the CHD group and the normal group. Moreover, the energy features, sample entropy features, and their combination were inputted as pulse feature vectors; the corresponding average recognition rates were 84%, 76.35%, and 90.21%, respectively. Conclusion. The proposed approach could be appropriately used to analyze pulses of patients with CHD, which can lay a foundation for research on objective and quantitative criteria on disease diagnosis or Zheng differentiation. PMID:26180536
Guo, Rui; Wang, Yiqin; Yan, Hanxia; Yan, Jianjun; Yuan, Fengyin; Xu, Zhaoxia; Liu, Guoping; Xu, Wenjie
2015-01-01
Objective. This research provides objective and quantitative parameters of the traditional Chinese medicine (TCM) pulse conditions for distinguishing between patients with the coronary heart disease (CHD) and normal people by using the proposed classification approach based on Hilbert-Huang transform (HHT) and random forest. Methods. The energy and the sample entropy features were extracted by applying the HHT to TCM pulse by treating these pulse signals as time series. By using the random forest classifier, the extracted two types of features and their combination were, respectively, used as input data to establish classification model. Results. Statistical results showed that there were significant differences in the pulse energy and sample entropy between the CHD group and the normal group. Moreover, the energy features, sample entropy features, and their combination were inputted as pulse feature vectors; the corresponding average recognition rates were 84%, 76.35%, and 90.21%, respectively. Conclusion. The proposed approach could be appropriately used to analyze pulses of patients with CHD, which can lay a foundation for research on objective and quantitative criteria on disease diagnosis or Zheng differentiation.
Accurate object tracking system by integrating texture and depth cues
NASA Astrophysics Data System (ADS)
Chen, Ju-Chin; Lin, Yu-Hang
2016-03-01
A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.
Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography
Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila
2016-01-01
Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251
Method for matching customer and manufacturer positions for metal product parameters standardization
NASA Astrophysics Data System (ADS)
Polyakova, Marina; Rubin, Gennadij; Danilova, Yulija
2018-04-01
Decision making is the main stage of regulation the relations between customer and manufacturer during the design the demands of norms in standards. It is necessary to match the positions of the negotiating sides in order to gain the consensus. In order to take into consideration the differences of customer and manufacturer estimation of the object under standardization process it is obvious to use special methods of analysis. It is proposed to establish relationships between product properties and its functions using functional-target analysis. The special feature of this type of functional analysis is the consideration of the research object functions and properties. It is shown on the example of hexagonal head crew the possibility to establish links between its functions and properties. Such approach allows obtaining a quantitative assessment of the closeness the positions of customer and manufacturer at decision making during the standard norms establishment.
Foglia, L.; Hill, Mary C.; Mehl, Steffen W.; Burlando, P.
2009-01-01
We evaluate the utility of three interrelated means of using data to calibrate the fully distributed rainfall‐runoff model TOPKAPI as applied to the Maggia Valley drainage area in Switzerland. The use of error‐based weighting of observation and prior information data, local sensitivity analysis, and single‐objective function nonlinear regression provides quantitative evaluation of sensitivity of the 35 model parameters to the data, identification of data types most important to the calibration, and identification of correlations among parameters that contribute to nonuniqueness. Sensitivity analysis required only 71 model runs, and regression required about 50 model runs. The approach presented appears to be ideal for evaluation of models with long run times or as a preliminary step to more computationally demanding methods. The statistics used include composite scaled sensitivities, parameter correlation coefficients, leverage, Cook's D, and DFBETAS. Tests suggest predictive ability of the calibrated model typical of hydrologic models.
Spector, P E; Jex, S M
1998-10-01
Despite the widespread use of self-report measures of both job-related stressors and strains, relatively few carefully developed scales for which validity data exist are available. In this article, we discuss 3 job stressor scales (Interpersonal Conflict at Work Scale, Organizational Constraints Scale, and Quantitative Workload Inventory) and 1 job strain scale (Physical Symptoms Inventory). Using meta-analysis, we combined the results of 18 studies to provide estimates of relations between our scales and other variables. Data showed moderate convergent validity for the 3 job stressor scales, suggesting some objectively to these self-reports. Norms for each scale are provided.
Cognitive functioning in patients with affective disorders and schizophrenia: a meta-analysis.
Stefanopoulou, Evgenia; Manoharan, Andiappan; Landau, Sabine; Geddes, John R; Goodwin, Guy; Frangou, Sophia
2009-01-01
There is considerable evidence for cognitive dysfunction in schizophrenia and affective disorders, but the pattern of potential similarities or differences between diagnostic groups remains uncertain. The objective of this study was to conduct a quantitative review of studies on cognitive performance in schizophrenia and affective disorders. Relevant articles were identified through literature search in major databases for the period between January 1980 and December 2005. Meta-analytic treatment of the original studies revealed widespread cognitive deficits in patients with schizophrenia and affective disorders in intellectual ability and speed of information processing, in encoding and retrieval, rule discovery and in response generation and response inhibition. Differences between diagnostic groups were quantitative rather than qualitative.
NecroQuant: quantitative assessment of radiological necrosis
NASA Astrophysics Data System (ADS)
Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay
2017-11-01
Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.
A sensitivity analysis of regional and small watershed hydrologic models
NASA Technical Reports Server (NTRS)
Ambaruch, R.; Salomonson, V. V.; Simmons, J. W.
1975-01-01
Continuous simulation models of the hydrologic behavior of watersheds are important tools in several practical applications such as hydroelectric power planning, navigation, and flood control. Several recent studies have addressed the feasibility of using remote earth observations as sources of input data for hydrologic models. The objective of the study reported here was to determine how accurately remotely sensed measurements must be to provide inputs to hydrologic models of watersheds, within the tolerances needed for acceptably accurate synthesis of streamflow by the models. The study objective was achieved by performing a series of sensitivity analyses using continuous simulation models of three watersheds. The sensitivity analysis showed quantitatively how variations in each of 46 model inputs and parameters affect simulation accuracy with respect to five different performance indices.
Losh, Molly; Gordon, Peter C.
2014-01-01
Autism Spectrum Disorder (ASD) is characterized by difficulties with social communication and functioning, and ritualistic/repetitive behaviors (American Psychiatric Association, 2013). While substantial heterogeneity exists in symptom expression, impairments in language discourse skills, including narrative, are universally observed (Tager-Flusberg, Paul, & Lord, 2005). This study applied a computational linguistic tool, Latent Semantic Analysis (LSA), to objectively characterize narrative performance in ASD across two narrative contexts differing in interpersonal and cognitive demands. Results indicated that individuals with ASD produced narratives comparable in semantic content to those from controls when narrating from a picture book, but produced narratives diminished in semantic quality in a more demanding narrative recall task. Results are discussed in terms of the utility of LSA as a quantitative, objective, and efficient measure of narrative ability. PMID:24915929
Dorph, Annalie
2017-01-01
Defining an acoustic repertoire is essential to understanding vocal signalling and communicative interactions within a species. Currently, quantitative and statistical definition is lacking for the vocalisations of many dasyurids, an important group of small to medium-sized marsupials from Australasia that includes the eastern quoll (Dasyurus viverrinus), a species of conservation concern. Beyond generating a better understanding of this species' social interactions, determining an acoustic repertoire will further improve detection rates and inference of vocalisations gathered by automated bioacoustic recorders. Hence, this study investigated eastern quoll vocalisations using objective signal processing techniques to quantitatively analyse spectrograms recorded from 15 different individuals. Recordings were collected in conjunction with observations of the behaviours associated with each vocalisation to develop an acoustic-based behavioural repertoire for the species. Analysis of recordings produced a putative classification of five vocalisation types: Bark, Growl, Hiss, Cp-cp, and Chuck. These were most frequently observed during agonistic encounters between conspecifics, most likely as a graded sequence from Hisses occurring in a warning context through to Growls and finally Barks being given prior to, or during, physical confrontations between individuals. Quantitative and statistical methods were used to objectively establish the accuracy of these five putative call types. A multinomial logistic regression indicated a 97.27% correlation with the perceptual classification, demonstrating support for the five different vocalisation types. This putative classification was further supported by hierarchical cluster analysis and silhouette information that determined the optimal number of clusters to be five. Minor disparity between the objective and perceptual classifications was potentially the result of gradation between vocalisations, or subtle differences present within vocalisations not discernible to the human ear. The implication of these different vocalisations and their given context is discussed in relation to the ecology of the species and the potential application of passive acoustic monitoring techniques. PMID:28686679
Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao
2015-01-01
Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566
Application of remote sensing to monitoring and studying dispersion in ocean dumping
NASA Technical Reports Server (NTRS)
Johnson, R. W.; Ohlhorst, C. W.
1981-01-01
Remotely sensed wide area synoptic data provides information on ocean dumping that is not readily available by other means. A qualitative approach has been used to map features, such as river plumes. Results of quantitative analyses have been used to develop maps showing quantitative distributions of one or more water quality parameters, such as suspended solids or chlorophyll a. Joint NASA/NOAA experiments have been conducted at designated dump areas in the U.S. coastal zones to determine the applicability of aircraft remote sensing systems to map plumes resulting from ocean dumping of sewage sludge and industrial wastes. A second objective is related to the evaluation of previously developed quantitative analysis techniques for studying dispersion of materials in these plumes. It was found that plumes resulting from dumping of four waste materials have distinctive spectral characteristics. The development of a technology for use in a routine monitoring system, based on remote sensing techniques, is discussed.
Fakhouri, A.; Kuperman, A.
2014-01-01
The paper focuses on the quantitative analysis of Israeli Government's targets of 10% renewable energy penetration by 2020 and determining the desired methodology (models) for assessing the effects on the electricity market, addressing the fact that Israel is an electricity island. The main objective is to determine the influence of achieving the Government's goals for renewable energy penetration on the need for backup in the Israeli electricity system. This work presents the current situation of the Israeli electricity market and the study to be taken in order to assess the undesirable effects resulting from the intermittency of electricity generated by wind and solar power stations as well as presents some solutions to mitigating these phenomena. Future work will focus on a quantitative analysis of model runs and determine the amounts of backup required relative to the amount of installed capacity from renewable resources. PMID:24624044
Fakhouri, A; Kuperman, A
2014-01-01
The paper focuses on the quantitative analysis of Israeli Government's targets of 10% renewable energy penetration by 2020 and determining the desired methodology (models) for assessing the effects on the electricity market, addressing the fact that Israel is an electricity island. The main objective is to determine the influence of achieving the Government's goals for renewable energy penetration on the need for backup in the Israeli electricity system. This work presents the current situation of the Israeli electricity market and the study to be taken in order to assess the undesirable effects resulting from the intermittency of electricity generated by wind and solar power stations as well as presents some solutions to mitigating these phenomena. Future work will focus on a quantitative analysis of model runs and determine the amounts of backup required relative to the amount of installed capacity from renewable resources.
Stevanović, Nikola R; Perušković, Danica S; Gašić, Uroš M; Antunović, Vesna R; Lolić, Aleksandar Đ; Baošić, Rada M
2017-03-01
The objectives of this study were to gain insights into structure-retention relationships and to propose the model to estimating their retention. Chromatographic investigation of series of 36 Schiff bases and their copper(II) and nickel(II) complexes was performed under both normal- and reverse-phase conditions. Chemical structures of the compounds were characterized by molecular descriptors which are calculated from the structure and related to the chromatographic retention parameters by multiple linear regression analysis. Effects of chelation on retention parameters of investigated compounds, under normal- and reverse-phase chromatographic conditions, were analyzed by principal component analysis, quantitative structure-retention relationship and quantitative structure-activity relationship models were developed on the basis of theoretical molecular descriptors, calculated exclusively from molecular structure, and parameters of retention and lipophilicity. Copyright © 2016 John Wiley & Sons, Ltd.
Quantitative analysis of spatial variability of geotechnical parameters
NASA Astrophysics Data System (ADS)
Fang, Xing
2018-04-01
Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.
Freitas, Mírian Luisa Faria; Dutra, Mariana Borges de Lima; Bolini, Helena Maria André
2016-12-01
The objective of this study was to evaluate the sensory properties and acceptability of pitanga nectar samples prepared with sucrose and different sweeteners (sucralose, aspartame, stevia with 40% rebaudioside A, stevia with 95% rebaudioside A, neotame, and a 2:1 cyclamate/saccharin blend). A total of 13 assessors participated in a quantitative descriptive analysis and evaluated the samples in relation to the descriptor terms. The acceptability test was carried out by 120 fruit juice consumers. The results of the quantitative descriptive analysis of pitanga nectar showed that samples prepared with sucralose, aspartame, and the 2:1 cyclamate/saccharin blend had sensory profiles similar to that of the sample prepared with sucrose. Consumers' most accepted samples were prepared with sucrose, sucralose, aspartame, and neotame. The sweeteners that have the greatest potential to replace sucrose in pitanga nectar are sucralose and aspartame. © The Author(s) 2016.
Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.
Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe
2018-01-01
Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.
Lifestyle Factors and Visible Skin Aging in a Population of Japanese Elders
Asakura, Keiko; Nishiwaki, Yuji; Milojevic, Ai; Michikawa, Takehiro; Kikuchi, Yuriko; Nakano, Makiko; Iwasawa, Satoko; Hillebrand, Greg; Miyamoto, Kukizo; Ono, Masaji; Kinjo, Yoshihide; Akiba, Suminori; Takebayashi, Toru
2009-01-01
Background The number of studies that use objective and quantitative methods to evaluate facial skin aging in elderly people is extremely limited, especially in Japan. Therefore, in this cross-sectional study we attempted to characterize the condition of facial skin (hyperpigmentation, pores, texture, and wrinkling) in Japanese adults aged 65 years or older by using objective and quantitative imaging methods. In addition, we aimed to identify lifestyle factors significantly associated with these visible signs of aging. Methods The study subjects were 802 community-dwelling Japanese men and women aged at least 65 years and living in the town of Kurabuchi (Takasaki City, Gunma Prefecture, Japan), a mountain community with a population of approximately 4800. The facial skin condition of subjects was assessed quantitatively using a standardized facial imaging system and subsequent computer image analysis. Lifestyle information was collected using a structured questionnaire. The association between skin condition and lifestyle factors was examined using multivariable regression analysis. Results Among women, the mean values for facial texture, hyperpigmentation, and pores were generally lower than those among age-matched men. There was no significant difference between sexes in the severity of facial wrinkling. Older age was associated with worse skin condition among women only. After adjusting for age, smoking status and topical sun protection were significantly associated with skin condition among both men and women. Conclusions Our study revealed significant differences between sexes in the severity of hyperpigmentation, texture, and pores, but not wrinkling. Smoking status and topical sun protection were significantly associated with signs of visible skin aging in this study population. PMID:19700917
Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza
2014-05-22
Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher sensitivity and specificity for the diagnosis of CR-BSIs in newborns when compared to the quantitative technique. In addition, this method is easier to perform and shows better agreement with the gold standard, and should therefore be recommended for routine clinical laboratory use. PFGE may contribute to the control of CR-BSIs by identifying clusters of microorganisms in neonatal ICUs, providing a means of determining potential cross-infection between patients.
Troy, Karen L; Edwards, W Brent
2018-05-01
Quantitative CT (QCT) analysis involves the calculation of specific parameters such as bone volume and density from CT image data, and can be a powerful tool for understanding bone quality and quantity. However, without careful attention to detail during all steps of the acquisition and analysis process, data can be of poor- to unusable-quality. Good quality QCT for research requires meticulous attention to detail and standardization of all aspects of data collection and analysis to a degree that is uncommon in a clinical setting. Here, we review the literature to summarize practical and technical considerations for obtaining high quality QCT data, and provide examples of how each recommendation affects calculated variables. We also provide an overview of the QCT analysis technique to illustrate additional opportunities to improve data reproducibility and reliability. Key recommendations include: standardizing the scanner and data acquisition settings, minimizing image artifacts, selecting an appropriate reconstruction algorithm, and maximizing repeatability and objectivity during QCT analysis. The goal of the recommendations is to reduce potential sources of error throughout the analysis, from scan acquisition to the interpretation of results. Copyright © 2018 Elsevier Inc. All rights reserved.
Kintsurashvili, L
2016-05-01
Aconitum orientale Mill (family Helleboraceae) is a perennial herb. It is spread in forests of the west and the east Georgia and in the subalpine zone. The research objects were underground parts of Aconitum orientale Mill, which were picked in the phase of fruiting in Borjomi in 2014. We had received alkaloids sum from the air-dry underground parts (1.5 kg) with chloroform extract which was alkalined by 5% sodium carbonate. We received the alkaloids sum of 16.5 g and determined that predominant is pharmacologically active diterpenic alkaloid - Lappaconitine, which is an acting initial part of the antiarrhythmic drug "Allapinin". The chromatospectrophotometrical method of quantitative analysis of Lappaconitine is elaborated for the detection of productivity of the underground parts of Aconitum orientale Mill. It was determined that maximal absorption wave length in ultra-violet spectrum (λmax) is 308 nm; It is established that relative error is norm (4%) from statical processing of quantitative analysis results. We determined that the content of Lappaconitine in the underground parts of Aconitum orientale Mill is 0.11-0.13% in the phase of fruiting. In consequence of experimental data Aconitum orientale Mill is approved as the raw material to receive pharmacologically active Lappaconitine.
A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.
Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less
A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis
Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...
2015-12-07
Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less
Demidenko, Natalia V; Penin, Aleksey A
2012-01-01
qRT-PCR is a generally acknowledged method for gene expression analysis due to its precision and reproducibility. However, it is well known that the accuracy of qRT-PCR data varies greatly depending on the experimental design and data analysis. Recently, a set of guidelines has been proposed that aims to improve the reliability of qRT-PCR. However, there are additional factors that have not been taken into consideration in these guidelines that can seriously affect the data obtained using this method. In this study, we report the influence that object morphology can have on qRT-PCR data. We have used a number of Arabidopsis thaliana mutants with altered floral morphology as models for this study. These mutants have been well characterised (including in terms of gene expression levels and patterns) by other techniques. This allows us to compare the results from the qRT-PCR with the results inferred from other methods. We demonstrate that the comparison of gene expression levels in objects that differ greatly in their morphology can lead to erroneous results.
Spectral saliency via automatic adaptive amplitude spectrum analysis
NASA Astrophysics Data System (ADS)
Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan
2016-03-01
Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.
12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.
Code of Federal Regulations, 2012 CFR
2012-01-01
... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...
12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.
Code of Federal Regulations, 2014 CFR
2014-01-01
... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...
12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.
Code of Federal Regulations, 2013 CFR
2013-01-01
... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...
Thekkek, Nadhi; Lee, Michelle H.; Polydorides, Alexandros D.; Rosen, Daniel G.; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca
2015-01-01
Abstract. Current imaging tools are associated with inconsistent sensitivity and specificity for detection of Barrett’s-associated neoplasia. Optical imaging has shown promise in improving the classification of neoplasia in vivo. The goal of this pilot study was to evaluate whether in vivo vital dye fluorescence imaging (VFI) has the potential to improve the accuracy of early-detection of Barrett’s-associated neoplasia. In vivo endoscopic VFI images were collected from 65 sites in 14 patients with confirmed Barrett’s esophagus (BE), dysplasia, or esophageal adenocarcinoma using a modular video endoscope and a high-resolution microendoscope (HRME). Qualitative image features were compared to histology; VFI and HRME images show changes in glandular structure associated with neoplastic progression. Quantitative image features in VFI images were identified for objective image classification of metaplasia and neoplasia, and a diagnostic algorithm was developed using leave-one-out cross validation. Three image features extracted from VFI images were used to classify tissue as neoplastic or not with a sensitivity of 87.8% and a specificity of 77.6% (AUC=0.878). A multimodal approach incorporating VFI and HRME imaging can delineate epithelial changes present in Barrett’s-associated neoplasia. Quantitative analysis of VFI images may provide a means for objective interpretation of BE during surveillance. PMID:25950645
Thekkek, Nadhi; Lee, Michelle H; Polydorides, Alexandros D; Rosen, Daniel G; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca
2015-05-01
Current imaging tools are associated with inconsistent sensitivity and specificity for detection of Barrett's-associated neoplasia. Optical imaging has shown promise in improving the classification of neoplasia in vivo. The goal of this pilot study was to evaluate whether in vivo vital dye fluorescence imaging (VFI) has the potential to improve the accuracy of early-detection of Barrett's-associated neoplasia. In vivo endoscopic VFI images were collected from 65 sites in 14 patients with confirmed Barrett's esophagus (BE), dysplasia, oresophageal adenocarcinoma using a modular video endoscope and a high-resolution microendoscope(HRME). Qualitative image features were compared to histology; VFI and HRME images show changes in glandular structure associated with neoplastic progression. Quantitative image features in VFI images were identified for objective image classification of metaplasia and neoplasia, and a diagnostic algorithm was developed using leave-one-out cross validation. Three image features extracted from VFI images were used to classify tissue as neoplastic or not with a sensitivity of 87.8% and a specificity of 77.6% (AUC = 0.878). A multimodal approach incorporating VFI and HRME imaging can delineate epithelial changes present in Barrett's-associated neoplasia. Quantitative analysis of VFI images may provide a means for objective interpretation of BE during surveillance.
Investigation of Portevin-Le Chatelier band with temporal phase analysis of speckle interferometry
NASA Astrophysics Data System (ADS)
Jiang, Zhenyu; Zhang, Qingchuan; Wu, Xiaoping
2003-04-01
A new method combining temporal phase analysis with dynamic digital speckle pattern interferometry is proposed to study Portevin-Le Chatelier effect quantitatively. The principle bases on that the phase difference of interference speckle patterns is a time-dependent function related to the object deformation. The interference speckle patterns of specimen are recorded with high sampling rate while PLC effect occurs, and the 2D displacement map of PLC band and its width are obtained by analyzing the displacement of specimen with proposed method.
A quantitative analysis of TIMS data obtained on the Learjet 23 at various altitudes
NASA Technical Reports Server (NTRS)
Jaggi, S.
1992-01-01
A series of Thermal Infrared Multispectral Scanner (TIMS) data acquisition flights were conducted on the NASA Learjet 23 at different altitudes over a test site. The objective was to monitor the performance of the TIMS (its estimation of the brightness temperatures of the ground scene) with increasing altitude. The results do not show any significant correlation between the brightness temperatures and the altitude. The analysis indicates that the estimation of the temperatures is a function of the accuracy of the atmospheric correction used for each altitude.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Kareem, O.; Ghoneim, M.; Harith, M. A.
2011-09-22
Analysis of metal objects is a necessary step for establishing an appropriate conservation treatment of an object or to follow up the application's result of the suggested treatments. The main considerations on selecting a method that can be used in investigation and analysis of metal objects are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to the appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal objects. In this study various historical metal objects collected from different museums andmore » excavations in Egypt were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal objects were investigated by other methods such as Scanning Electron Microscope with energy-dispersive x-ray analyzer (SEM-EDX) and X-ray Diffraction (XRD). This study confirms that Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal objects. LIBS analysis can quickly provide information on the qualitative and semi-quantitative elemental content of different metal objects and their characterization and classification. It is practically non-destructive technique with the critical advantage of being applicable in situ, thereby avoiding sampling and sample preparations. It is can be dependable, satisfactory and effective method for low cost study of archaeological and historical metals. But we have to take into consideration that the corrosion of metal leads to material alteration and possible loss of certain metals in the form of soluble salts. Certain corrosion products are known to leach out of the object and therefore, their low content does not necessarily reflect the composition of the metal at the time of the object manufacture. Another point should be taken into consideration that the heterogeneity of a metal alloy object that often result from poor mixing of the different metal alloy composition.There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal objects.« less
Draffehn, Astrid M; Li, Li; Krezdorn, Nicolas; Ding, Jia; Lübeck, Jens; Strahwald, Josef; Muktar, Meki S; Walkemeier, Birgit; Rotter, Björn; Gebhardt, Christiane
2013-01-01
Resistance to pathogens is essential for survival of wild and cultivated plants. Pathogen susceptibility causes major losses of crop yield and quality. Durable field resistance combined with high yield and other superior agronomic characters are therefore, important objectives in every crop breeding program. Precision and efficacy of resistance breeding can be enhanced by molecular diagnostic tools, which result from knowledge of the molecular basis of resistance and susceptibility. Breeding uses resistance conferred by single R genes and polygenic quantitative resistance. The latter is partial but considered more durable. Molecular mechanisms of plant pathogen interactions are elucidated mainly in experimental systems involving single R genes, whereas most genes important for quantitative resistance in crops like potato are unknown. Quantitative resistance of potato to Phytophthora infestans causing late blight is often compromised by late plant maturity, a negative agronomic character. Our objective was to identify candidate genes for quantitative resistance to late blight not compromised by late plant maturity. We used diagnostic DNA-markers to select plants with different field levels of maturity corrected resistance (MCR) to late blight and compared their leaf transcriptomes before and after infection with P. infestans using SuperSAGE (serial analysis of gene expression) technology and next generation sequencing. We identified 2034 transcripts up or down regulated upon infection, including a homolog of the kiwi fruit allergen kiwellin. 806 transcripts showed differential expression between groups of genotypes with contrasting MCR levels. The observed expression patterns suggest that MCR is in part controlled by differential transcript levels in uninfected plants. Functional annotation suggests that, besides biotic and abiotic stress responses, general cellular processes such as photosynthesis, protein biosynthesis, and degradation play a role in MCR.
A multi-objective constraint-based approach for modeling genome-scale microbial ecosystems.
Budinich, Marko; Bourdon, Jérémie; Larhlimi, Abdelhalim; Eveillard, Damien
2017-01-01
Interplay within microbial communities impacts ecosystems on several scales, and elucidation of the consequent effects is a difficult task in ecology. In particular, the integration of genome-scale data within quantitative models of microbial ecosystems remains elusive. This study advocates the use of constraint-based modeling to build predictive models from recent high-resolution -omics datasets. Following recent studies that have demonstrated the accuracy of constraint-based models (CBMs) for simulating single-strain metabolic networks, we sought to study microbial ecosystems as a combination of single-strain metabolic networks that exchange nutrients. This study presents two multi-objective extensions of CBMs for modeling communities: multi-objective flux balance analysis (MO-FBA) and multi-objective flux variability analysis (MO-FVA). Both methods were applied to a hot spring mat model ecosystem. As a result, multiple trade-offs between nutrients and growth rates, as well as thermodynamically favorable relative abundances at community level, were emphasized. We expect this approach to be used for integrating genomic information in microbial ecosystems. Following models will provide insights about behaviors (including diversity) that take place at the ecosystem scale.
ERIC Educational Resources Information Center
Benítez, Isabel; Padilla, José-Luis
2014-01-01
Differential item functioning (DIF) can undermine the validity of cross-lingual comparisons. While a lot of efficient statistics for detecting DIF are available, few general findings have been found to explain DIF results. The objective of the article was to study DIF sources by using a mixed method design. The design involves a quantitative phase…
Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo
2014-01-01
This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626
Hybrid, experimental and computational, investigation of mechanical components
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1996-07-01
Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.
Sun, Jared H; Twomey, Michele; Tran, Jeffrey; Wallis, Lee A
2012-11-01
Ninety percent of emergency incidents occur in developing countries, and this is only expected to get worse as these nations develop. As a result, governments in developing countries are establishing emergency care systems. However, there is currently no widely-usable, objective method to monitor or research the rapid growth of emergency care in the developing world. Analysis of current quantitative methods to assess emergency care in developing countries, and the proposal of a more appropriate method. Currently accepted methods to quantitatively assess the efficacy of emergency care systems cannot be performed in most developing countries due to weak record-keeping infrastructure and the inappropriateness of applying Western derived coefficients to developing country conditions. As a result, although emergency care in the developing world is rapidly growing, researchers and clinicians are unable to objectively measure its progress or determine which policies work best in their respective countries. We propose the TEWS methodology, a simple analytical tool that can be handled by low-resource, developing countries. By relying on the most basic universal parameters, simplest calculations and straightforward protocol, the TEWS methodology allows for widespread analysis of emergency care in the developing world. This could become essential in the establishment and growth of new emergency care systems worldwide.
Computer-aided analysis with Image J for quantitatively assessing psoriatic lesion area.
Sun, Z; Wang, Y; Ji, S; Wang, K; Zhao, Y
2015-11-01
Body surface area is important in determining the severity of psoriasis. However, objective, reliable, and practical method is still in need for this purpose. We performed a computer image analysis (CIA) of psoriatic area using the image J freeware to determine whether this method could be used for objective evaluation of psoriatic area. Fifteen psoriasis patients were randomized to be treated with adalimumab or placebo in a clinical trial. At each visit, the psoriasis area of each body site was estimated by two physicians (E-method), and standard photographs were taken. The psoriasis area in the pictures was assessed with CIA using semi-automatic threshold selection (T-method), or manual selection (M-method, gold standard). The results assessed by the three methods were analyzed with reliability and affecting factors evaluated. Both T- and E-method correlated strongly with M-method, and T-method had a slightly stronger correlation with M-method. Both T- and E-methods had a good consistency between the evaluators. All the three methods were able to detect the change in the psoriatic area after treatment, while the E-method tends to overestimate. The CIA with image J freeware is reliable and practicable in quantitatively assessing the lesional of psoriasis area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
[An object-oriented intelligent engineering design approach for lake pollution control].
Zou, Rui; Zhou, Jing; Liu, Yong; Zhu, Xiang; Zhao, Lei; Yang, Ping-Jian; Guo, Huai-Cheng
2013-03-01
Regarding the shortage and deficiency of traditional lake pollution control engineering techniques, a new lake pollution control engineering approach was proposed in this study, based on object-oriented intelligent design (OOID) from the perspective of intelligence. It can provide a new methodology and framework for effectively controlling lake pollution and improving water quality. The differences between the traditional engineering techniques and the OOID approach were compared. The key points for OOID were described as object perspective, cause and effect foundation, set points into surface, and temporal and spatial optimization. The blue algae control in lake was taken as an example in this study. The effect of algae control and water quality improvement were analyzed in details from the perspective of object-oriented intelligent design based on two engineering techniques (vertical hydrodynamic mixer and pumping algaecide recharge). The modeling results showed that the traditional engineering design paradigm cannot provide scientific and effective guidance for engineering design and decision-making regarding lake pollution. Intelligent design approach is based on the object perspective and quantitative causal analysis in this case. This approach identified that the efficiency of mixers was much higher than pumps in achieving the goal of low to moderate water quality improvement. However, when the objective of water quality exceeded a certain value (such as the control objective of peak Chla concentration exceeded 100 microg x L(-1) in this experimental water), the mixer cannot achieve this goal. The pump technique can achieve the goal but with higher cost. The efficiency of combining the two techniques was higher than using one of the two techniques alone. Moreover, the quantitative scale control of the two engineering techniques has a significant impact on the actual project benefits and costs.
A comparison of manual and quantitative elbow strength testing.
Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R
2012-10-01
The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.
Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C
2015-04-13
Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.
Objective comparison of particle tracking methods.
Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R; Godinez, William J; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E G; Jaldén, Joakim; Blau, Helen M; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P; Dan, Han-Wei; Tsai, Yuh-Show; Ortiz de Solórzano, Carlos; Olivo-Marin, Jean-Christophe; Meijering, Erik
2014-03-01
Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Because manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized an open competition in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to notable practical conclusions for users and developers.
Kuhn, Felix P; Spinner, Georg; Del Grande, Filippo; Wyss, Michael; Piccirelli, Marco; Erni, Stefan; Pfister, Pascal; Ho, Michael; Sah, Bert-Ram; Filli, Lukas; Ettlin, Dominik A; Gallo, Luigi M; Andreisek, Gustav
2017-01-01
Objectives: To qualitatively and quantitatively compare MRI of the temporomandibular joint (TMJ) at 7.0 T using high-permittivity dielectric pads and 3.0 T using a clinical high-resolution protocol. Methods: Institutional review board-approved study with written informed consent. 12 asymptomatic volunteers were imaged at 7.0 and 3.0 T using 32-channel head coils. High-permittivity dielectric pads consisting of barium titanate in deuterated suspension were used for imaging at 7.0 T. Imaging protocol consisted of oblique sagittal proton density weighted turbo spin echo sequences. For quantitative analysis, pixelwise signal-to-noise ratio maps of the TMJ were calculated. For qualitative analysis, images were evaluated by two independent readers using 5-point Likert scales. Quantitative and qualitative results were compared using t-tests and Wilcoxon signed-rank tests, respectively. Results: TMJ imaging at 7.0 T using high-permittivity dielectric pads was feasible in all volunteers. Quantitative analysis showed similar signal-to-noise ratio for both field strengths (mean ± SD; 7.0 T, 13.02 ± 3.92; 3.0 T, 14.02 ± 3.41; two-sample t-tests, p = 0.188). At 7.0 T, qualitative analysis yielded better visibility of all anatomical subregions of the temporomandibular disc (anterior band, intermediate zone and posterior band) than 3.0 T (Wilcoxon signed-rank tests, p < 0.05, corrected for multiple comparisons). Conclusions: MRI of the TMJ at 7.0 T using high-permittivity dielectric pads yields superior visibility of the temporomandibular disc compared with 3.0 T. PMID:27704872
Wass, Val; Roberts, Celia; Hoogenboom, Ron; Jones, Roger; Van der Vleuten, Cees
2003-01-01
Objective To assess the effect of ethnicity on student performance in stations assessing communication skills within an objective structured clinical examination. Design Quantitative and qualitative study. Setting A final UK clinical examination consisting of a two day objective structured clinical examination with 22 stations. Participants 82 students from ethnic minorities and 97 white students. Main outcome measures Mean scores for stations (quantitative) and observations made using discourse analysis on selected communication stations (qualitative). Results Mean performance of students from ethnic minorities was significantly lower than that of white students for stations assessing communication skills on days 1 (67.0% (SD 6.8%) and 72.3% (7.6%); P=0.001) and 2 (65.2% (6.6%) and 69.5% (6.3%); P=0.003). No examples of overt discrimination were found in 309 video recordings. Transcriptions showed subtle differences in communication styles in some students from ethnic minorities who performed poorly. Examiners' assumptions about what is good communication may have contributed to differences in grading. Conclusions There was no evidence of explicit discrimination between students from ethnic minorities and white students in the objective structured clinical examination. A small group of male students from ethnic minorities used particularly poorly rated communicative styles, and some subtle problems in assessing communication skills may have introduced bias. Tests need to reflect issues of diversity to ensure that students from ethnic minorities are not disadvantaged. What is already known on this topicUK medical schools are concerned that students from ethnic minorities may perform less well than white students in examinationsIt is important to understand whether our examination system disadvantages themWhat this study addsMean performance of students from ethnic minorities was significantly lower than that of white students in a final year objective structured clinical examinationTwo possible reasons for the difference were poor communicative performance of a small group of male students from ethnic minorities and examiners' use of a textbook patient centred notion of good communicationIssues of diversity in test construction and implementation must be addressed to ensure that students from ethnic minorities are not disadvantaged PMID:12689978
Kurita, Junki; Shoji, Jun; Inada, Noriko; Yoneda, Tsuyoshi; Sumi, Tamaki; Kobayashi, Masahiko; Hoshikawa, Yasuhiro; Fukushima, Atsuki; Yamagami, Satoru
2018-06-01
Digitization of clinical observation is necessary for assessing the severity of superior limbic keratoconjunctivitis (SLK). This study aimed to use a novel quantitative marker to examine hyperemia in patients with SLK. We included six eyes of six patients with both dry eye disease and SLK (SLK group) and eight eyes of eight patients with Sjögren syndrome (SS group). We simultaneously obtained the objective finding scores by using slit-lamp examination and calculated the superior hyperemia index (SHI) with an automated conjunctival hyperemia analysis software by using photographs of the anterior segment. Three objective finding scores, including papillary formation of the superior palpebral conjunctiva, superior limbal hyperemia and swelling, and superior corneal epitheliopathy, were determined. The SHI was calculated as the superior/temporal ratio of bulbar conjunctival hyperemia by using the software. Fisher's exact test was used to compare a high SHI (≥1.07) ratio between the SLK and SS groups. P-Values < 0.05 were considered statistically significant. The SHI (mean ± standard deviation) in the SLK and SS groups was 1.19 ± 0.50 and 0.69 ± 0.24, respectively. The number of patients with a high SHI (≥1.07) was significantly higher in the SLK group than in the SS group (p < 0.05). The sensitivity and specificity of the SHI in the differential diagnosis between SS and SLK were 66.7% and 87.5%, respectively. An analysis of the association between the objective finding scores and SHI showed that the SHI had a tendency to indicate the severity of superior limbal hyperemia and swelling score in the SLK group. The SHI calculated using the automated conjunctival hyperemia analysis software could successfully quantify superior bulbar conjunctival hyperemia and may be a useful tool for the differential diagnosis between SS and SLK and for the quantitative follow-up of patients with SLK.
Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J
2014-01-01
The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.
Quantitative fluorescence microscopy and image deconvolution.
Swedlow, Jason R
2013-01-01
Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used to remove blurred signal from an image. There are two major types of deconvolution approaches, deblurring and restoration algorithms. Deblurring algorithms remove blur, but treat a series of optical sections as individual two-dimensional entities, and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed. Copyright © 1998 Elsevier Inc. All rights reserved.
Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne
2016-01-30
Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.
A method for three-dimensional quantitative observation of the microstructure of biological samples
NASA Astrophysics Data System (ADS)
Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying
2009-07-01
Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
Gao, Hua-Jun; Chen, Ya-Jing; Zuo, Duo; Xiao, Ming-Ming; Li, Ying; Guo, Hua; Zhang, Ning; Chen, Rui-Bing
2015-01-01
Objective Hepatocellular carcinoma (HCC) is a leading cause of cancer-related deaths. Novel serum biomarkers are required to increase the sensitivity and specificity of serum screening for early HCC diagnosis. This study employed a quantitative proteomic strategy to analyze the differential expression of serum glycoproteins between HCC and normal control serum samples. Methods Lectin affinity chromatography (LAC) was used to enrich glycoproteins from the serum samples. Quantitative mass spectrometric analysis combined with stable isotope dimethyl labeling and 2D liquid chromatography (LC) separations were performed to examine the differential levels of the detected proteins between HCC and control serum samples. Western blot was used to analyze the differential expression levels of the three serum proteins. Results A total of 2,280 protein groups were identified in the serum samples from HCC patients by using the 2D LC-MS/MS method. Up to 36 proteins were up-regulated in the HCC serum, whereas 19 proteins were down-regulated. Three differential glycoproteins, namely, fibrinogen gamma chain (FGG), FOS-like antigen 2 (FOSL2), and α-1,6-mannosylglycoprotein 6-β-N-acetylglucosaminyltransferase B (MGAT5B) were validated by Western blot. All these three proteins were up-regulated in the HCC serum samples. Conclusion A quantitative glycoproteomic method was established and proven useful to determine potential novel biomarkers for HCC. PMID:26487969
Sandhu, Rupninder; Chollet-Hinton, Lynn; Kirk, Erin L; Midkiff, Bentley; Troester, Melissa A
2016-02-01
Complete age-related regression of mammary epithelium, often termed postmenopausal involution, is associated with decreased breast cancer risk. However, most studies have qualitatively assessed involution. We quantitatively analyzed epithelium, stroma, and adipose tissue from histologically normal breast tissue of 454 patients in the Normal Breast Study. High-resolution digital images of normal breast hematoxylin and eosin-stained slides were partitioned into epithelium, adipose tissue, and nonfatty stroma. Percentage area and nuclei per unit area (nuclear density) were calculated for each component. Quantitative data were evaluated in association with age using linear regression and cubic spline models. Stromal area decreased (P = 0.0002), and adipose tissue area increased (P < 0.0001), with an approximate 0.7% change in area for each component, until age 55 years when these area measures reached a steady state. Although epithelial area did not show linear changes with age, epithelial nuclear density decreased linearly beginning in the third decade of life. No significant age-related trends were observed for stromal or adipose nuclear density. Digital image analysis offers a high-throughput method for quantitatively measuring tissue morphometry and for objectively assessing age-related changes in adipose tissue, stroma, and epithelium. Epithelial nuclear density is a quantitative measure of age-related breast involution that begins to decline in the early premenopausal period. Copyright © 2015 Elsevier Inc. All rights reserved.
Lee, Alex Pui-Wai; Fang, Fang; Jin, Chun-Na; Kam, Kevin Ka-Ho; Tsui, Gary K W; Wong, Kenneth K Y; Looi, Jen-Li; Wong, Randolph H L; Wan, Song; Sun, Jing Ping; Underwood, Malcolm J; Yu, Cheuk-Man
2014-01-01
The mitral valve (MV) has complex 3-dimensional (3D) morphology and motion. Advance in real-time 3D echocardiography (RT3DE) has revolutionized clinical imaging of the MV by providing clinicians with realistic visualization of the valve. Thus far, RT3DE of the MV structure and dynamics has adopted an approach that depends largely on subjective and qualitative interpretation of the 3D images of the valve, rather than objective and reproducible measurement. RT3DE combined with image-processing computer techniques provides precise segmentation and reliable quantification of the complex 3D morphology and rapid motion of the MV. This new approach to imaging may provide additional quantitative descriptions that are useful in diagnostic and therapeutic decision-making. Quantitative analysis of the MV using RT3DE has increased our understanding of the pathologic mechanism of degenerative, ischemic, functional, and rheumatic MV disease. Most recently, 3D morphologic quantification has entered into clinical use to provide more accurate diagnosis of MV disease and for planning surgery and transcatheter interventions. Current limitations of this quantitative approach to MV imaging include labor-intensiveness during image segmentation and lack of a clear definition of the clinical significance of many of the morphologic parameters. This review summarizes the current development and applications of quantitative analysis of the MV morphology using RT3DE.
Valdés, Pablo A.; Leblond, Frederic; Kim, Anthony; Harris, Brent T.; Wilson, Brian C.; Fan, Xiaoyao; Tosteson, Tor D.; Hartov, Alex; Ji, Songbai; Erkmen, Kadir; Simmons, Nathan E.; Paulsen, Keith D.; Roberts, David W.
2011-01-01
Object Accurate discrimination between tumor and normal tissue is crucial for optimal tumor resection. Qualitative fluorescence of protoporphyrin IX (PpIX), synthesized endogenously following δ-aminolevulinic acid (ALA) administration, has been used for this purpose in high-grade glioma (HGG). The authors show that diagnostically significant but visually imperceptible concentrations of PpIX can be quantitatively measured in vivo and used to discriminate normal from neoplastic brain tissue across a range of tumor histologies. Methods The authors studied 14 patients with diagnoses of low-grade glioma (LGG), HGG, meningioma, and metastasis under an institutional review board–approved protocol for fluorescence-guided resection. The primary aim of the study was to compare the diagnostic capabilities of a highly sensitive, spectrally resolved quantitative fluorescence approach to conventional fluorescence imaging for detection of neoplastic tissue in vivo. Results A significant difference in the quantitative measurements of PpIX concentration occurred in all tumor groups compared with normal brain tissue. Receiver operating characteristic (ROC) curve analysis of PpIX concentration as a diagnostic variable for detection of neoplastic tissue yielded a classification efficiency of 87% (AUC = 0.95, specificity = 92%, sensitivity = 84%) compared with 66% (AUC = 0.73, specificity = 100%, sensitivity = 47%) for conventional fluorescence imaging (p < 0.0001). More than 81% (57 of 70) of the quantitative fluorescence measurements that were below the threshold of the surgeon's visual perception were classified correctly in an analysis of all tumors. Conclusions These findings are clinically profound because they demonstrate that ALA-induced PpIX is a targeting biomarker for a variety of intracranial tumors beyond HGGs. This study is the first to measure quantitative ALA-induced PpIX concentrations in vivo, and the results have broad implications for guidance during resection of intracranial tumors. PMID:21438658
The stellar wind of an O8.5 I(f) star in M 31
NASA Technical Reports Server (NTRS)
Haser, S. M.; Lennon, D. J.; Kudritzki, R.-P.; Puls, J.; Pauldrach, A. W. A.; Bianchi, L.; Hutchings, J. B.
1995-01-01
We rediscuss the UV spectrum of OB 78#231, an O8.5 I(f) star in the Andromeda galaxy M 31, which has been obtained with the Faint Object Spectrograph on the Hubble Space Telescope by Hutchings et al. (1992). The spectrum has been re-extracted with better knowledge of background, calibration, and scattered light. The empirical analysis of the stellar wind lines results in a terminal velocity and mass loss rate similar to those typically found in comparable galactic objects. Furthermore, a comparison with an FOS spectrum of an O7 supergiant in the Small Magellanic Cloud and IUE spectra of galactic objects implies a metallicity close to galactic counterparts. These results are confirmed quantitatively by spectrum synthesis calculations using a theoretical description of O-star winds.
Using Live-Crown Ratio to Control Wood Quality: An Example of Quantitative Silviculture
Thomas J. Dean
1999-01-01
Quantitative silviculture is the application of biological relationships in meeting specific, quantitative management objectives. It is a two-sided approach requiring the identification and application of biological relationships. An example of quantitative silviculture is presented that uses a relationship between average-live crown ratio and relative stand density...
Derkacs, Amanda D Felder; Ward, Samuel R; Lieber, Richard L
2012-02-01
Understanding cytoskeletal dynamics in living tissue is prerequisite to understanding mechanisms of injury, mechanotransduction, and mechanical signaling. Real-time visualization is now possible using transfection with plasmids that encode fluorescent cytoskeletal proteins. Using this approach with the muscle-specific intermediate filament protein desmin, we found that a green fluorescent protein-desmin chimeric protein was unevenly distributed throughout the muscle fiber, resulting in some image areas that were saturated as well as others that lacked any signal. Our goal was to analyze the muscle fiber cytoskeletal network quantitatively in an unbiased fashion. To objectively select areas of the muscle fiber that are suitable for analysis, we devised a method that provides objective classification of regions of images of striated cytoskeletal structures into "usable" and "unusable" categories. This method consists of a combination of spatial analysis of the image using Fourier methods along with a boosted neural network that "decides" on the quality of the image based on previous training. We trained the neural network using the expert opinion of three scientists familiar with these types of images. We found that this method was over 300 times faster than manual classification and that it permitted objective and accurate classification of image regions.
Methodology for determining the investment attractiveness of construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana
2018-03-01
The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.
Masci, Ilaria; Vannozzi, Giuseppe; Bergamini, Elena; Pesce, Caterina; Getchell, Nancy; Cappozzo, Aurelio
2013-04-01
Objective quantitative evaluation of motor skill development is of increasing importance to carefully drive physical exercise programs in childhood. Running is a fundamental motor skill humans adopt to accomplish locomotion, which is linked to physical activity levels, although the assessment is traditionally carried out using qualitative evaluation tests. The present study aimed at investigating the feasibility of using inertial sensors to quantify developmental differences in the running pattern of young children. Qualitative and quantitative assessment tools were adopted to identify a skill-sensitive set of biomechanical parameters for running and to further our understanding of the factors that determine progression to skilled running performance. Running performances of 54 children between the ages of 2 and 12 years were submitted to both qualitative and quantitative analysis, the former using sequences of developmental level, the latter estimating temporal and kinematic parameters from inertial sensor measurements. Discriminant analysis with running developmental level as dependent variable allowed to identify a set of temporal and kinematic parameters, within those obtained with the sensor, that best classified children into the qualitative developmental levels (accuracy higher than 67%). Multivariate analysis of variance with the quantitative parameters as dependent variables allowed to identify whether and which specific parameters or parameter subsets were differentially sensitive to specific transitions between contiguous developmental levels. The findings showed that different sets of temporal and kinematic parameters are able to tap all steps of the transitional process in running skill described through qualitative observation and can be prospectively used for applied diagnostic and sport training purposes. Copyright © 2012 Elsevier B.V. All rights reserved.
Systematic review of quantitative clinical gait analysis in patients with dementia.
van Iersel, M B; Hoefsloot, W; Munneke, M; Bloem, B R; Olde Rikkert, M G M
2004-02-01
Diminished mobility often accompanies dementia and has a great impact on independence and quality of life. New treatment strategies for dementia are emerging, but the effects on gait remains to be studied objectively. In this review we address the general effects of dementia on gait as revealed by quantitative gait analysis. A systematic literature search with the (MESH) terms: 'dementia' and 'gait disorders' in Medline, CC, Psychlit and CinaHL between 1980-2002. Main inclusion criteria: controlled studies; patients with dementia; quantitative gait data. Seven publications met the inclusion criteria. All compared gait in Alzheimer's Disease (AD) with healthy elderly controls; one also assessed gait in Vascular Dementia (VaD). The methodology used was inconsistent and often had many shortcomings. However, there were several consistent findings: walking velocity decreased in dementia compared to healthy controls and decreased further with progressing severity of dementia. VaD was associated with a significant decrease in walking velocity compared to AD subjects. Dementia was associated with a shortened step length, an increased double support time and step to step variability. Gait in dementia is hardly analyzed in a well-designed manner. Despite this, the literature suggests that quantitative gait analysis can be sufficiently reliable and responsive to measure decline in walking velocity between subjects with and without dementia. More research is required to assess, both on an individual and a group level, how the minimal clinically relevant changes in gait in elderly demented patients should be defined and what would be the most responsive method to measure these changes.
Lupidi, Marco; Coscas, Florence; Cagini, Carlo; Fiore, Tito; Spaccini, Elisa; Fruttini, Daniela; Coscas, Gabriel
2016-09-01
To describe a new automated quantitative technique for displaying and analyzing macular vascular perfusion using optical coherence tomography angiography (OCT-A) and to determine a normative data set, which might be used as reference in identifying progressive changes due to different retinal vascular diseases. Reliability study. A retrospective review of 47 eyes of 47 consecutive healthy subjects imaged with a spectral-domain OCT-A device was performed in a single institution. Full-spectrum amplitude-decorrelation angiography generated OCT angiograms of the retinal superficial and deep capillary plexuses. A fully automated custom-built software was used to provide quantitative data on the foveal avascular zone (FAZ) features and the total vascular and avascular surfaces. A comparative analysis between central macular thickness (and volume) and FAZ metrics was performed. Repeatability and reproducibility were also assessed in order to establish the feasibility and reliability of the method. The comparative analysis between the superficial capillary plexus and the deep capillary plexus revealed a statistically significant difference (P < .05) in terms of FAZ perimeter, surface, and major axis and a not statistically significant difference (P > .05) when considering total vascular and avascular surfaces. A linear correlation was demonstrated between central macular thickness (and volume) and the FAZ surface. Coefficients of repeatability and reproducibility were less than 0.4, thus demonstrating high intraobserver repeatability and interobserver reproducibility for all the examined data. A quantitative approach on retinal vascular perfusion, which is visible on Spectralis OCT angiography, may offer an objective and reliable method for monitoring disease progression in several retinal vascular diseases. Copyright © 2016 Elsevier Inc. All rights reserved.
High-resolution ab initio three-dimensional x-ray diffraction microscopy
Chapman, Henry N.; Barty, Anton; Marchesini, Stefano; ...
2006-01-01
Coherent x-ray diffraction microscopy is a method of imaging nonperiodic isolated objects at resolutions limited, in principle, by only the wavelength and largest scattering angles recorded. We demonstrate x-ray diffraction imaging with high resolution in all three dimensions, as determined by a quantitative analysis of the reconstructed volume images. These images are retrieved from the three-dimensional diffraction data using no a priori knowledge about the shape or composition of the object, which has never before been demonstrated on a nonperiodic object. We also construct two-dimensional images of thick objects with greatly increased depth of focus (without loss of transverse spatialmore » resolution). These methods can be used to image biological and materials science samples at high resolution with x-ray undulator radiation and establishes the techniques to be used in atomic-resolution ultrafast imaging at x-ray free-electron laser sources.« less
Objective measurement of accommodative biometric changes using ultrasound biomicroscopy
Ramasubramanian, Viswanathan; Glasser, Adrian
2015-01-01
PURPOSE To demonstrate that ultrasound biomicroscopy (UBM) can be used for objective quantitative measurements of anterior segment accommodative changes. SETTING College of Optometry, University of Houston, Houston, Texas, USA. DESIGN Prospective cross-sectional study. METHODS Anterior segment biometric changes in response to 0 to 6.0 diopters (D) of accommodative stimuli in 1.0 D steps were measured in eyes of human subjects aged 21 to 36 years. Imaging was performed in the left eye using a 35 MHz UBM (Vumax) and an A-scan ultrasound (A-5500) while the right eye viewed the accommodative stimuli. An automated Matlab image-analysis program was developed to measure the biometry parameters from the UBM images. RESULTS The UBM-measured accommodative changes in anterior chamber depth (ACD), lens thickness, anterior lens radius of curvature, posterior lens radius of curvature, and anterior segment length were statistically significantly (P < .0001) linearly correlated with accommodative stimulus amplitudes. Standard deviations of the UBM-measured parameters were independent of the accommodative stimulus demands (ACD 0.0176 mm, lens thickness 0.0294 mm, anterior lens radius of curvature 0.3350 mm, posterior lens radius of curvature 0.1580 mm, and anterior segment length 0.0340 mm). The mean difference between the A-scan and UBM measurements was −0.070 mm for ACD and 0.166 mm for lens thickness. CONCLUSIONS Accommodating phakic eyes imaged using UBM allowed visualization of the accommodative response, and automated image analysis of the UBM images allowed reliable, objective, quantitative measurements of the accommodative intraocular biometric changes. PMID:25804579
Good practices for quantitative bias analysis.
Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander
2014-12-01
Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage more widespread use of bias analysis to estimate the potential magnitude and direction of biases, as well as the uncertainty in estimates potentially influenced by the biases. © The Author 2014; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
Forest management and economics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buongiorno, J.; Gilless, J.K.
1987-01-01
This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.
Jose M. Iniguez; Joseph L. Ganey; Peter J. Daughtery; John D. Bailey
2005-01-01
The objective of this study was to develop a rule based cover type classification system for the forest and woodland vegetation in the Sky Islands of southeastern Arizona. In order to develop such a system we qualitatively and quantitatively compared a hierarchical (Wardâs) and a non-hierarchical (k-means) clustering method. Ecologically, unique groups represented by...
Jose M. Iniguez; Joseph L. Ganey; Peter J. Daugherty; John D. Bailey
2005-01-01
The objective of this study was to develop a rule based cover type classification system for the forest and woodland vegetation in the Sky Islands of southeastern Arizona. In order to develop such system we qualitatively and quantitatively compared a hierarchical (Wardâs) and a non-hierarchical (k-means) clustering method. Ecologically, unique groups and plots...
Representation and Reconstruction of Three-dimensional Microstructures in Ni-based Superalloys
2010-12-20
Materiala, 56, pp. 427-437 (2009); • Application of joint histogram and mutual information to registration and data fusion problems in serial...sectioning data sets and synthetically generated microstructures. The method is easy to use, and allows for a quantitative description of shapes. Further...following objectives were achieved: • we have successfully applied 3-D moment invariant analysis to several experimental data sets; • we have extended 2-D
Quantitative motor assessment of muscular weakness in myasthenia gravis: a pilot study.
Hoffmann, Sarah; Siedler, Jana; Brandt, Alexander U; Piper, Sophie K; Kohler, Siegfried; Sass, Christian; Paul, Friedemann; Reilmann, Ralf; Meisel, Andreas
2015-12-23
Muscular weakness in myasthenia gravis (MG) is commonly assessed using Quantitative Myasthenia Gravis Score (QMG). More objective and quantitative measures may complement the use of clinical scales and might detect subclinical affection of muscles. We hypothesized that muscular weakness in patients with MG can be quantified with the non-invasive Quantitative Motor (Q-Motor) test for Grip Force Assessment (QGFA) and Involuntary Movement Assessment (QIMA) and that pathological findings correlate with disease severity as measured by QMG. This was a cross-sectional pilot study investigating patients with confirmed diagnosis of MG. Data was compared to healthy controls (HC). Subjects were asked to lift a device (250 and 500 g) equipped with electromagnetic sensors that measured grip force (GF) and three-dimensional changes in position and orientation. These were used to calculate the position index (PI) and orientation index (OI) as measures for involuntary movements due to muscular weakness. Overall, 40 MG patients and 23 HC were included. PI and OI were significantly higher in MG patients for both weights in the dominant and non-dominant hand. Subgroup analysis revealed that patients with clinically ocular myasthenia gravis (OMG) also showed significantly higher values for PI and OI in both hands and for both weights. Disease severity correlates with QIMA performance in the non-dominant hand. Q-Motor tests and particularly QIMA may be useful objective tools for measuring motor impairment in MG and seem to detect subclinical generalized motor signs in patients with OMG. Q-Motor parameters might serve as sensitive endpoints for clinical trials in MG.
Prioritising coastal zone management issues through fuzzy cognitive mapping approach.
Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi
2012-04-30
Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.
Genetic analysis of PAX3 for diagnosis of Waardenburg syndrome type I.
Matsunaga, Tatsuo; Mutai, Hideki; Namba, Kazunori; Morita, Noriko; Masuda, Sawako
2013-04-01
PAX3 genetic analysis increased the diagnostic accuracy for Waardenburg syndrome type I (WS1). Analysis of the three-dimensional (3D) structure of PAX3 helped verify the pathogenicity of a missense mutation, and multiple ligation-dependent probe amplification (MLPA) analysis of PAX3 increased the sensitivity of genetic diagnosis in patients with WS1. Clinical diagnosis of WS1 is often difficult in individual patients with isolated, mild, or non-specific symptoms. The objective of the present study was to facilitate the accurate diagnosis of WS1 through genetic analysis of PAX3 and to expand the spectrum of known PAX3 mutations. In two Japanese families with WS1, we conducted a clinical evaluation of symptoms and genetic analysis, which involved direct sequencing, MLPA analysis, quantitative PCR of PAX3, and analysis of the predicted 3D structure of PAX3. The normal-hearing control group comprised 92 subjects who had normal hearing according to pure tone audiometry. In one family, direct sequencing of PAX3 identified a heterozygous mutation, p.I59F. Analysis of PAX3 3D structures indicated that this mutation distorted the DNA-binding site of PAX3. In the other family, MLPA analysis and subsequent quantitative PCR detected a large, heterozygous deletion spanning 1759-2554 kb that eliminated 12-18 genes including a whole PAX3 gene.
Kuo, Phillip Hsin; Avery, Ryan; Krupinski, Elizabeth; Lei, Hong; Bauer, Adam; Sherman, Scott; McMillan, Natalie; Seibyl, John; Zubal, George
2013-03-01
A fully automated objective striatal analysis (OSA) program that quantitates dopamine transporter uptake in subjects with suspected Parkinson's disease was applied to images from clinical (123)I-ioflupane studies. The striatal binding ratios or alternatively the specific binding ratio (SBR) of the lowest putamen uptake was computed, and receiver-operating-characteristic (ROC) analysis was applied to 94 subjects to determine the best discriminator using this quantitative method. Ninety-four (123)I-ioflupane SPECT scans were analyzed from patients referred to our clinical imaging department and were reconstructed using the manufacturer-supplied reconstruction and filtering parameters for the radiotracer. Three trained readers conducted independent visual interpretations and reported each case as either normal or showing dopaminergic deficit (abnormal). The same images were analyzed using the OSA software, which locates the striatal and occipital structures and places regions of interest on the caudate and putamen. Additionally, the OSA places a region of interest on the occipital region that is used to calculate the background-subtracted SBR. The lower SBR of the 2 putamen regions was taken as the quantitative report. The 33 normal (bilateral comma-shaped striata) and 61 abnormal (unilateral or bilateral dopaminergic deficit) studies were analyzed to generate ROC curves. Twenty-nine of the scans were interpreted as normal and 59 as abnormal by all 3 readers. For 12 scans, the 3 readers did not unanimously agree in their interpretations (discordant). The ROC analysis, which used the visual-majority-consensus interpretation from the readers as the gold standard, yielded an area under the curve of 0.958 when using 1.08 as the threshold SBR for the lowest putamen. The sensitivity and specificity of the automated quantitative analysis were 95% and 89%, respectively. The OSA program delivers SBR quantitative values that have a high sensitivity and specificity, compared with visual interpretations by trained nuclear medicine readers. Such a program could be a helpful aid for readers not yet experienced with (123)I-ioflupane SPECT images and if further adapted and validated may be useful to assess disease progression during pharmaceutical testing of therapies.
Quantitative holographic interferometry applied to combustion and compressible flow research
NASA Astrophysics Data System (ADS)
Bryanston-Cross, Peter J.; Towers, D. P.
1993-03-01
The application of holographic interferometry to phase object analysis is described. Emphasis has been given to a method of extracting quantitative information automatically from the interferometric fringe data. To achieve this a carrier frequency has been added to the holographic data. This has made it possible, firstly to form a phase map using a fast Fourier transform (FFT) algorithm. Then to `solve,' or unwrap, this image to give a contiguous density map using a minimum weight spanning tree (MST) noise immune algorithm, known as fringe analysis (FRAN). Applications of this work to a burner flame and a compressible flow are presented. In both cases the spatial frequency of the fringes exceed the resolvable limit of conventional digital framestores. Therefore, a flatbed scanner with a resolution of 3200 X 2400 pixels has been used to produce very high resolution digital images from photographs. This approach has allowed the processing of data despite the presence of caustics, generated by strong thermal gradients at the edge of the combustion field. A similar example is presented from the analysis of a compressible transonic flow in the shock wave and trailing edge regions.
Blackboard architecture for medical image interpretation
NASA Astrophysics Data System (ADS)
Davis, Darryl N.; Taylor, Christopher J.
1991-06-01
There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.
Zhu, H B; Su, C J; Tang, H F; Ruan, Z; Liu, D H; Wang, H; Qian, Y L
2017-10-20
Objective: To establish a method for rapid determination of 47 volatile organic compounds in the air of workplace using portable gas chromatography - mass spectrometer(GC - MS). Methods: The mixed standard gas with different concentration levels was made by using the static gas distribution method with the high purity nitrogen as dilution gas. The samples were injected into the GC - MS by a hand - held probe. Retention time and characteristic ion were used for qualitative analysis,and the internal standard method was usd for quantitation. Results: The 47 poisonous substances were separated and determined well. The linear range of this method was 0.2 - 16.0 mg/m(3),and the relative standard deviation of 45 volatile ovganic compounds was 3.8% - 15.8%. The average recovery was 79.3% - 119.0%. Conclusion: The method is simple,accurate,sensitive,has good separation effect,short analysis period, can be used for qualitative and quantitative analysis of volatile organic compounds in the workplace, and also supports the rapid identification and detection of occupational hazards.
An Attempt of Formalizing the Selection Parameters for Settlements Generalization in Small-Scales
NASA Astrophysics Data System (ADS)
Karsznia, Izabela
2014-12-01
The paper covers one of the most important problems concerning context-sensitive settlement selection for the purpose of the small-scale maps. So far, no formal parameters for small-scale settlements generalization have been specified, hence the problem seems to be an important and innovative challenge. It is also crucial from the practical point of view as it is necessary to develop appropriate generalization algorithms for the purpose of the General Geographic Objects Database generalization which is the essential Spatial Data Infrastructure component in Poland. The author proposes and verifies quantitative generalization parameters for the purpose of the settlement selection process in small-scale maps. The selection of settlements was carried out in two research areas - in Lower Silesia and Łódź Province. Based on the conducted analysis appropriate contextual-sensitive settlements selection parameters have been defined. Particular effort has been made to develop a methodology of quantitative settlements selection which would be useful in the automation processes and that would make it possible to keep specifics of generalized objects unchanged.
Disease quantification on PET/CT images without object delineation
NASA Astrophysics Data System (ADS)
Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.
2017-03-01
The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.
Combined elemental and microstructural analysis of genuine and fake copper-alloy coins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartoli, L; Agresti, J; Mascalchi, M
2011-07-31
Innovative noninvasive material analysis techniques are applied to determine archaeometallurgical characteristics of copper-alloy coins from Florence's National Museum of Archaeology. Three supposedly authentic Roman coins and three hypothetically fraudolent imitations are thoroughly investigated using laser-induced plasma spectroscopy and time of flight neutron diffraction along with 3D videomicroscopy and electron microscopy. Material analyses are aimed at collecting data allowing for objective discrimination between genuine Roman productions and late fakes. The results show the mentioned techniques provide quantitative compositional and textural data, which are strictly related to the manufacturing processes and aging of copper alloys. (laser applications)
Diagnostic analysis of liver B ultrasonic texture features based on LM neural network
NASA Astrophysics Data System (ADS)
Chi, Qingyun; Hua, Hu; Liu, Menglin; Jiang, Xiuying
2017-03-01
In this study, B ultrasound images of 124 benign and malignant patients were randomly selected as the study objects. The B ultrasound images of the liver were treated by enhanced de-noising. By constructing the gray level co-occurrence matrix which reflects the information of each angle, Principal Component Analysis of 22 texture features were extracted and combined with LM neural network for diagnosis and classification. Experimental results show that this method is a rapid and effective diagnostic method for liver imaging, which provides a quantitative basis for clinical diagnosis of liver diseases.
Efficient Algorithm for Fuzzy Linear Programming with Multiple Objectives.
1984-12-01
constraint). Because of other reasons at least 6 of the smallest trucks were wanted in the fleet. The management wanted to use quantitative analysis and...to the price Ki. For each share the values of the criteria are multiplied by a weight gi : (66) gi = 100/ki This percentage transformation is useful in...could be useful for a repeated and promising analysis . II 92 cL ) C7 C, CJ o4-- S- 00 L C) C 4.) 0n -’ I. - 4s , 4.4- C)C) 4-) ’,0 LC) C unS. - a
POPA: A Personality and Object Profiling Assistant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dreicer, J.S.
POPA: A Personality and Object Profiling Assistant system utilizes an extension and variation of a process developed for decision analysis as a tool to quantify intuitive feelings and subjective judgments. The technique is based on a manipulation of the Analytical Hierarchy Process. The POPA system models an individual in terms of his character type, life orientation, and incentive (motivational) factors. Then an object (i.e., individual, project, situation, or policy) is modeled with respect to its three most important factors. The individual and object models are combined to indicate the influence each of the three object factors have on the individual.more » We have investigated this problem: 1) to develop a technique that models personality types in a quantitative and organized manner, 2) to develop a tool capable of evaluating the probable success of obtaining funding for proposed programs at Los Alamos National Laboratory, 3) to determine the feasibility of quantifying feelings and intuition, and 4) to better understand subjective knowledge acquisition (especially intuition). 49 refs., 10 figs., 5 tabs.« less
Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2014-02-01
Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
Cordero, Chiara; Canale, Francesca; Del Rio, Daniele; Bicchi, Carlo
2009-11-01
The present study is focused on flavan-3-ols characterizing the antioxidant properties of fermented tea (Camellia sinensis). These bioactive compounds, object of nutritional claims in commercial products, should be quantified with rigorous analytical procedures whose accuracy and precision have been stated with a certain level of confidence. An HPLC-UV/DAD method, able to detect and quantify flavan-3-ols in infusions and ready-to-drink teas, has been developed for routine analysis and validated by characterizing several performance parameters. The accuracy assessment has been run through a series of LC-MS/MS analyses. Epigallocatechin, (+)-catechin, (-)-epigallocatechingallate, (-)-epicatechin, (-)-gallocatechingallate, (-)-epicatechingallate, and (-)-catechingallate were chosen as markers of the polyphenolic fraction. Quantitative results showed that samples obtained from tea leaves infusion were richer in polyphenolic antioxidants than those obtained through other industrial processes. The influence of shelf-life and packaging material on the flavan-3-ols content was also considered; markers decreased, with an exponential trend, as a function of time within the shelf life while packaging materials demonstrated to influence differently the flavan-3-ol fraction composition over time. The method presented here provides quantitative results with a certain level of confidence and is suitable for a routine quality control of iced teas whose antioxidant properties are object of nutritional claim.
NASA Astrophysics Data System (ADS)
Gibergans-Báguena, J.; Llasat, M. C.
2007-12-01
The objective of this paper is to present the improvement of quantitative forecasting of daily rainfall in Catalonia (NE Spain) from an analogues technique, taking into account synoptic and local data. This method is based on an analogues sorting technique: meteorological situations similar to the current one, in terms of 700 and 1000 hPa geopotential fields at 00 UTC, complemented with the inclusion of some thermodynamic parameters extracted from an historical data file. Thermodynamic analysis acts as a highly discriminating feature for situations in which the synoptic situation fails to explain either atmospheric phenomena or rainfall distribution. This is the case in heavy rainfall situations, where the existence of instability and high water vapor content is essential. With the objective of including these vertical thermodynamic features, information provided by the Palma de Mallorca radiosounding (Spain) has been used. Previously, a selection of the most discriminating thermodynamic parameters for the daily rainfall was made, and then the analogues technique applied to them. Finally, three analog forecasting methods were applied for the quantitative daily rainfall forecasting in Catalonia. The first one is based on analogies from geopotential fields to synoptic scale; the second one is exclusively based on the search of similarity from local thermodynamic information and the third method combines the other two methods. The results show that this last method provides a substantial improvement of quantitative rainfall estimation.
NASA Astrophysics Data System (ADS)
Boehm, H. F.; Fink, C.; Becker, C.; Reiser, M.
2007-03-01
Reliable and accurate methods for objective quantitative assessment of parenchymal alterations in the lung are necessary for diagnosis, treatment and follow-up of pulmonary diseases. Two major types of alterations are pulmonary emphysema and fibrosis, emphysema being characterized by abnormal enlargement of the air spaces distal to the terminal, nonrespiratory bronchiole, accompanied by destructive changes of the alveolar walls. The main characteristic of fibrosis is coursening of the interstitial fibers and compaction of the pulmonary tissue. With the ability to display anatomy free from superimposing structures and greater visual clarity, Multi-Detector-CT has shown to be more sensitive than the chest radiograph in identifying alterations of lung parenchyma. In automated evaluation of pulmonary CT-scans, quantitative image processing techniques are applied for objective evaluation of the data. A number of methods have been proposed in the past, most of which utilize simple densitometric tissue features based on the mean X-ray attenuation coefficients expressed in terms of Hounsfield Units [HU]. Due to partial volume effects, most of the density-based methodologies tend to fail, namely in cases, where emphysema and fibrosis occur within narrow spatial limits. In this study, we propose a methodology based upon the topological assessment of graylevel distribution in the 3D image data of lung tissue which provides a way of improving quantitative CT evaluation. Results are compared to the more established density-based methods.
Cost analysis of objective resident cataract surgery assessments.
Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M
2015-05-01
To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Cui, Xueliang; Chen, Hui; Rui, Yunfeng; Niu, Yang; Li, He
2018-01-01
Objectives Two-stage open reduction and internal fixation (ORIF) and limited internal fixation combined with external fixation (LIFEF) are two widely used methods to treat Pilon injury. However, which method is superior to the other remains controversial. This meta-analysis was performed to quantitatively compare two-stage ORIF and LIFEF and clarify which method is better with respect to postoperative complications in the treatment of tibial Pilon fractures. Methods We conducted a meta-analysis to quantitatively compare the postoperative complications between two-stage ORIF and LIFEF. Eight studies involving 360 fractures in 359 patients were included in the meta-analysis. Results The two-stage ORIF group had a significantly lower risk of superficial infection, nonunion, and bone healing problems than the LIFEF group. However, no significant differences in deep infection, delayed union, malunion, arthritis symptoms, or chronic osteomyelitis were found between the two groups. Conclusion Two-stage ORIF was associated with a lower risk of postoperative complications with respect to superficial infection, nonunion, and bone healing problems than LIFEF for tibial Pilon fractures. Level of evidence 2.
An importance-performance analysis of hospital information system attributes: A nurses' perspective.
Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J
2016-02-01
Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The role of 3-D interactive visualization in blind surveys of H I in galaxies
NASA Astrophysics Data System (ADS)
Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.
2015-09-01
Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.
A novel approach for evaluating the risk of health care failure modes.
Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang
2012-12-01
Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.
NASA Astrophysics Data System (ADS)
Federico, S.; Avolio, E.; Bellecci, C.; Colacino, M.; Walko, R. L.
2006-03-01
This paper reports preliminary results for a Limited area model Ensemble Prediction System (LEPS), based on RAMS (Regional Atmospheric Modelling System), for eight case studies of moderate-intense precipitation over Calabria, the southernmost tip of the Italian peninsula. LEPS aims to transfer the benefits of a probabilistic forecast from global to regional scales in countries where local orographic forcing is a key factor to force convection. To accomplish this task and to limit computational time in an operational implementation of LEPS, we perform a cluster analysis of ECMWF-EPS runs. Starting from the 51 members that form the ECMWF-EPS we generate five clusters. For each cluster a representative member is selected and used to provide initial and dynamic boundary conditions to RAMS, whose integrations generate LEPS. RAMS runs have 12-km horizontal resolution. To analyze the impact of enhanced horizontal resolution on quantitative precipitation forecasts, LEPS forecasts are compared to a full Brute Force (BF) ensemble. This ensemble is based on RAMS, has 36 km horizontal resolution and is generated by 51 members, nested in each ECMWF-EPS member. LEPS and BF results are compared subjectively and by objective scores. Subjective analysis is based on precipitation and probability maps of case studies whereas objective analysis is made by deterministic and probabilistic scores. Scores and maps are calculated by comparing ensemble precipitation forecasts against reports from the Calabria regional raingauge network. Results show that LEPS provided better rainfall predictions than BF for all case studies selected. This strongly suggests the importance of the enhanced horizontal resolution, compared to ensemble population, for Calabria for these cases. To further explore the impact of local physiographic features on QPF (Quantitative Precipitation Forecasting), LEPS results are also compared with a 6-km horizontal resolution deterministic forecast. Due to local and mesoscale forcing, the high resolution forecast (Hi-Res) has better performance compared to the ensemble mean for rainfall thresholds larger than 10mm but it tends to overestimate precipitation for lower amounts. This yields larger false alarms that have a detrimental effect on objective scores for lower thresholds. To exploit the advantages of a probabilistic forecast compared to a deterministic one, the relation between the ECMWF-EPS 700 hPa geopotential height spread and LEPS performance is analyzed. Results are promising even if additional studies are required.
Learning a cost function for microscope image segmentation.
Nilufar, Sharmin; Perkins, Theodore J
2014-01-01
Quantitative analysis of microscopy images is increasingly important in clinical researchers' efforts to unravel the cellular and molecular determinants of disease, and for pathological analysis of tissue samples. Yet, manual segmentation and measurement of cells or other features in images remains the norm in many fields. We report on a new system that aims for robust and accurate semi-automated analysis of microscope images. A user interactively outlines one or more examples of a target object in a training image. We then learn a cost function for detecting more objects of the same type, either in the same or different images. The cost function is incorporated into an active contour model, which can efficiently determine optimal boundaries by dynamic programming. We validate our approach and compare it to some standard alternatives on three different types of microscopic images: light microscopy of blood cells, light microscopy of muscle tissue sections, and electron microscopy cross-sections of axons and their myelin sheaths.
The S-054 X-ray telescope experiment on Skylab
NASA Technical Reports Server (NTRS)
Vaiana, G. S.; Van Speybroeck, L.; Zombeck, M. V.; Krieger, A. S.; Silk, J. K.; Timothy, A.
1977-01-01
A description of the S-054 X-ray telescope on Skylab is presented with a discussion of the experimental objectives, observing program, data reduction and analysis. Some results from the Skylab mission are given. The telescope photographically records high-resolution images of the solar corona in several broadband regions of the soft X-ray spectrum. It includes an objective grating used to study the line spectrum. The spatial resolution, sensitivity, dynamic range and time resolution of the instrument were chosen to survey a wide variety of solar phenomena. It embodies improvements in design, fabrication, and calibration techniques which were developed over a ten-year period. The observing program was devised to optimize the use of the instrument and to provide studies on a wide range of time scales. The data analysis program includes morphological studies and quantitative analysis using digitized images. A small sample of the data obtained in the mission is presented to demonstrate the type of information that is available and the kinds of results that can be obtained from it.
Forensic practice in the field of protection of cultural heritage
NASA Astrophysics Data System (ADS)
Kotrlý, Marek; Turková, Ivana
2012-06-01
Microscopic methods play a key role in issues covering analyses of objects of art that are used on the one hand as screening ones, on the other hand they can lead to obtaining data relevant for completion of expertise. Analyses of artworks, gemmological objects and other highly valuable commodities usually do not rank among routine ones, but every analysis is specific, be it e.g. material investigation of artworks, historical textile materials and other antiques (coins, etc.), identification of fragments (from transporters, storage places, etc.), period statues, sculptures compared to originals, analyses of gems and jewellery, etc. A number of analytical techniques may be employed: optical microscopy in transmitted and reflected light, polarization and fluorescence in visible, UV and IR radiation; image analysis, quantitative microspectrophotometry; SEM/EDS/WDS; FTIR and Raman spectroscopy; XRF and microXRF, including mobile one; XRD and microXRD; x-ray backlight or LA-ICP-MS, SIMS, PIXE; further methods of organic analysis are also utilised - GS-MS, MALDI-TOF, etc.
Subjective Socioeconomic Status and Adolescent Health: A Meta-Analysis
Quon, Elizabeth C.; McGrath, Jennifer J.
2017-01-01
Objective To comprehensively and quantitatively examine the association between subjective socioeconomic status (SES) and health outcomes during adolescence. Methods Forty-four studies met criteria for inclusion in the meta-analysis. Information on study quality, demographics, subjective SES, health outcomes, and covariates were extracted from each study. Fisher’s Z was selected as the common effect size metric across studies. Random-effect meta-analytic models were employed and fail-safe numbers were generated to address publication bias. Results Overall, subjective SES was associated with health during adolescence (Fisher’s Z = .10). The magnitude of the effect varied by type of health outcome, with larger effects observed for mental health outcomes, self-rated health, and general health symptoms; and nonsignificant effects observed for biomarkers of health and substance-use-related health behaviors. Of the measures of subjective SES employed in the reviewed studies, perception of financial constraints, was most strongly associated with adolescent health outcomes. Analysis of covariates indicated that inclusion of objective SES covariates did not affect the association between subjective SES and health. Conclusions This meta-analysis has implications for the measurement of subjective SES in adolescents, for the conceptualization of subjective and objective SES, and for the pathways between SES and health in adolescents. PMID:24245837
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odom, R.W.
1991-06-04
The objective of the research was to develop quantitative microanalysis methods for dielectric thin films using the laser ionization mass spectrometry (LIMS) technique. The research involved preparation of thin (5,000 A) films of SiO2, Al2O3, MgF2, TiO2, Cr2O3, Ta2O5, Si3N4, and ZrO2, and doping these films with ion implant impurities of 11B, 40Ca, 56Fe, 68Zn, 81Br, and 121Sb. Laser ionization mass spectrometry (LIMS), secondary ion mass spectrometry (SIMS) and Rutherford backscattering spectrometry (RBS) were performed on these films. The research demonstrated quantitative LIMS analysis down to detection levels of 10-100 ppm, and led to the development of (1) a compoundmore » thin film standards product line for the performing organization, (2) routine LIMS analytical methods, and (3) the manufacture of high speed preamplifiers for time-of-flight mass spectrometry (TOF-MS) techniques.« less
Burmester, Bridget; Leathem, Janet; Merrick, Paul
2016-12-01
Research investigating how subjective cognitive complaints (SCCs) might reliably indicate impairments in objective cognitive functioning has produced highly varied findings, and despite attempts to synthesise this literature (e.g., Jonker et al. International Journal of Geriatric Psychiatry, 15, 983-991, 2000; Reid and MacLullich Dementia and Geriatric Cognitive Disorders, 22(5-6), 471-485, 2006; Crumley et al. Psychology and Aging, 29(2), 250-263, 2014), recent work continues to offer little resolution. This review provides both quantitative and qualitative synthesis of research conducted since the last comprehensive review in 2006, with the aim of identifying reasons for these discrepancies that might provide fruitful avenues for future exploration. Meta-analysis found a small but significant association between SCCs and objective cognitive function, although it was limited by large heterogeneity between studies and evidence of potential publication bias. Often, assessments of SCCs and objective cognitive function were brief or not formally validated. However, studies that employed more comprehensive SCC measures tended to find that SCCs were associated independently with both objective cognitive function and depressive symptoms. Further explicit investigation of how assessment measures relate to reports of SCCs, and the validity of the proposed 'compensation theory' of SCC aetiology, is recommended.
Objective comparison of particle tracking methods
Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F.; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R.; Godinez, William J.; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E. G.; Jaldén, Joakim; Blau, Helen M.; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L.; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P.; Dan, Han-Wei; Tsai, Yuh-Show; de Solórzano, Carlos Ortiz; Olivo-Marin, Jean-Christophe; Meijering, Erik
2014-01-01
Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Since manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized, for the first time, an open competition, in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to important practical conclusions for users and developers. PMID:24441936
Quantification of Posterior Globe Flattening: Methodology Development and Validationc
NASA Technical Reports Server (NTRS)
Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.
2011-01-01
Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.
NASA Astrophysics Data System (ADS)
Min, Junwei; Yao, Baoli; Ketelhut, Steffi; Kemper, Björn
2017-02-01
The modular combination of optical microscopes with digital holographic microscopy (DHM) has been proven to be a powerful tool for quantitative live cell imaging. The introduction of condenser and different microscope objectives (MO) simplifies the usage of the technique and makes it easier to measure different kinds of specimens with different magnifications. However, the high flexibility of illumination and imaging also causes variable phase aberrations that need to be eliminated for high resolution quantitative phase imaging. The existent phase aberrations compensation methods either require add additional elements into the reference arm or need specimen free reference areas or separate reference holograms to build up suitable digital phase masks. These inherent requirements make them unpractical for usage with highly variable illumination and imaging systems and prevent on-line monitoring of living cells. In this paper, we present a simple numerical method for phase aberration compensation based on the analysis of holograms in spatial frequency domain with capabilities for on-line quantitative phase imaging. From a single shot off-axis hologram, the whole phase aberration can be eliminated automatically without numerical fitting or pre-knowledge of the setup. The capabilities and robustness for quantitative phase imaging of living cancer cells are demonstrated.
Quantitative reflectance spectroscopy of buddingtonite from the Cuprite mining district, Nevada
NASA Technical Reports Server (NTRS)
Felzer, Benjamin; Hauff, Phoebe; Goetz, Alexander F. H.
1994-01-01
Buddingtonite, an ammonium-bearing feldspar diagnostic of volcanic-hosted alteration, can be identified and, in some cases, quantitatively measured using short-wave infrared (SWIR) reflectance spectroscopy. In this study over 200 samples from Cuprite, Nevada, were evaluated by X ray diffraction, chemical analysis, scanning electron microscopy, and SWIR reflectance spectroscopy with the objective of developing a quantitative remote-sensing technique for rapid determination of the amount of ammonium or buddingtonite present, and its distribution across the site. Based upon the Hapke theory of radiative transfer from particulate surfaces, spectra from quantitative, physical mixtures were compared with computed mixture spectra. We hypothesized that the concentration of ammonium in each sample is related to the size and shape of the ammonium absorption bands and tested this hypothesis for samples of relatively pure buddingtonite. We found that the band depth of the 2.12-micron NH4 feature is linearly related to the NH4 concentration for the Cuprite buddingtonite, and that the relationship is approximately exponential for a larger range of NH4 concentrations. Associated minerals such as smectite and jarosite suppress the depth of the 2.12-micron NH4 absorption band. Quantitative reflectance spectroscopy is possible when the effects of these associated minerals are also considered.
SpArcFiRe: Scalable automated detection of spiral galaxy arm segments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Darren R.; Hayes, Wayne B., E-mail: drdavis@uci.edu, E-mail: whayes@uci.edu
Given an approximately centered image of a spiral galaxy, we describe an entirely automated method that finds, centers, and sizes the galaxy (possibly masking nearby stars and other objects if necessary in order to isolate the galaxy itself) and then automatically extracts structural information about the spiral arms. For each arm segment found, we list the pixels in that segment, allowing image analysis on a per-arm-segment basis. We also perform a least-squares fit of a logarithmic spiral arc to the pixels in that segment, giving per-arc parameters, such as the pitch angle, arm segment length, location, etc. The algorithm takesmore » about one minute per galaxies, and can easily be scaled using parallelism. We have run it on all ∼644,000 Sloan objects that are larger than 40 pixels across and classified as 'galaxies'. We find a very good correlation between our quantitative description of a spiral structure and the qualitative description provided by Galaxy Zoo humans. Our objective, quantitative measures of structure demonstrate the difficulty in defining exactly what constitutes a spiral 'arm', leading us to prefer the term 'arm segment'. We find that pitch angle often varies significantly segment-to-segment in a single spiral galaxy, making it difficult to define the pitch angle for a single galaxy. We demonstrate how our new database of arm segments can be queried to find galaxies satisfying specific quantitative visual criteria. For example, even though our code does not explicitly find rings, a good surrogate is to look for galaxies having one long, low-pitch-angle arm—which is how our code views ring galaxies. SpArcFiRe is available at http://sparcfire.ics.uci.edu.« less
Development of a Thiolysis HPLC Method for the Analysis of Procyanidins in Cranberry Products.
Gao, Chi; Cunningham, David G; Liu, Haiyan; Khoo, Christina; Gu, Liwei
2018-03-07
The objective of this study was to develop a thiolysis HPLC method to quantify total procyanidins, the ratio of A-type linkages, and A-type procyanidin equivalents in cranberry products. Cysteamine was utilized as a low-odor substitute of toluene-α-thiol for thiolysis depolymerization. A reaction temperature of 70 °C and reaction time of 20 min, in 0.3 M of HCl, were determined to be optimum depolymerization conditions. Thiolytic products of cranberry procyanidins were separated by RP-HPLC and identified using high-resolution mass spectrometry. Standards curves of good linearity were obtained on thiolyzed procyanidin dimer A2 and B2 external standards. The detection and quantification limits, recovery, and precision of this method were validated. The new method was applied to quantitate total procyanidins, average degree of polymerization, ratio of A-type linkages, and A-type procyanidin equivalents in cranberry products. Results showed that the method was suitable for quantitative and qualitative analysis of procyanidins in cranberry products.
Current perspectives of CASA applications in diverse mammalian spermatozoa.
van der Horst, Gerhard; Maree, Liana; du Plessis, Stefan S
2018-03-26
Since the advent of computer-aided sperm analysis (CASA) some four decades ago, advances in computer technology and software algorithms have helped establish it as a research and diagnostic instrument for the analysis of spermatozoa. Despite mammalian spermatozoa being the most diverse cell type known, CASA is a great tool that has the capacity to provide rapid, reliable and objective quantitative assessment of sperm quality. This paper provides contemporary research findings illustrating the scientific and commercial applications of CASA and its ability to evaluate diverse mammalian spermatozoa (human, primates, rodents, domestic mammals, wildlife species) at both structural and functional levels. The potential of CASA to quantitatively measure essential aspects related to sperm subpopulations, hyperactivation, morphology and morphometry is also demonstrated. Furthermore, applications of CASA are provided for improved mammalian sperm quality assessment, evaluation of sperm functionality and the effect of different chemical substances or pathologies on sperm fertilising ability. It is clear that CASA has evolved significantly and is currently superior to many manual techniques in the research and clinical setting.
3D Filament Network Segmentation with Multiple Active Contours
NASA Astrophysics Data System (ADS)
Xu, Ting; Vavylonis, Dimitrios; Huang, Xiaolei
2014-03-01
Fluorescence microscopy is frequently used to study two and three dimensional network structures formed by cytoskeletal polymer fibers such as actin filaments and microtubules. While these cytoskeletal structures are often dilute enough to allow imaging of individual filaments or bundles of them, quantitative analysis of these images is challenging. To facilitate quantitative, reproducible and objective analysis of the image data, we developed a semi-automated method to extract actin networks and retrieve their topology in 3D. Our method uses multiple Stretching Open Active Contours (SOACs) that are automatically initialized at image intensity ridges and then evolve along the centerlines of filaments in the network. SOACs can merge, stop at junctions, and reconfigure with others to allow smooth crossing at junctions of filaments. The proposed approach is generally applicable to images of curvilinear networks with low SNR. We demonstrate its potential by extracting the centerlines of synthetic meshwork images, actin networks in 2D TIRF Microscopy images, and 3D actin cable meshworks of live fission yeast cells imaged by spinning disk confocal microscopy.
Quantitative phase retrieval with arbitrary pupil and illumination
Claus, Rene A.; Naulleau, Patrick P.; Neureuther, Andrew R.; ...
2015-10-02
We present a general algorithm for combining measurements taken under various illumination and imaging conditions to quantitatively extract the amplitude and phase of an object wave. The algorithm uses the weak object transfer function, which incorporates arbitrary pupil functions and partially coherent illumination. The approach is extended beyond the weak object regime using an iterative algorithm. Finally, we demonstrate the method on measurements of Extreme Ultraviolet Lithography (EUV) multilayer mask defects taken in an EUV zone plate microscope with both a standard zone plate lens and a zone plate implementing Zernike phase contrast.
Influence of echo time in quantitative proton MR spectroscopy using LCModel.
Yamamoto, Tetsuya; Isobe, Tomonori; Akutsu, Hiroyoshi; Masumoto, Tomohiko; Ando, Hiroki; Sato, Eisuke; Takada, Kenta; Anno, Izumi; Matsumura, Akira
2015-06-01
The objective of this study was to elucidate the influence on quantitative analysis using LCModel with the condition of echo time (TE) longer than the recommended values in the spectrum acquisition specifications. A 3T magnetic resonance system was used to perform proton magnetic resonance spectroscopy. The participants were 5 healthy volunteers and 11 patients with glioma. Data were collected at TE of 72, 144 and 288ms. LCModel was used to quantify several metabolites (N-acetylaspartate, creatine and phosphocreatine, and choline-containing compounds). The results were compared with quantitative values obtained by using the T2-corrected internal reference method. In healthy volunteers, when TE was long, the quantitative values obtained using LCModel were up to 6.8-fold larger (p<0.05) than those obtained using the T2-corrected internal reference method. The ratios of the quantitative values obtained by the two methods differed between metabolites (p<0.05). In patients with glioma, the ratios of quantitative values obtained by the two methods tended to be larger at longer TE, similarly to the case of healthy volunteers, and large between-individual variation in the ratios was observed. In clinical practice, TE is sometimes set longer than the value recommended for LCModel. If TE is long, LCModel overestimates the quantitative value since it cannot compensate for signal attenuation, and this effect is different for each metabolite and condition. Therefore, if TE is longer than recommended, it is necessary to account for the possibly reduced reliability of quantitative values calculated using LCModel. Copyright © 2015 Elsevier Inc. All rights reserved.
Mast, Fred D.; Ratushny, Alexander V.
2014-01-01
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336
Detailed Modeling and Analysis of the CPFM Dataset
NASA Technical Reports Server (NTRS)
Swartz, William H.; Lloyd, Steven A.; DeMajistre, Robert
2004-01-01
A quantitative understanding of photolysis rate coefficients (or "j-values") is essential to determining the photochemical reaction rates that define ozone loss and other crucial processes in the atmosphere. j-Values can be calculated with radiative transfer models, derived from actinic flux observations, or inferred from trace gas measurements. The principal objective of this study is to cross-validate j-values from the Composition and Photodissociative Flux Measurement (CPFM) instrument during the Photochemistry of Ozone Loss in the Arctic Region In Summer (POLARIS) and SAGE I11 Ozone Loss and Validation Experiment (SOLVE) field campaigns with model calculations and other measurements and to use this detailed analysis to improve our ability to determine j-values. Another objective is to analyze the spectral flux from the CPFM (not just the j-values) and, using a multi-wavelength/multi-species spectral fitting technique, determine atmospheric composition.
Conducting On-orbit Gene Expression Analysis on ISS: WetLab-2
NASA Technical Reports Server (NTRS)
Parra, Macarena; Almeida, Eduardo; Boone, Travis; Jung, Jimmy; Lera, Matthew P.; Ricco, Antonio; Souza, Kenneth; Wu, Diana; Richey, C. Scott
2013-01-01
WetLab-2 will enable expanded genomic research on orbit by developing tools that support in situ sample collection, processing, and analysis on ISS. This capability will reduce the time-to-results for investigators and define new pathways for discovery on the ISS National Lab. The primary objective is to develop a research platform on ISS that will facilitate real-time quantitative gene expression analysis of biological samples collected on orbit. WetLab-2 will be capable of processing multiple sample types ranging from microbial cultures to animal tissues dissected on orbit. WetLab-2 will significantly expand the analytical capabilities onboard ISS and enhance science return from ISS.
Olokundun, Maxwell; Iyiola, Oluwole; Ibidunni, Stephen; Ogbari, Mercy; Falola, Hezekiah; Salau, Odunayo; Peter, Fred; Borishade, Taiye
2018-06-01
The article presented data on the effectiveness of entrepreneurship curriculum contents on university students' entrepreneurial interest and knowledge. The study focused on the perceptions of Nigerian university students. Emphasis was laid on the first four universities in Nigeria to offer a degree programme in entrepreneurship. The study adopted quantitative approach with a descriptive research design to establish trends related to the objective of the study. Survey was be used as quantitative research method. The population of this study included all students in the selected universities. Data was analyzed with the use of Statistical Package for Social Sciences (SPSS). Mean score was used as statistical tool of analysis. The field data set is made widely accessible to enable critical or a more comprehensive investigation.
Inverse transport problems in quantitative PAT for molecular imaging
NASA Astrophysics Data System (ADS)
Ren, Kui; Zhang, Rongting; Zhong, Yimin
2015-12-01
Fluorescence photoacoustic tomography (fPAT) is a molecular imaging modality that combines photoacoustic tomography with fluorescence imaging to obtain high-resolution imaging of fluorescence distributions inside heterogeneous media. The objective of this work is to study inverse problems in the quantitative step of fPAT where we intend to reconstruct physical coefficients in a coupled system of radiative transport equations using internal data recovered from ultrasound measurements. We derive uniqueness and stability results on the inverse problems and develop some efficient algorithms for image reconstructions. Numerical simulations based on synthetic data are presented to validate the theoretical analysis. The results we present here complement these in Ren K and Zhao H (2013 SIAM J. Imaging Sci. 6 2024-49) on the same problem but in the diffusive regime.
NASA Astrophysics Data System (ADS)
Prades, Cristina; García-Olmo, Juan; Romero-Prieto, Tomás; García de Ceca, José L.; López-Luque, Rafael
2010-06-01
The procedures used today to characterize cork plank for the manufacture of cork bottle stoppers continue to be based on a traditional, manual method that is highly subjective. Furthermore, there is no specific legislation regarding cork classification. The objective of this viability study is to assess the potential of near-infrared spectroscopy (NIRS) technology for characterizing cork plank according to the following variables: aspect or visual quality, porosity, moisture and geographical origin. In order to calculate the porosity coefficient, an image analysis program was specifically developed in Visual Basic language for a desktop scanner. A set comprising 170 samples from two geographical areas of Andalusia (Spain) was classified into eight quality classes by visual inspection. Spectra were obtained in the transverse and tangential sections of the cork planks using an NIRSystems 6500 SY II reflectance spectrophotometer. The quantitative calibrations showed cross-validation coefficients of determination of 0.47 for visual quality, 0.69 for porosity and 0.66 for moisture. The results obtained using NIRS technology are promising considering the heterogeneity and variability of a natural product such as cork in spite of the fact that the standard error of cross validation (SECV) in the quantitative analysis is greater than the standard error of laboratory (SEL) for the three variables. The qualitative analysis regarding geographical origin achieved very satisfactory results. Applying these methods in industry will permit quality control procedures to be automated, as well as establishing correlations between the different classification systems currently used in the sector. These methods can be implemented in the cork chain of custody certification and will also provide a certainly more objective tool for assessing the economic value of the product.
NASA Astrophysics Data System (ADS)
Argyropoulou, Evangelia
2015-04-01
The current study was focused on the seafloor morphology of the North Aegean Basin in Greece, through Object Based Image Analysis (OBIA) using a Digital Elevation Model. The goal was the automatic extraction of morphologic and morphotectonic features, resulting into fault surface extraction. An Object Based Image Analysis approach was developed based on the bathymetric data and the extracted features, based on morphological criteria, were compared with the corresponding landforms derived through tectonic analysis. A digital elevation model of 150 meters spatial resolution was used. At first, slope, profile curvature, and percentile were extracted from this bathymetry grid. The OBIA approach was developed within the eCognition environment. Four segmentation levels were created having as a target "level 4". At level 4, the final classes of geomorphological features were classified: discontinuities, fault-like features and fault surfaces. On previous levels, additional landforms were also classified, such as continental platform and continental slope. The results of the developed approach were evaluated by two methods. At first, classification stability measures were computed within eCognition. Then, qualitative and quantitative comparison of the results took place with a reference tectonic map which has been created manually based on the analysis of seismic profiles. The results of this comparison were satisfactory, a fact which determines the correctness of the developed OBIA approach.
Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry
NASA Astrophysics Data System (ADS)
Lukomski, Michal; Krzemien, Leszek
2013-05-01
Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.
A quantitative analysis of IRAS maps of molecular clouds
NASA Technical Reports Server (NTRS)
Wiseman, Jennifer J.; Adams, Fred C.
1994-01-01
We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.
NASA Astrophysics Data System (ADS)
Murakoshi, Dai; Hirota, Kazuhiro; Ishii, Hiroyasu; Hashimoto, Atsushi; Ebata, Tetsurou; Irisawa, Kaku; Wada, Takatsugu; Hayakawa, Toshiro; Itoh, Kenji; Ishihara, Miya
2018-02-01
Photoacoustic (PA) imaging technology is expected to be applied to clinical assessment for peripheral vascularity. We started a clinical evaluation with the prototype PA imaging system we recently developed. Prototype PA imaging system was composed with in-house Q-switched Alexandrite laser system which emits short-pulsed laser with 750 nm wavelength, handheld ultrasound transducer where illumination optics were integrated and signal processing for PA image reconstruction implemented in the clinical ultrasound (US) system. For the purpose of quantitative assessment of PA images, an image analyzing function has been developed and applied to clinical PA images. In this analyzing function, vascularity derived from PA signal intensity ranged for prescribed threshold was defined as a numerical index of vessel fulfillment and calculated for the prescribed region of interest (ROI). Skin surface was automatically detected by utilizing B-mode image acquired simultaneously with PA image. Skinsurface position is utilized to place the ROI objectively while avoiding unwanted signals such as artifacts which were imposed due to melanin pigment in the epidermal layer which absorbs laser emission and generates strong PA signals. Multiple images were available to support the scanned image set for 3D viewing. PA images for several fingers of patients with systemic sclerosis (SSc) were quantitatively assessed. Since the artifact region is trimmed off in PA images, the visibility of vessels with rather low PA signal intensity on the 3D projection image was enhanced and the reliability of the quantitative analysis was improved.
The objective assessment of experts' and novices' suturing skills using an image analysis program.
Frischknecht, Adam C; Kasten, Steven J; Hamstra, Stanley J; Perkins, Noel C; Gillespie, R Brent; Armstrong, Thomas J; Minter, Rebecca M
2013-02-01
To objectively assess suturing performance using an image analysis program and to provide validity evidence for this assessment method by comparing experts' and novices' performance. In 2009, the authors used an image analysis program to extract objective variables from digital images of suturing end products obtained during a previous study involving third-year medical students (novices) and surgical faculty and residents (experts). Variables included number of stitches, stitch length, total bite size, travel, stitch orientation, total bite-size-to-travel ratio, and symmetry across the incision ratio. The authors compared all variables between groups to detect significant differences and two variables (total bite-size-to-travel ratio and symmetry across the incision ratio) to ideal values. Five experts and 15 novices participated. Experts' and novices' performances differed significantly (P < .05) with large effect sizes attributable to experience (Cohen d > 0.8) for total bite size (P = .009, d = 1.5), travel (P = .045, d = 1.1), total bite-size-to-travel ratio (P < .0001, d = 2.6), stitch orientation (P = .014,d = 1.4), and symmetry across the incision ratio (P = .022, d = 1.3). The authors found that a simple computer algorithm can extract variables from digital images of a running suture and rapidly provide quantitative summative assessment feedback. The significant differences found between groups confirm that this system can discriminate between skill levels. This image analysis program represents a viable training tool for objectively assessing trainees' suturing, a foundational skill for many medical specialties.
Kushida, Ikuo
2012-03-01
The objective of this study was to develop a quantitative crystallinity analysis method for the bulk drug of E1010 ((+)-(4R,5S,6S)-6-[(R)-1-hydroxyethyl]-3-[(2S,4S)-2-[(R)-1-hydroxy-1-[(R)-pyrrolidin-3 -yl]methyl]pyrrolidin-4-yl]thio-4-methyl-7-oxo-1-azabicyclo[3.2.0]hept-2-ene-2-carboxylic acid monohydrochloride), a novel carbapenem antibiotic. X-ray analyses, thermal analyses and hygroscopicity measurements were used to elucidate the crystal structure and the solid state properties. To develop a quantitative method for the crystallinity of E1010 bulk drug, the relationship between enthalpy change obtained by differential scanning calorimetry (DSC) and crystalline form ratio was investigated. E1010 bulk drug was found to exist in a crystalline trihydrate formed in two layers, i.e. a layer of E1010 free form, and a layer consisting of chloride ions and water molecules. The thermal analysis showed an endothermic peak derived from dehydration with the loss of crystal lattices at around 100°C as an onset. The enthalpy change value for the endothermic peak correlated well with crystalline content in binary physical mixtures of the crystalline trihydrate and the amorphous form. In addition, for nine lots of the bulk drug, a positive correlation between the enthalpy change and chemical stability in the solid state was observed. This quantitative analysis of crystallinity using DSC could be applicable for the quality control of the bulk drug to detect variability among manufacturing batches and to estimate the chemical stability of partially amorphous samples. © 2011 The Author. JPP © 2011 Royal Pharmaceutical Society.
Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.
2017-01-01
Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972
ERIC Educational Resources Information Center
Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.
2011-01-01
Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…
Comparison of DNA fragmentation and color thresholding for objective quantitation of apoptotic cells
NASA Technical Reports Server (NTRS)
Plymale, D. R.; Ng Tang, D. S.; Fermin, C. D.; Lewis, D. E.; Martin, D. S.; Garry, R. F.
1995-01-01
Apoptosis is a process of cell death characterized by distinctive morphological changes and fragmentation of cellular DNA. Using video imaging and color thresholding techniques, we objectively quantitated the number of cultured CD4+ T-lymphoblastoid cells (HUT78 cells, RH9 subclone) displaying morphological signs of apoptosis before and after exposure to gamma-irradiation. The numbers of apoptotic cells measured by objective video imaging techniques were compared to numbers of apoptotic cells measured in the same samples by sensitive apoptotic assays that quantitate DNA fragmentation. DNA fragmentation assays gave consistently higher values compared with the video imaging assays that measured morphological changes associated with apoptosis. These results suggest that substantial DNA fragmentation can precede or occur in the absence of the morphological changes which are associated with apoptosis in gamma-irradiated RH9 cells.
An approach to computing direction relations between separated object groups
NASA Astrophysics Data System (ADS)
Yan, H.; Wang, Z.; Li, J.
2013-06-01
Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.
An approach to computing direction relations between separated object groups
NASA Astrophysics Data System (ADS)
Yan, H.; Wang, Z.; Li, J.
2013-09-01
Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.
Linking agent-based models and stochastic models of financial markets
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene
2012-01-01
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086
Pulmonary nodule characterization, including computer analysis and quantitative features.
Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E
2015-03-01
Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.
Linking agent-based models and stochastic models of financial markets.
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene
2012-05-29
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.
Quantitative methods in assessment of neurologic function.
Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J
1981-01-01
Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.
Image analysis in cytology: DNA-histogramming versus cervical smear prescreening.
Bengtsson, E W; Nordin, B
1993-01-01
The visual inspection of cellular specimens and histological sections through a light microscope plays an important role in clinical medicine and biomedical research. The human visual system is very good at the recognition of various patterns but less efficient at quantitative assessment of these patterns. Some samples are prepared in great numbers, most notably the screening for cervical cancer, the so-called PAP-smears, which results in hundreds of millions of samples each year, creating a tedious mass inspection task. Numerous attempts have been made over the last 40 years to create systems that solve these two tasks, the quantitative supplement to the human visual system and the automation of mass screening. The most difficult task, the total automation, has received the greatest attention with many large scale projects over the decades. In spite of all these efforts, still no generally accepted automated prescreening device exists on the market. The main reason for this failure is the great pattern recognition capabilities needed to distinguish between cancer cells and all other kinds of objects found in the specimens: cellular clusters, debris, degenerate cells, etc. Improved algorithms, the ever-increasing processing power of computers and progress in biochemical specimen preparation techniques make it likely that eventually useful automated prescreening systems will become available. Meanwhile, much less effort has been put into the development of interactive cell image analysis systems. Still, some such systems have been developed and put into use at thousands of laboratories worldwide. In these the human pattern recognition capability is used to select the fields and objects that are to be analysed while the computational power of the computer is used for the quantitative analysis of cellular DNA content or other relevant markers. Numerous studies have shown that the quantitative information about the distribution of cellular DNA content is of prognostic significance in many types of cancer. Several laboratories are therefore putting these techniques into routine clinical use. The more advanced systems can also study many other markers and cellular features, some known to be of clinical interest, others useful in research. The advances in computer technology are making these systems more generally available through decreasing cost, increasing computational power and improved user interfaces. We have been involved in research and development of both automated and interactive cell analysis systems during the last 20 years. Here some experiences and conclusions from this work will be presented as well as some predictions about what can be expected in the near future.
Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.
Sugino, T; Kawahira, H; Nakamura, R
2014-09-01
Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information. Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits. Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently. Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.
Optimizing morphology through blood cell image analysis.
Merino, A; Puigví, L; Boldú, L; Alférez, S; Rodellar, J
2018-05-01
Morphological review of the peripheral blood smear is still a crucial diagnostic aid as it provides relevant information related to the diagnosis and is important for selection of additional techniques. Nevertheless, the distinctive cytological characteristics of the blood cells are subjective and influenced by the reviewer's interpretation and, because of that, translating subjective morphological examination into objective parameters is a challenge. The use of digital microscopy systems has been extended in the clinical laboratories. As automatic analyzers have some limitations for abnormal or neoplastic cell detection, it is interesting to identify quantitative features through digital image analysis for morphological characteristics of different cells. Three main classes of features are used as follows: geometric, color, and texture. Geometric parameters (nucleus/cytoplasmic ratio, cellular area, nucleus perimeter, cytoplasmic profile, RBC proximity, and others) are familiar to pathologists, as they are related to the visual cell patterns. Different color spaces can be used to investigate the rich amount of information that color may offer to describe abnormal lymphoid or blast cells. Texture is related to spatial patterns of color or intensities, which can be visually detected and quantitatively represented using statistical tools. This study reviews current and new quantitative features, which can contribute to optimize morphology through blood cell digital image processing techniques. © 2018 John Wiley & Sons Ltd.
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Using mixed methods in health research
Woodman, Jenny
2013-01-01
Summary Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research. However, there are few papers that summarize this methodological approach for health practitioners who wish to conduct or critically engage with mixed methods studies. The objective of this paper is to provide an accessible introduction to mixed methods for clinicians and researchers unfamiliar with this approach. We present a synthesis of key methodological literature on mixed methods research, with examples from our own work and that of others, to illustrate the practical applications of this approach within health research. We summarize definitions of mixed methods research, the value of this approach, key aspects of study design and analysis, and discuss the potential challenges of combining quantitative and qualitative methods and data. One of the key challenges within mixed methods research is the successful integration of quantitative and qualitative data during analysis and interpretation. However, the integration of different types of data can generate insights into a research question, resulting in enriched understanding of complex health research problems. PMID:23885291
A low cost mobile phone dark-field microscope for nanoparticle-based quantitative studies.
Sun, Dali; Hu, Tony Y
2018-01-15
Dark-field microscope (DFM) analysis of nanoparticle binding signal is highly useful for a variety of research and biomedical applications, but current applications for nanoparticle quantification rely on expensive DFM systems. The cost, size, limited robustness of these DFMs limits their utility for non-laboratory settings. Most nanoparticle analyses use high-magnification DFM images, which are labor intensive to acquire and subject to operator bias. Low-magnification DFM image capture is faster, but is subject to background from surface artifacts and debris, although image processing can partially compensate for background signal. We thus mated an LED light source, a dark-field condenser and a 20× objective lens with a mobile phone camera to create an inexpensive, portable and robust DFM system suitable for use in non-laboratory conditions. This proof-of-concept mobile DFM device weighs less than 400g and costs less than $2000, but analysis of images captured with this device reveal similar nanoparticle quantitation results to those acquired with a much larger and more expensive desktop DFMM system. Our results suggest that similar devices may be useful for quantification of stable, nanoparticle-based activity and quantitation assays in resource-limited areas where conventional assay approaches are not practical. Copyright © 2017 Elsevier B.V. All rights reserved.
Quantitative proteomic analysis of microdissected oral epithelium for cancer biomarker discovery.
Xiao, Hua; Langerman, Alexander; Zhang, Yan; Khalid, Omar; Hu, Shen; Cao, Cheng-Xi; Lingen, Mark W; Wong, David T W
2015-11-01
Specific biomarkers are urgently needed for the detection and progression of oral cancer. The objective of this study was to discover cancer biomarkers from oral epithelium through utilizing high throughput quantitative proteomics approaches. Morphologically malignant, epithelial dysplasia, and adjacent normal epithelial tissues were laser capture microdissected (LCM) from 19 patients and used for proteomics analysis. Total proteins from each group were extracted, digested and then labelled with corresponding isobaric tags for relative and absolute quantitation (iTRAQ). Labelled peptides from each sample were combined and analyzed by liquid chromatography-mass spectrometry (LC-MS/MS) for protein identification and quantification. In total, 500 proteins were identified and 425 of them were quantified. When compared with adjacent normal oral epithelium, 17 and 15 proteins were consistently up-regulated or down-regulated in malignant and epithelial dysplasia, respectively. Half of these candidate biomarkers were discovered for oral cancer for the first time. Cornulin was initially confirmed in tissue protein extracts and was further validated in tissue microarray. Its presence in the saliva of oral cancer patients was also explored. Myoglobin and S100A8 were pre-validated by tissue microarray. These data demonstrated that the proteomic biomarkers discovered through this strategy are potential targets for oral cancer detection and salivary diagnostics. Copyright © 2015 Elsevier Ltd. All rights reserved.
Integrating regional conservation priorities for multiple objectives into national policy
Beger, Maria; McGowan, Jennifer; Treml, Eric A.; Green, Alison L.; White, Alan T.; Wolff, Nicholas H.; Klein, Carissa J.; Mumby, Peter J.; Possingham, Hugh P.
2015-01-01
Multinational conservation initiatives that prioritize investment across a region invariably navigate trade-offs among multiple objectives. It seems logical to focus where several objectives can be achieved efficiently, but such multi-objective hotspots may be ecologically inappropriate, or politically inequitable. Here we devise a framework to facilitate a regionally cohesive set of marine-protected areas driven by national preferences and supported by quantitative conservation prioritization analyses, and illustrate it using the Coral Triangle Initiative. We identify areas important for achieving six objectives to address ecosystem representation, threatened fauna, connectivity and climate change. We expose trade-offs between areas that contribute substantially to several objectives and those meeting one or two objectives extremely well. Hence there are two strategies to guide countries choosing to implement regional goals nationally: multi-objective hotspots and complementary sets of single-objective priorities. This novel framework is applicable to any multilateral or global initiative seeking to apply quantitative information in decision making. PMID:26364769
Zhang, Xiaoyang; Xue, Lei; Zhang, Zhi; Zhang, Yiwen
2016-01-01
Background: Health problems about children have been attracting much attention of parents and even the whole society all the time, among which, child-language development is a hot research topic. The experts and scholars have studied and found that the guardians taking appropriate intervention in children at the early stage can promote children’s language and cognitive ability development effectively, and carry out analysis of quantity. The intervention of Artificial Intelligence Technology has effect on the autistic spectrum disorders of children obviously. Objective and Methods: This paper presents a speech signal analysis system for children, with preprocessing of the speaker speech signal, subsequent calculation of the number in the speech of guardians and children, and some other characteristic parameters or indicators (e.g cognizable syllable number, the continuity of the language). Results: With these quantitative analysis tool and parameters, we can evaluate and analyze the quality of children’s language and cognitive ability objectively and quantitatively to provide the basis for decision-making criteria for parents. Thereby, they can adopt appropriate measures for children to promote the development of children's language and cognitive status. Conclusion: In this paper, according to the existing study of children’s language development, we put forward several indicators in the process of automatic measurement for language development which influence the formation of children’s language. From the experimental results we can see that after the pretreatment (including signal enhancement, speech activity detection), both divergence algorithm calculation results and the later words count are quite satisfactory compared with the actual situation. PMID:27583037
Speech graphs provide a quantitative measure of thought disorder in psychosis.
Mota, Natalia B; Vasconcelos, Nivaldo A P; Lemos, Nathalia; Pieretti, Ana C; Kinouchi, Osame; Cecchi, Guillermo A; Copelli, Mauro; Ribeiro, Sidarta
2012-01-01
Psychosis has various causes, including mania and schizophrenia. Since the differential diagnosis of psychosis is exclusively based on subjective assessments of oral interviews with patients, an objective quantification of the speech disturbances that characterize mania and schizophrenia is in order. In principle, such quantification could be achieved by the analysis of speech graphs. A graph represents a network with nodes connected by edges; in speech graphs, nodes correspond to words and edges correspond to semantic and grammatical relationships. To quantify speech differences related to psychosis, interviews with schizophrenics, manics and normal subjects were recorded and represented as graphs. Manics scored significantly higher than schizophrenics in ten graph measures. Psychopathological symptoms such as logorrhea, poor speech, and flight of thoughts were grasped by the analysis even when verbosity differences were discounted. Binary classifiers based on speech graph measures sorted schizophrenics from manics with up to 93.8% of sensitivity and 93.7% of specificity. In contrast, sorting based on the scores of two standard psychiatric scales (BPRS and PANSS) reached only 62.5% of sensitivity and specificity. The results demonstrate that alterations of the thought process manifested in the speech of psychotic patients can be objectively measured using graph-theoretical tools, developed to capture specific features of the normal and dysfunctional flow of thought, such as divergence and recurrence. The quantitative analysis of speech graphs is not redundant with standard psychometric scales but rather complementary, as it yields a very accurate sorting of schizophrenics and manics. Overall, the results point to automated psychiatric diagnosis based not on what is said, but on how it is said.
Automated CT Scan Scores of Bronchiectasis and Air Trapping in Cystic Fibrosis
Swiercz, Waldemar; Heltshe, Sonya L.; Anthony, Margaret M.; Szefler, Paul; Klein, Rebecca; Strain, John; Brody, Alan S.; Sagel, Scott D.
2014-01-01
Background: Computer analysis of high-resolution CT (HRCT) scans may improve the assessment of structural lung injury in children with cystic fibrosis (CF). The goal of this cross-sectional pilot study was to validate automated, observer-independent image analysis software to establish objective, simple criteria for bronchiectasis and air trapping. Methods: HRCT scans of the chest were performed in 35 children with CF and compared with scans from 12 disease control subjects. Automated image analysis software was developed to count visible airways on inspiratory images and to measure a low attenuation density (LAD) index on expiratory images. Among the children with CF, relationships among automated measures, Brody HRCT scanning scores, lung function, and sputum markers of inflammation were assessed. Results: The number of total, central, and peripheral airways on inspiratory images and LAD (%) on expiratory images were significantly higher in children with CF compared with control subjects. Among subjects with CF, peripheral airway counts correlated strongly with Brody bronchiectasis scores by two raters (r = 0.86, P < .0001; r = 0.91, P < .0001), correlated negatively with lung function, and were positively associated with sputum free neutrophil elastase activity. LAD (%) correlated with Brody air trapping scores (r = 0.83, P < .0001; r = 0.69, P < .0001) but did not correlate with lung function or sputum inflammatory markers. Conclusions: Quantitative airway counts and LAD (%) on HRCT scans appear to be useful surrogates for bronchiectasis and air trapping in children with CF. Our automated methodology provides objective quantitative measures of bronchiectasis and air trapping that may serve as end points in CF clinical trials. PMID:24114359
Russell, Richard A; Adams, Niall M; Stephens, David A; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S
2009-04-22
Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments.
Russell, Richard A.; Adams, Niall M.; Stephens, David A.; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S.
2009-01-01
Abstract Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments. PMID:19383481
Descriptive quantitative analysis of hallux abductovalgus transverse plane radiographic parameters.
Meyr, Andrew J; Myers, Adam; Pontious, Jane
2014-01-01
Although the transverse plane radiographic parameters of the first intermetatarsal angle (IMA), hallux abductus angle (HAA), and the metatarsal-sesamoid position (MSP) form the basis of preoperative procedure selection and postoperative surgical evaluation of the hallux abductovalgus deformity, the so-called normal values of these measurements have not been well established. The objectives of the present study were to (1) evaluate the descriptive statistics of the first IMA, HAA, and MSP from a large patient population and (2) to determine an objective basis for defining "normal" versus "abnormal" measurements. Anteroposterior foot radiographs from 373 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated for the measurements of the first IMA, HAA, and MSP. The results revealed a mean measurement of 9.93°, 17.59°, and position 3.63 for the first IMA, HAA, and MSP, respectively. An advanced descriptive analysis demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, clear differentiations in deformity progression were appreciated when the variables were graphically depicted against each other. This could represent a quantitative basis for defining "normal" versus "abnormal" values. From the results of the present study, we have concluded that these radiographic parameters can be more conservatively reported and analyzed using nonparametric descriptive and comparative statistics within medical studies and that the combination of a first IMA, HAA, and MSP at or greater than approximately 10°, 18°, and position 4, respectively, appears to be an objective "tipping point" in terms of deformity progression and might represent an upper limit of acceptable in terms of surgical deformity correction. Copyright © 2014 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.
[Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].
Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie
2013-11-01
In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.
NASA Astrophysics Data System (ADS)
Nguyen, Thanh C.; Nehmetallah, George; Lam, Van; Chung, Byung Min; Raub, Christopher
2017-02-01
Digital holographic microscopy (DHM) provides label-free and real-time quantitative phase information relevant to the analysis of dynamic biological systems. A DHM based on telecentric configuration optically mitigates phase aberrations due to the microscope objective and linear high frequency fringes due to the reference beam thus minimizing digital aberration correction needed for distortion free 3D reconstruction. The purpose of this work is to quantitatively assess growth and migratory behavior of invasive cancer cells using a telecentric DHM system. Together, the height and lateral shape features of individual cells, determined from time-lapse series of phase reconstructions, should reveal aspects of cell migration, cell-matrix adhesion, and cell cycle phase transitions. To test this, MDA-MB-231 breast cancer cells were cultured on collagen-coated or un-coated glass, and 3D holograms were reconstructed over 2 hours. Cells on collagencoated glass had an average 14% larger spread area than cells on uncoated glass (n=18-22 cells/group). The spread area of cells on uncoated glass were 15-21% larger than cells seeded on collagen hydrogels (n=18-22 cells/group). Premitotic cell rounding was observed with average phase height increasing 57% over 10 minutes. Following cell division phase height decreased linearly (R2=0.94) to 58% of the original height pre-division. Phase objects consistent with lamellipodia were apparent from the reconstructions at the leading edge of migrating cells. These data demonstrate the ability to track quantitative phase parameters and relate them to cell morphology during cell migration and division on adherent substrates, using telecentric DHM. The technique enables future studies of cell-matrix interactions relevant to cancer.
PXRF, μ-XRF, vacuum μ-XRF, and EPMA analysis of Email Champlevé objects present in Belgian museums.
Van der Linden, Veerle; Meesdom, Eva; Devos, Annemie; Van Dooren, Rita; Nieuwdorp, Hans; Janssen, Elsje; Balace, Sophie; Vekemans, Bart; Vincze, Laszlo; Janssens, Koen
2011-10-01
The enamel of 20 Email Champlevé objects dating between the 12th and 19th centuries was investigated by means of microscopic and portable X-ray fluorescence analysis (μ-XRF and PXRF). Seven of these objects were microsampled and the fragments were analyzed with electron probe microanalysis (EPMA) and vacuum μ-XRF to obtain quantitative data about the composition of the glass used to produce these enameled objects. As a result of the evolution of the raw materials employed to produce the base glass, three different compositional groups could be discriminated. The first group consisted of soda-lime-silica glass with a sodium source of mineral origin (with low K content) that was opacified by addition of calcium antimonate crystals. This type of glass was only used in objects made in the 12th century. Email Champlevé objects from the beginning of the 13th century onward were enameled with soda-lime-silica glass with a sodium source of vegetal origin. This type of glass, which has a higher potassium content, was opacified with SnO2 crystals. The glass used for 19th century Email Champlevé artifacts was produced with synthetic and purified components resulting in a different chemical composition compared to the other groups. Although the four analytical techniques employed in this study have their own specific characteristics, they were all found to be suitable for classifying the objects into the different chronological categories.
Objective, Quantitative, Data-Driven Assessment of Chemical Probes.
Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan
2018-02-15
Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Spinner, Georg; Wyss, Michael; Erni, Stefan; Ettlin, Dominik A; Nanz, Daniel; Ulbrich, Erika J; Gallo, Luigi M; Andreisek, Gustav
2016-01-01
Objectives: To quantitatively and qualitatively compare MRI of the temporomandibular joint (TMJ) using an optimized high-resolution protocol at 3.0 T and a clinical standard protocol at 1.5 T. Methods: A phantom and 12 asymptomatic volunteers were MR imaged using a 2-channel surface coil (standard TMJ coil) at 1.5 and 3.0 T (Philips Achieva and Philips Ingenia, respectively; Philips Healthcare, Best, Netherlands). Imaging protocol consisted of coronal and oblique sagittal proton density-weighted turbo spin echo sequences. For quantitative evaluation, a spherical phantom was imaged. Signal-to-noise ratio (SNR) maps were calculated on a voxelwise basis. For qualitative evaluation, all volunteers underwent MRI of the TMJ with the jaw in closed position. Two readers independently assessed visibility and delineation of anatomical structures of the TMJ and overall image quality on a 5-point Likert scale. Quantitative and qualitative measurements were compared between field strengths. Results: The quantitative analysis showed similar SNR for the high-resolution protocol at 3.0 T compared with the clinical protocol at 1.5 T. The qualitative analysis showed significantly better visibility and delineation of clinically relevant anatomical structures of the TMJ, including the TMJ disc and pterygoid muscle as well as better overall image quality at 3.0 T than at 1.5 T. Conclusions: The presented results indicate that expected gains in SNR at 3.0 T can be used to increase the spatial resolution when imaging the TMJ, which translates into increased visibility and delineation of anatomical structures of the TMJ. Therefore, imaging at 3.0 T should be preferred over 1.5 T for imaging the TMJ. PMID:26371077
A quantitative study of nanoparticle skin penetration with interactive segmentation.
Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook
2016-10-01
In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications.
Quantitative optical metrology with CMOS cameras
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.
2004-08-01
Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.
Testicular Dysgenesis Syndrome and the Estrogen Hypothesis: A Quantitative Meta-Analysis
Martin, Olwenn V.; Shialis, Tassos; Lester, John N.; Scrimshaw, Mark D.; Boobis, Alan R.; Voulvoulis, Nikolaos
2008-01-01
Background Male reproductive tract abnormalities such as hypospadias and cryptorchidism, and testicular cancer have been proposed to comprise a common syndrome together with impaired spermatogenesis with a common etiology resulting from the disruption of gonadal development during fetal life, the testicular dysgenesis syndrome (TDS). The hypothesis that in utero exposure to estrogenic agents could induce these disorders was first proposed in 1993. The only quantitative summary estimate of the association between prenatal exposure to estrogenic agents and testicular cancer was published over 10 years ago, and other systematic reviews of the association between estrogenic compounds, other than the potent pharmaceutical estrogen diethylstilbestrol (DES), and TDS end points have remained inconclusive. Objectives We conducted a quantitative meta-analysis of the association between the end points related to TDS and prenatal exposure to estrogenic agents. Inclusion in this analysis was based on mechanistic criteria, and the plausibility of an estrogen receptor (ER)-α–mediated mode of action was specifically explored. Results We included in this meta-analysis eight studies investigating the etiology of hypospadias and/or cryptorchidism that had not been identified in previous systematic reviews. Four additional studies of pharmaceutical estrogens yielded a statistically significant updated summary estimate for testicular cancer. Conclusions The doubling of the risk ratios for all three end points investigated after DES exposure is consistent with a shared etiology and the TDS hypothesis but does not constitute evidence of an estrogenic mode of action. Results of the subset analyses point to the existence of unidentified sources of heterogeneity between studies or within the study population. PMID:18288311
Noninvasive Dry Eye Assessment Using High-Technology Ophthalmic Examination Devices.
Yamaguchi, Masahiko; Sakane, Yuri; Kamao, Tomoyuki; Zheng, Xiaodong; Goto, Tomoko; Shiraishi, Atsushi; Ohashi, Yuichi
2016-11-01
Recently, the number of dry eye cases has dramatically increased. Thus, it is important that easy screening, exact diagnoses, and suitable treatments be available. We developed 3 original and noninvasive assessments for this disorder. First, a DR-1 dry eye monitor was used to determine the tear meniscus height quantitatively by capturing a tear meniscus digital image that was analyzed by Meniscus Processor software. The DR-1 meniscus height value significantly correlated with the fluorescein meniscus height (r = 0.06, Bland-Altman analysis). At a cutoff value of 0.22 mm, sensitivity of the dry eye diagnosis was 84.1% with 90.9% specificity. Second, the Tear Stability Analysis System was used to quantitatively measure tear film stability using a topographic modeling system corneal shape analysis device. Tear film stability was objectively and quantitatively evaluated every second during sustained eye openings. The Tear Stability Analysis System is currently installed in an RT-7000 autorefractometer and topographer to automate the diagnosis of dry eye. Third, the Ocular Surface Thermographer uses ophthalmic thermography for diagnosis. The decrease in ocular surface temperature in dry eyes was significantly greater than that in normal eyes (P < 0.001) at 10 seconds after eye opening. Decreased corneal temperature correlated significantly with the tear film breakup time (r = 0.572; P < 0.001). When changes in the ocular surface temperature of the cornea were used as indicators for dry eye, sensitivity was 0.83 and specificity was 0.80 after 10 seconds. This article describes the details and potential of these 3 noninvasive dry eye assessment systems.
2017-01-01
Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). Conclusions This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. PMID:28292738
Han, Zhihua; Shao, Lixin; Xie, Yan; Wu, Jianhong; Zhang, Yan; Xin, Hongkui; Ren, Aijun; Guo, Yong; Wang, Deli; He, Qing; Ruan, Dike
2014-01-01
Objective The objective of this study was to evaluate the efficacy of quantitative T2 magnetic resonance imaging (MRI) for quantifying early cervical intervertebral disc (IVD) degeneration in asymptomatic young adults by correlating the T2 value with Pfirrmann grade, sex, and anatomic level. Methods Seventy asymptomatic young subjects (34 men and 36 women; mean age, 22.80±2.11 yr; range, 18–25 years) underwent 3.0-T MRI to obtain morphological data (one T1-fast spin echo (FSE) and three-plane T2-FSE, used to assign a Pfirrmann grade (I–V)) and for T2 mapping (multi-echo spin echo). T2 values in the nucleus pulposus (NP, n = 350) and anulus fibrosus (AF, n = 700) were obtained. Differences in T2 values between sexes and anatomic level were evaluated, and linear correlation analysis of T2 values versus degenerative grade was conducted. Findings Cervical IVDs of healthy young adults were commonly determined to be at Pfirrmann grades I and II. T2 values of NPs were significantly higher than those of AF at all anatomic levels (P<0.000). The NP, anterior AF and posterior AF values did not differ significantly between genders at the same anatomic level (P>0.05). T2 values decreased linearly with degenerative grade. Linear correlation analysis revealed a strong negative association between the Pfirrmann grade and the T2 values of the NP (P = 0.000) but not the T2 values of the AF (P = 0.854). However, non-degenerated discs (Pfirrmann grades I and II) showed a wide range of T2 relaxation time. T2 values according to disc degeneration level classification were as follows: grade I (>62.03 ms), grade II (54.60–62.03 ms), grade III (<54.60 ms). Conclusions T2 quantitation provides a more sensitive and robust approach for detecting and characterizing the early stage of cervical IVD degeneration and to create a reliable quantitative in healthy young adults. PMID:24498384
Mast, Fred D; Ratushny, Alexander V; Aitchison, John D
2014-09-15
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.
[Integral quantitative evaluation of working conditions in the construction industry].
Guseĭnov, A A
1993-01-01
Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.
Variance fluctuations in nonstationary time series: a comparative study of music genres
NASA Astrophysics Data System (ADS)
Jennings, Heather D.; Ivanov, Plamen Ch.; De Martins, Allan M.; da Silva, P. C.; Viswanathan, G. M.
2004-05-01
An important problem in physics concerns the analysis of audio time series generated by transduced acoustic phenomena. Here, we develop a new method to quantify the scaling properties of the local variance of nonstationary time series. We apply this technique to analyze audio signals obtained from selected genres of music. We find quantitative differences in the correlation properties of high art music, popular music, and dance music. We discuss the relevance of these objective findings in relation to the subjective experience of music.
Getting started on metrics - Jet Propulsion Laboratory productivity and quality
NASA Technical Reports Server (NTRS)
Bush, M. W.
1990-01-01
A review is presented to describe the effort and difficulties of reconstructing fifteen years of JPL software history. In 1987 the collection and analysis of project data were started with the objective of creating laboratory-wide measures of quality and productivity for software development. As a result of this two-year Software Product Assurance metrics study, a rough measurement foundation for software productivity and software quality, and an order-of-magnitude quantitative baseline for software systems and subsystems are now available.
Gandomkar, Ziba; Brennan, Patrick C.; Mello-Thoms, Claudia
2017-01-01
Context: Previous studies showed that the agreement among pathologists in recognition of mitoses in breast slides is fairly modest. Aims: Determining the significantly different quantitative features among easily identifiable mitoses, challenging mitoses, and miscounted nonmitoses within breast slides and identifying which color spaces capture the difference among groups better than others. Materials and Methods: The dataset contained 453 mitoses and 265 miscounted objects in breast slides. The mitoses were grouped into three categories based on the confidence degree of three pathologists who annotated them. The mitoses annotated as “probably a mitosis” by the majority of pathologists were considered as the challenging category. The miscounted objects were recognized as a mitosis or probably a mitosis by only one of the pathologists. The mitoses were segmented using k-means clustering, followed by morphological operations. Morphological, intensity-based, and textural features were extracted from the segmented area and also the image patch of 63 × 63 pixels in different channels of eight color spaces. Holistic features describing the mitoses' surrounding cells of each image were also extracted. Statistical Analysis Used: The Kruskal–Wallis H-test followed by the Tukey-Kramer test was used to identify significantly different features. Results: The results indicated that challenging mitoses were smaller and rounder compared to other mitoses. Among different features, the Gabor textural features differed more than others between challenging mitoses and the easily identifiable ones. Sizes of the non-mitoses were similar to easily identifiable mitoses, but nonmitoses were rounder. The intensity-based features from chromatin channels were the most discriminative features between the easily identifiable mitoses and the miscounted objects. Conclusions: Quantitative features can be used to describe the characteristics of challenging mitoses and miscounted nonmitotic objects. PMID:28966834
Vanmeert, Frederik; De Nolf, Wout; Dik, Joris; Janssens, Koen
2018-06-05
At or below the surface of painted works of art, valuable information is present that provides insights into an object's past, such as the artist's technique and the creative process that was followed or its conservation history but also on its current state of preservation. Various noninvasive techniques have been developed over the past 2 decades that can probe this information either locally (via point analysis) or on a macroscopic scale (e.g., full-field imaging and raster scanning). Recently macroscopic X-ray powder diffraction (MA-XRPD) mapping using laboratory X-ray sources was developed. This method can visualize highly specific chemical distributions at the macroscale (dm 2 ). In this work we demonstrate the synergy between the quantitative aspects of powder diffraction and the noninvasive scanning capability of MA-XRPD highlighting the potential of the method to reveal new types of information. Quantitative data derived from a 15th/16th century illuminated sheet of parchment revealed three lead white pigments with different hydrocerussite-cerussite compositions in specific pictorial elements, while quantification analysis of impurities in the blue azurite pigment revealed two distinct azurite types: one rich in barite and one in quartz. Furthermore, on the same artifact, the depth-selective possibilities of the method that stem from an exploitation of the shift of the measured diffraction peaks with respect to reference data are highlighted. The influence of different experimental parameters on the depth-selective analysis results is briefly discussed. Promising stratigraphic information could be obtained, even though the analysis is hampered by not completely understood variations in the unit cell dimensions of the crystalline pigment phases.
Choi, Moon Hyung; Oh, Soon Nam; Rha, Sung Eun; Choi, Joon-Il; Lee, Sung Hak; Jang, Hong Seok; Kim, Jun-Gi; Grimm, Robert; Son, Yohan
2016-07-01
To investigate the usefulness of apparent diffusion coefficient (ADC) values derived from histogram analysis of the whole rectal cancer as a quantitative parameter to evaluate pathologic complete response (pCR) on preoperative magnetic resonance imaging (MRI). We enrolled a total of 86 consecutive patients who had undergone surgery for rectal cancer after neoadjuvant chemoradiotherapy (CRT) at our institution between July 2012 and November 2014. Two radiologists who were blinded to the final pathological results reviewed post-CRT MRI to evaluate tumor stage. Quantitative image analysis was performed using T2 -weighted and diffusion-weighted images independently by two radiologists using dedicated software that performed histogram analysis to assess the distribution of ADC in the whole tumor. After surgery, 16 patients were confirmed to have achieved pCR (18.6%). All parameters from pre- and post-CRT ADC histogram showed good or excellent agreement between two readers. The minimum, 10th, 25th, 50th, and 75th percentile and mean ADC from post-CRT ADC histogram were significantly higher in the pCR group than in the non-pCR group for both readers. The 25th percentile value from ADC histogram in post-CRT MRI had the best diagnostic performance for detecting pCR, with an area under the receiver operating characteristic curve of 0.796. Low percentile values derived from the ADC histogram analysis of rectal cancer on MRI after CRT showed a significant difference between pCR and non-pCR groups, demonstrating the utility of the ADC value as a quantitative and objective marker to evaluate complete pathologic response to preoperative CRT in rectal cancer. J. Magn. Reson. Imaging 2016;44:212-220. © 2015 Wiley Periodicals, Inc.
Medical Student Research: An Integrated Mixed-Methods Systematic Review and Meta-Analysis
Amgad, Mohamed; Man Kin Tsui, Marco; Liptrott, Sarah J.; Shash, Emad
2015-01-01
Importance Despite the rapidly declining number of physician-investigators, there is no consistent structure within medical education so far for involving medical students in research. Objective To conduct an integrated mixed-methods systematic review and meta-analysis of published studies about medical students' participation in research, and to evaluate the evidence in order to guide policy decision-making regarding this issue. Evidence Review We followed the PRISMA statement guidelines during the preparation of this review and meta-analysis. We searched various databases as well as the bibliographies of the included studies between March 2012 and September 2013. We identified all relevant quantitative and qualitative studies assessing the effect of medical student participation in research, without restrictions regarding study design or publication date. Prespecified outcome-specific quality criteria were used to judge the admission of each quantitative outcome into the meta-analysis. Initial screening of titles and abstracts resulted in the retrieval of 256 articles for full-text assessment. Eventually, 79 articles were included in our study, including eight qualitative studies. An integrated approach was used to combine quantitative and qualitative studies into a single synthesis. Once all included studies were identified, a data-driven thematic analysis was performed. Findings and Conclusions Medical student participation in research is associated with improved short- and long- term scientific productivity, more informed career choices and improved knowledge about-, interest in- and attitudes towards research. Financial worries, gender, having a higher degree (MSc or PhD) before matriculation and perceived competitiveness of the residency of choice are among the factors that affect the engagement of medical students in research and/or their scientific productivity. Intercalated BSc degrees, mandatory graduation theses and curricular research components may help in standardizing research education during medical school. PMID:26086391
Namkoong, Sun; Hong, Seung Phil; Kim, Myung Hwa; Park, Byung Cheol
2013-02-01
Nowadays, although its clinical value remains controversial institutions utilize hair mineral analysis. Arguments about the reliability of hair mineral analysis persist, and there have been evaluations of commercial laboratories performing hair mineral analysis. The objective of this study was to assess the reliability of intra-laboratory and inter-laboratory data at three commercial laboratories conducting hair mineral analysis, compared to serum mineral analysis. Two divided hair samples taken from near the scalp were submitted for analysis at the same time, to all laboratories, from one healthy volunteer. Each laboratory sent a report consisting of quantitative results and their interpretation of health implications. Differences among intra-laboratory and interlaboratory data were analyzed using SPSS version 12.0 (SPSS Inc., USA). All the laboratories used identical methods for quantitative analysis, and they generated consistent numerical results according to Friedman analysis of variance. However, the normal reference ranges of each laboratory varied. As such, each laboratory interpreted the patient's health differently. On intra-laboratory data, Wilcoxon analysis suggested they generated relatively coherent data, but laboratory B could not in one element, so its reliability was doubtful. In comparison with the blood test, laboratory C generated identical results, but not laboratory A and B. Hair mineral analysis has its limitations, considering the reliability of inter and intra laboratory analysis comparing with blood analysis. As such, clinicians should be cautious when applying hair mineral analysis as an ancillary tool. Each laboratory included in this study requires continuous refinement from now on for inducing standardized normal reference levels.
3-D interactive visualisation tools for Hi spectral line imaging
NASA Astrophysics Data System (ADS)
van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.
2017-06-01
Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.
Different techniques of multispectral data analysis for vegetation fraction retrieval
NASA Astrophysics Data System (ADS)
Kancheva, Rumiana; Georgiev, Georgi
2012-07-01
Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.
Zheng, Xianlin; Lu, Yiqing; Zhao, Jiangbo; Zhang, Yuhai; Ren, Wei; Liu, Deming; Lu, Jie; Piper, James A; Leif, Robert C; Liu, Xiaogang; Jin, Dayong
2016-01-19
Compared with routine microscopy imaging of a few analytes at a time, rapid scanning through the whole sample area of a microscope slide to locate every single target object offers many advantages in terms of simplicity, speed, throughput, and potential for robust quantitative analysis. Existing techniques that accommodate solid-phase samples incorporating individual micrometer-sized targets generally rely on digital microscopy and image analysis, with intrinsically low throughput and reliability. Here, we report an advanced on-the-fly stage scanning method to achieve high-precision target location across the whole slide. By integrating X- and Y-axis linear encoders to a motorized stage as the virtual "grids" that provide real-time positional references, we demonstrate an orthogonal scanning automated microscopy (OSAM) technique which can search a coverslip area of 50 × 24 mm(2) in just 5.3 min and locate individual 15 μm lanthanide luminescent microspheres with standard deviations of 1.38 and 1.75 μm in X and Y directions. Alongside implementation of an autofocus unit that compensates the tilt of a slide in the Z-axis in real time, we increase the luminescence detection efficiency by 35% with an improved coefficient of variation. We demonstrate the capability of advanced OSAM for robust quantification of luminescence intensities and lifetimes for a variety of micrometer-scale luminescent targets, specifically single down-shifting and upconversion microspheres, crystalline microplates, and color-barcoded microrods, as well as quantitative suspension array assays of biotinylated-DNA functionalized upconversion nanoparticles.
Radhakrishna, K.; Bowles, K.; Zettek-Sumner, A.
2013-01-01
Summary Background Telehealth data overload through high alert generation is a significant barrier to sustained adoption of telehealth for managing HF patients. Objective To explore the factors contributing to frequent telehealth alerts including false alerts for Medicare heart failure (HF) patients admitted to a home health agency. Materials and Methods A mixed methods design that combined quantitative correlation analysis of patient characteristic data with number of telehealth alerts and qualitative analysis of telehealth and visiting nurses’ notes on follow-up actions to patients’ telehealth alerts was employed. All the quantitative and qualitative data was collected through retrospective review of electronic records of the home heath agency. Results Subjects in the study had a mean age of 83 (SD = 7.6); 56% were female. Patient co-morbidities (p<0.05) of renal disorders, anxiety, and cardiac arrhythmias emerged as predictors of telehealth alerts through quantitative analysis (n = 168) using multiple regression. Inappropriate telehealth measurement technique by patients (54%) and home healthcare system inefficiencies (37%) contributed to most telehealth false alerts in the purposive qualitative sub-sample (n = 35) of patients with high telehealth alerts. Conclusion Encouraging patient engagement with the telehealth process, fostering a collaborative approach among all the clinicians involved with the telehealth intervention, tailoring telehealth alert thresholds to patient characteristics along with establishing patient-centered telehealth outcome goals may allow meaningful generation of telehealth alerts. Reducing avoidable telehealth alerts could vastly improve the efficiency and sustainability of telehealth programs for HF management. PMID:24454576
Using computer-based video analysis in the study of fidgety movements.
Adde, Lars; Helbostad, Jorunn L; Jensenius, Alexander Refsum; Taraldsen, Gunnar; Støen, Ragnhild
2009-09-01
Absence of fidgety movements (FM) in high-risk infants is a strong marker for later cerebral palsy (CP). FMs can be classified by the General Movement Assessment (GMA), based on Gestalt perception of the infant's movement pattern. More objective movement analysis may be provided by computer-based technology. The aim of this study was to explore the feasibility of a computer-based video analysis of infants' spontaneous movements in classifying non-fidgety versus fidgety movements. GMA was performed from video material of the fidgety period in 82 term and preterm infants at low and high risks of developing CP. The same videos were analysed using the developed software called General Movement Toolbox (GMT) with visualisation of the infant's movements for qualitative analyses. Variables derived from the calculation of displacement of pixels from one video frame to the next were used for quantitative analyses. Visual representations from GMT showed easily recognisable patterns of FMs. Of the eight quantitative variables derived, the variability in displacement of a spatial centre of active pixels in the image had the highest sensitivity (81.5) and specificity (70.0) in classifying FMs. By setting triage thresholds at 90% sensitivity and specificity for FM, the need for further referral was reduced by 70%. Video recordings can be used for qualitative and quantitative analyses of FMs provided by GMT. GMT is easy to implement in clinical practice, and may provide assistance in detecting infants without FMs.
Modelling the structure of sludge aggregates
Smoczyński, Lech; Ratnaweera, Harsha; Kosobucka, Marta; Smoczyński, Michał; Kalinowski, Sławomir; Kvaal, Knut
2016-01-01
ABSTRACT The structure of sludge is closely associated with the process of wastewater treatment. Synthetic dyestuff wastewater and sewage were coagulated using the PAX and PIX methods, and electro-coagulated on aluminium electrodes. The processes of wastewater treatment were supported with an organic polymer. The images of surface structures of the investigated sludge were obtained using scanning electron microscopy (SEM). The software image analysis permitted obtaining plots log A vs. log P, wherein A is the surface area and P is the perimeter of the object, for individual objects comprised in the structure of the sludge. The resulting database confirmed the ‘self-similarity’ of the structural objects in the studied groups of sludge, which enabled calculating their fractal dimension and proposing models for these objects. A quantitative description of the sludge aggregates permitted proposing a mechanism of the processes responsible for their formation. In the paper, also, the impact of the structure of the investigated sludge on the process of sedimentation, and dehydration of the thickened sludge after sedimentation, was discussed. PMID:26549812
Quantitative ptychographic reconstruction by applying a probe constraint
NASA Astrophysics Data System (ADS)
Reinhardt, J.; Schroer, C. G.
2018-04-01
The coherent scanning technique X-ray ptychography has become a routine tool for high-resolution imaging and nanoanalysis in various fields of research such as chemistry, biology or materials science. Often the ptychographic reconstruction results are analysed in order to yield absolute quantitative values for the object transmission and illuminating probe function. In this work, we address a common ambiguity encountered in scaling the object transmission and probe intensity via the application of an additional constraint to the reconstruction algorithm. A ptychographic measurement of a model sample containing nanoparticles is used as a test data set against which to benchmark in the reconstruction results depending on the type of constraint used. Achieving quantitative absolute values for the reconstructed object transmission is essential for advanced investigation of samples that are changing over time, e.g., during in-situ experiments or in general when different data sets are compared.
40 CFR 35.102 - Definitions of terms.
Code of Federal Regulations, 2010 CFR
2010-07-01
... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...
40 CFR 35.102 - Definitions of terms.
Code of Federal Regulations, 2012 CFR
2012-07-01
... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...
40 CFR 35.102 - Definitions of terms.
Code of Federal Regulations, 2014 CFR
2014-07-01
... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...
40 CFR 35.102 - Definitions of terms.
Code of Federal Regulations, 2011 CFR
2011-07-01
... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...
40 CFR 35.102 - Definitions of terms.
Code of Federal Regulations, 2013 CFR
2013-07-01
... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...
A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images
Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.
1986-01-01
The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16
Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve; Constantino, John; Povinelli, Daniel; Pruett, John R.
2011-01-01
Objective Comparative studies of social responsiveness, an ability that is impaired in autistic spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species (human-chimpanzee) social responsiveness measure. Method We translated the Social Responsiveness Scale (SRS), an instrument that quantifies human social responsiveness, into an analogous instrument for chimpanzees. We then retranslated this "Chimp SRS" into a human "Cross-Species SRS" (XSRS). We evaluated three groups of chimpanzees (n=29) with the Chimp SRS and typical and autistic spectrum disorder (ASD) human children (n=20) with the XSRS. Results The Chimp SRS demonstrated strong inter-rater reliability at the three sites (ranges for individual ICCs: .534–.866 and mean ICCs: .851–.970). As has been observed in humans, exploratory principal components analysis of Chimp SRS scores supports a single factor underlying chimpanzee social responsiveness. Human subjects' XSRS scores were fully concordant with their SRS scores (r=.976, p=.001) and distinguished appropriately between typical and ASD subjects. One chimpanzee known for inappropriate social behavior displayed a significantly higher score than all other chimpanzees at its site, demonstrating the scale's ability to detect impaired social responsiveness in chimpanzees. Conclusion Our initial cross-species social responsiveness scale proved reliable and discriminated differences in social responsiveness across (in a relative sense) and within (in a more objectively quantifiable manner) humans and chimpanzees. PMID:21515200
Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik
2016-01-01
We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2015-01-01
Pulsed photothermal radiometry (PPTR) allows noninvasive determination of laser-induced temperature depth profiles in optically scattering layered structures. The obtained profiles provide information on spatial distribution of selected chromophores such as melanin and hemoglobin in human skin. We apply the described approach to study time evolution of incidental bruises (hematomas) in human subjects. By combining numerical simulations of laser energy deposition in bruised skin with objective fitting of the predicted and measured PPTR signals, we can quantitatively characterize the key processes involved in bruise evolution (i.e., hemoglobin mass diffusion and biochemical decomposition). Simultaneous analysis of PPTR signals obtained at various times post injury provides an insight into the variations of these parameters during the bruise healing process. The presented methodology and results advance our understanding of the bruise evolution and represent an important step toward development of an objective technique for age determination of traumatic bruises in forensic medicine.
Fuzzy Logic Approaches to Multi-Objective Decision-Making in Aerospace Applications
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1994-01-01
Fuzzy logic allows for the quantitative representation of multi-objective decision-making problems which have vague or fuzzy objectives and parameters. As such, fuzzy logic approaches are well-suited to situations where alternatives must be assessed by using criteria that are subjective and of unequal importance. This paper presents an overview of fuzzy logic and provides sample applications from the aerospace industry. Applications include an evaluation of vendor proposals, an analysis of future space vehicle options, and the selection of a future space propulsion system. On the basis of the results provided in this study, fuzzy logic provides a unique perspective on the decision-making process, allowing the evaluator to assess the degree to which each option meets the evaluation criteria. Future decision-making should take full advantage of fuzzy logic methods to complement existing approaches in the selection of alternatives.
Kim, Hyun Joo; Bang, Ji-In; Kim, Ji-Young; Moon, Jae Hoon; So, Young
2017-01-01
Objective Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. Materials and Methods A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of 99mTc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and %uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. Results GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only %uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A %uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). Conclusion A novel parameter of thyroid %uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients. PMID:28458607
Andrzejak, Ralph G.; Hauf, Martinus; Pollo, Claudio; Müller, Markus; Weisstanner, Christian; Wiest, Roland; Schindler, Kaspar
2015-01-01
Background Epilepsy surgery is a potentially curative treatment option for pharmacoresistent patients. If non-invasive methods alone do not allow to delineate the epileptogenic brain areas the surgical candidates undergo long-term monitoring with intracranial EEG. Visual EEG analysis is then used to identify the seizure onset zone for targeted resection as a standard procedure. Methods Despite of its great potential to assess the epileptogenicty of brain tissue, quantitative EEG analysis has not yet found its way into routine clinical practice. To demonstrate that quantitative EEG may yield clinically highly relevant information we retrospectively investigated how post-operative seizure control is associated with four selected EEG measures evaluated in the resected brain tissue and the seizure onset zone. Importantly, the exact spatial location of the intracranial electrodes was determined by coregistration of pre-operative MRI and post-implantation CT and coregistration with post-resection MRI was used to delineate the extent of tissue resection. Using data-driven thresholding, quantitative EEG results were separated into normally contributing and salient channels. Results In patients with favorable post-surgical seizure control a significantly larger fraction of salient channels in three of the four quantitative EEG measures was resected than in patients with unfavorable outcome in terms of seizure control (median over the whole peri-ictal recordings). The same statistics revealed no association with post-operative seizure control when EEG channels contributing to the seizure onset zone were studied. Conclusions We conclude that quantitative EEG measures provide clinically relevant and objective markers of target tissue, which may be used to optimize epilepsy surgery. The finding that differentiation between favorable and unfavorable outcome was better for the fraction of salient values in the resected brain tissue than in the seizure onset zone is consistent with growing evidence that spatially extended networks might be more relevant for seizure generation, evolution and termination than a single highly localized brain region (i.e. a “focus”) where seizures start. PMID:26513359
Saving Educational Dollars through Quality Objectives.
ERIC Educational Resources Information Center
Alvir, Howard P.
This document is a collection of working papers written to meet the specific needs of teachers who are starting to think about and write performance objectives. It emphasizes qualitative objectives as opposed to quantitative classroom goals. The author describes quality objectives as marked by their clarity, accessibility, accountability, and…
Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.
2009-01-01
Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086
Effects of complex aural stimuli on mental performance.
Vij, Mohit; Aghazadeh, Fereydoun; Ray, Thomas G; Hatipkarasulu, Selen
2003-06-01
The objective of this study is to investigate the effect of complex aural stimuli on mental performance. A series of experiments were designed to obtain data for two different analyses. The first analysis is a "Stimulus" versus "No-stimulus" comparison for each of the four dependent variables, i.e. quantitative ability, reasoning ability, spatial ability and memory of an individual, by comparing the control treatment with the rest of the treatments. The second set of analysis is a multi-variant analysis of variance for component level main effects and interactions. The two component factors are tempo of the complex aural stimuli and sound volume level, each administered at three discrete levels for all four dependent variables. Ten experiments were conducted on eleven subjects. It was found that complex aural stimuli influence the quantitative and spatial aspect of the mind, while the reasoning ability was unaffected by the stimuli. Although memory showed a trend to be worse with the presence of complex aural stimuli, the effect was statistically insignificant. Variation in tempo and sound volume level of an aural stimulus did not significantly affect the mental performance of an individual. The results of these experiments can be effectively used in designing work environments.
Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T
2016-01-01
The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.
de los Reyes-Guzmán, Ana; Dimbwadyo-Terrer, Iris; Trincado-Alonso, Fernando; Monasterio-Huelin, Félix; Torricelli, Diego; Gil-Agudo, Angel
2014-08-01
Quantitative measures of human movement quality are important for discriminating healthy and pathological conditions and for expressing the outcomes and clinically important changes in subjects' functional state. However the most frequently used instruments for the upper extremity functional assessment are clinical scales, that previously have been standardized and validated, but have a high subjective component depending on the observer who scores the test. But they are not enough to assess motor strategies used during movements, and their use in combination with other more objective measures is necessary. The objective of the present review is to provide an overview on objective metrics found in literature with the aim of quantifying the upper extremity performance during functional tasks, regardless of the equipment or system used for registering kinematic data. A search in Medline, Google Scholar and IEEE Xplore databases was performed following a combination of a series of keywords. The full scientific papers that fulfilled the inclusion criteria were included in the review. A set of kinematic metrics was found in literature in relation to joint displacements, analysis of hand trajectories and velocity profiles. These metrics were classified into different categories according to the movement characteristic that was being measured. These kinematic metrics provide the starting point for a proposed objective metrics for the functional assessment of the upper extremity in people with movement disorders as a consequence of neurological injuries. Potential areas of future and further research are presented in the Discussion section. Copyright © 2014 Elsevier Ltd. All rights reserved.
TH-A-207B-00: Shear-Wave Imaging and a QIBA US Biomarker Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, S.
Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garra, B.
Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less
Daher, Céline; Pimenta, Vanessa; Bellot-Gurlet, Ludovic
2014-11-01
The compositions of ancient varnishes are mainly determined destructively by separation methods coupled to mass spectrometry. In this study, a methodology for non-invasive quantitative analyses of varnishes by vibrational spectroscopies is proposed. For that, experimental simplified varnishes of colophony and linseed oil were prepared according to 18th century traditional recipes with an increasing mass concentration ratio of colophony/linseed oil. FT-Raman and IR analyses using ATR and non-invasive reflectance modes were done on the "pure" materials and on the different mixtures. Then, a new approach involving spectral decomposition calculation was developed considering the mixture spectra as a linear combination of the pure materials ones, and giving a relative amount of each component. Specific spectral regions were treated and the obtained results show a good accuracy between the prepared and calculated amounts of the two compounds. We were thus able to detect and quantify from 10% to 50% of colophony in linseed oil using non-invasive techniques that can also be conducted in situ with portable instruments when it comes to museum varnished objects and artifacts. Copyright © 2014 Elsevier B.V. All rights reserved.
Ranacher, Peter; Tzavella, Katerina
2014-05-27
In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods.
Ranacher, Peter; Tzavella, Katerina
2014-01-01
In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods. PMID:27019646
Tracking moving targets behind a scattering medium via speckle correlation.
Guo, Chengfei; Liu, Jietao; Wu, Tengfei; Zhu, Lei; Shao, Xiaopeng
2018-02-01
Tracking moving targets behind a scattering medium is a challenge, and it has many important applications in various fields. Owing to the multiple scattering, instead of the object image, only a random speckle pattern can be received on the camera when light is passing through highly scattering layers. Significantly, an important feature of a speckle pattern has been found, and it showed the target information can be derived from the speckle correlation. In this work, inspired by the notions used in computer vision and deformation detection, by specific simulations and experiments, we demonstrate a simple object tracking method, in which by using the speckle correlation, the movement of a hidden object can be tracked in the lateral direction and axial direction. In addition, the rotation state of the moving target can also be recognized by utilizing the autocorrelation of a speckle. This work will be beneficial for biomedical applications in the fields of quantitative analysis of the working mechanisms of a micro-object and the acquisition of dynamical information of the micro-object motion.
Returning to Work after Cancer: Quantitative Studies and Prototypical Narratives
Steiner, John F.; Nowels, Carolyn T.; Main, Deborah S.
2009-01-01
Objective A combination of quantitative data and illustrative narratives may allow cancer survivorship researchers to disseminate their research findings more broadly. We identified recent, methodologically rigorous quantitative studies on return to work after cancer, summarized the themes from these studies, and illustrated those themes with narratives of individual cancer survivors. Methods We reviewed English-language studies of return to work for adult cancer survivors through June, 2008, and identified 13 general themes from papers that met methodological criteria (population-based sampling, prospective and longitudinal assessment, detailed assessment of work, evaluation of economic impact, assessment of moderators of work return, and large sample size). We drew survivorship narratives from a prior qualitative research study to illustrate these themes. Results Nine quantitative studies met 4 or more of our 6 methodological criteria. These studies suggested that most cancer survivors could return to work without residual disabilities. Cancer site, clinical prognosis, treatment modalities, socioeconomic status, and attributes of the job itself influenced the likelihood of work return. Three narratives - a typical survivor who returned to work after treatment, an individual unable to return to work, and an inspiring survivor who returned to work despite substantial barriers - illustrated many of the themes from the quantitative literature while providing additional contextual details. Conclusion Illustrative narratives can complement the findings of cancer survivorship research if researchers are rigorous and transparent in the selection, analysis, and retelling of those stories. PMID:19507264
McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R
2007-05-01
The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.
[Acoustic and aerodynamic characteristics of the oesophageal voice].
Vázquez de la Iglesia, F; Fernández González, S
2005-12-01
The aim of the study is to determine the physiology and pathophisiology of esophageal voice according to objective aerodynamic and acoustic parameters (quantitative and qualitative parameters). Our subjects were comprised of 33 laryngectomized patients (all male) that underwent aerodynamic, acoustic and perceptual protocol. There is a statistical association between acoustic and aerodynamic qualitative parameters (phonation flow chart type, sound spectrum, perceptual analysis) among quantitative parameters (neoglotic pressure, phonation flow, phonation time, fundamental frequency, maximum intensity sound level, speech rate). Nevertheles, not always such observations bring practical resources to clinical practice. We consider that the facts studied may enable us to add, pragmatically, new resources to the more effective vocal rehabilitation to these patients. The physiology of esophageal voice is well understood by the method we have applied, also seeking for rehabilitation, improving oral communication skills in the laryngectomee population.
Watkins, Daphne C.; Wharton, Tracy; Mitchell, Jamie A.; Matusko, Niki; Kales, Helen
2016-01-01
The purpose of this study was to explore the role of non-spousal family support on mental health among older, church-going African American men. The mixed methods objective was to employ a design that used existing qualitative and quantitative data to explore the interpretive context within which social and cultural experiences occur. Qualitative data (n=21) were used to build a conceptual model that was tested using quantitative data (n= 401). Confirmatory factor analysis indicated an inverse association between non-spousal family support and distress. The comparative fit index, Tucker-Lewis fit index, and root mean square error of approximation indicated good model fit. This study offers unique methodological approaches to using existing, complementary data sources to understand the health of African American men. PMID:28943829
The cutting edge - Micro-CT for quantitative toolmark analysis of sharp force trauma to bone.
Norman, D G; Watson, D G; Burnett, B; Fenne, P M; Williams, M A
2018-02-01
Toolmark analysis involves examining marks created on an object to identify the likely tool responsible for creating those marks (e.g., a knife). Although a potentially powerful forensic tool, knife mark analysis is still in its infancy and the validation of imaging techniques as well as quantitative approaches is ongoing. This study builds on previous work by simulating real-world stabbings experimentally and statistically exploring quantitative toolmark properties, such as cut mark angle captured by micro-CT imaging, to predict the knife responsible. In Experiment 1 a mechanical stab rig and two knives were used to create 14 knife cut marks on dry pig ribs. The toolmarks were laser and micro-CT scanned to allow for quantitative measurements of numerous toolmark properties. The findings from Experiment 1 demonstrated that both knives produced statistically different cut mark widths, wall angle and shapes. Experiment 2 examined knife marks created on fleshed pig torsos with conditions designed to better simulate real-world stabbings. Eight knives were used to generate 64 incision cut marks that were also micro-CT scanned. Statistical exploration of these cut marks suggested that knife type, serrated or plain, can be predicted from cut mark width and wall angle. Preliminary results suggest that knives type can be predicted from cut mark width, and that knife edge thickness correlates with cut mark width. An additional 16 cut marks walls were imaged for striation marks using scanning electron microscopy with results suggesting that this approach might not be useful for knife mark analysis. Results also indicated that observer judgements of cut mark shape were more consistent when rated from micro-CT images than light microscopy images. The potential to combine micro-CT data, medical grade CT data and photographs to develop highly realistic virtual models for visualisation and 3D printing is also demonstrated. This is the first study to statistically explore simulated real-world knife marks imaged by micro-CT to demonstrate the potential of quantitative approaches in knife mark analysis. Findings and methods presented in this study are relevant to both forensic toolmark researchers as well as practitioners. Limitations of the experimental methodologies and imaging techniques are discussed, and further work is recommended. Copyright © 2017 Elsevier B.V. All rights reserved.
Kamilari, Eleni; Farsalinos, Konstantinos; Poulas, Konstantinos; Kontoyannis, Christos G; Orkoula, Malvina G
2018-06-01
Electronic cigarettes are considered healthier alternatives to conventional cigarettes containing tobacco. They produce vapor through heating of the refill liquids (e-liquids) which consist of propylene glycol, vegetable glycerin, nicotine (in various concentrations), water and flavoring agents. Heavy metals may enter the refill liquid during the production, posing a risk for consumer's health due to their toxicity. The objective of the present study was the development of a methodology for the detection and quantitative analysis of cadmium (Cd), lead (Pb), nickel (Ni), copper (Cu), arsenic (As) and chromium (Cr), employing Total Reflection X-Ray Fluorescence Spectroscopy (TXRF) as an alternative technique to ICP-MS or ICP-OES commonly used for this type of analysis. TXRF was chosen due to its advantages, which include short analysis time, promptness, simultaneous multi-element analysis capability and minimum sample preparation, low purchase and operational cost. The proposed methodology was applied to a large number of electronic cigarette liquids commercially available, as well as their constituents, in order to evaluate their safety. TXRF may be a valuable tool for probing heavy metals in electronic cigarette refill liquids to serve for the protection of human health. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Quantitative Technique for Beginning Microscopists.
ERIC Educational Resources Information Center
Sundberg, Marshall D.
1984-01-01
Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)
Automatic and Objective Assessment of Alternating Tapping Performance in Parkinson's Disease
Memedi, Mevludin; Khan, Taha; Grenholm, Peter; Nyholm, Dag; Westin, Jerker
2013-01-01
This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson's disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’) and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson's Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping. PMID:24351667
Rho, Nark Kyoung; Park, Je Young; Youn, Choon Shik; Lee, Soo-Keun; Kim, Hei Sung
2017-02-01
Quantitative measurements are important for objective evaluation of postprocedural outcomes. Three-dimensional (3D) imaging is known as an objective, accurate, and reliable system for quantifying the soft tissue dimensions of the face. To compare the preprocedural and acute postprocedural nasofrontal, nasofacial, nasolabial, and nasomental angles, early changes in the height and length of the nose, and nasal volume using a 3D surface imaging with a light-emitting diode. The 3D imaging analysis of 40 Korean women who underwent structured nonsurgical rhinoplasty was conducted. The 3D assessment was performed before, immediately after, 1 day, and 2 weeks after filler rhinoplasty with a Morpheus 3D scanner (Morpheus Co., Seoul, Korea). There were significant early changes in facial profile following nonsurgical rhinoplasty with a hyaluronic acid filler. An average increase of 6.03° in the nasofrontal angle, an increase of 3.79° in the nasolabial angle, increase of 0.88° in the nasomental angle, and a reduction of 0.83° in the nasofacial angle was observed at 2 weeks of follow-up. Increment in nasal volume and nose height was also found after 2 weeks. Side effects, such as hematoma, nodules, and skin necrosis, were not observed. The 3D surface imaging quantitatively demonstrated the early changes in facial profile after structured filler rhinoplasty. The study results describe significant acute spatial changes in nose shape following treatment.
Genome Scale Modeling in Systems Biology: Algorithms and Resources
Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali
2014-01-01
In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031
Video methods in the quantification of children's exposures.
Ferguson, Alesia C; Canales, Robert A; Beamer, Paloma; Auyeung, Willa; Key, Maya; Munninghoff, Amy; Lee, Kevin Tse-Wing; Robertson, Alexander; Leckie, James O
2006-05-01
In 1994, Stanford University's Exposure Research Group (ERG) conducted its first pilot study to collect micro-level activity time series (MLATS) data for young children. The pilot study involved videotaping four children of farm workers in the Salinas Valley of California and converting their videotaped activities to valuable text files of contact behavior using video-translation techniques. These MLATS are especially useful for describing intermittent dermal (i.e., second-by-second account of surfaces and objects contacted) and non-dietary ingestion (second-by-second account of objects or hands placed in the mouth) contact behavior. Second-by-second records of children contact behavior are amenable to quantitative and statistical analysis and allow for more accurate model estimates of human exposure and dose to environmental contaminants. Activity patterns data for modeling inhalation exposure (i.e., accounts of microenvironments visited) can also be extracted from the MLATS data. Since the pilot study, ERG has collected an immense MLATS data set for 92 children using more developed and refined videotaping and video-translation methodologies. This paper describes all aspects required for the collection of MLATS including: subject recruitment techniques, videotaping and video-translation processes, and potential data analysis. This paper also describes the quality assurance steps employed for these new MLATS projects, including: training, data management, and the application of interobserver and intraobserver agreement during video translation. The discussion of these issues and ERG's experiences in dealing with them can assist other groups in the conduct of research that employs these more quantitative techniques.
Automatic and objective assessment of alternating tapping performance in Parkinson's disease.
Memedi, Mevludin; Khan, Taha; Grenholm, Peter; Nyholm, Dag; Westin, Jerker
2013-12-09
This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson's disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions ('speed', 'accuracy', 'fatigue' and 'arrhythmia') and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson's Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.
Arabidopsis phenotyping through Geometric Morphometrics.
Manacorda, Carlos A; Asurmendi, Sebastian
2018-06-18
Recently, much technical progress was achieved in the field of plant phenotyping. High-throughput platforms and the development of improved algorithms for rosette image segmentation make it now possible to extract shape and size parameters for genetic, physiological and environmental studies on a large scale. The development of low-cost phenotyping platforms and freeware resources make it possible to widely expand phenotypic analysis tools for Arabidopsis. However, objective descriptors of shape parameters that could be used independently of platform and segmentation software used are still lacking and shape descriptions still rely on ad hoc or even sometimes contradictory descriptors, which could make comparisons difficult and perhaps inaccurate. Modern geometric morphometrics is a family of methods in quantitative biology proposed to be the main source of data and analytical tools in the emerging field of phenomics studies. Based on the location of landmarks (corresponding points) over imaged specimens and by combining geometry, multivariate analysis and powerful statistical techniques, these tools offer the possibility to reproducibly and accurately account for shape variations amongst groups and measure them in shape distance units. Here, a particular scheme of landmarks placement on Arabidopsis rosette images is proposed to study shape variation in the case of viral infection processes. Shape differences between controls and infected plants are quantified throughout the infectious process and visualized. Quantitative comparisons between two unrelated ssRNA+ viruses are shown and reproducibility issues are assessed. Combined with the newest automated platforms and plant segmentation procedures, geometric morphometric tools could boost phenotypic features extraction and processing in an objective, reproducible manner.
Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez
2014-01-01
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. © 2013 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, J; Nishikawa, R; Reiser, I
Purpose: Segmentation quality can affect quantitative image feature analysis. The objective of this study is to examine the relationship between computed tomography (CT) image quality, segmentation performance, and quantitative image feature analysis. Methods: A total of 90 pathology proven breast lesions in 87 dedicated breast CT images were considered. An iterative image reconstruction (IIR) algorithm was used to obtain CT images with different quality. With different combinations of 4 variables in the algorithm, this study obtained a total of 28 different qualities of CT images. Two imaging tasks/objectives were considered: 1) segmentation and 2) classification of the lesion as benignmore » or malignant. Twenty-three image features were extracted after segmentation using a semi-automated algorithm and 5 of them were selected via a feature selection technique. Logistic regression was trained and tested using leave-one-out-cross-validation and its area under the ROC curve (AUC) was recorded. The standard deviation of a homogeneous portion and the gradient of a parenchymal portion of an example breast were used as an estimate of image noise and sharpness. The DICE coefficient was computed using a radiologist’s drawing on the lesion. Mean DICE and AUC were used as performance metrics for each of the 28 reconstructions. The relationship between segmentation and classification performance under different reconstructions were compared. Distributions (median, 95% confidence interval) of DICE and AUC for each reconstruction were also compared. Results: Moderate correlation (Pearson’s rho = 0.43, p-value = 0.02) between DICE and AUC values was found. However, the variation between DICE and AUC values for each reconstruction increased as the image sharpness increased. There was a combination of IIR parameters that resulted in the best segmentation with the worst classification performance. Conclusion: There are certain images that yield better segmentation or classification performance. The best segmentation Result does not necessarily lead to the best classification Result. This work has been supported in part by grants from the NIH R21-EB015053. R Nishikawa is receives royalties form Hologic, Inc.« less
Vibration analysis based on electronic stroboscopic speckle-shearing pattern interferometry
NASA Astrophysics Data System (ADS)
Jia, Dagong; Yu, Changsong; Xu, Tianhua; Jin, Chao; Zhang, Hongxia; Jing, Wencai; Zhang, Yimo
2008-12-01
In this paper, an electronic speckle-shearing pattern interferometer with pulsed laser and pulse frequency controller is fabricated. The principle of measuring the vibration in the object using electronic stroboscopic speckle--shearing pattern interferometer is analyzed. Using a metal plate, the edge of which is clamped, as an experimental specimen, the shear interferogram are obtained under two experimental frequencies, 100 Hz and 200 Hz. At the same time, the vibration of this metal plate under the same experimental conditions is measured using the time-average method in order to test the performance of this electronic stroboscopic speckle-shearing pattern interferometer. The result indicated that the fringe of shear interferogram become dense with the experimental frequency increasing. Compared the fringe pattern obtained by the stroboscopic method with the fringe obtained by the time-average method, the shearing interferogram of stroboscopic method is clearer than the time-average method. In addition, both the time-average method and stroboscopic method are suited for qualitative analysis for the vibration of the object. More over, the stroboscopic method is well adapted to quantitative vibration analysis.
Improved analysis of palm creases
Park, Jin Seo; Shin, Dong Sun; Jung, Wonsug
2010-01-01
Palm creases are helpful in revealing anthropologic characteristics and diagnosing chromosomal aberrations, and have been analyzed qualitatively and quantitatively. However, previous methods of analyzing palm creases were not objective so that reproducibility could not be guaranteed. In this study, a more objective morphologic analysis of palm creases was developed. The features of the improved methods include the strict definition of major and minor palm creases and the systematic classification of major palm creases based on their relationships, branches, and variants. Furthermore, based on the analysis of 3,216 Koreans, palm creases were anthropologically interpreted. There was a tendency for palm creases to be evenly distributed on the palm, which was acknowledged by the relationship between major and minor creases as well as by the incidences of major creases types. This tendency was consistent with the role of palm creases to facilitate folding of palm skin. The union of major palm creases was frequent in males and right palms to have powerful hand grip. The new method of analyzing palm creases is expected to be widely used for anthropologic investigation and chromosomal diagnosis. PMID:21189999
Multiparametric Analysis of the Tumor Microenvironment: Hypoxia Markers and Beyond.
Mayer, Arnulf; Vaupel, Peter
2017-01-01
We have established a novel in situ protein analysis pipeline, which is built upon highly sensitive, multichannel immunofluorescent staining of paraffin sections of human and xenografted tumor tissue. Specimens are digitized using slide scanners equipped with suitable light sources and fluorescence filter combinations. Resulting digital images are subsequently subjected to quantitative image analysis using a primarily object-based approach, which comprises segmentation of single cells or higher-order structures (e.g., blood vessels), cell shape approximation, measurement of signal intensities in individual fluorescent channels and correlation of these data with positional information for each object. Our approach could be particularly useful for the study of the hypoxic tumor microenvironment as it can be utilized to systematically explore the influence of spatial factors on cell phenotypes, e.g., the distance of a given cell type from the nearest blood vessel on the cellular expression of hypoxia-associated biomarkers and other proteins reflecting their specific state of activation or function. In this report, we outline the basic methodology and provide an outlook on possible use cases.
Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S
2014-01-01
The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).
de Gramatica, Martina; Massacci, Fabio; Shim, Woohyun; Turhan, Uğur; Williams, Julian
2017-02-01
We analyze the issue of agency costs in aviation security by combining results from a quantitative economic model with a qualitative study based on semi-structured interviews. Our model extends previous principal-agent models by combining the traditional fixed and varying monetary responses to physical and cognitive effort with nonmonetary welfare and potentially transferable value of employees' own human capital. To provide empirical evidence for the tradeoffs identified in the quantitative model, we have undertaken an extensive interview process with regulators, airport managers, security personnel, and those tasked with training security personnel from an airport operating in a relatively high-risk state, Turkey. Our results indicate that the effectiveness of additional training depends on the mix of "transferable skills" and "emotional" buy-in of the security agents. Principals need to identify on which side of a critical tipping point their agents are to ensure that additional training, with attached expectations of the burden of work, aligns the incentives of employees with the principals' own objectives. © 2016 Society for Risk Analysis.
Data-Driven Surface Traversability Analysis for Mars 2020 Landing Site Selection
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Rothrock, Brandon; Almeida, Eduardo; Ansar, Adnan; Otero, Richard; Huertas, Andres; Heverly, Matthew
2015-01-01
The objective of this paper is three-fold: 1) to describe the engineering challenges in the surface mobility of the Mars 2020 Rover mission that are considered in the landing site selection processs, 2) to introduce new automated traversability analysis capabilities, and 3) to present the preliminary analysis results for top candidate landing sites. The analysis capabilities presented in this paper include automated terrain classification, automated rock detection, digital elevation model (DEM) generation, and multi-ROI (region of interest) route planning. These analysis capabilities enable to fully utilize the vast volume of high-resolution orbiter imagery, quantitatively evaluate surface mobility requirements for each candidate site, and reject subjectivity in the comparison between sites in terms of engineering considerations. The analysis results supported the discussion in the Second Landing Site Workshop held in August 2015, which resulted in selecting eight candidate sites that will be considered in the third workshop.
Systemic Analysis Approaches for Air Transportation
NASA Technical Reports Server (NTRS)
Conway, Sheila
2005-01-01
Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.
Nicholas, Dequina; Proctor, Elizabeth A; Raval, Forum M; Ip, Blanche C; Habib, Chloe; Ritou, Eleni; Grammatopoulos, Tom N; Steenkamp, Devin; Dooms, Hans; Apovian, Caroline M; Lauffenburger, Douglas A; Nikolajczyk, Barbara S
2017-01-01
Numerous studies show that mitochondrial energy generation determines the effectiveness of immune responses. Furthermore, changes in mitochondrial function may regulate lymphocyte function in inflammatory diseases like type 2 diabetes. Analysis of lymphocyte mitochondrial function has been facilitated by introduction of 96-well format extracellular flux (XF96) analyzers, but the technology remains imperfect for analysis of human lymphocytes. Limitations in XF technology include the lack of practical protocols for analysis of archived human cells, and inadequate data analysis tools that require manual quality checks. Current analysis tools for XF outcomes are also unable to automatically assess data quality and delete untenable data from the relatively high number of biological replicates needed to power complex human cell studies. The objectives of work presented herein are to test the impact of common cellular manipulations on XF outcomes, and to develop and validate a new automated tool that objectively analyzes a virtually unlimited number of samples to quantitate mitochondrial function in immune cells. We present significant improvements on previous XF analyses of primary human cells that will be absolutely essential to test the prediction that changes in immune cell mitochondrial function and fuel sources support immune dysfunction in chronic inflammatory diseases like type 2 diabetes.
Standardizing Quality Assessment of Fused Remotely Sensed Images
NASA Astrophysics Data System (ADS)
Pohl, C.; Moellmann, J.; Fries, K.
2017-09-01
The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.
A framing theory-based content analysis of a Turkish newspaper's coverage of nanotechnology
NASA Astrophysics Data System (ADS)
Şenocak, Erdal
2017-07-01
This study aims at examining how nanotechnology is covered in Turkish print media. As an initial part of this objective, a total of 76 articles derived from a widespread national newspaper were analyzed based on the framing theory. These articles were analyzed using both quantitative and qualitative traditions of content analysis; however, the quantitative method was the primary form of investigation. The analyses showed that the first news about nanotechnology appeared in 1991 and the frequencies of articles had increased in the subsequent years; but the number of articles had decreased after a while. The findings demonstrated a remarkable positive tone in the articles; there were only a few articles in negative tones and these articles were published in the first years of nanotechnology news. It was further found that the articles were mostly concerned with the implementations of nanotechnology, such as research and education centers, medical, and electronics. The study also investigated the presentation style of nanotechnology news. In other words, it investigated how the articles were framed. The results showed that the articles were mostly framed with scientific researches or discoveries and future expectations.
Quantitative analysis of brain magnetic resonance imaging for hepatic encephalopathy
NASA Astrophysics Data System (ADS)
Syh, Hon-Wei; Chu, Wei-Kom; Ong, Chin-Sing
1992-06-01
High intensity lesions around ventricles have recently been observed in T1-weighted brain magnetic resonance images for patients suffering hepatic encephalopathy. The exact etiology that causes magnetic resonance imaging (MRI) gray scale changes has not been totally understood. The objective of our study was to investigate, through quantitative means, (1) the amount of changes to brain white matter due to the disease process, and (2) the extent and distribution of these high intensity lesions, since it is believed that the abnormality may not be entirely limited to the white matter only. Eleven patients with proven haptic encephalopathy and three normal persons without any evidence of liver abnormality constituted our current data base. Trans-axial, sagittal, and coronal brain MRI were obtained on a 1.5 Tesla scanner. All processing was carried out on a microcomputer-based image analysis system in an off-line manner. Histograms were decomposed into regular brain tissues and lesions. Gray scale ranges coded as lesion were then brought back to original images to identify distribution of abnormality. Our results indicated the disease process involved pallidus, mesencephalon, and subthalamic regions.
Han, Jung Mi; Boo, Eun Hee; Kim, Jung A; Yoon, Soo Jin; Kim, Seong Woo
2012-01-01
Objectives This study evaluated the qualitative and quantitative performances of the newly developed information system which was implemented on November 4, 2011 at the National Health Insurance Corporation Ilsan Hospital. Methods Registration waiting time and changes in the satisfaction scores for the key performance indicators (KPI) before and after the introduction of the system were compared; and the economic effects of the system were analyzed by using the information economics approach. Results After the introduction of the system, the waiting time for registration was reduced by 20%, and the waiting time at the internal medicine department was reduced by 15%. The benefit-to-cost ratio was increased to 1.34 when all intangible benefits were included in the economic analysis. Conclusions The economic impact and target satisfaction rates increased due to the introduction of the new system. The results were proven by the quantitative and qualitative analyses carried out in this study. This study was conducted only seven months after the introduction of the system. As such, a follow-up study should be carried out in the future when the system stabilizes. PMID:23115744
Quantitative analysis of a reconstruction method for fully three-dimensional PET.
Suckling, J; Ott, R J; Deehan, B J
1992-03-01
The major advantage of positron emission tomography (PET) using large area planar detectors over scintillator-based commercial ring systems is the potentially larger (by a factor of two or three) axial field-of-view (FOV). However, to achieve the space invariance of the point spread function necessary for Fourier filtering a polar angle rejection criterion is applied to the data during backprojection resulting in a trade-off between FOV size and sensitivity. A new algorithm due to Defrise and co-workers developed for list-mode data overcomes this problem with a solution involving the division of the image into several subregions. A comparison between the existing backprojection-then-filter algorithm and the new method (with three subregions) has been made using both simulated and real data collected from the MUP-PET positron camera. Signal-to-noise analysis reveals that improvements of up to a factor of 1.4 are possible resulting from an increased data usage of up to a factor of 2.5 depending on the axial extent of the imaged object. Quantitation is also improved.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
Qasim, Ban J.; Ali, Hussam H.; Hussein, Alaa G.
2013-01-01
Background/Aim: To evaluate the immunohistochemical expression of matrix metalloproteinase-7 (MMP-7) in colorectal adenomas, and to correlate this expression with different clinicopathological parameters. Patients and Methods: The study was retrospectively designed. Thirty three paraffin blocks from patients with colorectal adenoma and 20 samples of non-tumerous colonic tissue taken as control group were included in the study. MMP-7 expression was assessed by immunohistochemistry method. The scoring of immunohistochemical staining was conducted utilizing a specified automated cellular image analysis system (Digimizer). Results: The frequency of positive immunohistochemical expression of MMP-7 was significantly higher in adenoma than control group (45.45% versus 10%) (P value < 0.001). Strong MMP-7 staining was mainly seen in adenoma cases (30.30%) in comparison with control (0%) the difference is significant (P < 0.001). The three digital parameters of MMP-7 immunohistochemical expression (Area (A), Number of objects (N), and intensity (I)) were significantly higher in adenoma than control. Mean (A and I) of MMP-7 showed a significant correlation with large sized adenoma (≥ 1cm) (P < 0.05), also a significant positive correlation of the three digital parameters (A, N, and I) of MMP-7 expression with villous configuration and severe dysplasia in colorectal adenoma had been identified (P < 0.05). Conclusion: MMP-7 plays an important role in the growth and malignant conversion of colorectal adenomas as it is more likely to be expressed in advanced colorectal adenomatous polyps with large size, severe dysplasia and villous histology. The use of automated cellular image analysis system (Digmizer) to quantify immunohistochemical staining yields more consistent assay results, converts semi-quantitative assay to a truly quantitative assay, and improves assay objectivity and reproducibility. PMID:23319034
Hong, Ha; Solomon, Ethan A.; DiCarlo, James J.
2015-01-01
To go beyond qualitative models of the biological substrate of object recognition, we ask: can a single ventral stream neuronal linking hypothesis quantitatively account for core object recognition performance over a broad range of tasks? We measured human performance in 64 object recognition tests using thousands of challenging images that explore shape similarity and identity preserving object variation. We then used multielectrode arrays to measure neuronal population responses to those same images in visual areas V4 and inferior temporal (IT) cortex of monkeys and simulated V1 population responses. We tested leading candidate linking hypotheses and control hypotheses, each postulating how ventral stream neuronal responses underlie object recognition behavior. Specifically, for each hypothesis, we computed the predicted performance on the 64 tests and compared it with the measured pattern of human performance. All tested hypotheses based on low- and mid-level visually evoked activity (pixels, V1, and V4) were very poor predictors of the human behavioral pattern. However, simple learned weighted sums of distributed average IT firing rates exactly predicted the behavioral pattern. More elaborate linking hypotheses relying on IT trial-by-trial correlational structure, finer IT temporal codes, or ones that strictly respect the known spatial substructures of IT (“face patches”) did not improve predictive power. Although these results do not reject those more elaborate hypotheses, they suggest a simple, sufficient quantitative model: each object recognition task is learned from the spatially distributed mean firing rates (100 ms) of ∼60,000 IT neurons and is executed as a simple weighted sum of those firing rates. SIGNIFICANCE STATEMENT We sought to go beyond qualitative models of visual object recognition and determine whether a single neuronal linking hypothesis can quantitatively account for core object recognition behavior. To achieve this, we designed a database of images for evaluating object recognition performance. We used multielectrode arrays to characterize hundreds of neurons in the visual ventral stream of nonhuman primates and measured the object recognition performance of >100 human observers. Remarkably, we found that simple learned weighted sums of firing rates of neurons in monkey inferior temporal (IT) cortex accurately predicted human performance. Although previous work led us to expect that IT would outperform V4, we were surprised by the quantitative precision with which simple IT-based linking hypotheses accounted for human behavior. PMID:26424887
Sankhalkar, Sangeeta; Vernekar, Vrunda
2016-01-01
Number of secondary compounds is produced by plants as natural antioxidants. Moringa oleifera Lam. and Ocimum tenuiflorum L. are known for their wide applications in food and pharmaceutical industry. To compare phenolic and flavonoid content in M. oleifera Lam and O. tenuiflorum L. by quantitative and qualitative analysis. Phenolic and flavonoid content were studied spectrophotometrically and by paper chromatography in M. oleifera Lam. and O. tenuiflorum L. Higher phenolic and flavonoid content were observed in Moringa leaf and flower. Ocimum flower showed higher phenolic content and low flavonoid in comparison to Moringa. Flavonoids such as biflavonyl, flavones, glycosylflavones, and kaempferol were identified by paper chromatography. Phytochemical analysis for flavonoid, tannins, saponins, alkaloids, reducing sugars, and anthraquinones were tested positive for Moringa and Ocimum leaf as well as flower. In the present study higher phenolic and flavonoid content, indicated the natural antioxidant nature of Moringa and Ocimum signifying their medicinal importance. Moringa oleifera Lam. and Ocimum tenuiflorum L. are widly grown in India and are known for their medicinal properties. Number of secondary metabolites like phenolics and flavonoids are known to be present in both the plants. The present study was conducted with an objective to qualitatively and quantitatively compare the phenolics and flavanoids in these two medicinally important plants.Quantitation of total phenolics and flavanoids was done by spectrophotometrically while qualitative analysis was perfomed by paper chromatography and by phytochemical tests. Our results have shown higher phenolics and flavanoid content in Moringa leaf and flower. However, higher phenolic content was absent in Ocimum flower compared to that of Moringa. Phytochemical analysis of various metabolites such as flavonoids, tanins, sapponins, alkaloids, anthraquinones revealed that both the plant extracts were rich sources of secondary metabolites and thus tested positive for the above tests. Various flavanoids and Phenolics were identified by paper chromatography based on their Rf values and significant colors. From the above study we conclude that Moringa and Ocimum are rich in natural antioxidants hence are potent source in pharmaceutical industry.
Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc
2017-05-01
Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.
[Free crystalline silica: a comparison of methods for its determination in total dust].
Maciejewska, Aleksandra; Szadkowska-Stańczyk, Irena; Kondratowicz, Grzegorz
2005-01-01
The major objective of the study was to compare and investigate the usefulness of quantitative analyses of free crystalline silica (FCS) in the assessment of dust exposure in samples of total dust of varied composition, using three methods: chemical method in common use in Poland; infrared spectrometry; and x-ray powder diffraction. Mineral composition and FCS contents were investigated in 9 laboratory samples of raw materials, materials, and industrial wastes, containing from about 2 to over 80% of crystalline silica and reduced to particles of size corresponding with that of total dust. Sample components were identified using XRD and FT-IR methods. Ten independent determinations of FCS with each of the three study methods were performed in dust samples. An analysis of linear correlation was applied to investigate interrelationship between mean FCS determinations. In analyzed dust samples, along with silica dust there were numerous minerals interfering with silica during the quantitative analysis. Comparison of mean results of FCS determinations showed that the results obtained using the FT-IR method were by 12-13% lower than those obtained with two other methods. However, the differences observed were within the limits of changeability of results associated with their precision and dependence on reference materials used. Assessment of occupational exposure to dusts containing crystalline silica can be performed on the basis of quantitative analysis of FCS in total dusts using each of the compared methods. The FT-IR method is most appropriate for the FCS determination in samples of small amount of silica or collected at low dust concentrations; the XRD method for the analysis of multicomponent samples; and the chemical method in the case of medium and high FCS contents in samples or high concentrations of dusts in the work environment.
Representational Similarity Analysis – Connecting the Branches of Systems Neuroscience
Kriegeskorte, Nikolaus; Mur, Marieke; Bandettini, Peter
2008-01-01
A fundamental challenge for systems neuroscience is to quantitatively relate its three major branches of research: brain-activity measurement, behavioral measurement, and computational modeling. Using measured brain-activity patterns to evaluate computational network models is complicated by the need to define the correspondency between the units of the model and the channels of the brain-activity data, e.g., single-cell recordings or voxels from functional magnetic resonance imaging (fMRI). Similar correspondency problems complicate relating activity patterns between different modalities of brain-activity measurement (e.g., fMRI and invasive or scalp electrophysiology), and between subjects and species. In order to bridge these divides, we suggest abstracting from the activity patterns themselves and computing representational dissimilarity matrices (RDMs), which characterize the information carried by a given representation in a brain or model. Building on a rich psychological and mathematical literature on similarity analysis, we propose a new experimental and data-analytical framework called representational similarity analysis (RSA), in which multi-channel measures of neural activity are quantitatively related to each other and to computational theory and behavior by comparing RDMs. We demonstrate RSA by relating representations of visual objects as measured with fMRI in early visual cortex and the fusiform face area to computational models spanning a wide range of complexities. The RDMs are simultaneously related via second-level application of multidimensional scaling and tested using randomization and bootstrap techniques. We discuss the broad potential of RSA, including novel approaches to experimental design, and argue that these ideas, which have deep roots in psychology and neuroscience, will allow the integrated quantitative analysis of data from all three branches, thus contributing to a more unified systems neuroscience. PMID:19104670
The Quantitative Analysis of Chennai Automotive Industry Cluster
NASA Astrophysics Data System (ADS)
Bhaskaran, Ethirajan
2016-07-01
Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity increase.
A quantitative analysis of municipal solid waste disposal charges in China.
Wu, Jian; Zhang, Weiqian; Xu, Jiaxuan; Che, Yue
2015-03-01
Rapid industrialization and economic development have caused a tremendous increase in municipal solid waste (MSW) generation in China. China began implementing a policy of MSW disposal fees for household waste management at the end of last century. Three charging methods were implemented throughout the country: a fixed disposal fee, a potable water-based disposal fee, and a plastic bag-based disposal fee. To date, there has been little qualitative or quantitative analysis on the effectiveness of this relatively new policy. This paper provides a general overview of MSW fee policy in China, attempts to verify whether the policy is successful in reducing general waste collected, and proposes an improved charging system to address current problems. The paper presents an empirical statistical analysis of policy effectiveness derived from an environmental Kuznets curve (EKC) test on panel data of China. EKC tests on different kinds of MSW charge systems were then examined for individual provinces or cities. A comparison of existing charging systems was conducted using environmental and economic criteria. The results indicate the following: (1) the MSW policies implemented over the study period were effective in the reduction of waste generation, (2) the household waste discharge fee policy did not act as a strong driver in terms of waste prevention and reduction, and (3) the plastic bag-based disposal fee appeared to be performing well according to qualitative and quantitative analysis. Based on current situation of waste discharging management in China, a three-stage transitional charging scheme is proposed and both advantages and drawbacks discussed. Evidence suggests that a transition from a fixed disposal fee to a plastic bag-based disposal fee involving various stakeholders should be the next objective of waste reduction efforts.
Suzuki, Masaharu; Ketterling, Matthew G; McCarty, Donald R
2005-09-01
We have developed a simple quantitative computational approach for objective analysis of cis-regulatory sequences in promoters of coregulated genes. The program, designated MotifFinder, identifies oligo sequences that are overrepresented in promoters of coregulated genes. We used this approach to analyze promoter sequences of Viviparous1 (VP1)/abscisic acid (ABA)-regulated genes and cold-regulated genes, respectively, of Arabidopsis (Arabidopsis thaliana). We detected significantly enriched sequences in up-regulated genes but not in down-regulated genes. This result suggests that gene activation but not repression is mediated by specific and common sequence elements in promoters. The enriched motifs include several known cis-regulatory sequences as well as previously unidentified motifs. With respect to known cis-elements, we dissected the flanking nucleotides of the core sequences of Sph element, ABA response elements (ABREs), and the C repeat/dehydration-responsive element. This analysis identified the motif variants that may correlate with qualitative and quantitative differences in gene expression. While both VP1 and cold responses are mediated in part by ABA signaling via ABREs, these responses correlate with unique ABRE variants distinguished by nucleotides flanking the ACGT core. ABRE and Sph motifs are tightly associated uniquely in the coregulated set of genes showing a strict dependence on VP1 and ABA signaling. Finally, analysis of distribution of the enriched sequences revealed a striking concentration of enriched motifs in a proximal 200-base region of VP1/ABA and cold-regulated promoters. Overall, each class of coregulated genes possesses a discrete set of the enriched motifs with unique distributions in their promoters that may account for the specificity of gene regulation.
Neuromuscular Characterization of the Urethra in Continent Women
Kenton, Kimberly; Mueller, Elizabeth; Brubaker, Linda
2011-01-01
Objectives To describe quantitative urethral function parameters in a racially diverse group of continent women. Materials and Methods Following Institutional Review Board approval, we recruited women without urinary incontinence from the community. To be considered continent, participants answered “never” to the first six questions on the stress subscale of the Medical, Epidemiologic, and Social Aspects of Aging urinary incontinence (MESA) questionnaire. Participants all underwent quantitative concentric urethral electromyography (EMG) and urodynamic testing (UDS). Results Thirty-one women with a mean±SD age of 39±14 years underwent EMG and UDS. The cohort was racially diverse with 13 Caucasians (43%), 13 African Americans (43%), and 4 Hispanics (14%). Body mass index (BMI) (P=.12, .06), age (P=.40, .64), and vaginal parity (P=.53, .76) did not differ by race or ethnicity. We did not detect differences in any EMG parameter by race, ethnicity or vaginally parity. A mean (range) of 30 motor unit action potential analysis (MUP) (10-55) were identified and analyzed in Multi-MUP analysis and 14 (8-21) were identified and analyzed in IP analysis. On average, 37±20% MUPs were polyphasic. Age significantly correlated with several measures of urethral sphincter function. Increasing age was inversely correlated with interference analysis (IP) turns (−.57, p=.001), IP amplitude (r=−.43, p=.02), IP turns/amplitude (r=−.54, p=.003), maximum urethral closure pressures (MUCP) (r=−.41, p=.04). Similarly, MUCP correlated with IP amplitude (r=.38, p=.04). Conclusions This urethral neuromuscular function data on the largest cohort of continent women fully characterized with quantitative urethral EMG demonstrates significant neuropathic MUP changes with advancing age. PMID:22453105
A quantitative approach to evolution of music and philosophy
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano
2012-08-01
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Ma, Danjun; Wang, Jiarui; Zhao, Yingchun; Lee, Wai-Nang Paul; Xiao, Jing; Go, Vay Liang W.; Wang, Qi; Recker, Robert; Xiao, Gary Guishan
2011-01-01
Objectives Novel quantitative proteomic approaches were used to study the effects of inhibition of glycogen phosphorylase on proteome and signaling pathways in MIA PaCa-2 pancreatic cancer cells. Methods We performed quantitative proteomic analysis in MIA PaCa-2 cancer cells treated with a stratified dose of CP-320626 (25 μM, 50 μM and 100 μM). The effect of metabolic inhibition on cellular protein turnover dynamics was also studied using the modified SILAC method (mSILAC). Results A total of twenty-two protein spots and four phosphoprotein spots were quantitatively analyzed. We found that dynamic expression of total proteins and phosphoproteins was significantly changed in MIA PaCa-2 cells treated with an incremental dose of CP-320626. Functional analyses suggested that most of the proteins differentially expressed were in the pathways of MAPK/ERK and TNF-α/NF-κB. Conclusions Signaling pathways and metabolic pathways share many common cofactors and substrates forming an extended metabolic network. The restriction of substrate through one pathway such as inhibition of glycogen phosphorylation induces pervasive metabolomic and proteomic changes manifested in protein synthesis, breakdown and post-translational modification of signaling molecules. Our results suggest that quantitative proteomic is an important approach to understand the interaction between metabolism and signaling pathways. PMID:22158071
Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses
Alexander, Elsinore; Wei, Xin; Lee, Shinwook
2018-01-01
Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526
Zikmund, T; Kvasnica, L; Týč, M; Křížová, A; Colláková, J; Chmelík, R
2014-11-01
Transmitted light holographic microscopy is particularly used for quantitative phase imaging of transparent microscopic objects such as living cells. The study of the cell is based on extraction of the dynamic data on cell behaviour from the time-lapse sequence of the phase images. However, the phase images are affected by the phase aberrations that make the analysis particularly difficult. This is because the phase deformation is prone to change during long-term experiments. Here, we present a novel algorithm for sequential processing of living cells phase images in a time-lapse sequence. The algorithm compensates for the deformation of a phase image using weighted least-squares surface fitting. Moreover, it identifies and segments the individual cells in the phase image. All these procedures are performed automatically and applied immediately after obtaining every single phase image. This property of the algorithm is important for real-time cell quantitative phase imaging and instantaneous control of the course of the experiment by playback of the recorded sequence up to actual time. Such operator's intervention is a forerunner of process automation derived from image analysis. The efficiency of the propounded algorithm is demonstrated on images of rat fibrosarcoma cells using an off-axis holographic microscope. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Adams, Hans-Peter; Wagner, Simone; Koziol, James A.
1998-06-01
Magnetic resonance imaging (MRI) is routinely used for the diagnosis of multiple sclerosis (MS), and for objective assessment of the extent of disease as a marker of treatment efficacy in MS clinical trials. The purpose of this study is to compare the evaluation of T2-weighted MRI scans in MS patients using a semi-automated quantitative technique with an independent assessment by a neurologist. Baseline, 6- month, and 12-month T2-weighted MRI scans from 41 chronic progressive MS patients were examined. The lesion volume ranged from 0.50 to 51.56 cm2 (mean: 8.08 cm2). Reproducibility of the quantitative technique was assessed by the re-evaluation of a random subset of 20 scans, the coefficient of variation of the replicate determinations was 8.2%. The reproducibility of the neurologist evaluations was assessed by the re-evaluation of a random subset of 10 patients. The rank correlation between the results of the two methods was 0.097, which did not significantly differ from zero. Disease-related activity in T2-weighted MRI scans is a multi-dimensional construct, and is not adequately summarized solely by determination of lesion volume. In this setting, image analysis software should not only support storage and retrieval as sets of pixels, but should also support links to an anatomical dictionary.
TECHNOLOGICAL INNOVATION IN NEUROSURGERY: A QUANTITATIVE STUDY
Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar
2015-01-01
Object Technological innovation within healthcare may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technologically intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical technique. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation respectively. Methods A patent database was searched between 1960 and 2010 using the search terms “neurosurgeon” OR “neurosurgical” OR “neurosurgery”. The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top performing technology cluster was then selected as an exemplar for more detailed analysis of individual patents. Results In all, 11,672 patents and 208,203 publications relating to neurosurgery were identified. The top performing technology clusters over the 50 years were: image guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes and endoscopes. Image guidance and neuromodulation devices demonstrated a highly correlated rapid rise in patents and publications, suggesting they are areas of technology expansion. In-depth analysis of neuromodulation patents revealed that the majority of high performing patents were related to Deep Brain Stimulation (DBS). Conclusions Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery. PMID:25699414
3D Actin Network Centerline Extraction with Multiple Active Contours
Xu, Ting; Vavylonis, Dimitrios; Huang, Xiaolei
2013-01-01
Fluorescence microscopy is frequently used to study two and three dimensional network structures formed by cytoskeletal polymer fibers such as actin filaments and actin cables. While these cytoskeletal structures are often dilute enough to allow imaging of individual filaments or bundles of them, quantitative analysis of these images is challenging. To facilitate quantitative, reproducible and objective analysis of the image data, we propose a semi-automated method to extract actin networks and retrieve their topology in 3D. Our method uses multiple Stretching Open Active Contours (SOACs) that are automatically initialized at image intensity ridges and then evolve along the centerlines of filaments in the network. SOACs can merge, stop at junctions, and reconfigure with others to allow smooth crossing at junctions of filaments. The proposed approach is generally applicable to images of curvilinear networks with low SNR. We demonstrate its potential by extracting the centerlines of synthetic meshwork images, actin networks in 2D Total Internal Reflection Fluorescence Microscopy images, and 3D actin cable meshworks of live fission yeast cells imaged by spinning disk confocal microscopy. Quantitative evaluation of the method using synthetic images shows that for images with SNR above 5.0, the average vertex error measured by the distance between our result and ground truth is 1 voxel, and the average Hausdorff distance is below 10 voxels. PMID:24316442
Hsieh, Anne M-Y; Polyakova, Olena; Fu, Guodong; Chazen, Ronald S; MacMillan, Christina; Witterick, Ian J; Ralhan, Ranju; Walfish, Paul G
2018-04-13
Recognition of noninvasive follicular thyroid neoplasms with papillary-like nuclear features (NIFTP) that distinguishes them from invasive malignant encapsulated follicular variant of papillary thyroid carcinoma (EFVPTC) can prevent overtreatment of NIFTP patients. We and others have previously reported that programmed death-ligand 1 (PD-L1) is a useful biomarker in thyroid tumors; however, all reports to date have relied on manual scoring that is time consuming as well as subject to individual bias. Consequently, we developed a digital image analysis (DIA) protocol for cytoplasmic and membranous stain quantitation (ThyApp) and evaluated three tumor sampling methods [Systemic Uniform Random Sampling, hotspot nucleus, and hotspot nucleus/3,3'-Diaminobenzidine (DAB)]. A patient cohort of 153 cases consisting of 48 NIFTP, 44 EFVPTC, 26 benign nodules and 35 encapsulated follicular lesions/neoplasms with lymphocytic thyroiditis (LT) was studied. ThyApp quantitation of PD-L1 expression revealed a significant difference between invasive EFVPTC and NIFTP; but none between NIFTP and benign nodules. ThyApp integrated with hotspot nucleus tumor sampling method demonstrated to be most clinically relevant, consumed least processing time, and eliminated interobserver variance. In conclusion, the fully automatic DIA algorithm developed using a histomorphological approach objectively quantitated PD-L1 expression in encapsulated thyroid neoplasms and outperformed manual scoring in reproducibility and higher efficiency.
Thurman, Steven M; Lu, Hongjing
2014-01-01
Visual form analysis is fundamental to shape perception and likely plays a central role in perception of more complex dynamic shapes, such as moving objects or biological motion. Two primary form-based cues serve to represent the overall shape of an object: the spatial position and the orientation of locations along the boundary of the object. However, it is unclear how the visual system integrates these two sources of information in dynamic form analysis, and in particular how the brain resolves ambiguities due to sensory uncertainty and/or cue conflict. In the current study, we created animations of sparsely-sampled dynamic objects (human walkers or rotating squares) comprised of oriented Gabor patches in which orientation could either coincide or conflict with information provided by position cues. When the cues were incongruent, we found a characteristic trade-off between position and orientation information whereby position cues increasingly dominated perception as the relative uncertainty of orientation increased and vice versa. Furthermore, we found no evidence for differences in the visual processing of biological and non-biological objects, casting doubt on the claim that biological motion may be specialized in the human brain, at least in specific terms of form analysis. To explain these behavioral results quantitatively, we adopt a probabilistic template-matching model that uses Bayesian inference within local modules to estimate object shape separately from either spatial position or orientation signals. The outputs of the two modules are integrated with weights that reflect individual estimates of subjective cue reliability, and integrated over time to produce a decision about the perceived dynamics of the input data. Results of this model provided a close fit to the behavioral data, suggesting a mechanism in the human visual system that approximates rational Bayesian inference to integrate position and orientation signals in dynamic form analysis.