Virtualising the Quantitative Research Methods Course: An Island-Based Approach
ERIC Educational Resources Information Center
Baglin, James; Reece, John; Baker, Jenalle
2015-01-01
Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…
Quantitative proteomics in cardiovascular research: global and targeted strategies
Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun
2014-01-01
Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501
Neutron multiplicity counting: Confidence intervals for reconstruction parameters
Verbeke, Jerome M.
2016-03-09
From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less
Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero
2011-03-24
High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.
[Integral quantitative evaluation of working conditions in the construction industry].
Guseĭnov, A A
1993-01-01
Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.
Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.
2016-01-01
Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933
NASA Astrophysics Data System (ADS)
Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.
2005-03-01
Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.
Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S
2018-01-01
Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.
Nondestructive Evaluation for Aerospace Composites
NASA Technical Reports Server (NTRS)
Leckey, Cara; Cramer, Elliott; Perey, Daniel
2015-01-01
Nondestructive evaluation (NDE) techniques are important for enabling NASA's missions in space exploration and aeronautics. The expanded and continued use of composite materials for aerospace components and vehicles leads to a need for advanced NDE techniques capable of quantitatively characterizing damage in composites. Quantitative damage detection techniques help to ensure safety, reliability and durability of space and aeronautic vehicles. This presentation will give a broad outline of NASA's range of technical work and an overview of the NDE research performed in the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center. The presentation will focus on ongoing research in the development of NDE techniques for composite materials and structures, including development of automated data processing tools to turn NDE data into quantitative location and sizing results. Composites focused NDE research in the areas of ultrasonics, thermography, X-ray computed tomography, and NDE modeling will be discussed.
Abildgaard, Johan S.; Saksvik, Per Ø.; Nielsen, Karina
2016-01-01
Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire (N = 285) as well as an extensive interview study (N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies. PMID:27713707
Abildgaard, Johan S; Saksvik, Per Ø; Nielsen, Karina
2016-01-01
Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire ( N = 285) as well as an extensive interview study ( N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies.
Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong
2014-07-01
Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Morimoto, Yoshitaka; Hoshino, Hironobu; Sakurai, Takashi; Terakawa, Susumu; Nagano, Akira
2009-04-01
Quantitative evaluation of the ability of bone resorption activity in live osteoclast-like cells (OCLs) has not yet been reported on. In this study, we observed the sequential morphological change of OCLs and measured the resorbing calcium phosphate (CP) area made by OCLs alone and with the addition of elcatonin utilizing incubator facilitated video-enhanced microscopy. OCLs, which were obtained from a coculture of ddy-mouse osteoblastic cells and bone marrow cells, were cultured on CP-coated quartz cover slips. The CP-free area increased constantly in the OCLs alone, whereas it did not increase after the addition of elcatonin. This study showed that analysis of the resorbed areas under the OCL body using this method enables the sequential quantitative evaluation of the bone resorption activity and the effect of several therapeutic agents on bone resorption in vitro.
NASA Astrophysics Data System (ADS)
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-11-01
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05839b
Pham Ba, Viet Anh; Cho, Dong-Guk; Kim, Daesan; Yoo, Haneul; Ta, Van-Thao; Hong, Seunghun
2017-08-15
We demonstrated the quantitative electrophysiological monitoring of histamine and anti-histamine drug effects on live cells via reusable sensor platforms based on carbon nanotube transistors. This method enabled us to monitor the real-time electrophysiological responses of a single HeLa cell to histamine with different concentrations. The measured electrophysiological responses were attributed to the activity of histamine type 1 receptors on a HeLa cell membrane by histamine. Furthermore, the effects of anti-histamine drugs such as cetirizine or chlorphenamine on the electrophysiological activities of HeLa cells were also evaluated quantitatively. Significantly, we utilized only a single device to monitor the responses of multiple HeLa cells to each drug, which allowed us to quantitatively analyze the antihistamine drug effects on live cells without errors from the device-to-device variation in device characteristics. Such quantitative evaluation capability of our method would promise versatile applications such as drug screening and nanoscale bio sensor researches. Copyright © 2017 Elsevier B.V. All rights reserved.
Health promotion and sustainability programmes in Australia: barriers and enablers to evaluation.
Patrick, Rebecca; Kingsley, Jonathan
2017-08-01
In an era characterised by the adverse impacts of climate change and environmental degradation, health promotion programmes are beginning to actively link human health with environmental sustainability imperatives. This paper draws on a study of health promotion and sustainability programmes in Australia, providing insights to evaluation approaches being used and barriers and enablers to these evaluations. The study was based on a multi-strategy research involving both quantitative and qualitative methods. Health promotion practitioners explained through surveys and semi-structured interviews that they focused on five overarching health and sustainability programme types (healthy and sustainable food, active transport, energy efficiency, contact with nature, and capacity building). Various evaluation methods and indicators (health, social, environmental, economic and demographic) were identified as being valuable for monitoring and evaluating health and sustainability programmes. Findings identified several evaluation enablers such as successful community engagement, knowledge of health and sustainability issues and programme champions, whereas barriers included resource constraints and competing interests. This paper highlights the need for ecological models and evaluation tools to support the design and monitoring of health promotion and sustainability programmes.
In vivo estimation of target registration errors during augmented reality laparoscopic surgery.
Thompson, Stephen; Schneider, Crispin; Bosi, Michele; Gurusamy, Kurinchi; Ourselin, Sébastien; Davidson, Brian; Hawkes, David; Clarkson, Matthew J
2018-06-01
Successful use of augmented reality for laparoscopic surgery requires that the surgeon has a thorough understanding of the likely accuracy of any overlay. Whilst the accuracy of such systems can be estimated in the laboratory, it is difficult to extend such methods to the in vivo clinical setting. Herein we describe a novel method that enables the surgeon to estimate in vivo errors during use. We show that the method enables quantitative evaluation of in vivo data gathered with the SmartLiver image guidance system. The SmartLiver system utilises an intuitive display to enable the surgeon to compare the positions of landmarks visible in both a projected model and in the live video stream. From this the surgeon can estimate the system accuracy when using the system to locate subsurface targets not visible in the live video. Visible landmarks may be either point or line features. We test the validity of the algorithm using an anatomically representative liver phantom, applying simulated perturbations to achieve clinically realistic overlay errors. We then apply the algorithm to in vivo data. The phantom results show that using projected errors of surface features provides a reliable predictor of subsurface target registration error for a representative human liver shape. Applying the algorithm to in vivo data gathered with the SmartLiver image-guided surgery system shows that the system is capable of accuracies around 12 mm; however, achieving this reliably remains a significant challenge. We present an in vivo quantitative evaluation of the SmartLiver image-guided surgery system, together with a validation of the evaluation algorithm. This is the first quantitative in vivo analysis of an augmented reality system for laparoscopic surgery.
Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.
Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders
Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu
2014-01-01
Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709
Quantitative aspects of inductively coupled plasma mass spectrometry
NASA Astrophysics Data System (ADS)
Bulska, Ewa; Wagner, Barbara
2016-10-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.
Methodology for determining the investment attractiveness of construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana
2018-03-01
The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.
Quantitative aspects of inductively coupled plasma mass spectrometry
Wagner, Barbara
2016-01-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971
Addison, P F E; Flander, L B; Cook, C N
2015-02-01
Protected area management agencies are increasingly using management effectiveness evaluation (MEE) to better understand, learn from and improve conservation efforts around the globe. Outcome assessment is the final stage of MEE, where conservation outcomes are measured to determine whether management objectives are being achieved. When quantitative monitoring data are available, best-practice examples of outcome assessments demonstrate that data should be assessed against quantitative condition categories. Such assessments enable more transparent and repeatable integration of monitoring data into MEE, which can promote evidence-based management and improve public accountability and reporting. We interviewed key informants from marine protected area (MPA) management agencies to investigate how scientific data sources, especially long-term biological monitoring data, are currently informing conservation management. Our study revealed that even when long-term monitoring results are available, management agencies are not using them for quantitative condition assessment in MEE. Instead, many agencies conduct qualitative condition assessments, where monitoring results are interpreted using expert judgment only. Whilst we found substantial evidence for the use of long-term monitoring data in the evidence-based management of MPAs, MEE is rarely the sole mechanism that facilitates the knowledge transfer of scientific evidence to management action. This suggests that the first goal of MEE (to enable environmental accountability and reporting) is being achieved, but the second and arguably more important goal of facilitating evidence-based management is not. Given that many MEE approaches are in their infancy, recommendations are made to assist management agencies realize the full potential of long-term quantitative monitoring data for protected area evaluation and evidence-based management. Copyright © 2014 Elsevier Ltd. All rights reserved.
Barrow, Emma; Evans, D Gareth; McMahon, Ray; Hill, James; Byers, Richard
2011-03-01
Lynch Syndrome is caused by mutations in DNA mismatch repair (MMR) genes. Mutation carrier identification is facilitated by immunohistochemical detection of the MMR proteins MHL1 and MSH2 in tumour tissue and is desirable as colonoscopic screening reduces mortality. However, protein detection by conventional immunohistochemistry (IHC) is subjective, and quantitative techniques are required. Quantum dots (QDs) are novel fluorescent labels that enable quantitative multiplex staining. This study compared their use with quantitative 3,3'-diaminobenzidine (DAB) IHC for the diagnosis of Lynch Syndrome. Tumour sections from 36 mutation carriers and six controls were obtained. These were stained with DAB on an automated platform using antibodies against MLH1 and MSH2. Multiplex QD immunofluorescent staining of the sections was performed using antibodies against MLH1, MSH2 and smooth muscle actin (SMA). Multispectral analysis of the slides was performed. The staining intensity of DAB and QDs was measured in multiple colonic crypts, and the mean intensity scores calculated. Receiver operating characteristic (ROC) curves of staining performance for the identification of mutation carriers were evaluated. For quantitative DAB IHC, the area under the MLH1 ROC curve was 0.872 (95% CI 0.763 to 0.981), and the area under the MSH2 ROC curve was 0.832 (95% CI 0.704 to 0.960). For quantitative QD IHC, the area under the MLH1 ROC curve was 0.812 (95% CI 0.681 to 0.943), and the area under the MSH2 ROC curve was 0.598 (95% CI 0.418 to 0.777). Despite the advantage of QD staining to enable several markers to be measured simultaneously, it is of lower utility than DAB IHC for the identification of MMR mutation carriers. Automated DAB IHC staining and quantitative slide analysis may enable high-throughput IHC.
Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials
Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.
2015-01-01
Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347
Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko
2008-04-01
The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.
ERIC Educational Resources Information Center
Thomas, Sarah; Grimes, Darren
2003-01-01
Graduate apprenticeships in a British college's hospitality management course involved integration of key skills and National Vocational Qualifications units. Qualitative and quantitative data from seven students indicated they felt that integration enabled formal recognition of competency, provided valuable managerial experience, and facilitated…
Reliability and precision of pellet-group counts for estimating landscape-level deer density
David S. deCalesta
2013-01-01
This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...
Detection of blur artifacts in histopathological whole-slide images of endomyocardial biopsies.
Hang Wu; Phan, John H; Bhatia, Ajay K; Cundiff, Caitlin A; Shehata, Bahig M; Wang, May D
2015-01-01
Histopathological whole-slide images (WSIs) have emerged as an objective and quantitative means for image-based disease diagnosis. However, WSIs may contain acquisition artifacts that affect downstream image feature extraction and quantitative disease diagnosis. We develop a method for detecting blur artifacts in WSIs using distributions of local blur metrics. As features, these distributions enable accurate classification of WSI regions as sharp or blurry. We evaluate our method using over 1000 portions of an endomyocardial biopsy (EMB) WSI. Results indicate that local blur metrics accurately detect blurry image regions.
Collaborating to improve the use of free-energy and other quantitative methods in drug discovery
NASA Astrophysics Data System (ADS)
Sherborne, Bradley; Shanmugasundaram, Veerabahu; Cheng, Alan C.; Christ, Clara D.; DesJarlais, Renee L.; Duca, Jose S.; Lewis, Richard A.; Loughney, Deborah A.; Manas, Eric S.; McGaughey, Georgia B.; Peishoff, Catherine E.; van Vlijmen, Herman
2016-12-01
In May and August, 2016, several pharmaceutical companies convened to discuss and compare experiences with Free Energy Perturbation (FEP). This unusual synchronization of interest was prompted by Schrödinger's FEP+ implementation and offered the opportunity to share fresh studies with FEP and enable broader discussions on the topic. This article summarizes key conclusions of the meetings, including a path forward of actions for this group to aid the accelerated evaluation, application and development of free energy and related quantitative, structure-based design methods.
NASA Astrophysics Data System (ADS)
Tanii, Takashi; Sasaki, Kosuke; Ichisawa, Kota; Demura, Takanori; Beppu, Yuichi; Vu, Hoan Anh; Thanh Chi, Hoan; Yamamoto, Hideaki; Sato, Yuko
2011-06-01
The adhesive ability of two human pancreatic cancer cell lines was evaluated using organosilane monolayer templates (OMTs). Using the OMT, the spreading area of adhered cells can be limited, and this enables us to focus on the initial attachment process of adhesion. Moreover, it becomes possible to arrange the cells in an array and to quantitatively evaluate the number of attached cells. The adhesive ability of the cancer cells cultured on the OMT was controlled by adding (-)-epigallocatechin-3-gallate (EGCG), which blocks a receptor that mediates cell adhesion and is overexpressed in cancer cells. Measurement of the relative ability of the cancer cells to attach to the OMT revealed that the ability for attachment decreased with increasing EGCG concentration. The results agreed well with the western blot analysis, indicating that the OMT can potentially be employed to evaluate the adhesive ability of various cancer cells.
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-12-14
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.
Quantitative nondestructive evaluation: Requirements for tomorrow's reliability
NASA Technical Reports Server (NTRS)
Heyman, Joseph S.
1991-01-01
Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.
ERIC Educational Resources Information Center
Islam, Muhammad Faysal
2013-01-01
Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…
Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.
ERIC Educational Resources Information Center
Lindahl, William H.; Gardner, James H.
Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…
HuMOVE: a low-invasive wearable monitoring platform in sexual medicine.
Ciuti, Gastone; Nardi, Matteo; Valdastri, Pietro; Menciassi, Arianna; Basile Fasolo, Ciro; Dario, Paolo
2014-10-01
To investigate an accelerometer-based wearable system, named Human Movement (HuMOVE) platform, designed to enable quantitative and continuous measurement of sexual performance with minimal invasiveness and inconvenience for users. Design, implementation, and development of HuMOVE, a wearable platform equipped with an accelerometer sensor for monitoring inertial parameters for sexual performance assessment and diagnosis, were performed. The system enables quantitative measurement of movement parameters during sexual intercourse, meeting the requirements of wearability, data storage, sampling rate, and interfacing methods, which are fundamental for human sexual intercourse performance analysis. HuMOVE was validated through characterization using a controlled experimental test bench and evaluated in a human model during simulated sexual intercourse conditions. HuMOVE demonstrated to be a robust and quantitative monitoring platform and a reliable candidate for sexual performance evaluation and diagnosis. Characterization analysis on the controlled experimental test bench demonstrated an accurate correlation between the HuMOVE system and data from a reference displacement sensor. Experimental tests in the human model during simulated intercourse conditions confirmed the accuracy of the sexual performance evaluation platform and the effectiveness of the selected and derived parameters. The obtained outcomes also established the project expectations in terms of usability and comfort, evidenced by the questionnaires that highlighted the low invasiveness and acceptance of the device. To the best of our knowledge, HuMOVE platform is the first device for human sexual performance analysis compatible with sexual intercourse; the system has the potential to be a helpful tool for physicians to accurately classify sexual disorders, such as premature or delayed ejaculation. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sumiya, H.; Hamaki, K.; Harano, K.
2018-05-01
Ultra-hard and high-strength spherical indenters with high precision and sphericity were successfully prepared from nanopolycrystalline diamond (NPD) synthesized by direct conversion sintering from graphite under high pressure and high temperature. It was shown that highly accurate and stable microfracture strength tests can be performed on various super-hard diamond materials by using the NPD spherical indenters. It was also verified that this technique enables quantitative evaluation of the strength characteristics of single crystal diamonds and NPDs which have been quite difficult to evaluate.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-01-01
Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393
NASA Astrophysics Data System (ADS)
Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn
2016-03-01
Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.
Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher
2018-03-07
Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Rotorcraft Conceptual Design Environment
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Sinsay, Jeffrey
2009-01-01
Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.
Rotorcraft Conceptual Design Environment
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Sinsay, Jeffrey D.
2010-01-01
Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.
Shi, Yin; Zong, Min; Xu, Xiaoquan; Zou, Yuefen; Feng, Yang; Liu, Wei; Wang, Chuanbing; Wang, Dehang
2015-04-01
To quantitatively evaluate nerve roots by measuring fractional anisotropy (FA) values in healthy volunteers and sciatica patients, visualize nerve roots by tractography, and compare the diagnostic efficacy between conventional magnetic resonance imaging (MRI) and DTI. Seventy-five sciatica patients and thirty-six healthy volunteers underwent MR imaging using DTI. FA values for L5-S1 lumbar nerve roots were calculated at three levels from DTI images. Tractography was performed on L3-S1 nerve roots. ROC analysis was performed for FA values. The lumbar nerve roots were visualized and FA values were calculated in all subjects. FA values decreased in compressed nerve roots and declined from proximal to distal along the compressed nerve tracts. Mean FA values were more sensitive and specific than MR imaging for differentiating compressed nerve roots, especially in the far lateral zone at distal nerves. DTI can quantitatively evaluate compressed nerve roots, and DTT enables visualization of abnormal nerve tracts, providing vivid anatomic information and localization of probable nerve compression. DTI has great potential utility for evaluating lumbar nerve compression in sciatica. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Kitamura, Yutaka; Isobe, Kazushige; Kawabata, Hideo; Tsujino, Tetsuhiro; Watanabe, Taisuke; Nakamura, Masayuki; Toyoda, Toshihisa; Okudera, Hajime; Okuda, Kazuhiro; Nakata, Koh; Kawase, Tomoyuki
2018-06-18
Platelet activation and aggregation have been conventionally evaluated using an aggregometer. However, this method is suitable for short-term but not long-term quantitative evaluation of platelet aggregation, morphological changes, and/or adhesion to specific materials. The recently developed digital holographic microscopy (DHM) has enabled the quantitative evaluation of cell size and morphology without labeling or destruction. Thus, we aim to validate its applicability in quantitatively evaluating changes in cell morphology, especially in the aggregation and spreading of activated platelets, thus modifying typical image analysis procedures to suit aggregated platelets. Freshly prepared platelet-rich plasma was washed with phosphate-buffered saline and treated with 0.1% CaCl 2 . Platelets were then fixed and subjected to DHM, scanning electron microscopy (SEM), atomic force microscopy, optical microscopy, and flow cytometry (FCM). Tightly aggregated platelets were identified as single cells. Data obtained from time-course experiments were plotted two-dimensionally according to the average optical thickness versus attachment area and divided into four regions. The majority of the control platelets, which supposedly contained small and round platelets, were distributed in the lower left region. As activation time increased, however, this population dispersed toward the upper right region. The distribution shift demonstrated by DHM was essentially consistent with data obtained from SEM and FCM. Therefore, DHM was validated as a promising device for testing platelet function given that it allows for the quantitative evaluation of activation-dependent morphological changes in platelets. DHM technology will be applicable to the quality assurance of platelet concentrates, as well as diagnosis and drug discovery related to platelet functions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Evaluating ICT Integration in Turkish K-12 Schools through Teachers' Views
ERIC Educational Resources Information Center
Aydin, Mehmet Kemal; Gürol, Mehmet; Vanderlinde, Ruben
2016-01-01
The current study aims to explore ICT integration in Turkish K-12 schools purposively selected as a representation of F@tih and non-F@tih public schools together with a private school. A convergent mixed methods design was employed with a multiple case strategy as such it will enable to make casewise comparisons. The quantitative data was…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
.... This model will enable the Sub-Adviser to evaluate, rank, and select the appropriate mix of investments... becoming, risky. The Sub-Adviser will use a quantitative metric to rank and select the appropriate mix of... imposes a duty of due diligence on its Equity Trading Permit Holders to learn the essential facts relating...
Hong, Seokpyo; Ahn, Kilsoo; Kim, Sungjune; Gong, Sungyong
2015-01-01
This study presents a methodology that enables a quantitative assessment of green chemistry technologies. The study carries out a quantitative evaluation of a particular case of material reutilization by calculating the level of "greenness" i.e., the level of compliance with the principles of green chemistry that was achieved by implementing a green chemistry technology. The results indicate that the greenness level was enhanced by 42% compared to the pre-improvement level, thus demonstrating the economic feasibility of green chemistry. The assessment technique established in this study will serve as a useful reference for setting the direction of industry-level and government-level technological R&D and for evaluating newly developed technologies, which can greatly contribute toward gaining a competitive advantage in the global market.
Estimation of 3D reconstruction errors in a stereo-vision system
NASA Astrophysics Data System (ADS)
Belhaoua, A.; Kohler, S.; Hirsch, E.
2009-06-01
The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Ultrasound arthroscopy of human knee cartilage and subchondral bone in vivo.
Liukkonen, Jukka; Lehenkari, Petri; Hirvasniemi, Jukka; Joukainen, Antti; Virén, Tuomas; Saarakkala, Simo; Nieminen, Miika T; Jurvelin, Jukka S; Töyräs, Juha
2014-09-01
Arthroscopic ultrasound imaging enables quantitative evaluation of articular cartilage. However, the potential of this technique for evaluation of subchondral bone has not been investigated in vivo. In this study, we address this issue in clinical arthroscopy of the human knee (n = 11) by determining quantitative ultrasound (9 MHz) reflection and backscattering parameters for cartilage and subchondral bone. Furthermore, in each knee, seven anatomical sites were graded using the International Cartilage Repair Society (ICRS) system based on (i) conventional arthroscopy and (ii) ultrasound images acquired in arthroscopy with a miniature transducer. Ultrasound enabled visualization of articular cartilage and subchondral bone. ICRS grades based on ultrasound images were higher (p < 0.05) than those based on conventional arthroscopy. The higher ultrasound-based ICRS grades were expected as ultrasound reveals additional information on, for example, the relative depth of the lesion. In line with previous literature, ultrasound reflection and scattering in cartilage varied significantly (p < 0.05) along the ICRS scale. However, no significant correlation between ultrasound parameters and structure or density of subchondral bone could be demonstrated. To conclude, arthroscopic ultrasound imaging had a significant effect on clinical grading of cartilage, and it was found to provide quantitative information on cartilage. The lack of correlation between the ultrasound parameters and bone properties may be related to lesser bone change or excessive attenuation in overlying cartilage and insufficient power of the applied miniature transducer. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-02-01
A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Design Standards for Engineered Tissues
Nawroth, Janna C.; Parker, Kevin Kit
2013-01-01
Traditional technologies are required to meet specific, quantitative standards of safety and performance. In tissue engineering, similar standards will have to be developed to enable routine clinical use and customized tissue fabrication. In this essay, we discuss a framework of concepts leading towards general design standards for tissue-engineering, focusing in particular on systematic design strategies, control of cell behavior, physiological scaling, fabrication modes and functional evaluation. PMID:23267860
Quantitative Evaluation of Performance during Robot-assisted Treatment.
Peri, E; Biffi, E; Maghini, C; Servodio Iammarrone, F; Gagliardi, C; Germiniasi, C; Pedrocchi, A; Turconi, A C; Reni, G
2016-01-01
This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The great potential of robots in extracting quantitative and meaningful data is not always exploited in clinical practice. The aim of the present work is to describe a simple parameter to assess the performance of subjects during upper limb robotic training exploiting data automatically recorded by the robot, with no additional effort for patients and clinicians. Fourteen children affected by cerebral palsy (CP) performed a training with Armeo®Spring. Each session was evaluated with P, a simple parameter that depends on the overall performance recorded, and median and interquartile values were computed to perform a group analysis. Median (interquartile) values of P significantly increased from 0.27 (0.21) at T0 to 0.55 (0.27) at T1 . This improvement was functionally validated by a significant increase of the Melbourne Assessment of Unilateral Upper Limb Function. The parameter described here was able to show variations in performance over time and enabled a quantitative evaluation of motion abilities in a way that is reliable with respect to a well-known clinical scale.
Boundary cooled rocket engines for space storable propellants
NASA Technical Reports Server (NTRS)
Kesselring, R. C.; Mcfarland, B. L.; Knight, R. M.; Gurnitz, R. N.
1972-01-01
An evaluation of an existing analytical heat transfer model was made to develop the technology of boundary film/conduction cooled rocket thrust chambers to the space storable propellant combination oxygen difluoride/diborane. Critical design parameters were identified and their importance determined. Test reduction methods were developed to enable data obtained from short duration hot firings with a thin walled (calorimeter) chamber to be used quantitatively evaluate the heat absorbing capability of the vapor film. The modification of the existing like-doublet injector was based on the results obtained from the calorimeter firings.
Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging☆
Oishi, Kenichi; Faria, Andreia V.; Yoshida, Shoko; Chang, Linda; Mori, Susumu
2013-01-01
The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a “growth percentile chart,” which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced. Future directions include multimodal image analysis and personalization for clinical application. PMID:23796902
Informatics methods to enable sharing of quantitative imaging research data.
Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L
2012-11-01
The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.
Ito, Kazunari; Gomi, Katsuya; Kariyama, Masahiro; Miyake, Tsuyoshi
2017-07-01
The construction of an experimental system that can mimic koji making in the manufacturing setting of a sake brewery is initially required for the quantitative evaluation of mycelia grown on/in koji pellets (haze formation). Koji making with rice was investigated with a solid-state fermentation (SSF) system using a non-airflow box (NAB), which produced uniform conditions in the culture substrate with high reproducibility and allowed for the control of favorable conditions in the substrate during culture. The SSF system using NAB accurately reproduced koji making in a manufacturing setting. To evaluate haze formation during koji making, surfaces and cross sections of koji pellets obtained from koji making tests were observed using a digital microscope. Image analysis was used to distinguish between haze and non-haze sections of koji pellets, enabling the evaluation of haze formation in a batch by measuring the haze rate of a specific number of koji pellets. This method allowed us to obtain continuous and quantitative data on the time course of haze formation. Moreover, drying koji during the late stage of koji making was revealed to cause further penetration of mycelia into koji pellets (internal haze). The koji making test with the SSF system using NAB and quantitative evaluation of haze formation in a batch by image analysis is a useful method for understanding the relations between haze formation and koji making conditions. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Yamamoto, Tatsuki; Miura, Chihiro; Fuji, Masako; Nagata, Shotaro; Otani, Yuria; Yagame, Takahiro; Yamato, Masahide; Kaminaka, Hironori
2017-02-21
In nature, orchid plants depend completely on symbiotic fungi for their nutrition at the germination and the subsequent seedling (protocorm) stages. However, only limited quantitative methods for evaluating the orchid-fungus interactions at the protocorm stage are currently available, which greatly constrains our understanding of the symbiosis. Here, we aimed to improve and integrate quantitative evaluations of the growth and fungal colonization in the protocorms of a terrestrial orchid, Blettila striata, growing on a plate medium. We achieved both symbiotic and asymbiotic germinations for the terrestrial orchid B. striata. The protocorms produced by the two germination methods grew almost synchronously for the first three weeks. At week four, however, the length was significantly lower in the symbiotic protocorms. Interestingly, the dry weight of symbiotic protocorms did not significantly change during the growth period, which implies that there was only limited transfer of carbon compounds from the fungus to the protocorms in this relationship. Next, to evaluate the orchid-fungus interactions, we developed an ink-staining method to observe the hyphal coils in protocorms without preparing thin sections. Crushing the protocorm under the coverglass enables us to observe all hyphal coils in the protocorms with high resolution. For this observation, we established a criterion to categorize the stages of hyphal coils, depending on development and degradation. By counting the symbiotic cells within each stage, it was possible to quantitatively evaluate the orchid-fungus symbiosis. We describe a method for quantitative evaluation of orchid-fungus symbiosis by integrating the measurements of plant growth and fungal colonization. The current study revealed that although fungal colonization was observed in the symbiotic protocorms, the weight of the protocorm did not significantly increase, which is probably due to the incompatibility of the fungus in this symbiosis. These results suggest that fungal colonization and nutrition transfer can be differentially regulated in the symbiosis. The evaluation methods developed in this study can be used to study various quantitative aspects of the orchid-fungus symbiosis.
Bokulich, Nicholas A.
2013-01-01
Ultra-high-throughput sequencing (HTS) of fungal communities has been restricted by short read lengths and primer amplification bias, slowing the adoption of newer sequencing technologies to fungal community profiling. To address these issues, we evaluated the performance of several common internal transcribed spacer (ITS) primers and designed a novel primer set and work flow for simultaneous quantification and species-level interrogation of fungal consortia. Primer comparison and validation were predicted in silico and by sequencing a “mock community” of mixed yeast species to explore the challenges of amplicon length and amplification bias for reconstructing defined yeast community structures. The amplicon size and distribution of this primer set are smaller than for all preexisting ITS primer sets, maximizing sequencing coverage of hypervariable ITS domains by very-short-amplicon, high-throughput sequencing platforms. This feature also enables the optional integration of quantitative PCR (qPCR) directly into the HTS preparatory work flow by substituting qPCR with these primers for standard PCR, yielding quantification of individual community members. The complete work flow described here, utilizing any of the qualified primer sets evaluated, can rapidly profile mixed fungal communities and capably reconstructed well-characterized beer and wine fermentation fungal communities. PMID:23377949
Solheim, Elisabeth; Plathe, Hilde Syvertsen; Eide, Hilde
2017-11-01
Clinical skills training is an important part of nurses' education programmes. Clinical skills are complex. A common understanding of what characterizes clinical skills and learning outcomes needs to be established. The aim of the study was to develop and evaluate a new reflection and feedback tool for formative assessment. The study has a descriptive quantitative design. 129 students participated who were at the end of the first year of a Bachelor degree in nursing. After highfidelity simulation, data were collected using a questionnaire with 19 closed-ended and 2 open-ended questions. The tool stimulated peer assessment, and enabled students to be more thorough in what to assess as an observer in clinical skills. The tool provided a structure for selfassessment and made visible items that are important to be aware of in clinical skills. This article adds to simulation literature and provides a tool that is useful in enhancing peer learning, which is essential for nurses in practice. The tool has potential for enabling students to learn about reflection and developing skills for guiding others in practice after they have graduated. Copyright © 2017 Elsevier Ltd. All rights reserved.
... this page: //medlineplus.gov/ency/article/003545.htm Quantitative nephelometry test To use the sharing features on this page, please enable JavaScript. Quantitative nephelometry is a lab test to quickly and ...
Myzithras, Maria; Li, Hua; Bigwarfe, Tammy; Waltz, Erica; Gupta, Priyanka; Low, Sarah; Hayes, David B; MacDonnell, Scott; Ahlberg, Jennifer; Franti, Michael; Roberts, Simon
2016-03-01
Four bioanalytical platforms were evaluated to optimize sensitivity and enable detection of recombinant human GDF11 in biological matrices; ELISA, Meso Scale Discovery, Gyrolab xP Workstation and Simoa HD-1. Results & methodology: After completion of custom assay development, the single-molecule ELISA (Simoa) achieved the greatest sensitivity with a lower limit of quantitation of 0.1 ng/ml, an improvement of 100-fold over the next sensitive platform (MSD). This improvement was essential to enable detection of GDF11 in biological samples, and without the technology the sensitivity achieved on the other platforms would not have been sufficient. Other factors such as ease of use, cost, assay time and automation capability can also be considered when developing custom immunoassays, based on the requirements of the bioanalyst.
Shinde, V; Burke, K E; Chakravarty, A; Fleming, M; McDonald, A A; Berger, A; Ecsedy, J; Blakemore, S J; Tirrell, S M; Bowman, D
2014-01-01
Immunohistochemistry-based biomarkers are commonly used to understand target inhibition in key cancer pathways in preclinical models and clinical studies. Automated slide-scanning and advanced high-throughput image analysis software technologies have evolved into a routine methodology for quantitative analysis of immunohistochemistry-based biomarkers. Alongside the traditional pathology H-score based on physical slides, the pathology world is welcoming digital pathology and advanced quantitative image analysis, which have enabled tissue- and cellular-level analysis. An automated workflow was implemented that includes automated staining, slide-scanning, and image analysis methodologies to explore biomarkers involved in 2 cancer targets: Aurora A and NEDD8-activating enzyme (NAE). The 2 workflows highlight the evolution of our immunohistochemistry laboratory and the different needs and requirements of each biological assay. Skin biopsies obtained from MLN8237 (Aurora A inhibitor) phase 1 clinical trials were evaluated for mitotic and apoptotic index, while mitotic index and defects in chromosome alignment and spindles were assessed in tumor biopsies to demonstrate Aurora A inhibition. Additionally, in both preclinical xenograft models and an acute myeloid leukemia phase 1 trial of the NAE inhibitor MLN4924, development of a novel image algorithm enabled measurement of downstream pathway modulation upon NAE inhibition. In the highlighted studies, developing a biomarker strategy based on automated image analysis solutions enabled project teams to confirm target and pathway inhibition and understand downstream outcomes of target inhibition with increased throughput and quantitative accuracy. These case studies demonstrate a strategy that combines a pathologist's expertise with automated image analysis to support oncology drug discovery and development programs.
NASA Astrophysics Data System (ADS)
Sibai, Mira; Fisher, Carl; Veilleux, Israel; Elliott, Jonathan T.; Leblond, Frederic; Roberts, David W.; Wilson, Brian C.
2017-07-01
5-Aminolevelunic acid-induced protoporphyrin IX (PpIX) fluorescence-guided resection (FGR) enables maximum safe resection of glioma by providing real-time tumor contrast. However, the subjective visual assessment and the variable intrinsic optical attenuation of tissue limit this technique to reliably delineating only high-grade tumors that display strong fluorescence. We have previously shown, using a fiber-optic probe, that quantitative assessment using noninvasive point spectroscopic measurements of the absolute PpIX concentration in tissue further improves the accuracy of FGR, extending it to surgically curable low-grade glioma. More recently, we have shown that implementing spatial frequency domain imaging with a fluorescent-light transport model enables recovery of two-dimensional images of [PpIX], alleviating the need for time-consuming point sampling of the brain surface. We present first results of this technique modified for in vivo imaging on an RG2 rat brain tumor model. Despite the moderate errors in retrieving the absorption and reduced scattering coefficients in the subdiffusive regime of 14% and 19%, respectively, the recovered [PpIX] maps agree within 10% of the point [PpIX] values measured by the fiber-optic probe, validating its potential as an extension or an alternative to point sampling during glioma resection.
Gaass, Thomas; Schneider, Moritz Jörg; Dietrich, Olaf; Ingrisch, Michael; Dinkel, Julien
2017-04-01
Variability across devices, patients, and time still hinders widespread recognition of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) as quantitative biomarker. The purpose of this work was to introduce and characterize a dedicated microchannel phantom as a model for quantitative DCE-MRI measurements. A perfusable, MR-compatible microchannel network was constructed on the basis of sacrificial melt-spun sugar fibers embedded in a block of epoxy resin. Structural analysis was performed on the basis of light microscopy images before DCE-MRI experiments. During dynamic acquisition the capillary network was perfused with a standard contrast agent injection system. Flow-dependency, as well as inter- and intrascanner reproducibility of the computed DCE parameters were evaluated using a 3.0 T whole-body MRI. Semi-quantitative and quantitative flow-related parameters exhibited the expected proportionality to the set flow rate (mean Pearson correlation coefficient: 0.991, P < 2.5e-5). The volume fraction was approximately independent from changes of the applied flow rate through the phantom. Repeatability and reproducibility experiments yielded maximum intrascanner coefficients of variation (CV) of 4.6% for quantitative parameters. All evaluated parameters were well in the range of known in vivo results for the applied flow rates. The constructed phantom enables reproducible, flow-dependent, contrast-enhanced MR measurements with the potential to facilitate standardization and comparability of DCE-MRI examinations. © 2017 American Association of Physicists in Medicine.
Quantitative evaluation of statistical errors in small-angle X-ray scattering measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sedlak, Steffen M.; Bruetzel, Linda K.; Lipfert, Jan
A new model is proposed for the measurement errors incurred in typical small-angle X-ray scattering (SAXS) experiments, which takes into account the setup geometry and physics of the measurement process. The model accurately captures the experimentally determined errors from a large range of synchrotron and in-house anode-based measurements. Its most general formulation gives for the variance of the buffer-subtracted SAXS intensity σ 2(q) = [I(q) + const.]/(kq), whereI(q) is the scattering intensity as a function of the momentum transferq;kand const. are fitting parameters that are characteristic of the experimental setup. The model gives a concrete procedure for calculating realistic measurementmore » errors for simulated SAXS profiles. In addition, the results provide guidelines for optimizing SAXS measurements, which are in line with established procedures for SAXS experiments, and enable a quantitative evaluation of measurement errors.« less
NASA Astrophysics Data System (ADS)
Zhang, Min; Katsumata, Akitoshi; Muramatsu, Chisako; Hara, Takeshi; Suzuki, Hiroki; Fujita, Hiroshi
2014-03-01
Periodontal disease is a kind of typical dental diseases, which affects many adults. The presence of alveolar bone resorption, which can be observed from dental panoramic radiographs, is one of the most important signs of the progression of periodontal disease. Automatically evaluating alveolar-bone resorption is of important clinic meaning in dental radiology. The purpose of this study was to propose a novel system for automated alveolar-bone-resorption evaluation from digital dental panoramic radiographs for the first time. The proposed system enables visualization and quantitative evaluation of alveolar bone resorption degree surrounding the teeth. It has the following procedures: (1) pre-processing for a test image; (2) detection of tooth root apices with Gabor filter and curve fitting for the root apex line; (3) detection of features related with alveolar bone by using image phase congruency map and template matching and curving fitting for the alveolar line; (4) detection of occlusion line with selected Gabor filter; (5) finally, evaluation of the quantitative alveolar-bone-resorption degree in the area surrounding teeth by simply computing the average ratio of the height of the alveolar bone and the height of the teeth. The proposed scheme was applied to 30 patient cases of digital panoramic radiographs, with alveolar bone resorption of different stages. Our initial trial on these test cases indicates that the quantitative evaluation results are correlated with the alveolar-boneresorption degree, although the performance still needs further improvement. Therefore it has potential clinical practicability.
Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy
NASA Technical Reports Server (NTRS)
Edwards, Michelle
2010-01-01
Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.
Quantitative Glycomics Strategies*
Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu
2013-01-01
The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de
2015-03-31
In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less
Murakami, Yoshihiko; Yokoyama, Masayuki; Nishida, Hiroshi; Tomizawa, Yasuko; Kurosawa, Hiromi
2008-09-01
Several hemostat hydrogels are clinically used, and some other agents are studied for safer, more facile, and more efficient hemostasis. In the present paper, we proposed a novel method to evaluate local hemostat hydrogel on tissue surface. The procedure consisted of the following steps: (step 1) a mouse was fixed on a cork board, and its abdomen was incised; (step 2) serous fluid was carefully removed because it affected the estimation of the weight gained by the filter paper, and parafilm and preweighted filter paper were placed beneath the liver (parafilm prevented the filter paper's absorption of gradually oozing serous fluid); (step 3) the cork board was tilted and maintained at an angle of about 45 degrees so that the bleeding would more easily flow from the liver toward the filter paper; and (step 4) the bleeding lasted for 3 min. In this step, a hemostat was applied to the liver wound immediately after the liver was pricked with a needle. We found that (1) a careful removal of serous fluid prior to a bleeding and (2) a quantitative determination of the amount of excess aqueous solution that oozed out from a hemostat were important to a rigorous evaluation of hemostat efficacy. We successfully evaluated the efficacy of a fibrin-based hemostat hydrogel by using our method. The method proposed in the present study enabled the quantitative, accurate, and easy evaluation of the efficacy of local hemostatic hydrogel which acts as tissue-adhesive agent on biointerfaces.
Novel online monitoring and alert system for anaerobic digestion reactors.
Dong, Fang; Zhao, Quan-Bao; Li, Wen-Wei; Sheng, Guo-Ping; Zhao, Jin-Bao; Tang, Yong; Yu, Han-Qing; Kubota, Kengo; Li, Yu-You; Harada, Hideki
2011-10-15
Effective monitoring and diagnosis of anaerobic digestion processes is a great challenge for anaerobic digestion reactors, which limits their stable operation. In this work, an online monitoring and alert system for upflow anaerobic sludge blanket (UASB) reactors is developed on the basis of a set of novel evaluating indexes. The two indexes, i.e., stability index S and auxiliary index a, which incorporate both gas- and liquid-phase parameters for UASB, enable a quantitative and comprehensive evaluation of reactor status. A series of shock tests is conducted to evaluate the response of the monitoring and alert system to organic overloading, hydraulic, temperature, and toxicant shocks. The results show that this system enables an accurate and rapid monitoring and diagnosis of the reactor status, and offers reliable early warnings on the potential risks. As the core of this system, the evaluating indexes are demonstrated to be of high accuracy and sensitivity in process evaluation and good adaptability to the artificial intelligence and automated control apparatus. This online monitoring and alert system presents a valuable effort to promote the automated monitoring and control of anaerobic digestion process, and holds a high promise for application.
Yepes-Rios, Monica; Dudek, Nancy; Duboyce, Rita; Curtis, Jerri; Allard, Rhonda J; Varpio, Lara
2016-11-01
Many clinical educators feel unprepared and/or unwilling to report unsatisfactory trainee performance. This systematic review consolidates knowledge from medical, nursing, and dental literature on the experiences and perceptions of evaluators or assessors with this failure to fail phenomenon. We searched the English language literature in CINAHL, EMBASE, and MEDLINE from January 2005 to January 2015. Qualitative and quantitative studies were included. Following our review protocol, registered with BEME, reviewers worked in pairs to identify relevant articles. The investigators participated in thematic analysis of the qualitative data reported in these studies. Through several cycles of analysis, discussion and reflection, the team identified the barriers and enablers to failing a trainee. From 5330 articles, we included 28 publications in the review. The barriers identified were (1) assessor's professional considerations, (2) assessor's personal considerations, (3) trainee related considerations, (4) unsatisfactory evaluator development and evaluation tools, (5) institutional culture and (6) consideration of available remediation for the trainee. The enablers identified were: (1) duty to patients, to society, and to the profession, (2) institutional support such as backing a failing evaluation, support from colleagues, evaluator development, and strong assessment systems, and (3) opportunities for students after failing. The inhibiting and enabling factors to failing an underperforming trainee were common across the professions included in this study, across the 10 years of data, and across the educational continuum. We suggest that these results can inform efforts aimed at addressing the failure to fail problem.
Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne
2016-01-30
Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.
Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc
2017-05-01
Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.
Hematocrit Measurement with R2* and Quantitative Susceptibility Mapping in Postmortem Brain.
Walsh, A J; Sun, H; Emery, D J; Wilman, A H
2018-05-24
Noninvasive venous oxygenation quantification with MR imaging will improve the neurophysiologic investigation and the understanding of the pathophysiology in neurologic diseases. Available MR imaging methods are limited by sensitivity to flow and often require assumptions of the hematocrit level. In situ postmortem imaging enables evaluation of methods in a fully deoxygenated environment without flow artifacts, allowing direct calculation of hematocrit. This study compares 2 venous oxygenation quantification methods in in situ postmortem subjects. Transverse relaxation (R2*) mapping and quantitative susceptibility mapping were performed on a whole-body 4.7T MR imaging system. Intravenous measurements in major draining intracranial veins were compared between the 2 methods in 3 postmortem subjects. The quantitative susceptibility mapping technique was also applied in 10 healthy control subjects and compared with reference venous oxygenation values. In 2 early postmortem subjects, R2* mapping and quantitative susceptibility mapping measurements within intracranial veins had a significant and strong correlation ( R 2 = 0.805, P = .004 and R 2 = 0.836, P = .02). Higher R2* and susceptibility values were consistently demonstrated within gravitationally dependent venous segments during the early postmortem period. Hematocrit ranged from 0.102 to 0.580 in postmortem subjects, with R2* and susceptibility as large as 291 seconds -1 and 1.75 ppm, respectively. Measurements of R2* and quantitative susceptibility mapping within large intracranial draining veins have a high correlation in early postmortem subjects. This study supports the use of quantitative susceptibility mapping for evaluation of in vivo venous oxygenation and postmortem hematocrit concentrations. © 2018 by American Journal of Neuroradiology.
A multisite assessment of the quantitative capabilities of the Xpert MTB/RIF assay.
Blakemore, Robert; Nabeta, Pamela; Davidow, Amy L; Vadwai, Viral; Tahirli, Rasim; Munsamy, Vanisha; Nicol, Mark; Jones, Martin; Persing, David H; Hillemann, Doris; Ruesch-Gerdes, Sabine; Leisegang, Felicity; Zamudio, Carlos; Rodrigues, Camilla; Boehme, Catharina C; Perkins, Mark D; Alland, David
2011-11-01
The Xpert MTB/RIF is an automated molecular test for Mycobacterium tuberculosis that estimates bacterial burden by measuring the threshold-cycle (Ct) of its M. tuberculosis-specific real-time polymerase chain reaction. Bacterial burden is an important biomarker for disease severity, infection control risk, and response to therapy. Evaluate bacterial load quantitation by Xpert MTB/RIF compared with conventional quantitative methods. Xpert MTB/RIF results were compared with smear-microscopy, semiquantiative solid culture, and time-to-detection in liquid culture for 741 patients and 2,008 samples tested in a multisite clinical trial. An internal control real-time polymerase chain reaction was evaluated for its ability to identify inaccurate quantitative Xpert MTB/RIF results. Assays with an internal control Ct greater than 34 were likely to be inaccurately quantitated; this represented 15% of M. tuberculosis-positive tests. Excluding these, decreasing M. tuberculosis Ct was associated with increasing smear microscopy grade for smears of concentrated sputum pellets (r(s) = -0.77) and directly from sputum (r(s) =-0.71). A Ct cutoff of approximately 27.7 best predicted smear-positive status. The association between M. tuberculosis Ct and time-to-detection in liquid culture (r(s) = 0.68) and semiquantitative colony counts (r(s) = -0.56) was weaker than smear. Tests of paired same-patient sputum showed that high viscosity sputum samples contained ×32 more M. tuberculosis than nonviscous samples. Comparisons between the grade of the acid-fast bacilli smear and Xpert MTB/RIF quantitative data across study sites enabled us to identify a site outlier in microscopy. Xpert MTB/RIF quantitation offers a new, standardized approach to measuring bacterial burden in the sputum of patients with tuberculosis.
Liu, Hueiming; Lindley, Richard; Alim, Mohammed; Felix, Cynthia; Gandhi, Dorcas B C; Verma, Shweta J; Tugnawat, Deepak Kumar; Syrigapu, Anuradha; Ramamurthy, Ramaprabhu Krishnappa; Pandian, Jeyaraj D; Walker, Marion; Forster, Anne; Anderson, Craig S; Langhorne, Peter; Murthy, Gudlavalleti Venkata Satyanarayana; Shamanna, Bindiganavale Ramaswamy; Hackett, Maree L; Maulik, Pallab K; Harvey, Lisa A; Jan, Stephen
2016-01-01
Introduction We are undertaking a randomised controlled trial (fAmily led rehabiliTaTion aftEr stroke in INDia, ATTEND) evaluating training a family carer to enable maximal rehabilitation of patients with stroke-related disability; as a potentially affordable, culturally acceptable and effective intervention for use in India. A process evaluation is needed to understand how and why this complex intervention may be effective, and to capture important barriers and facilitators to its implementation. We describe the protocol for our process evaluation to encourage the development of in-process evaluation methodology and transparency in reporting. Methods and analysis The realist and RE-AIM (Reach, Effectiveness, Adoption, Implementation and Maintenance) frameworks informed the design. Mixed methods include semistructured interviews with health providers, patients and their carers, analysis of quantitative process data describing fidelity and dose of intervention, observations of trial set up and implementation, and the analysis of the cost data from the patients and their families perspective and programme budgets. These qualitative and quantitative data will be analysed iteratively prior to knowing the quantitative outcomes of the trial, and then triangulated with the results from the primary outcome evaluation. Ethics and dissemination The process evaluation has received ethical approval for all sites in India. In low-income and middle-income countries, the available human capital can form an approach to reducing the evidence practice gap, compared with the high cost alternatives available in established market economies. This process evaluation will provide insights into how such a programme can be implemented in practice and brought to scale. Through local stakeholder engagement and dissemination of findings globally we hope to build on patient-centred, cost-effective and sustainable models of stroke rehabilitation. Trial registration number CTRI/2013/04/003557. PMID:27633636
Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter
2012-04-01
Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.
Zhao, Dan; Liu, Wei; Cai, Ailu; Li, Jingyu; Chen, Lizhu; Wang, Bing
2013-02-01
The purpose of this study was to investigate the effectiveness for quantitative evaluation of cerebellar vermis using three-dimensional (3D) ultrasound and to establish a nomogram for Chinese fetal vermis measurements during gestation. Sonographic examinations were performed in normal fetuses and in cases suspected of the diagnosis of vermian rotation. 3D median planes were obtained with both OMNIVIEW and tomographic ultrasound imaging. Measurements of the cerebellar vermis were highly correlated between two-dimensional and 3D median planes. The diameter of the cerebellar vermis follows growth approximately predicted by the quadratic regression equation. The normal vermis was almost parallel to the brain stem, with the average angle degree to be <2° in normal fetuses. The average angle degree of the 9 cases of vermian rotation was >5°. Three-dimensional median planes are obtained more easily than two-dimensional ones, and allow accurate measurements of the cerebellar vermis. The 3D approach may enable rapid assessment of fetal cerebral anatomy in standard examination. Measurements of cerebellar vermis may provide a quantitative index for prenatal diagnosis of posterior fossa malformations. © 2012 John Wiley & Sons, Ltd.
Technical skills measurement based on a cyber-physical system for endovascular surgery simulation.
Tercero, Carlos; Kodama, Hirokatsu; Shi, Chaoyang; Ooe, Katsutoshi; Ikeda, Seiichi; Fukuda, Toshio; Arai, Fumihito; Negoro, Makoto; Kwon, Guiryong; Najdovski, Zoran
2013-09-01
Quantification of medical skills is a challenge, particularly simulator-based training. In the case of endovascular intervention, it is desirable that a simulator accurately recreates the morphology and mechanical characteristics of the vasculature while enabling scoring. For this purpose, we propose a cyber-physical system composed of optical sensors for a catheter's body motion encoding, a magnetic tracker for motion capture of an operator's hands, and opto-mechatronic sensors for measuring the interaction of the catheter tip with the vasculature model wall. Two pilot studies were conducted for measuring technical skills, one for distinguishing novices from experts and the other for measuring unnecessary motion. The proficiency levels were measurable between expert and novice and also between individual novice users. The results enabled scoring of the user's proficiency level, using sensitivity, reaction time, time to complete a task and respect for tissue integrity as evaluation criteria. Additionally, unnecessary motion was also measurable. The development of cyber-physical simulators for other domains of medicine depend on the study of photoelastic materials for human tissue modelling, and enables quantitative evaluation of skills using surgical instruments and a realistic representation of human tissue. Copyright © 2012 John Wiley & Sons, Ltd.
Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo
2017-08-04
Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.
Rapid Creation and Quantitative Monitoring of High Coverage shRNA Libraries
Bassik, Michael C.; Lebbink, Robert Jan; Churchman, L. Stirling; Ingolia, Nicholas T.; Patena, Weronika; LeProust, Emily M.; Schuldiner, Maya; Weissman, Jonathan S.; McManus, Michael T.
2009-01-01
Short hairpin RNA (shRNA) libraries are limited by the low efficacy of many shRNAs, giving false negatives, and off-target effects, giving false positives. Here we present a strategy for rapidly creating expanded shRNA pools (∼30 shRNAs/gene) that are analyzed by deep-sequencing (EXPAND). This approach enables identification of multiple effective target-specific shRNAs from a complex pool, allowing a rigorous statistical evaluation of whether a gene is a true hit. PMID:19448642
Kaneko, Hiromasa; Funatsu, Kimito
2013-09-23
We propose predictive performance criteria for nonlinear regression models without cross-validation. The proposed criteria are the determination coefficient and the root-mean-square error for the midpoints between k-nearest-neighbor data points. These criteria can be used to evaluate predictive ability after the regression models are updated, whereas cross-validation cannot be performed in such a situation. The proposed method is effective and helpful in handling big data when cross-validation cannot be applied. By analyzing data from numerical simulations and quantitative structural relationships, we confirm that the proposed criteria enable the predictive ability of the nonlinear regression models to be appropriately quantified.
Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.
Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark
2017-12-01
A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Using confidence intervals to evaluate the focus alignment of spectrograph detector arrays.
Sawyer, Travis W; Hawkins, Kyle S; Damento, Michael
2017-06-20
High-resolution spectrographs extract detailed spectral information of a sample and are frequently used in astronomy, laser-induced breakdown spectroscopy, and Raman spectroscopy. These instruments employ dispersive elements such as prisms and diffraction gratings to spatially separate different wavelengths of light, which are then detected by a charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) detector array. Precise alignment along the optical axis (focus position) of the detector array is critical to maximize the instrumental resolution; however, traditional approaches of scanning the detector through focus lack a quantitative measure of precision, limiting the repeatability and relying on one's experience. Here we propose a method to evaluate the focus alignment of spectrograph detector arrays by establishing confidence intervals to measure the alignment precision. We show that propagation of uncertainty can be used to estimate the variance in an alignment, thus providing a quantitative and repeatable means to evaluate the precision and confidence of an alignment. We test the approach by aligning the detector array of a prototype miniature echelle spectrograph. The results indicate that the procedure effectively quantifies alignment precision, enabling one to objectively determine when an alignment has reached an acceptable level. This quantitative approach also provides a foundation for further optimization, including automated alignment. Furthermore, the procedure introduced here can be extended to other alignment techniques that rely on numerically fitting data to a model, providing a general framework for evaluating the precision of alignment methods.
Uemoto, Michihisa; Makino, Masanori; Ota, Yuji; Sakaguchi, Hiromi; Shimizu, Yukari; Sato, Kazuhiro
2018-01-01
Minor and trace metals in aluminum and aluminum alloys have been determined by inductively coupled plasma atomic emission spectrometry (ICP-AES) as an interlaboratory testing toward standardization. The trueness of the measured data was successfully investigated to improve the analytical protocols, using certified reference materials of aluminum. Their precision could also be evaluated, feasible to estimate the uncertainties separately. The accuracy (trueness and precision) of the data were finally in good agreement with the certified values and assigned uncertainties. Repeated measurements of aluminum solutions with different concentrations of the analytes revealed the relative standard deviations of the measurements with concentrations, thus enabling their limits of quantitation. They differed separately and also showed slightly higher values with an aluminum matrix than those without one. In addition, the upper limit of the detectable concentration of silicon with simple acid digestion was estimated to be 0.03 % in the mass fraction.
Ferreira, Fabiano G; Nouer, Darcy F; Silva, Nelson P; Garbui, Ivana U; Correr-Sobrinho, Lourenço; Nouer, Paulo R A
2014-09-01
The aim of this study was to undertake a qualitative and quantitative evaluation of changes on enamel surfaces after debonding of brackets followed by finishing procedures, using a high-resolution three-dimensional optical profiler and to investigate the accuracy of the technique. The labial surfaces of 36 extracted upper central incisors were examined. Before bonding, the enamel surfaces were subjected to profilometry, recording four amplitude parameters. Brackets were then bonded using two types of light-cured orthodontic adhesive: composite resin and resin-modified glass ionomer cement. Finishing was performed by three different methods: pumice on a rubber cup, fine and ultrafine aluminum oxide discs, and microfine diamond cups followed by silicon carbide brushes. The samples were subsequently re-analyzed by profilometry. Wilcoxon signed-rank test, Kruskal-Wallis test (p < 0.05) and a posteriori Mann-Whitney U test with Bonferroni correction (p < 0.0167) revealed a significant reduction of enamel roughness when diamond cups followed by silicon carbide brushes were used to finish surfaces that had remnants of resin-modified glass ionomer adhesive and when pumice was used to finish surfaces that had traces of composite resin. Enamel loss was minimal. The 3D optical profilometry technique was able to provide accurate qualitative and quantitative assessment of changes on the enamel surface after debonding. Morphological changes in the topography of dental surfaces, especially if related to enamel loss and roughness, are of considerable clinical importance. The quantitative evaluation method used herein enables a more comprehensive understanding of the effects of orthodontic bonding on teeth.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Request; Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... clearance. Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... research has proposed that providing quantitative information about product efficacy enables consumers to...
Infrared thermography for wood density estimation
NASA Astrophysics Data System (ADS)
López, Gamaliel; Basterra, Luis-Alfonso; Acuña, Luis
2018-03-01
Infrared thermography (IRT) is becoming a commonly used technique to non-destructively inspect and evaluate wood structures. Based on the radiation emitted by all objects, this technique enables the remote visualization of the surface temperature without making contact using a thermographic device. The process of transforming radiant energy into temperature depends on many parameters, and interpreting the results is usually complicated. However, some works have analyzed the operation of IRT and expanded its applications, as found in the latest literature. This work analyzes the effect of density on the thermodynamic behavior of timber to be determined by IRT. The cooling of various wood samples has been registered, and a statistical procedure that enables one to quantitatively estimate the density of timber has been designed. This procedure represents a new method to physically characterize this material.
NASA Technical Reports Server (NTRS)
Tai, Ann T.; Chau, Savio N.; Alkalai, Leon
2000-01-01
Using COTS products, standards and intellectual properties (IPs) for all the system and component interfaces is a crucial step toward significant reduction of both system cost and development cost as the COTS interfaces enable other COTS products and IPs to be readily accommodated by the target system architecture. With respect to the long-term survivable systems for deep-space missions, the major challenge for us is, under stringent power and mass constraints, to achieve ultra-high reliability of the system comprising COTS products and standards that are not developed for mission-critical applications. The spirit of our solution is to exploit the pertinent standard features of a COTS product to circumvent its shortcomings, though these standard features may not be originally designed for highly reliable systems. In this paper, we discuss our experiences and findings on the design of an IEEE 1394 compliant fault-tolerant COTS-based bus architecture. We first derive and qualitatively analyze a -'stacktree topology" that not only complies with IEEE 1394 but also enables the implementation of a fault-tolerant bus architecture without node redundancy. We then present a quantitative evaluation that demonstrates significant reliability improvement from the COTS-based fault tolerance.
NASA Astrophysics Data System (ADS)
Su, Long-Jyun; Wu, Meng-Shiue; Hui, Yuen Yung; Chang, Be-Ming; Pan, Lei; Hsu, Pei-Chen; Chen, Yit-Tsong; Ho, Hong-Nerng; Huang, Yen-Hua; Ling, Thai-Yen; Hsu, Hsao-Hsun; Chang, Huan-Cheng
2017-03-01
Cell therapy is a promising strategy for the treatment of human diseases. While the first use of cells for therapeutic purposes can be traced to the 19th century, there has been a lack of general and reliable methods to study the biodistribution and associated pharmacokinetics of transplanted cells in various animal models for preclinical evaluation. Here, we present a new platform using albumin-conjugated fluorescent nanodiamonds (FNDs) as biocompatible and photostable labels for quantitative tracking of human placenta choriodecidual membrane-derived mesenchymal stem cells (pcMSCs) in miniature pigs by magnetic modulation. With this background-free detection technique and time-gated fluorescence imaging, we have been able to precisely determine the numbers as well as positions of the transplanted FND-labeled pcMSCs in organs and tissues of the miniature pigs after intravenous administration. The method is applicable to single-cell imaging and quantitative tracking of human stem/progenitor cells in rodents and other animal models as well.
Theoretical framework for analyzing structural compliance properties of proteins.
Arikawa, Keisuke
2018-01-01
We propose methods for directly analyzing structural compliance (SC) properties of elastic network models of proteins, and we also propose methods for extracting information about motion properties from the SC properties. The analysis of SC properties involves describing the relationships between the applied forces and the deformations. When decomposing the motion according to the magnitude of SC (SC mode decomposition), we can obtain information about the motion properties under the assumption that the lower SC mode motions or the softer motions occur easily. For practical applications, the methods are formulated in a general form. The parts where forces are applied and those where deformations are evaluated are separated from each other for enabling the analyses of allosteric interactions between the specified parts. The parts are specified not only by the points but also by the groups of points (the groups are treated as flexible bodies). In addition, we propose methods for quantitatively evaluating the properties based on the screw theory and the considerations of the algebraic structures of the basic equations expressing the SC properties. These methods enable quantitative discussions about the relationships between the SC mode motions and the motions estimated from two different conformations; they also help identify the key parts that play important roles for the motions by comparing the SC properties with those of partially constrained models. As application examples, lactoferrin and ATCase are analyzed. The results show that we can understand their motion properties through their lower SC mode motions or the softer motions.
Theoretical framework for analyzing structural compliance properties of proteins
2018-01-01
We propose methods for directly analyzing structural compliance (SC) properties of elastic network models of proteins, and we also propose methods for extracting information about motion properties from the SC properties. The analysis of SC properties involves describing the relationships between the applied forces and the deformations. When decomposing the motion according to the magnitude of SC (SC mode decomposition), we can obtain information about the motion properties under the assumption that the lower SC mode motions or the softer motions occur easily. For practical applications, the methods are formulated in a general form. The parts where forces are applied and those where deformations are evaluated are separated from each other for enabling the analyses of allosteric interactions between the specified parts. The parts are specified not only by the points but also by the groups of points (the groups are treated as flexible bodies). In addition, we propose methods for quantitatively evaluating the properties based on the screw theory and the considerations of the algebraic structures of the basic equations expressing the SC properties. These methods enable quantitative discussions about the relationships between the SC mode motions and the motions estimated from two different conformations; they also help identify the key parts that play important roles for the motions by comparing the SC properties with those of partially constrained models. As application examples, lactoferrin and ATCase are analyzed. The results show that we can understand their motion properties through their lower SC mode motions or the softer motions. PMID:29607281
Albrich, Werner C.; van der Linden, Mark P. G.; Bénet, Thomas; Chou, Monidarin; Sylla, Mariam; Barreto Costa, Patricia; Richard, Nathalie; Klugman, Keith P.; Endtz, Hubert P.; Paranhos-Baccalà, Gláucia; Telles, Jean-Noël
2016-01-01
For epidemiological and surveillance purposes, it is relevant to monitor the distribution and dynamics of Streptococcus pneumoniae serotypes. Conventional serotyping methods do not provide rapid or quantitative information on serotype loads. Quantitative serotyping may enable prediction of the invasiveness of a specific serotype compared to other serotypes carried. Here, we describe a novel, rapid multiplex real-time PCR assay for identification and quantification of the 40 most prevalent pneumococcal serotypes and the assay impacts in pneumonia specimens from emerging and developing countries. Eleven multiplex PCR to detect 40 serotypes or serogroups were optimized. Quantification was enabled by reference to standard dilutions of known bacterial load. Performance of the assay was evaluated to specifically type and quantify S. pneumoniae in nasopharyngeal and blood samples from adult and pediatric patients hospitalized with pneumonia (n = 664) from five different countries. Serogroup 6 was widely represented in nasopharyngeal specimens from all five cohorts. The most frequent serotypes in the French, South African, and Brazilian cohorts were 1 and 7A/F, 3 and 19F, and 14, respectively. When both samples were available, the serotype in blood was always present as carriage with other serotypes in the nasopharynx. Moreover, the ability of a serotype to invade the bloodstream may be linked to its nasopharyngeal load. The mean nasopharyngeal concentration of the serotypes that moved to the blood was 3 log-fold higher than the ones only found in the nasopharynx. This novel, rapid, quantitative assay may potentially predict some of the S. pneumoniae serotypes invasiveness and assessment of pneumococcal serotype distribution. PMID:26986831
Yoshida, Yoko; Miyata, Toshiyuki; Matsumoto, Masanori; Shirotani-Ikejima, Hiroko; Uchida, Yumiko; Ohyama, Yoshifumi; Kokubo, Tetsuro; Fujimura, Yoshihiro
2015-01-01
For thrombotic microangiopathies (TMAs), the diagnosis of atypical hemolytic uremic syndrome (aHUS) is made by ruling out Shiga toxin-producing Escherichia coli (STEC)-associated HUS and ADAMTS13 activity-deficient thrombotic thrombocytopenic purpura (TTP), often using the exclusion criteria for secondary TMAs. Nowadays, assays for ADAMTS13 activity and evaluation for STEC infection can be performed within a few hours. However, a confident diagnosis of aHUS often requires comprehensive gene analysis of the alternative complement activation pathway, which usually takes at least several weeks. However, predisposing genetic abnormalities are only identified in approximately 70% of aHUS. To facilitate the diagnosis of complement-mediated aHUS, we describe a quantitative hemolytic assay using sheep red blood cells (RBCs) and human citrated plasma, spiked with or without a novel inhibitory anti-complement factor H (CFH) monoclonal antibody. Among 45 aHUS patients in Japan, 24% (11/45) had moderate-to-severe (≥50%) hemolysis, whereas the remaining 76% (34/45) patients had mild or no hemolysis (<50%). The former group is largely attributed to CFH-related abnormalities, and the latter group has C3-p.I1157T mutations (16/34), which were identified by restriction fragment length polymorphism (RFLP) analysis. Thus, a quantitative hemolytic assay coupled with RFLP analysis enabled the early diagnosis of complement-mediated aHUS in 60% (27/45) of patients in Japan within a week of presentation. We hypothesize that this novel quantitative hemolytic assay would be more useful in a Caucasian population, who may have a higher proportion of CFH mutations than Japanese patients. PMID:25951460
Performance evaluation of knowledge management among hospital employees.
Chang, Ying-Ying; Hsu, Pi-Fang; Li, Min-Hua; Chang, Ching-Ching
2011-01-01
The purpose of this study is to investigate the cognition of knowledge management (KM) among hospital employees and the relationship between KM and the KM enabler activities (financial, customer, internal business processes, learning and growth) in a regional hospital in Taiwan. Both qualitative and quantitative research were used in this study. The instrument was conducted using in-depth interviews of three policy-makers as participants. The quantitative data were collected from a regional hospital in the Northern part of Taiwan with a 77 percent effective response rate (n=154). The findings in this paper indicate that the cognition and demand for KM in subordinates is close to the expectations of policy-makers. The policy-makers expect subordinates working in the hospital to be brave in taking on new responsibilities and complying with hospital operation norms. KM is emphasized as a powerful and positive asset. Moreover, understanding KM predicts good performance in an organization. The findings in this paper can be generalized to other regional hospitals. The findings may be applied to a wider population. This study can provide insights into the perceptions and cognitions of workers in a hospital about KM and the activities of KM enablers. The responses and perceptions observed in the interviews in this study, as well as the quantitative research results could be useful to other hospitals and individuals who engage KM as a new management trend. This study suggested KM guidelines for policy-makers who are experienced managers.
Chen, Alice A.; Underhill, Gregory H.; Bhatia, Sangeeta N.
2014-01-01
Three-dimensional (3D) tissue models have significantly improved our understanding of structure/function relationships and promise to lead to new advances in regenerative medicine. However, despite the expanding diversity of 3D tissue fabrication methods, approaches for functional assessment have been relatively limited. Here, we describe the fabrication of microtissue (μ-tissue) suspensions and their quantitative evaluation with techniques capable of analyzing large sample numbers and performing multiplexed parallel analysis. We applied this platform to 3D μ-tissues representing multiple stages of liver development and disease including: embryonic stem cells, bipotential hepatic progenitors, mature hepatocytes, and hepatoma cells photoencapsulated in polyethylene glycol hydrogels. Multiparametric μ-tissue cytometry enabled quantitation of fluorescent reporter expression within populations of intact μ-tissues (n≥102-103) and sorting-based enrichment of subsets for subsequent studies. Further, 3D μ-tissues could be implanted in vivo, respond to systemic stimuli, retrieved and quantitatively assessed. In order to facilitate multiplexed ‘pooled’ experimentation, fluorescent labeling strategies were developed and utilized to investigate the impact of μ-tissue composition and exposure to soluble factors. In particular, examination of drug/gene interactions on collections of 3D hepatoma μ-tissues indicated synergistic influence of doxorubicin and knockdown of the anti-apoptotic gene BCL-XL. Collectively, these studies highlight the broad utility of μ-tissue suspensions as an enabling approach for high n, populational analysis of 3D tissue biology in vitro and in vivo. PMID:20820630
Remans, Tony; Keunen, Els; Bex, Geert Jan; Smeets, Karen; Vangronsveld, Jaco; Cuypers, Ann
2014-10-01
Reverse transcription-quantitative PCR (RT-qPCR) has been widely adopted to measure differences in mRNA levels; however, biological and technical variation strongly affects the accuracy of the reported differences. RT-qPCR specialists have warned that, unless researchers minimize this variability, they may report inaccurate differences and draw incorrect biological conclusions. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines describe procedures for conducting and reporting RT-qPCR experiments. The MIQE guidelines enable others to judge the reliability of reported results; however, a recent literature survey found low adherence to these guidelines. Additionally, even experiments that use appropriate procedures remain subject to individual variation that statistical methods cannot correct. For example, since ideal reference genes do not exist, the widely used method of normalizing RT-qPCR data to reference genes generates background noise that affects the accuracy of measured changes in mRNA levels. However, current RT-qPCR data reporting styles ignore this source of variation. In this commentary, we direct researchers to appropriate procedures, outline a method to present the remaining uncertainty in data accuracy, and propose an intuitive way to select reference genes to minimize uncertainty. Reporting the uncertainty in data accuracy also serves for quality assessment, enabling researchers and peer reviewers to confidently evaluate the reliability of gene expression data. © 2014 American Society of Plant Biologists. All rights reserved.
Parry, Selina M; Knight, Laura D; Connolly, Bronwen; Baldwin, Claire; Puthucheary, Zudin; Morris, Peter; Mortimore, Jessica; Hart, Nicholas; Denehy, Linda; Granger, Catherine L
2017-04-01
To identify, evaluate and synthesise studies examining the barriers and enablers for survivors of critical illness to participate in physical activity in the ICU and post-ICU settings from the perspective of patients, caregivers and healthcare providers. Systematic review of articles using five electronic databases: MEDLINE, CINAHL, EMBASE, Cochrane Library, Scopus. Quantitative and qualitative studies that were published in English in a peer-reviewed journal and assessed barriers or enablers for survivors of critical illness to perform physical activity were included. Prospero ID: CRD42016035454. Eighty-nine papers were included. Five major themes and 28 sub-themes were identified, encompassing: (1) patient physical and psychological capability to perform physical activity, including delirium, sedation, illness severity, comorbidities, weakness, anxiety, confidence and motivation; (2) safety influences, including physiological stability and concern for lines, e.g. risk of dislodgement; (3) culture and team influences, including leadership, interprofessional communication, administrative buy-in, clinician expertise and knowledge; (4) motivation and beliefs regarding the benefits/risks; and (5) environmental influences, including funding, access to rehabilitation programs, staffing and equipment. The main barriers identified were patient physical and psychological capability to perform physical activity, safety concerns, lack of leadership and ICU culture of mobility, lack of interprofessional communication, expertise and knowledge, and lack of staffing/equipment and funding to provide rehabilitation programs. Barriers and enablers are multidimensional and span diverse factors. The majority of these barriers are modifiable and can be targeted in future clinical practice.
Sonnaert, Maarten; Papantoniou, Ioannis; Luyten, Frank P; Schrooten, Jan Ir
2015-06-01
As the fields of tissue engineering and regenerative medicine mature toward clinical applications, the need for online monitoring both for quantitative and qualitative use becomes essential. Resazurin-based metabolic assays are frequently applied for determining cytotoxicity and have shown great potential for monitoring 3D bioreactor-facilitated cell culture. However, no quantitative correlation between the metabolic conversion rate of resazurin and cell number has been defined yet. In this work, we determined conversion rates of Presto Blue, a resazurin-based metabolic assay, for human periosteal cells during 2D and 3D static and 3D perfusion cultures. Our results showed that for the evaluated culture systems there is a quantitative correlation between the Presto Blue conversion rate and the cell number during the expansion phase with no influence of the perfusion-related parameters, that is, flow rate and shear stress. The correlation between the cell number and Presto Blue conversion subsequently enabled the definition of operating windows for optimal signal readouts. In conclusion, our data showed that the conversion of the resazurin-based Presto Blue metabolic assay can be used as a quantitative readout for online monitoring of cell proliferation in a 3D perfusion bioreactor system, although a system-specific validation is required.
Sonnaert, Maarten; Papantoniou, Ioannis; Luyten, Frank P.
2015-01-01
As the fields of tissue engineering and regenerative medicine mature toward clinical applications, the need for online monitoring both for quantitative and qualitative use becomes essential. Resazurin-based metabolic assays are frequently applied for determining cytotoxicity and have shown great potential for monitoring 3D bioreactor-facilitated cell culture. However, no quantitative correlation between the metabolic conversion rate of resazurin and cell number has been defined yet. In this work, we determined conversion rates of Presto Blue™, a resazurin-based metabolic assay, for human periosteal cells during 2D and 3D static and 3D perfusion cultures. Our results showed that for the evaluated culture systems there is a quantitative correlation between the Presto Blue conversion rate and the cell number during the expansion phase with no influence of the perfusion-related parameters, that is, flow rate and shear stress. The correlation between the cell number and Presto Blue conversion subsequently enabled the definition of operating windows for optimal signal readouts. In conclusion, our data showed that the conversion of the resazurin-based Presto Blue metabolic assay can be used as a quantitative readout for online monitoring of cell proliferation in a 3D perfusion bioreactor system, although a system-specific validation is required. PMID:25336207
Evaluation of self-esteem in cancer patients undergoing chemotherapy treatment1
Leite, Marilia Aparecida Carvalho; Nogueira, Denismar Alves; Terra, Fábio de Souza
2015-01-01
Objective: to evaluate the self-esteem of cancer patients undergoing chemotherapy. Method: descriptive analytical cross-sectional study with a quantitative approach. Around 156 patients that attended an oncology unit of a mid-sized hospital participated in the study. Results: we found a higher frequency of patients with high self-esteem, but some of them showed average or low self-esteem. The scale showed a Cronbach's alpha value of 0.746, by considering its acceptable internal consistency for the evaluated items. No independent variables showed significant associations with self-esteem. Conclusion: the cancer patients evaluated have presented high self-esteem; thus, it becomes crucial for nursing to plan the assistance of patients undergoing chemotherapy treatments, which enables actions and strategies that meet their physical and psychosocial conditions, aiming to maintain and rehabilitate these people's emotional aspects. PMID:26625999
NASA Astrophysics Data System (ADS)
Mercado, Karla Patricia E.
Tissue engineering holds great promise for the repair or replacement of native tissues and organs. Further advancements in the fabrication of functional engineered tissues are partly dependent on developing new and improved technologies to monitor the properties of engineered tissues volumetrically, quantitatively, noninvasively, and nondestructively over time. Currently, engineered tissues are evaluated during fabrication using histology, biochemical assays, and direct mechanical tests. However, these techniques destroy tissue samples and, therefore, lack the capability for real-time, longitudinal monitoring. The research reported in this thesis developed nondestructive, noninvasive approaches to characterize the structural, biological, and mechanical properties of 3-D engineered tissues using high-frequency quantitative ultrasound and elastography technologies. A quantitative ultrasound technique, using a system-independent parameter known as the integrated backscatter coefficient (IBC), was employed to visualize and quantify structural properties of engineered tissues. Specifically, the IBC was demonstrated to estimate cell concentration and quantitatively detect differences in the microstructure of 3-D collagen hydrogels. Additionally, the feasibility of an ultrasound elastography technique called Single Tracking Location Acoustic Radiation Force Impulse (STL-ARFI) imaging was demonstrated for estimating the shear moduli of 3-D engineered tissues. High-frequency ultrasound techniques can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, these high-frequency quantitative ultrasound techniques can enable noninvasive, volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation.
Quick, Harald H; Zenge, Michael O; Kuehl, Hilmar; Kaiser, Gernot; Aker, Stephanie; Massing, Sandra; Bosk, Silke; Ladd, Mark E
2005-02-01
Active instrument visualization strategies for interventional MR angiography (MRA) require vascular instruments to be equipped with some type of radiofrequency (RF) coil or dipole RF antenna for MR signal detection. Such visualization strategies traditionally necessitate a connection to the scanner with either coaxial cable or laser fibers. In order to eliminate any wire connection, RF resonators that inductively couple their signal to MR surface coils were implemented into catheters to enable wireless active instrument visualization. Instrument background to contrast-to-noise ratio was systematically investigated as a function of the excitation flip angle. Signal coupling between the catheter RF coil and surface RF coils was evaluated qualitatively and quantitatively as a function of the catheter position and orientation with regard to the static magnetic field B0 and to the surface coils. In vivo evaluation of the instruments was performed in interventional MRA procedures on five pigs under MR guidance. Cartesian and projection reconstruction TrueFISP imaging enabled simultaneous visualization of the instruments and vascular morphology in real time. The implementation of RF resonators enabled robust visualization of the catheter curvature to the very tip. Additionally, the active visualization strategy does not require any wire connection to the scanner and thus does not hamper the interventionalist during the course of an intervention.
Computer vision and machine learning for robust phenotyping in genome-wide studies
Zhang, Jiaoping; Naik, Hsiang Sing; Assefa, Teshale; Sarkar, Soumik; Reddy, R. V. Chowda; Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh K.
2017-01-01
Traditional evaluation of crop biotic and abiotic stresses are time-consuming and labor-intensive limiting the ability to dissect the genetic basis of quantitative traits. A machine learning (ML)-enabled image-phenotyping pipeline for the genetic studies of abiotic stress iron deficiency chlorosis (IDC) of soybean is reported. IDC classification and severity for an association panel of 461 diverse plant-introduction accessions was evaluated using an end-to-end phenotyping workflow. The workflow consisted of a multi-stage procedure including: (1) optimized protocols for consistent image capture across plant canopies, (2) canopy identification and registration from cluttered backgrounds, (3) extraction of domain expert informed features from the processed images to accurately represent IDC expression, and (4) supervised ML-based classifiers that linked the automatically extracted features with expert-rating equivalent IDC scores. ML-generated phenotypic data were subsequently utilized for the genome-wide association study and genomic prediction. The results illustrate the reliability and advantage of ML-enabled image-phenotyping pipeline by identifying previously reported locus and a novel locus harboring a gene homolog involved in iron acquisition. This study demonstrates a promising path for integrating the phenotyping pipeline into genomic prediction, and provides a systematic framework enabling robust and quicker phenotyping through ground-based systems. PMID:28272456
Development of in vitro and in vivo neutralization assays based on the pseudotyped H7N9 virus.
Tian, Yabin; Zhao, Hui; Liu, Qiang; Zhang, Chuntao; Nie, Jianhui; Huang, Weijing; Li, Changgui; Li, Xuguang; Wang, Youchun
2018-05-31
H7N9 viral infections pose a great threat to both animal and human health. This avian virus cannot be handled in level 2 biocontainment laboratories, substantially hindering evaluation of prophylactic vaccines and therapeutic agents. Here, we report a high-titer pseudoviral system with a bioluminescent reporter gene, enabling us to visually and quantitatively conduct analyses of virus replications in both tissue cultures and animals. For evaluation of immunogenicity of H7N9 vaccines, we developed an in vitro assay for neutralizing antibody measurement based on the pseudoviral system; results generated by the in vitro assay were found to be strongly correlated with those by either hemagglutination inhibition (HI) or micro-neutralization (MN) assay. Furthermore, we injected the viruses into Balb/c mice and observed dynamic distributions of the viruses in the animals, which provides an ideal imaging model for quantitative analyses of prophylactic and therapeutic monoclonal antibodies. Taken together, the pseudoviral systems reported here could be of great value for both in vitro and in vivo evaluations of vaccines and antiviral agents without the need of wild type H7N9 virus.
CASTIN: a system for comprehensive analysis of cancer-stromal interactome.
Komura, Daisuke; Isagawa, Takayuki; Kishi, Kazuki; Suzuki, Ryohei; Sato, Reiko; Tanaka, Mariko; Katoh, Hiroto; Yamamoto, Shogo; Tatsuno, Kenji; Fukayama, Masashi; Aburatani, Hiroyuki; Ishikawa, Shumpei
2016-11-09
Cancer microenvironment plays a vital role in cancer development and progression, and cancer-stromal interactions have been recognized as important targets for cancer therapy. However, identifying relevant and druggable cancer-stromal interactions is challenging due to the lack of quantitative methods to analyze whole cancer-stromal interactome. We present CASTIN (CAncer-STromal INteractome analysis), a novel framework for the evaluation of cancer-stromal interactome from RNA-Seq data using cancer xenograft models. For each ligand-receptor interaction which is derived from curated protein-protein interaction database, CASTIN summarizes gene expression profiles of cancer and stroma into three evaluation indices. These indices provide quantitative evaluation and comprehensive visualization of interactome, and thus enable to identify critical cancer-microenvironment interactions, which would be potential drug targets. We applied CASTIN to the dataset of pancreas ductal adenocarcinoma, and successfully characterized the individual cancer in terms of cancer-stromal relationships, and identified both well-known and less-characterized druggable interactions. CASTIN provides comprehensive view of cancer-stromal interactome and is useful to identify critical interactions which may serve as potential drug targets in cancer-microenvironment. CASTIN is available at: http://github.com/tmd-gpat/CASTIN .
Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.
Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F
2015-02-01
The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.
Systems-Level Analysis of Innate Immunity
Zak, Daniel E.; Tam, Vincent C.; Aderem, Alan
2014-01-01
Systems-level analysis of biological processes strives to comprehensively and quantitatively evaluate the interactions between the relevant molecular components over time, thereby enabling development of models that can be employed to ultimately predict behavior. Rapid development in measurement technologies (omics), when combined with the accessible nature of the cellular constituents themselves, is allowing the field of innate immunity to take significant strides toward this lofty goal. In this review, we survey exciting results derived from systems biology analyses of the immune system, ranging from gene regulatory networks to influenza pathogenesis and systems vaccinology. PMID:24655298
Goda, Ryoya; Kobayashi, Nobuhiro
2012-05-01
To evaluate the usefulness of the peptide adsorption-controlled liquid chromatography-tandem mass spectrometry (PAC-LC-MS/MS) for reproducible measurement of peptides in biological fluids, simultaneous quantitation of amyloid β 1-38, 1-40, 1-42 and 1-43 peptides (Aβ38, Aβ40, Aβ42 and Aβ43) in dog cerebrospinal fluid (CSF) was tried. Each stable isotope labeled Aβ was used as the internal standard to minimize the influence of CSF matrix on the reproducible Aβ quantitation. To reduce a loss of Aβ during the pretreatment procedures, the dog CSF diluted by water-acetic acid-methanol (2:6:1, v/v/v) was loaded on PAC-LC-MS/MS directly. Quantification of the Aβ in the diluted dog CSF was carried out using multiple reaction monitoring (MRM) mode. The [M+5H(5+)] and b(5+) ion fragment of each peptide were chosen as the precursor and product ions for MRM transitions of each peptide. The calibration curves were drawn from Aβ standard calibration solutions using PAC-LC-MS/MS. Analysis of dog CSF samples suggests that the basal concentration of Aβ38, Aβ40, Aβ42 and Aβ43 in dog CSF is approximately 300, 900, 200 and 30 pM, respectively. This is the first time Aβ concentrations in dog CSF have been reported. Additionally, the evaluation of intra- and inter-day reproducibility of analysis of Aβ standard solution, the freeze-thaw stability and the room temperature stability of Aβ standard solution suggest that the PAC-LC-MS/MS method enables reproducible Aβ quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Walters, Charles David
2017-01-01
Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008)…
Multidimensional quantitative analysis of mRNA expression within intact vertebrate embryos.
Trivedi, Vikas; Choi, Harry M T; Fraser, Scott E; Pierce, Niles A
2018-01-08
For decades, in situ hybridization methods have been essential tools for studies of vertebrate development and disease, as they enable qualitative analyses of mRNA expression in an anatomical context. Quantitative mRNA analyses typically sacrifice the anatomy, relying on embryo microdissection, dissociation, cell sorting and/or homogenization. Here, we eliminate the trade-off between quantitation and anatomical context, using quantitative in situ hybridization chain reaction (qHCR) to perform accurate and precise relative quantitation of mRNA expression with subcellular resolution within whole-mount vertebrate embryos. Gene expression can be queried in two directions: read-out from anatomical space to expression space reveals co-expression relationships in selected regions of the specimen; conversely, read-in from multidimensional expression space to anatomical space reveals those anatomical locations in which selected gene co-expression relationships occur. As we demonstrate by examining gene circuits underlying somitogenesis, quantitative read-out and read-in analyses provide the strengths of flow cytometry expression analyses, but by preserving subcellular anatomical context, they enable bi-directional queries that open a new era for in situ hybridization. © 2018. Published by The Company of Biologists Ltd.
In Situ Cross-Linking of Stimuli-Responsive Hemicellulose Microgels during Spray Drying
2015-01-01
Chemical cross-linking during spray drying offers the potential for green fabrication of microgels with a rapid stimuli response and good blood compatibility and provides a platform for stimuli-responsive hemicellulose microgels (SRHMGs). The cross-linking reaction occurs rapidly in situ at elevated temperature during spray drying, enabling the production of microgels in a large scale within a few minutes. The SRHMGs with an average size range of ∼1–4 μm contain O-acetyl-galactoglucomannan as a matrix and poly(acrylic acid), aniline pentamer (AP), and iron as functional additives, which are responsive to external changes in pH, electrochemical stimuli, magnetic field, or dual-stimuli. The surface morphologies, chemical compositions, charge, pH, and mechanical properties of these smart microgels were evaluated using scanning electron microscopy, IR, zeta potential measurements, pH evaluation, and quantitative nanomechanical mapping, respectively. Different oxidation states were observed when AP was introduced, as confirmed by UV spectroscopy and cyclic voltammetry. Systematic blood compatibility evaluations revealed that the SRHMGs have good blood compatibility. This bottom-up strategy to synthesize SRHMGs enables a new route to the production of smart microgels for biomedical applications. PMID:25630464
In situ cross-linking of stimuli-responsive hemicellulose microgels during spray drying.
Zhao, Weifeng; Nugroho, Robertus Wahyu N; Odelius, Karin; Edlund, Ulrica; Zhao, Changsheng; Albertsson, Ann-Christine
2015-02-25
Chemical cross-linking during spray drying offers the potential for green fabrication of microgels with a rapid stimuli response and good blood compatibility and provides a platform for stimuli-responsive hemicellulose microgels (SRHMGs). The cross-linking reaction occurs rapidly in situ at elevated temperature during spray drying, enabling the production of microgels in a large scale within a few minutes. The SRHMGs with an average size range of ∼ 1-4 μm contain O-acetyl-galactoglucomannan as a matrix and poly(acrylic acid), aniline pentamer (AP), and iron as functional additives, which are responsive to external changes in pH, electrochemical stimuli, magnetic field, or dual-stimuli. The surface morphologies, chemical compositions, charge, pH, and mechanical properties of these smart microgels were evaluated using scanning electron microscopy, IR, zeta potential measurements, pH evaluation, and quantitative nanomechanical mapping, respectively. Different oxidation states were observed when AP was introduced, as confirmed by UV spectroscopy and cyclic voltammetry. Systematic blood compatibility evaluations revealed that the SRHMGs have good blood compatibility. This bottom-up strategy to synthesize SRHMGs enables a new route to the production of smart microgels for biomedical applications.
Crawford, S; Garrard, J
2013-01-01
This mixed methods study was a comprehensive impact-process evaluation of the Ride2School program in metropolitan and regional areas in Victoria, Australia. The program aimed to promote transport to school for primary school children. Qualitative and quantitative data were collected at baseline and followup from two primary schools involved in the pilot phase of the program and two matched comparison schools, and a further 13 primary schools that participated in the implementation phase of the program. Classroom surveys, structured and unstructured observations, and interviews with Ride2School program staff were used to evaluate the pilot program. For the 13 schools in the second phase of the program, parents and students completed questionnaires at baseline (N = 889) and followup (N = 761). Based on the quantitative data, there was little evidence of an overall increase in active transport to school across participating schools, although impacts varied among individual schools. Qualitative data in the form of observations, interviews, and focus group discussions with students, school staff, and program staff provided insight into the reasons for variable program impacts. This paper highlights the benefits of undertaking a mixed methods approach to evaluating active transport to school programs that enables both measurement and understanding of program impacts.
Crawford, S.; Garrard, J.
2013-01-01
This mixed methods study was a comprehensive impact-process evaluation of the Ride2School program in metropolitan and regional areas in Victoria, Australia. The program aimed to promote transport to school for primary school children. Qualitative and quantitative data were collected at baseline and followup from two primary schools involved in the pilot phase of the program and two matched comparison schools, and a further 13 primary schools that participated in the implementation phase of the program. Classroom surveys, structured and unstructured observations, and interviews with Ride2School program staff were used to evaluate the pilot program. For the 13 schools in the second phase of the program, parents and students completed questionnaires at baseline (N = 889) and followup (N = 761). Based on the quantitative data, there was little evidence of an overall increase in active transport to school across participating schools, although impacts varied among individual schools. Qualitative data in the form of observations, interviews, and focus group discussions with students, school staff, and program staff provided insight into the reasons for variable program impacts. This paper highlights the benefits of undertaking a mixed methods approach to evaluating active transport to school programs that enables both measurement and understanding of program impacts. PMID:23606865
NASA Astrophysics Data System (ADS)
Morita, Yoshifumi; Hirose, Akinori; Uno, Takashi; Uchid, Masaki; Ukai, Hiroyuki; Matsui, Nobuyuki
2007-12-01
In this paper we propose a new rehabilitation training support system for upper limbs. The proposed system enables therapists to quantitatively evaluate the therapeutic effect of upper limb motor function during training, to easily change the load of resistance of training and to easily develop a new training program suitable for the subjects. For this purpose we develop control algorithms of training programs in the 3D force display robot. The 3D force display robot has parallel link mechanism with three motors. The control algorithm simulating sanding training is developed for the 3D force display robot. Moreover the teaching/training function algorithm is developed. It enables the therapists to easily make training trajectory suitable for subject's condition. The effectiveness of the developed control algorithms is verified by experiments.
Ghana's National Health insurance scheme and maternal and child health: a mixed methods study.
Singh, Kavita; Osei-Akoto, Isaac; Otchere, Frank; Sodzi-Tettey, Sodzi; Barrington, Clare; Huang, Carolyn; Fordham, Corinne; Speizer, Ilene
2015-03-17
Ghana is attracting global attention for efforts to provide health insurance to all citizens through the National Health Insurance Scheme (NHIS). With the program's strong emphasis on maternal and child health, an expectation of the program is that members will have increased use of relevant services. This paper uses qualitative and quantitative data from a baseline assessment for the Maternal and Newborn errals Evaluation from the Northern and Central Regions to describe women's experiences with the NHIS and to study associations between insurance and skilled facility delivery, antenatal care and early care-seeking for sick children. The assessment included a quantitative household survey (n = 1267 women), a quantitative community leader survey (n = 62), qualitative birth narratives with mothers (n = 20) and fathers (n = 18), key informant interviews with health care workers (n = 5) and focus groups (n = 3) with community leaders and stakeholders. The key independent variables for the quantitative analyses were health insurance coverage during the past three years (categorized as all three years, 1-2 years or no coverage) and health insurance during the exact time of pregnancy. Quantitative findings indicate that insurance coverage during the past three years and insurance during pregnancy were associated with greater use of facility delivery but not ANC. Respondents with insurance were also significantly more likely to indicate that an illness need not be severe for them to take a sick child for care. The NHIS does appear to enable pregnant women to access services and allow caregivers to seek care early for sick children, but both the quantitative and qualitative assessments also indicated that the poor and least educated were less likely to have insurance than their wealthier and more educated counterparts. Findings from the qualitative interviews uncovered specific challenges women faced regarding registration for the NHIS and other barriers such lack of understanding of who and what services were covered for free. Efforts should be undertaken so all individuals understand the NHIS policy including who is eligible for free services and what services are covered. Increasing access to health insurance will enable Ghana to further improve maternal and child health outcomes.
Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan
2015-12-01
Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.
NASA Astrophysics Data System (ADS)
Braunagel, Margarita; Birnbacher, Lorenz; Willner, Marian; Marschner, Mathias; De Marco, Fabio; Viermetz, Manuel; Notohamiprodjo, Susan; Hellbach, Katharina; Auweter, Sigrid; Link, Vera; Woischke, Christine; Reiser, Maximilian F.; Pfeiffer, Franz; Notohamiprodjo, Mike; Herzen, Julia
2017-03-01
Current clinical imaging methods face limitations in the detection and correct characterization of different subtypes of renal cell carcinoma (RCC), while these are important for therapy and prognosis. The present study evaluates the potential of grating-based X-ray phase-contrast computed tomography (gbPC-CT) for visualization and characterization of human RCC subtypes. The imaging results for 23 ex vivo formalin-fixed human kidney specimens obtained with phase-contrast CT were compared to the results of the absorption-based CT (gbCT), clinical CT and a 3T MRI and validated using histology. Regions of interest were placed on each specimen for quantitative evaluation. Qualitative and quantitative gbPC-CT imaging could significantly discriminate between normal kidney cortex (54 ± 4 HUp) and clear cell (42 ± 10), papillary (43 ± 6) and chromophobe RCCs (39 ± 7), p < 0.05 respectively. The sensitivity for detection of tumor areas was 100%, 50% and 40% for gbPC-CT, gbCT and clinical CT, respectively. RCC architecture like fibrous strands, pseudocapsules, necrosis or hyalinization was depicted clearly in gbPC-CT and was not equally well visualized in gbCT, clinical CT and MRI. The results show that gbPC-CT enables improved discrimination of normal kidney parenchyma and tumorous tissues as well as different soft-tissue components of RCCs without the use of contrast media.
Liu, Hueiming; Lindley, Richard; Alim, Mohammed; Felix, Cynthia; Gandhi, Dorcas B C; Verma, Shweta J; Tugnawat, Deepak Kumar; Syrigapu, Anuradha; Ramamurthy, Ramaprabhu Krishnappa; Pandian, Jeyaraj D; Walker, Marion; Forster, Anne; Anderson, Craig S; Langhorne, Peter; Murthy, Gudlavalleti Venkata Satyanarayana; Shamanna, Bindiganavale Ramaswamy; Hackett, Maree L; Maulik, Pallab K; Harvey, Lisa A; Jan, Stephen
2016-09-15
We are undertaking a randomised controlled trial (fAmily led rehabiliTaTion aftEr stroke in INDia, ATTEND) evaluating training a family carer to enable maximal rehabilitation of patients with stroke-related disability; as a potentially affordable, culturally acceptable and effective intervention for use in India. A process evaluation is needed to understand how and why this complex intervention may be effective, and to capture important barriers and facilitators to its implementation. We describe the protocol for our process evaluation to encourage the development of in-process evaluation methodology and transparency in reporting. The realist and RE-AIM (Reach, Effectiveness, Adoption, Implementation and Maintenance) frameworks informed the design. Mixed methods include semistructured interviews with health providers, patients and their carers, analysis of quantitative process data describing fidelity and dose of intervention, observations of trial set up and implementation, and the analysis of the cost data from the patients and their families perspective and programme budgets. These qualitative and quantitative data will be analysed iteratively prior to knowing the quantitative outcomes of the trial, and then triangulated with the results from the primary outcome evaluation. The process evaluation has received ethical approval for all sites in India. In low-income and middle-income countries, the available human capital can form an approach to reducing the evidence practice gap, compared with the high cost alternatives available in established market economies. This process evaluation will provide insights into how such a programme can be implemented in practice and brought to scale. Through local stakeholder engagement and dissemination of findings globally we hope to build on patient-centred, cost-effective and sustainable models of stroke rehabilitation. CTRI/2013/04/003557. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Verran, Joanna; Haigh, Carol; Brooks, Jane; Butler, Jonathan; Redfern, James
2018-05-31
There are many different initiatives, global and local, designed to raise awareness of antimicrobial resistance (AMR) and change audience behaviour. However, it is not possible to assess the impact of specific, small-scale events on national and international outcomes - although one might acknowledge some contribution to the individual and collective knowledge and experience-focused 'science capital' As with any research, in preparation for a public engagement event, it is important to identify aims, and appropriate methods whose results might help satisfy those aims. Therefore, the aim of this paper was to develop, deliver and evaluate an event designed to engage an adult audience with AMR. The venue was a World War 2 air raid shelter, enabling comparison of the pre- and post-antibiotic eras via three different activity stations, focusing on nursing, the search for new antibiotics, and investigations into novel antimicrobials. The use of observers released the presenters from evaluation duties, enabling them to focus on their specific activities. Qualitative measures of audience engagement were combined with quantitative data. The evaluation revealed that adult audiences can easily be absorbed into an activity- particularly if hands-on - after a brief introduction. This research demonstrates that hands-on practical engagement with AMR can enable high level interaction and learning in an informal and enjoyable environment. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Jean-Louis, Girardin; Ayappa, Indu; Rapoport, David; Zizi, Ferdinand; Airhihenbuwa, Collins; Okuyemi, Kola; Ogedegbe, Gbenga
2016-02-01
The aim of this study was to evaluate the National Institute of Health (NIH)-funded PRIDE Institute in Behavioral Medicine and Sleep Disorders Research at New York University (NYU) Langone Medical Center. The NYU PRIDE Institute provides intensive didactic and mentored research training to junior underrepresented minority (URM) faculty. The Kirkpatrick model, a mixed-methods program evaluation tool, was used to gather data on participant's satisfaction and program outcomes. Quantitative evaluation data were obtained from all 29 mentees using the PRIDE REDcap-based evaluation tool. In addition, in-depth interviews and focus groups were conducted with 17 mentees to learn about their experiences at the institute and their professional development activities. Quantitative data were examined, and emerging themes from in-depth interviews and focus groups were studied for patterns of connection and grouped into broader categories based on grounded theory. Overall, mentees rated all programmatic and mentoring aspects of the NYU PRIDE Institute very highly (80-100%). They identified the following areas as critical to their development: research and professional skills, mentorship, structured support and accountability, peer support, and continuous career development beyond the summer institute. Indicators of academic self-efficacy showed substantial improvement over time. Areas for improvement included tailoring programmatic activities to individual needs, greater assistance with publications, and identifying local mentors when K awards are sought. In order to promote career development, numerous factors that uniquely influence URM investigators' ability to succeed should be addressed. The NYU PRIDE Institute, which provides exposure to a well-resourced academic environment, leadership, didactic skills building, and intensive individualized mentorship proved successful in enabling URM mentees to excel in the academic environment. Overall, the institute accomplished its goals: to build an infrastructure enabling junior URM faculty to network with one another as well as with senior investigators, serving as a role model, in a supportive academic environment. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Viswanath, Satish; Toth, Robert; Rusu, Mirabela; Sperling, Dan; Lepor, Herbert; Futterer, Jurgen; Madabhushi, Anant
2013-03-01
Laser interstitial thermal therapy (LITT) has recently shown great promise as a treatment strategy for localized, focal, low-grade, organ-confined prostate cancer (CaP). Additionally, LITT is compatible with multi-parametric magnetic resonance imaging (MP-MRI) which in turn enables (1) high resolution, accurate localization of ablation zones on in vivo MP-MRI prior to LITT, and (2) real-time monitoring of temperature changes in vivo via MR thermometry during LITT. In spite of rapidly increasing interest in the use of LITT for treating low grade, focal CaP, very little is known about treatment-related changes following LITT. There is thus a clear need for studying post-LITT changes via MP-MRI and consequently to attempt to (1) quantitatively identify MP-MRI markers predictive of favorable treatment response and longer term patient outcome, and (2) identify which MP-MRI markers are most sensitive to post-LITT changes in the prostate. In this work, we present the first attempt at examining focal treatment-related changes on a per-voxel basis (high resolution) via quantitative evaluation of MR parameters pre- and post-LITT. A retrospective cohort of MP-MRI data comprising both pre- and post- LITT T2-weighted (T2w) and diffusion-weighted (DWI) acquisitions was considered, where DWI MRI yielded an Apparent Diffusion Co-efficient (ADC) map. A spatially constrained affine registration scheme was implemented to first bring T2w and ADC images into alignment within each of the pre- and post-LITT acquisitions, following which the pre- and post-LITT acquisitions were aligned. Pre- and post-LITT MR parameters (T2w intensity, ADC value) were then standardized to a uniform scale (to correct for intensity drift) and then quantified via the raw intensity values as well as via texture features derived from T2w MRI. In order to quantify imaging changes as a result of LITT, absolute differences were calculated between the normalized pre- and post-LITT MRI parameters. Quantitatively combining the ADC and T2w MRI parameters enabled construction of an integrated MP-MRI difference map that was highly indicative of changes specific to the LITT ablation zone. Preliminary quantitative comparison of the changes in different MR parameters indicated that T2w texture may be highly sensitive as well as specific in identifying changes within the ablation zone pre- and post-LITT. Visual evaluation of the differences in T2w texture features pre- and post-LITT also appeared to provide an indication of LITT-related effects such as edema. Our preliminary results thus indicate great potential for non-invasive MP-MRI imaging markers for determining focal treatment related changes, and hence long- and short-term patient outcome.
Nonis, Alberto; Vezzaro, Alice; Ruperti, Benedetto
2012-07-11
Genome wide transcriptomic surveys together with targeted molecular studies are uncovering an ever increasing number of differentially expressed genes in relation to agriculturally relevant processes in olive (Olea europaea L). These data need to be supported by quantitative approaches enabling the precise estimation of transcript abundance. qPCR being the most widely adopted technique for mRNA quantification, preliminary work needs to be done to set up robust methods for extraction of fully functional RNA and for the identification of the best reference genes to obtain reliable quantification of transcripts. In this work, we have assessed different methods for their suitability for RNA extraction from olive fruits and leaves and we have evaluated thirteen potential candidate reference genes on 21 RNA samples belonging to fruit developmental/ripening series and to leaves subjected to wounding. By using two different algorithms, GAPDH2 and PP2A1 were identified as the best reference genes for olive fruit development and ripening, and their effectiveness for normalization of expression of two ripening marker genes was demonstrated.
Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram
2016-01-01
Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Non-contact versus contact-based sensing methodologies for in-home upper arm robotic rehabilitation.
Howard, Ayanna; Brooks, Douglas; Brown, Edward; Gebregiorgis, Adey; Chen, Yu-Ping
2013-06-01
In recent years, robot-assisted rehabilitation has gained momentum as a viable means for improving outcomes for therapeutic interventions. Such therapy experiences allow controlled and repeatable trials and quantitative evaluation of mobility metrics. Typically though these robotic devices have been focused on rehabilitation within a clinical setting. In these traditional robot-assisted rehabilitation studies, participants are required to perform goal-directed movements with the robot during a therapy session. This requires physical contact between the participant and the robot to enable precise control of the task, as well as a means to collect relevant performance data. On the other hand, non-contact means of robot interaction can provide a safe methodology for extracting the control data needed for in-home rehabilitation. As such, in this paper we discuss a contact and non-contact based method for upper-arm rehabilitation exercises that enables quantification of upper-arm movements. We evaluate our methodology on upper-arm abduction/adduction movements and discuss the advantages and limitations of each approach as applied to an in-home rehabilitation scenario.
Method and platform standardization in MRM-based quantitative plasma proteomics.
Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H
2013-12-16
There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.
Maity, Maitreya; Dhane, Dhiraj; Mungle, Tushar; Maiti, A K; Chakraborty, Chandan
2017-10-26
Web-enabled e-healthcare system or computer assisted disease diagnosis has a potential to improve the quality and service of conventional healthcare delivery approach. The article describes the design and development of a web-based distributed healthcare management system for medical information and quantitative evaluation of microscopic images using machine learning approach for malaria. In the proposed study, all the health-care centres are connected in a distributed computer network. Each peripheral centre manages its' own health-care service independently and communicates with the central server for remote assistance. The proposed methodology for automated evaluation of parasites includes pre-processing of blood smear microscopic images followed by erythrocytes segmentation. To differentiate between different parasites; a total of 138 quantitative features characterising colour, morphology, and texture are extracted from segmented erythrocytes. An integrated pattern classification framework is designed where four feature selection methods viz. Correlation-based Feature Selection (CFS), Chi-square, Information Gain, and RELIEF are employed with three different classifiers i.e. Naive Bayes', C4.5, and Instance-Based Learning (IB1) individually. Optimal features subset with the best classifier is selected for achieving maximum diagnostic precision. It is seen that the proposed method achieved with 99.2% sensitivity and 99.6% specificity by combining CFS and C4.5 in comparison with other methods. Moreover, the web-based tool is entirely designed using open standards like Java for a web application, ImageJ for image processing, and WEKA for data mining considering its feasibility in rural places with minimal health care facilities.
Sawa, Mitsuru
2011-03-01
1. Slit-lamp microscopy is a principal ophthalmic clinical method, because it provides microscopic findings of the anterior segment of the eye noninvasively. Its findings, however, are qualitative and there are large inter-observer variations in their evaluation. Furthermore, slit-lamp microscopy provides morphological findings, but a functional evaluation is difficult. We developed two novel methods that establish a qualitative methodology of the slit-lamp microscope and the pathophysiology of the anterior segment of the eye. One is the flare-cell photometer to evaluate flare and cells in the aqueous humor of the eye and the other is an immunohistochemical examination method using tear fluid to evaluate ocular surface disorders. The comprehensive evaluation of these studies is herein overviewed. 2. INNOVATION OF THE FLARE-CELL PHOTOMETER AND ITS CLINICAL SIGNIFICANCE: The breakdown of the blood-aqueous barrier (BAB) causes an increase in protein (flare) and leakage of blood cells (cell) into the aqueous humor of the eye and the severity of BAB breakdown has a positive correlation with the intensity of flare and cells. The flare and cells in the aqueous can be observed qualitatively by slit-lamp microscopy. These findings are primarily distinguished in optics by light scattering. Therefore, detection of the intensity of light scattering due to flare and cells can evaluate the BAB function. The flare-cell photometer comprises 3 novel components: a laser beam system as an incident light, a photomultiplier to detect scattered light intensity and a computer-assisted system to operate the whole system and analyze detected scattered light signals due to flare and cells. The instrument enables us to quantitatively analyze the flare and cells non-invasively and accurately with a wide dynamic measurement range, resulting in a repeated examination of each individual case. It also enables the evaluation of inflammation in the aqueous not only postoperatively but also in endogenous uveitis, evaluation of the effects of anti-inflammatory drugs on BAB and evaluation of aqueous humor dynamics. Furthermore, repeating the examination can minimize inter-individual variations and reduce the number of animals in animal experiments. 3. Sampling of tears can be performed noninvasively, but the obtainable volume is limited. Therefore, a determination of targeting biomarkers and a development of their micro-volume analysis methods play a crucial role in pathophysiological studies of the ocular surface. Targeting biomarkers should be determined according to the various specified bioactive substances such as eosinophil cationic protein (ECP), cytokines and others. A number of microvolume analysis methods, such as chemiluminescent enzyme immunoassay, immunochromatography, micro-array system and polymerase chain reaction method are used. Objective disorders in the studies include allergic conjunctivitis and infectious diseases such as herpetic keratitis. Quantitative evaluation methods for ECP concentration, antigen-specific secretory IgA in allergic diseases and herpetic keratitis, herpes simplex virus-DNA and cytokine and chemokine profile in tear fluid sampled by filter paper method were investigated. We developed a clinically applicable quantitative immunochemical method for ECP concentration in tear fluid. The results revealed that tear fluid analysis using the above mentioned methods is a clinically useful to investigate the pathophysiology of the ocular surface. 4. Laser flare-cell photometer and tear fluid analysis are potent clinical quantitative methods to investigate the pathophysiology of the anterior segment of the eye.
Isola, A A; Schmitt, H; van Stevendaal, U; Begemann, P G; Coulon, P; Boussel, L; Grass, M
2011-09-21
Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.
Workstation-Based Simulation for Rapid Prototyping and Piloted Evaluation of Control System Designs
NASA Technical Reports Server (NTRS)
Mansur, M. Hossein; Colbourne, Jason D.; Chang, Yu-Kuang; Aiken, Edwin W. (Technical Monitor)
1998-01-01
The development and optimization of flight control systems for modem fixed- and rotary-. wing aircraft consume a significant portion of the overall time and cost of aircraft development. Substantial savings can be achieved if the time required to develop and flight test the control system, and the cost, is reduced. To bring about such reductions, software tools such as Matlab/Simulink are being used to readily implement block diagrams and rapidly evaluate the expected responses of the completed system. Moreover, tools such as CONDUIT (CONtrol Designer's Unified InTerface) have been developed that enable the controls engineers to optimize their control laws and ensure that all the relevant quantitative criteria are satisfied, all within a fully interactive, user friendly, unified software environment.
ERIC Educational Resources Information Center
Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr
2018-01-01
The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…
NASA Astrophysics Data System (ADS)
Kemper, Björn; Bauwens, Andreas; Vollmer, Angelika; Ketelhut, Steffi; Langehanenberg, Patrik; Müthing, Johannes; Karch, Helge; von Bally, Gert
2010-05-01
Digital holographic microscopy (DHM) enables quantitative multifocus phase contrast imaging for nondestructive technical inspection and live cell analysis. Time-lapse investigations on human brain microvascular endothelial cells demonstrate the use of DHM for label-free dynamic quantitative monitoring of cell division of mother cells into daughter cells. Cytokinetic DHM analysis provides future applications in toxicology and cancer research.
Rudnick, Paul A.; Clauser, Karl R.; Kilpatrick, Lisa E.; Tchekhovskoi, Dmitrii V.; Neta, Pedatsur; Blonder, Nikša; Billheimer, Dean D.; Blackman, Ronald K.; Bunk, David M.; Cardasis, Helene L.; Ham, Amy-Joan L.; Jaffe, Jacob D.; Kinsinger, Christopher R.; Mesri, Mehdi; Neubert, Thomas A.; Schilling, Birgit; Tabb, David L.; Tegeler, Tony J.; Vega-Montoto, Lorenzo; Variyath, Asokan Mulayath; Wang, Mu; Wang, Pei; Whiteaker, Jeffrey R.; Zimmerman, Lisa J.; Carr, Steven A.; Fisher, Susan J.; Gibson, Bradford W.; Paulovich, Amanda G.; Regnier, Fred E.; Rodriguez, Henry; Spiegelman, Cliff; Tempst, Paul; Liebler, Daniel C.; Stein, Stephen E.
2010-01-01
A major unmet need in LC-MS/MS-based proteomics analyses is a set of tools for quantitative assessment of system performance and evaluation of technical variability. Here we describe 46 system performance metrics for monitoring chromatographic performance, electrospray source stability, MS1 and MS2 signals, dynamic sampling of ions for MS/MS, and peptide identification. Applied to data sets from replicate LC-MS/MS analyses, these metrics displayed consistent, reasonable responses to controlled perturbations. The metrics typically displayed variations less than 10% and thus can reveal even subtle differences in performance of system components. Analyses of data from interlaboratory studies conducted under a common standard operating procedure identified outlier data and provided clues to specific causes. Moreover, interlaboratory variation reflected by the metrics indicates which system components vary the most between laboratories. Application of these metrics enables rational, quantitative quality assessment for proteomics and other LC-MS/MS analytical applications. PMID:19837981
Chen, Yi; Fisher, Kate J.; Lloyd, Mark; Wood, Elizabeth R.; Coppola, Domenico; Siegel, Erin; Shibata, David; Chen, Yian A.; Koomen, John M.
2017-01-01
Quantitative evaluation of protein expression across multiple cancer-related signaling pathways (e.g. Wnt/β-catenin, TGF-β, receptor tyrosine kinases (RTK), MAP kinases, NF-κB, and apoptosis) in tumor tissues may enable the development of a molecular profile for each individual tumor that can aid in the selection of appropriate targeted cancer therapies. Here, we describe the development of a broadly applicable protocol to develop and implement quantitative mass spectrometry assays using cell line models and frozen tissue specimens from colon cancer patients. Cell lines are used to develop peptide-based assays for protein quantification, which are incorporated into a method based on SDS-PAGE protein fractionation, in-gel digestion, and liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM/MS). This analytical platform is then applied to frozen tumor tissues. This protocol can be broadly applied to the study of human disease using multiplexed LC-MRM assays. PMID:28808993
Li, Changqing; Zhao, Hongzhi; Anderson, Bonnie; Jiang, Huabei
2006-03-01
We describe a compact diffuse optical tomography system specifically designed for breast imaging. The system consists of 64 silicon photodiode detectors, 64 excitation points, and 10 diode lasers in the near-infrared region, allowing multispectral, three-dimensional optical imaging of breast tissue. We also detail the system performance and optimization through a calibration procedure. The system is evaluated using tissue-like phantom experiments and an in vivo clinic experiment. Quantitative two-dimensional (2D) and three-dimensional (3D) images of absorption and reduced scattering coefficients are obtained from these experiments. The ten-wavelength spectra of the extracted reduced scattering coefficient enable quantitative morphological images to be reconstructed with this system. From the in vivo clinic experiment, functional images including deoxyhemoglobin, oxyhemoglobin, and water concentration are recovered and tumors are detected with correct size and position compared with the mammography.
Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool
Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.
2011-01-01
Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150
NASA Technical Reports Server (NTRS)
Zhog, Cheng Frank; Ye, Jing Yong; Norris, Theodore B.; Myc, Andrzej; Cao, Zhengyl; Bielinska, Anna; Thomas, Thommey; Baker, James R., Jr.
2004-01-01
Flow cytometry is a powerful technique for obtaining quantitative information from fluorescence in cells. Quantitation is achieved by assuring a high degree of uniformity in the optical excitation and detection, generally by using a highly controlled flow such as is obtained via hydrodynamic focusing. In this work, we demonstrate a two-beam, two- channel detection and two-photon excitation flow cytometry (T(sup 3)FC) system that enables multi-dye analysis to be performed very simply, with greatly relaxed requirements on the fluid flow. Two-photon excitation using a femtosecond near-infrared (NIR) laser has the advantages that it enables simultaneous excitation of multiple dyes and achieves very high signal-to-noise ratio through simplified filtering and fluorescence background reduction. By matching the excitation volume to the size of a cell, single-cell detection is ensured. Labeling of cells by targeted nanoparticles with multiple fluorophores enables normalization of the fluorescence signal and thus ratiometric measurements under nonuniform excitation. Quantitative size measurements can also be done even under conditions of nonuniform flow via a two-beam layout. This innovative detection scheme not only considerably simplifies the fluid flow system and the excitation and collection optics, it opens the way to quantitative cytometry in simple and compact microfluidics systems, or in vivo. Real-time detection of fluorescent microbeads in the vasculature of mouse ear demonstrates the ability to do flow cytometry in vivo. The conditions required to perform quantitative in vivo cytometry on labeled cells will be presented.
Vieira, Roberta Peixoto; Gomes, Sílvia Helena Pereira; Machado, Maria de Fátima Antero Sousa; Bezerra, Italla Maria Pinheiro; Machado, Caroline Antero
2014-01-01
to evaluate the participation of adolescents in the Family Health Strategy, from the theoretical-methodological structure of an enabler to participation. a quantitative study, conducted from December of 2010 to March of 2011, with 213 professionals in the FHS in the region of Cariri-Ceará-Brazil. Data were collected through a questionnaire and organized in SPSS 18.0. the level of normative participation becomes manifest beginning with the adolescent search for health services, motivated by disease (77.9%). Normative participation + independence appear when they seek prenatal care and family planning. Emancipatory participation was identified by the frequency of adolescents in group activities, in the schools, and a move in the direction of the level of transformative participation was observed. in this context, it is understood that there exists a need to stimulate the participatory process of the adolescents for a change in health promotion in this group.
General practitioners learning qualitative research: A case study of postgraduate education.
Hepworth, Julie; Kay, Margaret
2015-10-01
Qualitative research is increasingly being recognised as a vital aspect of primary healthcare research. Teaching and learning how to conduct qualitative research is especially important for general practitioners and other clinicians in the professional educational setting. This article examines a case study of postgraduate professional education in qualitative research for clinicians, for the purpose of enabling a robust discussion around teaching and learning in medicine and the health sciences. A series of three workshops was delivered for primary healthcare academics. The workshops were evaluated using a quantitative survey and qualitative free-text responses to enable descriptive analyses. Participants found qualitative philosophy and theory the most difficult areas to engage with, and learning qualitative coding and analysis was considered the easiest to learn. Key elements for successful teaching were identified, including the use of adult learning principles, the value of an experienced facilitator and an awareness of the impact of clinical subcultures on learning.
CPTAC Assay Portal: a repository of targeted proteomic assays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiteaker, Jeffrey R.; Halusa, Goran; Hoofnagle, Andrew N.
2014-06-27
To address these issues, the Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as a public repository of well-characterized quantitative, MS-based, targeted proteomic assays. The purpose of the CPTAC Assay Portal is to facilitate widespread adoption of targeted MS assays by disseminating SOPs, reagents, and assay characterization data for highly characterized assays. A primary aim of the NCI-supported portal is to bring together clinicians or biologists and analytical chemists to answer hypothesis-driven questions using targeted, MS-based assays. Assay content is easily accessed through queries and filters, enabling investigatorsmore » to find assays to proteins relevant to their areas of interest. Detailed characterization data are available for each assay, enabling researchers to evaluate assay performance prior to launching the assay in their own laboratory.« less
Fortier, Véronique; Levesque, Ives R
2018-06-01
Phase processing impacts the accuracy of quantitative susceptibility mapping (QSM). Techniques for phase unwrapping and background removal have been proposed and demonstrated mostly in brain. In this work, phase processing was evaluated in the context of large susceptibility variations (Δχ) and negligible signal, in particular for susceptibility estimation using the iterative phase replacement (IPR) algorithm. Continuous Laplacian, region-growing, and quality-guided unwrapping were evaluated. For background removal, Laplacian boundary value (LBV), projection onto dipole fields (PDF), sophisticated harmonic artifact reduction for phase data (SHARP), variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP), regularization enabled sophisticated harmonic artifact reduction for phase data (RESHARP), and 3D quadratic polynomial field removal were studied. Each algorithm was quantitatively evaluated in simulation and qualitatively in vivo. Additionally, IPR-QSM maps were produced to evaluate the impact of phase processing on the susceptibility in the context of large Δχ with negligible signal. Quality-guided unwrapping was the most accurate technique, whereas continuous Laplacian performed poorly in this context. All background removal algorithms tested resulted in important phase inaccuracies, suggesting that techniques used for brain do not translate well to situations where large Δχ and no or low signal are expected. LBV produced the smallest errors, followed closely by PDF. Results suggest that quality-guided unwrapping should be preferred, with PDF or LBV for background removal, for QSM in regions with large Δχ and negligible signal. This reduces the susceptibility inaccuracy introduced by phase processing. Accurate background removal remains an open question. Magn Reson Med 79:3103-3113, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Quantitation of sweet steviol glycosides by means of a HILIC-MS/MS-SIDA approach.
Well, Caroline; Frank, Oliver; Hofmann, Thomas
2013-11-27
Meeting the rising consumer demand for natural food ingredients, steviol glycosides, the sweet principle of Stevia rebaudiana Bertoni (Bertoni), have recently been approved as food additives in the European Union. As regulatory constraints require sensitive methods to analyze the sweet-tasting steviol glycosides in foods and beverages, a HILIC-MS/MS method was developed enabling the accurate and reliable quantitation of the major steviol glycosides stevioside, rebaudiosides A-F, steviolbioside, rubusoside, and dulcoside A by using the corresponding deuterated 16,17-dihydrosteviol glycosides as suitable internal standards. This quantitation not only enables the analysis of the individual steviol glycosides in foods and beverages but also can support the optimization of breeding and postharvest downstream processing of Stevia plants to produce preferentially sweet and least bitter tasting Stevia extracts.
Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun
2017-01-30
Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.
2011-01-01
Background Disease management programmes (DMPs) have been developed to improve effectiveness and economic efficiency within chronic care delivery by combining patient-related, professional-directed, and organisational interventions. The benefits of DMPs within different settings, patient groups, and versions remain unclear. In this article we propose a protocol to evaluate a range of current DMPs by capturing them in a single conceptual framework, employing comparable structure, process, and outcome measures, and combining qualitative and quantitative research methods. Methods To assess DMP effectiveness a practical clinical trial will be conducted. Twenty-two disease management experiments will be studied in various Dutch regions consisting of a variety of collaborations between organisations and/or professionals. Patient cohorts include those with cardiovascular diseases, chronic obstructive pulmonary disease, diabetes, stroke, depression, psychotic diseases, and eating disorders. Our methodological approach combines qualitative and quantitative research methods to enable a comprehensive evaluation of complex programmes. Process indicators will be collected from health care providers' data registries and measured via physician and staff questionnaires. Patient questionnaires include health care experiences, health care utilisation, and quality of life. Qualitative data will be gathered by means of interviews and document analysis for an in depth description of project interventions and the contexts in which DMPs are embedded, and an ethnographic process evaluation in five DMPs. Such a design will provide insight into ongoing DMPs and demonstrate which elements of the intervention are potentially (cost)-effective for which patient populations. It will also enable sound comparison of the results of the different programmes. Discussion The study will lead to a better understanding of (1) the mechanisms of disease management, (2) the feasibility, and cost-effectiveness of a disease management approach to improving health care, and (3) the factors that determine success and failure of DMPs. Our study results will be relevant to decision makers and managers who confront the challenge of implementing and integrating DMPs into the health care system. Moreover, it will contribute to the search for methods to evaluate complex healthcare interventions. PMID:21219620
Lemmens, Karin M M; Rutten-Van Mölken, Maureen P M H; Cramm, Jane M; Huijsman, Robbert; Bal, Roland A; Nieboer, Anna P
2011-01-10
Disease management programmes (DMPs) have been developed to improve effectiveness and economic efficiency within chronic care delivery by combining patient-related, professional-directed, and organisational interventions. The benefits of DMPs within different settings, patient groups, and versions remain unclear. In this article we propose a protocol to evaluate a range of current DMPs by capturing them in a single conceptual framework, employing comparable structure, process, and outcome measures, and combining qualitative and quantitative research methods. To assess DMP effectiveness a practical clinical trial will be conducted. Twenty-two disease management experiments will be studied in various Dutch regions consisting of a variety of collaborations between organisations and/or professionals. Patient cohorts include those with cardiovascular diseases, chronic obstructive pulmonary disease, diabetes, stroke, depression, psychotic diseases, and eating disorders. Our methodological approach combines qualitative and quantitative research methods to enable a comprehensive evaluation of complex programmes. Process indicators will be collected from health care providers' data registries and measured via physician and staff questionnaires. Patient questionnaires include health care experiences, health care utilisation, and quality of life. Qualitative data will be gathered by means of interviews and document analysis for an in depth description of project interventions and the contexts in which DMPs are embedded, and an ethnographic process evaluation in five DMPs. Such a design will provide insight into ongoing DMPs and demonstrate which elements of the intervention are potentially (cost)-effective for which patient populations. It will also enable sound comparison of the results of the different programmes. The study will lead to a better understanding of (1) the mechanisms of disease management, (2) the feasibility, and cost-effectiveness of a disease management approach to improving health care, and (3) the factors that determine success and failure of DMPs. Our study results will be relevant to decision makers and managers who confront the challenge of implementing and integrating DMPs into the health care system. Moreover, it will contribute to the search for methods to evaluate complex healthcare interventions.
A Quantitative Comparison of Calibration Methods for RGB-D Sensors Using Different Technologies.
Villena-Martínez, Víctor; Fuster-Guilló, Andrés; Azorín-López, Jorge; Saval-Calvo, Marcelo; Mora-Pascual, Jeronimo; Garcia-Rodriguez, Jose; Garcia-Garcia, Alberto
2017-01-27
RGB-D (Red Green Blue and Depth) sensors are devices that can provide color and depth information from a scene at the same time. Recently, they have been widely used in many solutions due to their commercial growth from the entertainment market to many diverse areas (e.g., robotics, CAD, etc.). In the research community, these devices have had good uptake due to their acceptable levelofaccuracyformanyapplicationsandtheirlowcost,butinsomecases,theyworkatthelimitof their sensitivity, near to the minimum feature size that can be perceived. For this reason, calibration processes are critical in order to increase their accuracy and enable them to meet the requirements of such kinds of applications. To the best of our knowledge, there is not a comparative study of calibration algorithms evaluating its results in multiple RGB-D sensors. Specifically, in this paper, a comparison of the three most used calibration methods have been applied to three different RGB-D sensors based on structured light and time-of-flight. The comparison of methods has been carried out by a set of experiments to evaluate the accuracy of depth measurements. Additionally, an object reconstruction application has been used as example of an application for which the sensor works at the limit of its sensitivity. The obtained results of reconstruction have been evaluated through visual inspection and quantitative measurements.
A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.
Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui
2017-10-01
Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.
Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors
Caldwell, J.T.; Kunz, W.E.; Atencio, J.D.
1982-03-31
A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify /sup 233/U, /sup 235/U and /sup 239/Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as /sup 240/Pu, /sup 244/Cm and /sup 252/Cf, and the spontaneous alpha particle emitter /sup 241/Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether permanent low-level burial is appropriate for the waste sample.
Apparatus and method for quantitative assay of generic transuranic wastes from nuclear reactors
Caldwell, John T.; Kunz, Walter E.; Atencio, James D.
1984-01-01
A combination of passive and active neutron measurements which yields quantitative information about the isotopic composition of transuranic wastes from nuclear power or weapons material manufacture reactors is described. From the measurement of prompt and delayed neutron emission and the incidence of two coincidentally emitted neutrons from induced fission of fissile material in the sample, one can quantify .sup.233 U, .sup.235 U and .sup.239 Pu isotopes in waste samples. Passive coincidence counting, including neutron multiplicity measurement and determination of the overall passive neutron flux additionally enables the separate quantitative evaluation of spontaneous fission isotopes such as .sup.240 Pu, .sup.244 Cm and .sup.252 Cf, and the spontaneous alpha particle emitter .sup.241 Am. These seven isotopes are the most important constituents of wastes from nuclear power reactors and once the mass of each isotope present is determined by the apparatus and method of the instant invention, the overall alpha particle activity can be determined to better than 1 nCi/g from known radioactivity data. Therefore, in addition to the quantitative analysis of the waste sample useful for later reclamation purposes, the alpha particle activity can be determined to decide whether "permanent" low-level burial is appropriate for the waste sample.
Calibration methods influence quantitative material decomposition in photon-counting spectral CT
NASA Astrophysics Data System (ADS)
Curtis, Tyler E.; Roeder, Ryan K.
2017-03-01
Photon-counting detectors and nanoparticle contrast agents can potentially enable molecular imaging and material decomposition in computed tomography (CT). Material decomposition has been investigated using both simulated and acquired data sets. However, the effect of calibration methods on material decomposition has not been systematically investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on quantitative material decomposition. A commerciallyavailable photon-counting spectral micro-CT (MARS Bioimaging) was used to acquire images with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material basis matrix values were determined using multiple linear regression models and material decomposition was performed using a maximum a posteriori estimator. The accuracy of quantitative material decomposition was evaluated by the root mean squared error (RMSE), specificity, sensitivity, and area under the curve (AUC). An increased maximum concentration (range) in the calibration significantly improved RMSE, specificity and AUC. The effects of an increased number of concentrations in the calibration were not statistically significant for the conditions in this study. The overall results demonstrated that the accuracy of quantitative material decomposition in spectral CT is significantly influenced by calibration methods, which must therefore be carefully considered for the intended diagnostic imaging application.
Manuilov, Anton V; Radziejewski, Czeslaw H
2011-01-01
Comparability studies lie at the heart of assessments that evaluate differences amongst manufacturing processes and stability studies of protein therapeutics. Low resolution chromatographic and electrophoretic methods facilitate quantitation, but do not always yield detailed insight into the effect of the manufacturing change or environmental stress. Conversely, mass spectrometry (MS) can provide high resolution information on the molecule, but conventional methods are not very quantitative. This gap can be reconciled by use of a stable isotope-tagged reference standard (SITRS), a version of the analyte protein that is uniformly labeled with 13C6-arginine and 13C6-lysine. The SITRS serves as an internal control that is trypsin-digested and analyzed by liquid chromatography (LC)-MS with the analyte sample. The ratio of the ion intensities of each unlabeled and labeled peptide pair is then compared to that of other sample(s). A comparison of these ratios provides a readily accessible way to spot even minute differences among samples. In a study of a monoclonal antibody (mAb) spiked with varying amounts of the same antibody bearing point mutations, peptides containing the mutations were readily identified and quantified at concentrations as low as 2% relative to unmodified peptides. The method was robust, reproducible and produced a linear response for every peptide that was monitored. The method was also successfully used to distinguish between two batches of a mAb that were produced in two different cell lines while two batches produced from the same cell line were found to be highly comparable. Finally, the use of the SITRS method in the comparison of two stressed mAb samples enabled the identification of sites susceptible to deamidation and oxidation, as well as their quantitation. The experimental results indicate that use of a SITRS in a peptide mapping experiment with MS detection enables sensitive and quantitative comparability studies of proteins at high resolution. PMID:21654206
Manuilov, Anton V; Radziejewski, Czeslaw H; Lee, David H
2011-01-01
Comparability studies lie at the heart of assessments that evaluate differences amongst manufacturing processes and stability studies of protein therapeutics. Low resolution chromatographic and electrophoretic methods facilitate quantitation, but do not always yield detailed insight into the effect of the manufacturing change or environmental stress. Conversely, mass spectrometry (MS) can provide high resolution information on the molecule, but conventional methods are not very quantitative. This gap can be reconciled by use of a stable isotope-tagged reference standard (SITRS), a version of the analyte protein that is uniformly labeled (13)C6-arginine and (13)C6-lysine. The SITRS serves as an internal control that is trypsin-digested and analyzed by liquid chromatography (LC)-MS with the analyte sample. The ratio of the ion intensities of each unlabeled and labeled peptide pair is then compared to that of other sample(s). A comparison of these ratios provides a readily accessible way to spot even minute differences among samples. In a study of a monoclonal antibody (mAb) spiked with varying amounts of the same antibody bearing point mutations, peptides containing the mutations were readily identified and quantified at concentrations as low as 2% relative to unmodified peptides. The method is robust, reproducible and produced a linear response for every peptide that was monitored. The method was also successfully used to distinguish between two batches of a mAb that were produced in two different cell lines while two batches produced from the same cell line were found to be highly comparable. Finally, the use of the SITRS method in the comparison of two stressed mAb samples enabled the identification of sites susceptible to deamidation and oxidation, as well as their quantitation. The experimental results indicate that use of a SITRS in a peptide mapping experiment with MS detection enables sensitive and quantitative comparability studies of proteins at high resolution.
Possible ways for Public Health Surveillance practices evaluation.
Vilela, Maria Filomena de Gouveia; Santos, Dario Nunes Dos; Kemp, Brigina
2017-10-01
This is an evaluative and qualitative study that proposes to investigate self-assessment evaluation as a device to analyze Health Surveillance practices through a questionnaire built by researchers, adapted from the Self-Assessment of Improved Access and Primary Care Quality (AMAQ) and available on the FORMSUS platform. Forty-one Health Surveillance workers and managers of a large municipality from São Paulo State evaluated the realms of "management", "teamwork" and their respective sub-realms. Two categories were created to analyze the results: "Management" and "Team" in dialogue with references from Management, Evaluation and Health Surveillance. Most "management" and "teamwork" sub-realms were deemed satisfactory. Self-assessment evaluation through an applied evaluation tool was shown to be a powerful resource for the analysis of Health Surveillance practices in combination with other devices adopted by the Unified Health System (SUS). Unlike usual evaluation processes guided by quantitative markers, this self-assessable evaluative process included subjects and enabled the possibility of incorporating a new look at itself to the way Health Surveillance is carried out and support future management contracts between workers and managers.
On the Distinction Between Quantitative and Qualitative Research.
ERIC Educational Resources Information Center
Smith, P. L.
Quantitative and qualitative research are differing modes of measurement, one using numbers and the other not. The assignment of numerals to represent properties enables a researcher to distinguish minutely between different properties. The major issue dividing these approaches to empirical research represents a philosophical dispute which has…
Quantitative single-molecule imaging by confocal laser scanning microscopy.
Vukojevic, Vladana; Heidkamp, Marcus; Ming, Yu; Johansson, Björn; Terenius, Lars; Rigler, Rudolf
2008-11-25
A new approach to quantitative single-molecule imaging by confocal laser scanning microscopy (CLSM) is presented. It relies on fluorescence intensity distribution to analyze the molecular occurrence statistics captured by digital imaging and enables direct determination of the number of fluorescent molecules and their diffusion rates without resorting to temporal or spatial autocorrelation analyses. Digital images of fluorescent molecules were recorded by using fast scanning and avalanche photodiode detectors. In this way the signal-to-background ratio was significantly improved, enabling direct quantitative imaging by CLSM. The potential of the proposed approach is demonstrated by using standard solutions of fluorescent dyes, fluorescently labeled DNA molecules, quantum dots, and the Enhanced Green Fluorescent Protein in solution and in live cells. The method was verified by using fluorescence correlation spectroscopy. The relevance for biological applications, in particular, for live cell imaging, is discussed.
Shewale, Jaiprakash G; Schneida, Elaine; Wilson, Jonathan; Walker, Jerilyn A; Batzer, Mark A; Sinha, Sudhir K
2007-03-01
The human DNA quantification (H-Quant) system, developed for use in human identification, enables quantitation of human genomic DNA in biological samples. The assay is based on real-time amplification of AluYb8 insertions in hominoid primates. The relatively high copy number of subfamily-specific Alu repeats in the human genome enables quantification of very small amounts of human DNA. The oligonucleotide primers present in H-Quant are specific for human DNA and closely related great apes. During the real-time PCR, the SYBR Green I dye binds to the DNA that is synthesized by the human-specific AluYb8 oligonucleotide primers. The fluorescence of the bound SYBR Green I dye is measured at the end of each PCR cycle. The cycle at which the fluorescence crosses the chosen threshold correlates to the quantity of amplifiable DNA in that sample. The minimal sensitivity of the H-Quant system is 7.6 pg/microL of human DNA. The amplicon generated in the H-Quant assay is 216 bp, which is within the same range of the common amplifiable short tandem repeat (STR) amplicons. This size amplicon enables quantitation of amplifiable DNA as opposed to a quantitation of degraded or nonamplifiable DNA of smaller sizes. Development and validation studies were performed on the 7500 real-time PCR system following the Quality Assurance Standards for Forensic DNA Testing Laboratories.
Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie
2015-11-20
In a previous study, a modular process risk model, from the raw material reception to the final product storage, was built to estimate the risk of a UHT-aseptic line of not complying with commercial sterility (Pujol et al., 2015). This present study was focused on demonstrating how the model (updated version with uncertainty and variability separated and 2(nd) order Monte Carlo procedure run) could be used to assess quantitatively the influence of management options. This assessment was done in three steps: pinpoint which process step had the highest influence on the risk, identify which management option(s) could be the most effective to control and/or reduce the risk, and finally evaluate quantitatively the influence of changing process setting(s) on the risk. For Bacillus cereus, it was identified that during post-process storage in an aseptic tank, there was potentially an air re-contamination due to filter efficiency loss (efficiency loss due to successive in-place sterilizations after cleaning operations), followed by B. cereus growth. Two options were then evaluated: i) reducing by one fifth of the number of filter sterilizations before renewing the filters, ii) designing new UHT-aseptic lines without an aseptic tank, i.e. without a storage period after the thermal process and before filling. Considering the uncertainty in the model, it was not possible to confirm whether these options had a significant influence on the risk associated with B. cereus. On the other hand, for Geobacillus stearothermophilus, combinations of heat-treatment time and temperature enabling the control or reduction in risk by a factor of ca. 100 were determined; for ease of operational implementation, they were presented graphically in the form of iso-risk curves. For instance, it was established that a heat treatment of 138°C for 31s (instead of 138°C for 25s) enabled a reduction in risk to 18×10(-8) (95% CI=[10; 34]×10(-8)), instead of 578×10(-8) (95% CI=[429; 754]×10(-8)) initially. In conclusion, a modular risk model, as the one exemplified here with a UHT-aseptic line, is a valuable tool in process design and operation, bringing definitive quantitative elements into the decision making process. Copyright © 2015 Elsevier B.V. All rights reserved.
Deller, Timothy W; Khalighi, Mohammad Mehdi; Jansen, Floris P; Glover, Gary H
2018-01-01
The recent introduction of simultaneous whole-body PET/MR scanners has enabled new research taking advantage of the complementary information obtainable with PET and MRI. One such application is kinetic modeling, which requires high levels of PET quantitative stability. To accomplish the required PET stability levels, the PET subsystem must be sufficiently isolated from the effects of MR activity. Performance measurements have previously been published, demonstrating sufficient PET stability in the presence of MR pulsing for typical clinical use; however, PET stability during radiofrequency (RF)-intensive and gradient-intensive sequences has not previously been evaluated for a clinical whole-body scanner. In this work, PET stability of the GE SIGNA PET/MR was examined during simultaneous scanning of aggressive MR pulse sequences. Methods: PET performance tests were acquired with MR idle and during simultaneous MR pulsing. Recent system improvements mitigating RF interference and gain variation were used. A fast recovery fast spin echo MR sequence was selected for high RF power, and an echo planar imaging sequence was selected for its high heat-inducing gradients. Measurements were performed to determine PET stability under varying MR conditions using the following metrics: sensitivity, scatter fraction, contrast recovery, uniformity, count rate performance, and image quantitation. A final PET quantitative stability assessment for simultaneous PET scanning during functional MRI studies was performed with a spiral in-and-out gradient echo sequence. Results: Quantitation stability of a 68 Ge flood phantom was demonstrated within 0.34%. Normalized sensitivity was stable during simultaneous scanning within 0.3%. Scatter fraction measured with a 68 Ge line source in the scatter phantom was stable within the range of 40.4%-40.6%. Contrast recovery and uniformity were comparable for PET images acquired simultaneously with multiple MR conditions. Peak noise equivalent count rate was 224 kcps at an effective activity concentration of 18.6 kBq/mL, and the count rate curves and scatter fraction curve were consistent for the alternating MR pulsing states. A final test demonstrated quantitative stability during a spiral functional MRI sequence. Conclusion: PET stability metrics demonstrated that PET quantitation was not affected during simultaneous aggressive MRI. This stability enables demanding applications such as kinetic modeling. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.
Thonusin, Chanisa; IglayReger, Heidi B; Soni, Tanu; Rothberg, Amy E; Burant, Charles F; Evans, Charles R
2017-11-10
In recent years, mass spectrometry-based metabolomics has increasingly been applied to large-scale epidemiological studies of human subjects. However, the successful use of metabolomics in this context is subject to the challenge of detecting biologically significant effects despite substantial intensity drift that often occurs when data are acquired over a long period or in multiple batches. Numerous computational strategies and software tools have been developed to aid in correcting for intensity drift in metabolomics data, but most of these techniques are implemented using command-line driven software and custom scripts which are not accessible to all end users of metabolomics data. Further, it has not yet become routine practice to assess the quantitative accuracy of drift correction against techniques which enable true absolute quantitation such as isotope dilution mass spectrometry. We developed an Excel-based tool, MetaboDrift, to visually evaluate and correct for intensity drift in a multi-batch liquid chromatography - mass spectrometry (LC-MS) metabolomics dataset. The tool enables drift correction based on either quality control (QC) samples analyzed throughout the batches or using QC-sample independent methods. We applied MetaboDrift to an original set of clinical metabolomics data from a mixed-meal tolerance test (MMTT). The performance of the method was evaluated for multiple classes of metabolites by comparison with normalization using isotope-labeled internal standards. QC sample-based intensity drift correction significantly improved correlation with IS-normalized data, and resulted in detection of additional metabolites with significant physiological response to the MMTT. The relative merits of different QC-sample curve fitting strategies are discussed in the context of batch size and drift pattern complexity. Our drift correction tool offers a practical, simplified approach to drift correction and batch combination in large metabolomics studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.
Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar
2016-05-01
Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits can be anticipated for the assessment of safety and toxicity profile of novel molecules. © The Author 2016. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Umetani, K.; Fukushima, K.
2013-03-15
An X-ray intravital microscopy technique was developed to enable in vivo visualization of the coronary, cerebral, and pulmonary arteries in rats without exposure of organs and with spatial resolution in the micrometer range and temporal resolution in the millisecond range. We have refined the system continually in terms of the spatial resolution and exposure time. X-rays transmitted through an object are detected by an X-ray direct-conversion type detector, which incorporates an X-ray SATICON pickup tube. The spatial resolution has been improved to 6 {mu}m, yielding sharp images of small arteries. The exposure time has been shortened to around 2 msmore » using a new rotating-disk X-ray shutter, enabling imaging of beating rat hearts. Quantitative evaluations of the X-ray intravital microscopy technique were extracted from measurements of the smallest-detectable vessel size and detection of the vessel function. The smallest-diameter vessel viewed for measurements is determined primarily by the concentration of iodinated contrast material. The iodine concentration depends on the injection technique. We used ex vivo rat hearts under Langendorff perfusion for accurate evaluation. After the contrast agent is injected into the origin of the aorta in an isolated perfused rat heart, the contrast agent is delivered directly into the coronary arteries with minimum dilution. The vascular internal diameter response of coronary arterial circulation is analyzed to evaluate the vessel function. Small blood vessels of more than about 50 {mu}m diameters were visualized clearly at heart rates of around 300 beats/min. Vasodilation compared to the control was observed quantitatively using drug manipulation. Furthermore, the apparent increase in the number of small vessels with diameters of less than about 50 {mu}m was observed after the vasoactive agents increased the diameters of invisible small blood vessels to visible sizes. This technique is expected to offer the potential for direct investigation of mechanisms of vascular dysfunctions.« less
Mägi, Reedik; Suleimanov, Yury V; Clarke, Geraldine M; Kaakinen, Marika; Fischer, Krista; Prokopenko, Inga; Morris, Andrew P
2017-01-11
Genome-wide association studies (GWAS) of single nucleotide polymorphisms (SNPs) have been successful in identifying loci contributing genetic effects to a wide range of complex human diseases and quantitative traits. The traditional approach to GWAS analysis is to consider each phenotype separately, despite the fact that many diseases and quantitative traits are correlated with each other, and often measured in the same sample of individuals. Multivariate analyses of correlated phenotypes have been demonstrated, by simulation, to increase power to detect association with SNPs, and thus may enable improved detection of novel loci contributing to diseases and quantitative traits. We have developed the SCOPA software to enable GWAS analysis of multiple correlated phenotypes. The software implements "reverse regression" methodology, which treats the genotype of an individual at a SNP as the outcome and the phenotypes as predictors in a general linear model. SCOPA can be applied to quantitative traits and categorical phenotypes, and can accommodate imputed genotypes under a dosage model. The accompanying META-SCOPA software enables meta-analysis of association summary statistics from SCOPA across GWAS. Application of SCOPA to two GWAS of high-and low-density lipoprotein cholesterol, triglycerides and body mass index, and subsequent meta-analysis with META-SCOPA, highlighted stronger association signals than univariate phenotype analysis at established lipid and obesity loci. The META-SCOPA meta-analysis also revealed a novel signal of association at genome-wide significance for triglycerides mapping to GPC5 (lead SNP rs71427535, p = 1.1x10 -8 ), which has not been reported in previous large-scale GWAS of lipid traits. The SCOPA and META-SCOPA software enable discovery and dissection of multiple phenotype association signals through implementation of a powerful reverse regression approach.
Morales, Arturo; Marmesat, Susana; Dobarganes, M Carmen; Márquez-Ruiz, Gloria; Velasco, Joaquín
2012-09-07
The use of an ELS detector in NP-HPLC for quantitative analysis of oxidation products in FAME obtained from oils is evaluated in this study. The results obtained have shown that the ELS detector enables the quantitative determination of the hydroperoxides of oleic and linoleic acid methyl esters as a whole, and connected in series with a UV detector makes it possible to determine both groups of compounds by difference, providing useful complementary information. The limits of detection (LOD) and quantification (LOQ) found for hydroperoxides were respectively 2.5 and 5.7 μg mL⁻¹ and precision of quantitation expressed as coefficient of variation was lower than 10%. Due to a low sensitivity the ELS detector shows limitations to determine the low contents of secondary oxidation products in the direct analysis of FAME oxidized at low or moderate temperature. Analysis of FAME samples obtained either from high linoleic sunflower oil (HLSO) or high oleic sunflower oil (HOSO) and oxidized at 80 °C showed that only ketodienes formed from methyl linoleate can be determined in samples with relatively high oxidation, being the LOD and LOQ 0.2 and 0.4 mg/g FAME, respectively, at the analytical conditions applied. The ELS detector also enabled the determination of methyl cis-9,10-epoxystearate and methyl trans-9,10-epoxystearate, which were resolved at the chromatographic conditions applied. Results showed that these compounds, which are formed from methyl oleate, were not detected in the high-linoleic sample, but occurred at non-negligible levels in the oxidized FAME obtained from HOSO. Copyright © 2012 Elsevier B.V. All rights reserved.
Current challenges and future perspectives of plant and agricultural biotechnology.
Moshelion, Menachem; Altman, Arie
2015-06-01
Advances in understanding plant biology, novel genetic resources, genome modification, and omics technologies generate new solutions for food security and novel biomaterials production under changing environmental conditions. New gene and germplasm candidates that are anticipated to lead to improved crop yields and other plant traits under stress have to pass long development phases based on trial and error using large-scale field evaluation. Therefore, quantitative, objective, and automated screening methods combined with decision-making algorithms are likely to have many advantages, enabling rapid screening of the most promising crop lines at an early stage followed by final mandatory field experiments. The combination of novel molecular tools, screening technologies, and economic evaluation should become the main goal of the plant biotechnological revolution in agriculture. Copyright © 2015 Elsevier Ltd. All rights reserved.
LeGros, Theresa A; Amerongen, Helen M; Cooley, Janet H; Schloss, Ernest P
2015-01-01
Despite the increasing need for faculty and preceptors skilled in interprofessional facilitation (IPF), the relative novelty of the field poses a challenge to the development and evaluation of IPF programs. We use learning theory and IPF competencies with associated behavioral indicators to develop and evaluate six key messages in IPF training and experience. Our mixed methods approach included two phases: quantitative data collection with embedded qualitative data, followed by qualitative data collection in explanatory sequential fashion. This enabled triangulated analyses of both data types and of facilitation behaviors from facilitator and student perspectives. Results indicate the competency-based training was effective. Facilitators felt comfortable performing behaviors associated with IPF competencies; student observations of those behaviors supported facilitator self-reported performance. Overall, students perceived more facilitation opportunities than facilitators. Findings corroborate the importance of recruiting seasoned facilitators and establishing IPF guidelines that acknowledge variable team dynamics and help facilitators recognize teachable moments.
Evaluating the decision accuracy and speed of clinical data visualizations.
Pieczkiewicz, David S; Finkelstein, Stanley M
2010-01-01
Clinicians face an increasing volume of biomedical data. Assessing the efficacy of systems that enable accurate and timely clinical decision making merits corresponding attention. This paper discusses the multiple-reader multiple-case (MRMC) experimental design and linear mixed models as means of assessing and comparing decision accuracy and latency (time) for decision tasks in which clinician readers must interpret visual displays of data. These tools can assess and compare decision accuracy and latency (time). These experimental and statistical techniques, used extensively in radiology imaging studies, offer a number of practical and analytic advantages over more traditional quantitative methods such as percent-correct measurements and ANOVAs, and are recommended for their statistical efficiency and generalizability. An example analysis using readily available, free, and commercial statistical software is provided as an appendix. While these techniques are not appropriate for all evaluation questions, they can provide a valuable addition to the evaluative toolkit of medical informatics research.
Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.
ERIC Educational Resources Information Center
Moffat, A. J.; And Others
Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…
General Platform for Systematic Quantitative Evaluation of Small-Molecule Permeability in Bacteria
2015-01-01
The chemical features that impact small-molecule permeability across bacterial membranes are poorly understood, and the resulting lack of tools to predict permeability presents a major obstacle to the discovery and development of novel antibiotics. Antibacterials are known to have vastly different structural and physicochemical properties compared to nonantiinfective drugs, as illustrated herein by principal component analysis (PCA). To understand how these properties influence bacterial permeability, we have developed a systematic approach to evaluate the penetration of diverse compounds into bacteria with distinct cellular envelopes. Intracellular compound accumulation is quantitated using LC-MS/MS, then PCA and Pearson pairwise correlations are used to identify structural and physicochemical parameters that correlate with accumulation. An initial study using 10 sulfonyladenosines in Escherichia coli, Bacillus subtilis, and Mycobacterium smegmatis has identified nonobvious correlations between chemical structure and permeability that differ among the various bacteria. Effects of cotreatment with efflux pump inhibitors were also investigated. This sets the stage for use of this platform in larger prospective analyses of diverse chemotypes to identify global relationships between chemical structure and bacterial permeability that would enable the development of predictive tools to accelerate antibiotic drug discovery. PMID:25198656
NASA Astrophysics Data System (ADS)
Li, Xuan; Liu, Zhiping; Jiang, Xiaoli; Lodewijks, Gabrol
2018-01-01
Eddy current pulsed thermography (ECPT) is well established for non-destructive testing of electrical conductive materials, featuring the advantages of contactless, intuitive detecting and efficient heating. The concept of divergence characterization of the damage rate of carbon fibre-reinforced plastic (CFRP)-steel structures can be extended to ECPT thermal pattern characterization. It was found in this study that the use of ECPT technology on CFRP-steel structures generated a sizeable amount of valuable information for comprehensive material diagnostics. The relationship between divergence and transient thermal patterns can be identified and analysed by deploying mathematical models to analyse the information about fibre texture-like orientations, gaps and undulations in these multi-layered materials. The developed algorithm enabled the removal of information about fibre texture and the extraction of damage features. The model of the CFRP-glue-steel structures with damage was established using COMSOL Multiphysics® software, and quantitative non-destructive damage evaluation from the ECPT image areas was derived. The results of this proposed method illustrate that damaged areas are highly affected by available information about fibre texture. This proposed work can be applied for detection of impact induced damage and quantitative evaluation of CFRP structures.
Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun
2018-05-01
Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.
Oeck, Sebastian; Malewicz, Nathalie M; Hurst, Sebastian; Al-Refae, Klaudia; Krysztofiak, Adam; Jendrossek, Verena
2017-07-01
The quantitative analysis of foci plays an important role in various cell biological methods. In the fields of radiation biology and experimental oncology, the effect of ionizing radiation, chemotherapy or molecularly targeted drugs on DNA damage induction and repair is frequently performed by the analysis of protein clusters or phosphorylated proteins recruited to so called repair foci at DNA damage sites, involving for example γ-H2A.X, 53BP1 or RAD51. We recently developed "The Focinator" as a reliable and fast tool for automated quantitative and qualitative analysis of nuclei and DNA damage foci. The refined software is now even more user-friendly due to a graphical interface and further features. Thus, we included an R-script-based mode for automated image opening, file naming, progress monitoring and an error report. Consequently, the evaluation no longer required the attendance of the operator after initial parameter definition. Moreover, the Focinator v2-0 is now able to perform multi-channel analysis of four channels and evaluation of protein-protein colocalization by comparison of up to three foci channels. This enables for example the quantification of foci in cells of a specific cell cycle phase.
Predicting episodic memory formation for movie events
Tang, Hanlin; Singer, Jed; Ison, Matias J.; Pivazyan, Gnel; Romaine, Melissa; Frias, Rosa; Meller, Elizabeth; Boulin, Adrianna; Carroll, James; Perron, Victoria; Dowcett, Sarah; Arellano, Marlise; Kreiman, Gabriel
2016-01-01
Episodic memories are long lasting and full of detail, yet imperfect and malleable. We quantitatively evaluated recollection of short audiovisual segments from movies as a proxy to real-life memory formation in 161 subjects at 15 minutes up to a year after encoding. Memories were reproducible within and across individuals, showed the typical decay with time elapsed between encoding and testing, were fallible yet accurate, and were insensitive to low-level stimulus manipulations but sensitive to high-level stimulus properties. Remarkably, memorability was also high for single movie frames, even one year post-encoding. To evaluate what determines the efficacy of long-term memory formation, we developed an extensive set of content annotations that included actions, emotional valence, visual cues and auditory cues. These annotations enabled us to document the content properties that showed a stronger correlation with recognition memory and to build a machine-learning computational model that accounted for episodic memory formation in single events for group averages and individual subjects with an accuracy of up to 80%. These results provide initial steps towards the development of a quantitative computational theory capable of explaining the subjective filtering steps that lead to how humans learn and consolidate memories. PMID:27686330
Gören, Ahmet C; Bilsel, Gökhan; Şimşek, Adnan; Bilsel, Mine; Akçadağ, Fatma; Topal, Kevser; Ozgen, Hasan
2015-05-15
High Performance Liquid Chromatography LC-UV and LC-MS/MS methods were developed and validated for quantitative analyses of sodium benzoate and potassium sorbate in foods and beverages. HPLC-UV and LC-MS/MS methods were compared for quantitative analyses of sodium benzoate and potassium sorbate in a representative ketchup sample. Optimisation of the methods enabled the chromatographic separation of the analytes in less than 4 min. A correlation coefficient of 0.999 was achieved over the measured calibration range for both compounds and methods (HPLC and LC-MS/MS). The uncertainty values of sodium benzoate and potassium sorbate were found as 0.199 and 0.150 mg/L by HPLC and 0.072 and 0.044 mg/L by LC-MS/MS, respectively. Proficiency testing performance of Turkish accredited laboratories between the years 2005 and 2013 was evaluated and reported herein. The aim of the proficiency testing scheme was to evaluate the performance of the laboratories, analysing benzoate and sorbate in tomato ketchup. Copyright © 2014 Elsevier Ltd. All rights reserved.
Predicting episodic memory formation for movie events.
Tang, Hanlin; Singer, Jed; Ison, Matias J; Pivazyan, Gnel; Romaine, Melissa; Frias, Rosa; Meller, Elizabeth; Boulin, Adrianna; Carroll, James; Perron, Victoria; Dowcett, Sarah; Arellano, Marlise; Kreiman, Gabriel
2016-09-30
Episodic memories are long lasting and full of detail, yet imperfect and malleable. We quantitatively evaluated recollection of short audiovisual segments from movies as a proxy to real-life memory formation in 161 subjects at 15 minutes up to a year after encoding. Memories were reproducible within and across individuals, showed the typical decay with time elapsed between encoding and testing, were fallible yet accurate, and were insensitive to low-level stimulus manipulations but sensitive to high-level stimulus properties. Remarkably, memorability was also high for single movie frames, even one year post-encoding. To evaluate what determines the efficacy of long-term memory formation, we developed an extensive set of content annotations that included actions, emotional valence, visual cues and auditory cues. These annotations enabled us to document the content properties that showed a stronger correlation with recognition memory and to build a machine-learning computational model that accounted for episodic memory formation in single events for group averages and individual subjects with an accuracy of up to 80%. These results provide initial steps towards the development of a quantitative computational theory capable of explaining the subjective filtering steps that lead to how humans learn and consolidate memories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevcik, R. S.; Hyman, D. A.; Basumallich, L.
2013-01-01
A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less
auf dem Keller, Ulrich; Prudova, Anna; Gioia, Magda; Butler, Georgina S.; Overall, Christopher M.
2010-01-01
Terminal amine isotopic labeling of substrates (TAILS), our recently introduced platform for quantitative N-terminome analysis, enables wide dynamic range identification of original mature protein N-termini and protease cleavage products. Modifying TAILS by use of isobaric tag for relative and absolute quantification (iTRAQ)-like labels for quantification together with a robust statistical classifier derived from experimental protease cleavage data, we report reliable and statistically valid identification of proteolytic events in complex biological systems in MS2 mode. The statistical classifier is supported by a novel parameter evaluating ion intensity-dependent quantification confidences of single peptide quantifications, the quantification confidence factor (QCF). Furthermore, the isoform assignment score (IAS) is introduced, a new scoring system for the evaluation of single peptide-to-protein assignments based on high confidence protein identifications in the same sample prior to negative selection enrichment of N-terminal peptides. By these approaches, we identified and validated, in addition to known substrates, low abundance novel bioactive MMP-2 targets including the plasminogen receptor S100A10 (p11) and the proinflammatory cytokine proEMAP/p43 that were previously undescribed. PMID:20305283
Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna
2018-06-05
Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.
Automated classification of cell morphology by coherence-controlled holographic microscopy
NASA Astrophysics Data System (ADS)
Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim
2017-08-01
In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity.
Automated classification of cell morphology by coherence-controlled holographic microscopy.
Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim
2017-08-01
In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Tonkin-Crine, Sarah; Anthierens, Sibyl; Hood, Kerenza; Yardley, Lucy; Cals, Jochen W L; Francis, Nick A; Coenen, Samuel; van der Velden, Alike W; Godycki-Cwirko, Maciek; Llor, Carl; Butler, Chris C; Verheij, Theo J M; Goossens, Herman; Little, Paul
2016-05-12
Mixed methods are commonly used in health services research; however, data are not often integrated to explore complementarity of findings. A triangulation protocol is one approach to integrating such data. A retrospective triangulation protocol was carried out on mixed methods data collected as part of a process evaluation of a trial. The multi-country randomised controlled trial found that a web-based training in communication skills (including use of a patient booklet) and the use of a C-reactive protein (CRP) point-of-care test decreased antibiotic prescribing by general practitioners (GPs) for acute cough. The process evaluation investigated GPs' and patients' experiences of taking part in the trial. Three analysts independently compared findings across four data sets: qualitative data collected view semi-structured interviews with (1) 62 patients and (2) 66 GPs and quantitative data collected via questionnaires with (3) 2886 patients and (4) 346 GPs. Pairwise comparisons were made between data sets and were categorised as agreement, partial agreement, dissonance or silence. Three instances of dissonance occurred in 39 independent findings. GPs and patients reported different views on the use of a CRP test. GPs felt that the test was useful in convincing patients to accept a no-antibiotic decision, but patient data suggested that this was unnecessary if a full explanation was given. Whilst qualitative data indicated all patients were generally satisfied with their consultation, quantitative data indicated highest levels of satisfaction for those receiving a detailed explanation from their GP with a booklet giving advice on self-care. Both qualitative and quantitative data sets indicated higher patient enablement for those in the communication groups who had received a booklet. Use of CRP tests does not appear to engage patients or influence illness perceptions and its effect is more centred on changing clinician behaviour. Communication skills and the patient booklet were relevant and useful for all patients and associated with increased patient satisfaction. A triangulation protocol to integrate qualitative and quantitative data can reveal findings that need further interpretation and also highlight areas of dissonance that lead to a deeper insight than separate analyses.
Investigating the Educational Value of Social Learning Networks: A Quantitative Analysis
ERIC Educational Resources Information Center
Dafoulas, Georgios; Shokri, Azam
2016-01-01
Purpose: The emergence of Education 2.0 enabled technology-enhanced learning, necessitating new pedagogical approaches, while e-learning has evolved into an instrumental pedagogy of collaboration through affordances of social media. Social learning networks and ubiquitous learning enabled individual and group learning through social engagement and…
Steiner, S; Vogl, T J; Fischer, P; Steger, W; Neuhaus, P; Keck, H
1995-08-01
The aim of our study was to evaluate a T2-weighted turbo-spinecho sequence in comparison to a T2-weighted spinecho sequence in imaging focal liver lesions. In our study 35 patients with suspected focal liver lesions were examined. Standardised imaging protocol included a conventional T2-weighted SE sequence (TR/TE = 2000/90/45, acquisition time = 10.20) as well as a T2-weighted TSE sequence (TR/TE = 4700/90, acquisition time = 6.33). Calculation of S/N and C/N ratio as a basis of quantitative evaluation was done using standard methods. A diagnostic score was implemented to enable qualitative assessment. In 7% (n = 2) the TSE sequence enabled detection of further liver lesions showing a size of less than 1 cm in diameter. Comparing anatomical details the TSE sequence was superior. S/N and C/N ratio of anatomic and pathologic structures of the TSE sequence were higher compared to results of the SE sequence. Our results indicate that the T2-weighted turbo-spinecho sequence is well appropriate for imaging focal liver lesions, and leads to reduction of imaging time.
NASA Astrophysics Data System (ADS)
Taya, T.; Kataoka, J.; Kishimoto, A.; Tagawa, L.; Mochizuki, S.; Toshito, T.; Kimura, M.; Nagao, Y.; Kurita, K.; Yamaguchi, M.; Kawachi, N.
2017-07-01
Particle therapy is an advanced cancer therapy that uses a feature known as the Bragg peak, in which particle beams suddenly lose their energy near the end of their range. The Bragg peak enables particle beams to damage tumors effectively. To achieve precise therapy, the demand for accurate and quantitative imaging of the beam irradiation region or dosage during therapy has increased. The most common method of particle range verification is imaging of annihilation gamma rays by positron emission tomography. Not only 511-keV gamma rays but also prompt gamma rays are generated during therapy; therefore, the Compton camera is expected to be used as an on-line monitor for particle therapy, as it can image these gamma rays in real time. Proton therapy, one of the most common particle therapies, uses a proton beam of approximately 200 MeV, which has a range of ~ 25 cm in water. As gamma rays are emitted along the path of the proton beam, quantitative evaluation of the reconstructed images of diffuse sources becomes crucial, but it is far from being fully developed for Compton camera imaging at present. In this study, we first quantitatively evaluated reconstructed Compton camera images of uniformly distributed diffuse sources, and then confirmed that our Compton camera obtained 3 %(1 σ) and 5 %(1 σ) uniformity for line and plane sources, respectively. Based on this quantitative study, we demonstrated on-line gamma imaging during proton irradiation. Through these studies, we show that the Compton camera is suitable for future use as an on-line monitor for particle therapy.
Shukla, Garima; Bhatia, Manvir; Behari, Madhuri
2005-10-01
Small fiber neuropathy is a common neurological disorder, often missed or ignored by physicians, since examination and routine nerve conduction studies are usually normal in this condition. Many methods including quantitative thermal sensory testing are currently being used for early detection of this condition, so as to enable timely investigation and treatment. This study was conducted to assess the yield of quantitative thermal sensory testing in diagnosis of small fiber neuropathy. We included patients presenting with history suggestive of positive and/or negative sensory symptoms, with normal examination findings, clinically suggestive of small fiber neuropathy, with normal or minimally abnormal routine nerve conduction studies. These patients were subjected to quantitative thermal sensory testing using a Medoc TSA-II Neurosensory analyser at two sites and for two modalities. QST data were compared with those in 120 normal healthy controls. Twenty-five patients (16 males, 9 females) with mean age 46.8+/-16.6 years (range: 21-75 years) were included in the study. The mean duration of symptoms was 1.6+/-1.6 years (range: 3 months-6 years). Eighteen patients (72%) had abnormal thresholds in at least one modality. Thermal thresholds were normal in 7 out of the 25 patients. This study demonstrates that quantitative thermal sensory testing is a fairly sensitive method for detection of small fiber neuropathy especially in patients with normal routine nerve conduction studies.
Boland, Mary Regina; Rusanov, Alexander; So, Yat; Lopez-Jimenez, Carlos; Busacca, Linda; Steinman, Richard C; Bakken, Suzanne; Bigger, J Thomas; Weng, Chunhua
2014-12-01
Underspecified user needs and frequent lack of a gold standard reference are typical barriers to technology evaluation. To address this problem, this paper presents a two-phase evaluation framework involving usability experts (phase 1) and end-users (phase 2). In phase 1, a cross-system functionality alignment between expert-derived user needs and system functions was performed to inform the choice of "the best available" comparison system to enable a cognitive walkthrough in phase 1 and a comparative effectiveness evaluation in phase 2. During phase 2, five quantitative and qualitative evaluation methods are mixed to assess usability: time-motion analysis, software log, questionnaires - System Usability Scale and the Unified Theory of Acceptance of Use of Technology, think-aloud protocols, and unstructured interviews. Each method contributes data for a unique measure (e.g., time motion analysis contributes task-completion-time; software log contributes action transition frequency). The measures are triangulated to yield complementary insights regarding user-perceived ease-of-use, functionality integration, anxiety during use, and workflow impact. To illustrate its use, we applied this framework in a formative evaluation of a software called Integrated Model for Patient Care and Clinical Trials (IMPACT). We conclude that this mixed-methods evaluation framework enables an integrated assessment of user needs satisfaction and user-perceived usefulness and usability of a novel design. This evaluation framework effectively bridges the gap between co-evolving user needs and technology designs during iterative prototyping and is particularly useful when it is difficult for users to articulate their needs for technology support due to the lack of a baseline. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo
2018-04-01
For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.
Validating internal controls for quantitative plant gene expression studies.
Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H
2004-08-18
Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sirenko, Oksana, E-mail: oksana.sirenko@moldev.com; Cromwell, Evan F., E-mail: evan.cromwell@moldev.com; Crittenden, Carole
2013-12-15
Human induced pluripotent stem cell (iPSC)-derived cardiomyocytes show promise for screening during early drug development. Here, we tested a hypothesis that in vitro assessment of multiple cardiomyocyte physiological parameters enables predictive and mechanistically-interpretable evaluation of cardiotoxicity in a high-throughput format. Human iPSC-derived cardiomyocytes were exposed for 30 min or 24 h to 131 drugs, positive (107) and negative (24) for in vivo cardiotoxicity, in up to 6 concentrations (3 nM to 30 uM) in 384-well plates. Fast kinetic imaging was used to monitor changes in cardiomyocyte function using intracellular Ca{sup 2+} flux readouts synchronous with beating, and cell viability. Amore » number of physiological parameters of cardiomyocyte beating, such as beat rate, peak shape (amplitude, width, raise, decay, etc.) and regularity were collected using automated data analysis. Concentration–response profiles were evaluated using logistic modeling to derive a benchmark concentration (BMC) point-of-departure value, based on one standard deviation departure from the estimated baseline in vehicle (0.3% dimethyl sulfoxide)-treated cells. BMC values were used for cardiotoxicity classification and ranking of compounds. Beat rate and several peak shape parameters were found to be good predictors, while cell viability had poor classification accuracy. In addition, we applied the Toxicological Prioritization Index (ToxPi) approach to integrate and display data across many collected parameters, to derive “cardiosafety” ranking of tested compounds. Multi-parameter screening of beating profiles allows for cardiotoxicity risk assessment and identification of specific patterns defining mechanism-specific effects. These data and analysis methods may be used widely for compound screening and early safety evaluation in drug development. - Highlights: • Induced pluripotent stem cell-derived cardiomyocytes are promising in vitro models. • We tested if evaluation of cardiotoxicity is possible in a high-throughput format. • The assay shows benefits of automated data integration across multiple parameters. • Quantitative assessment of concentration–response is possible using iPSCs. • Multi-parametric screening allows for cardiotoxicity risk assessment.« less
Social Return on Investment: Valuing health outcomes or promoting economic values?
Leck, Chris; Upton, Dominic; Evans, Nick
2016-07-01
Interventions and activities that influence health are often concerned with intangible outcomes that are difficult to value despite their potential significance. Social Return on Investment is an evaluation framework that explores all aspects of change and expresses these in comparable terms. It combines qualitative narratives and quantitative measurements with a financial approach to enable outcomes that can otherwise be overlooked or undervalued to be incorporated appropriately. This article presents Social Return on Investment as an effective tool for supporting the development of a holistic appreciation of how interventions impact on the health and well-being of individuals, communities and societies. © The Author(s) 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Mingsen; Guizhou Provincial Key Laboratory of Computational Nano-Material Science, Institute of Applied Physics, Guizhou Normal College, Guiyang, 550018; Ye, Gui
The probe of flexible molecular conformation is crucial for the electric application of molecular systems. We have developed a theoretical procedure to analyze the couplings of molecular local vibrations with the electron transportation process, which enables us to evaluate the structural fingerprints of some vibrational modes in the inelastic electron tunneling spectroscopy (IETS). Based on a model molecule of Bis-(4-mercaptophenyl)-ether with a flexible center angle, we have revealed and validated a simple mathematical relationship between IETS signals and molecular angles. Our results might open a route to quantitatively measure key geometrical parameters of molecular junctions, which helps to achieve precisemore » control of molecular devices.« less
Lorantfy, Bettina; Seyer, Bernhard; Herwig, Christoph
2014-01-25
Extreme halophilic Archaea are extremophile species which can thrive in hypersaline environments of up to 3-5 M sodium chloride concentration. Although their ecology and physiology are widely identified on the microbiological level, little emphasis has been laid on quantitative bioprocess development with extreme halophiles. The goal of this study was to establish, on the one hand, a methodological basis for quantitative bioprocess analysis of extreme halophilic Archaea with an extreme halophilic strain as an example. Firstly, as a novel usage, a corrosion resistant bioreactor setup for extreme halophiles has been implemented. Then, paying special attention to total bioprocess quantification approaches, an indirect method for biomass quantification using on-line process signals was introduced. Subsequently, robust quantitative data evaluation methods for halophiles could be developed, providing defined and controlled cultivation conditions in the bioreactor and therefore obtaining suitable quality of on-line as well as off-line datasets. On the other hand, new physiological results of extreme halophiles in bioreactor have also been obtained based on the quantitative methodological tools. For the first time, quantitative data on stoichiometry and kinetics were collected and evaluated on different carbon sources. The results on various substrates were interpreted, with proposed metabolic mechanisms, by linking to the reported primary carbon metabolism of extreme halophilic Archaea. Moreover, results of chemostat cultures demonstrated that extreme halophilic organisms show Monod-kinetics on different sole carbon sources. A diauxic growth pattern was described on a mixture of substrates in batch cultivations. In addition, the methodologies presented here enable one to characterize the utilized strain Haloferax mediterranei (HFX) as a potential new host organism. Thus, this study offers a strong methodological basis as well as a fundamental physiological assessment for bioreactor quantification of extreme halophiles that can serve as primary knowledge for applications of extreme halophiles in biotechnology. Copyright © 2013 Elsevier B.V. All rights reserved.
Pawlikowska, T R B; Nowak, P R; Szumilo-Grzesik, W; Walker, J J
2002-04-01
Primary health care reform is underpinned by a move towards patient-centred holistic care. This pilot study uses the Patient Enablement Instrument (PEI) to assess outcome at a fundamental level: that of the patient and their doctor at consultation. Our aim was to assess the evaluative potential of the PEI in relation to a reform programme in Poland by (i) comparing the outcomes of consultations (using the PEI) carried out by nine doctors (three diploma GPs who had participated in the training programme, three GPs who had not participated in the training programme and three polyclinic internists); and (ii) relating PEI scores to a proxy quality process measure (consultation length). A cross-sectional quantitative questionnaire survey was carried out using the PEI. The subjects were patients consulting with nine doctors distributed within a single region around Gdansk. The overall results with the PEI and consultation length reflected UK experience. In addition, there were significant differences between groups in this pilot study. Patients seen by diploma GPs achieved higher patient enablement scores (mean 4.33, 95% confidence interval 4.09-4.58) relative to GPs (mean 3.44, 3.21-3.67) and polyclinic doctors (mean 3.23, 2.99-3.47). However, there is evidence of appreciable between-doctor variation in PEI scores within groups. The difference in patient enablement between groups was not affected by patient case mix, in contrast to the duration of consultation, which was. Holistically trained diploma GPs spent longer with patients with psychological problems. Patients seen by diploma GPs received longer consultations (mean 12.65 min, 95% confidence interval 12.18-13.13) relative to their colleagues (the GPs' mean was 10.11, 9.82-10.41 min; that of the polyclinic internists was 10.16, 9.81-10.50 min). The duration of consultation was positively correlated with patient enablement. The results of such training courses should be examined from the perspective of both the patient and their doctor. Significant differences were found in both patient enablement and consultation length between patients attending groups of doctors delivering primary care, but working from different paradigms. This pilot shows promising results which, if repeated in a larger study, would provide an objective means of evaluating such reform programmes.
Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J
2014-01-01
The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.
Measurements in quantitative research: how to select and report on research instruments.
Hagan, Teresa L
2014-07-01
Measures exist to numerically represent degrees of attributes. Quantitative research is based on measurement and is conducted in a systematic, controlled manner. These measures enable researchers to perform statistical tests, analyze differences between groups, and determine the effectiveness of treatments. If something is not measurable, it cannot be tested.
Understanding Knowledge-Sharing Breakdowns: A Meeting of the Quantitative and Qualitative Minds
ERIC Educational Resources Information Center
Soller, Amy
2004-01-01
The rapid advance of distance learning and networking technology has enabled universities and corporations to reach out and educate students across time and space barriers. Although this technology enables structured collaborative learning activities, online groups often do not enjoy the same benefits as face-to-face learners, and their…
Accuracy of lung nodule density on HRCT: analysis by PSF-based image simulation.
Ohno, Ken; Ohkubo, Masaki; Marasinghe, Janaka C; Murao, Kohei; Matsumoto, Toru; Wada, Shinichi
2012-11-08
A computed tomography (CT) image simulation technique based on the point spread function (PSF) was applied to analyze the accuracy of CT-based clinical evaluations of lung nodule density. The PSF of the CT system was measured and used to perform the lung nodule image simulation. Then, the simulated image was resampled at intervals equal to the pixel size and the slice interval found in clinical high-resolution CT (HRCT) images. On those images, the nodule density was measured by placing a region of interest (ROI) commonly used for routine clinical practice, and comparing the measured value with the true value (a known density of object function used in the image simulation). It was quantitatively determined that the measured nodule density depended on the nodule diameter and the image reconstruction parameters (kernel and slice thickness). In addition, the measured density fluctuated, depending on the offset between the nodule center and the image voxel center. This fluctuation was reduced by decreasing the slice interval (i.e., with the use of overlapping reconstruction), leading to a stable density evaluation. Our proposed method of PSF-based image simulation accompanied with resampling enables a quantitative analysis of the accuracy of CT-based evaluations of lung nodule density. These results could potentially reveal clinical misreadings in diagnosis, and lead to more accurate and precise density evaluations. They would also be of value for determining the optimum scan and reconstruction parameters, such as image reconstruction kernels and slice thicknesses/intervals.
Monitoring Astronaut Health at the Nanoscale Cellular Level Through the Eye
NASA Technical Reports Server (NTRS)
Ansari, Rafat R.; Singh, Bhim S.; Rovati, Luigi; Docchio, Franco; Sebag, Jerry
2000-01-01
A user friendly goggles-like head-mounted device equipped with a suite of instruments for several non-invasive and quantitative medical evaluation of the eye, skin, and brain is desired for monitoring the health of astronauts during space travel and exploration of neighboring and distant planets. Real-time non-invasive evaluation of the different structures within the above organs can provide indices of the health of not just these organs, but the entire body. The techniques such as dynamic light scattering (for the early detection of uveitis, cholesterol levels, cataract, changes in the vitreous and possibly Alzheimer's disease), corneal autofluorescence (to assess extracellular matrix biology e.g., in diabetes), optical activity measurements (of anterior ocular fluid to evaluate blood-glucose levels), laser Doppler velocimetry (to assess retinal, optic nerve, and choroidal blood flow), reflectometry/oximetry (for assessing ocular and central nervous system oxygen metabolism), optical coherence tomography (to determine retinal tissue microstructure) and possibly scanning laser technology (for intraocular tissue imaging and scanning) will he integrated into this compact device. Skin sensors will also be mounted on the portion of the device in contact with the periocular region. This will enable monitoring of body temperature, EEG, and electrolyte status. This device will monitor astronaut health during long-duration space travel by detecting aberrations from pre-established "nonns", enabling prompt diagnosis and possibly the initiation of early preventative/curative therapy. The non-invasive nature of the device technologies permits frequent repetition of tests, enabling real-time complete crew health monitoring. This device may ultimately be useful in tele-medicine to bring modern healthcare to under-served areas on Earth as well as in so-called "advanced" care settings (e.g. diabetes in the USA).
Surface plasmon resonance microscopy: achieving a quantitative optical response
Peterson, Alexander W.; Halter, Michael; Plant, Anne L.; Elliott, John T.
2016-01-01
Surface plasmon resonance (SPR) imaging allows real-time label-free imaging based on index of refraction, and changes in index of refraction at an interface. Optical parameter analysis is achieved by application of the Fresnel model to SPR data typically taken by an instrument in a prism based configuration. We carry out SPR imaging on a microscope by launching light into a sample, and collecting reflected light through a high numerical aperture microscope objective. The SPR microscope enables spatial resolution that approaches the diffraction limit, and has a dynamic range that allows detection of subnanometer to submicrometer changes in thickness of biological material at a surface. However, unambiguous quantitative interpretation of SPR changes using the microscope system could not be achieved using the Fresnel model because of polarization dependent attenuation and optical aberration that occurs in the high numerical aperture objective. To overcome this problem, we demonstrate a model to correct for polarization diattenuation and optical aberrations in the SPR data, and develop a procedure to calibrate reflectivity to index of refraction values. The calibration and correction strategy for quantitative analysis was validated by comparing the known indices of refraction of bulk materials with corrected SPR data interpreted with the Fresnel model. Subsequently, we applied our SPR microscopy method to evaluate the index of refraction for a series of polymer microspheres in aqueous media and validated the quality of the measurement with quantitative phase microscopy. PMID:27782542
Quantitative prediction of phase transformations in silicon during nanoindentation
NASA Astrophysics Data System (ADS)
Zhang, Liangchi; Basak, Animesh
2013-08-01
This paper establishes the first quantitative relationship between the phases transformed in silicon and the shape characteristics of nanoindentation curves. Based on an integrated analysis using TEM and unit cell properties of phases, the volumes of the phases emerged in a nanoindentation are formulated as a function of pop-out size and depth of nanoindentation impression. This simple formula enables a fast, accurate and quantitative prediction of the phases in a nanoindentation cycle, which has been impossible before.
Zhu, Xiaolin; Zhang, Kexin; Wang, Chengzhi; Guan, Jiunian; Yuan, Xing; Li, Baikun
2016-01-01
This study aimed at developing simple, sensitive and rapid electrochemical approach to quantitatively determine and assess the toxicity of 2,4-dichlorophenol (2,4-DCP), a priority pollutant and has potential risk to public health through a novel poly(eosin Y, EY)/hydroxylated multi-walled carbon nanotubes composite modified electrode (PEY/MWNTs-OH/GCE). The distinct feature of this easy-fabricated electrode was the synergistic coupling effect between EY and MWNTs-OH that enabled a high electrocatalytic activity to 2,4-DCP. Under optimum conditions, the oxidation peak current enhanced linearly with concentration increasing from 0.005 to 0.1 μM and 0.2 to 40.0 μM, and revealed the detection limit of 1.5 nM. Moreover, the PEY/MWNTs-OH/GCE exhibited excellent electrocatalytic activity toward intracellular electroactive species. Two sensitive electrochemical signals ascribed to guanine/xanthine and adenine/hypoxanthine in human hepatoma (HepG2) cells were detected simultaneously. The sensor was successfully applied to evaluate the toxicity of 2,4-DCP to HepG2 cells. The IC50 values based on the two electrochemical signals are 201.07 and 252.83 μM, respectively. This study established a sensitive platform for the comprehensive evaluation of 2,4-DCP and posed a great potential to simplify environmental toxicity monitoring. PMID:27941912
He, Yuhui; Tsutsui, Makusu; Scheicher, Ralph H.; Fan, Chun; Taniguchi, Masateru; Kawai, Tomoji
2013-01-01
Experiments using nanopores demonstrated that a salt gradient enhances the capture rate of DNA and reduces its translocation speed. These two effects can help to enable electrical DNA sequencing with nanopores. Here, we provide a quantitative theoretical evaluation that shows the positive net charges, which accumulate around the pore entrance due to the salt gradient, are responsible for the two observed effects: they reinforce the electric capture field, resulting in promoted molecule capture rate; and they induce cationic electroosmotic flow through the nanopore, thus significantly retarding the motion of the anionic DNA through the nanopore. Our multiphysical simulation results show that, during the polymer trapping stage, the former effect plays the major role, thus resulting in promoted DNA capture rate, while during the nanopore-penetrating stage the latter effect dominates and consequently reduces the DNA translocation speed significantly. Quantitative agreement with experimental results has been reached by further taking nanopore wall surface charges into account. PMID:23931325
Dumbryte, Irma; Linkeviciene, Laura; Linkevicius, Tomas; Malinauskas, Mangirdas
2017-07-26
The study aimed at introducing current available techniques for enamel microcracks (EMCs) detection, and presenting a method for direct quantitative analysis of an individual EMC. Measurements of the detailed EMCs characteristics (location, length, and width) were taken from the reconstructed images of the buccal tooth surface (teeth extracted from two age groups of patients) employing a scanning electron microscopy (SEM) and our derived formulas before and after ceramic brackets removal. Measured parameters of EMCs for younger age group were 2.41 µm (width), 3.68 mm (length) before and 2.73 µm, 3.90 mm after debonding; for older -4.03 µm, 4.35 mm before and 4.80 µm, 4.37 mm after brackets removal. Following debonding EMCs increased for both groups, however the changes in width and length were statistically insignificant. Regardless of the age group, proposed method enabled precise detection of the same EMC before and after debonding, and quantitative examination of its characteristics.
MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.
Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J
2015-10-15
Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Designing Domain-Specific HUMS Architectures: An Automated Approach
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Agarwal, Neha; Kumar, Pramod; Sundaram, Parthiban
2004-01-01
The HUMS automation system automates the design of HUMS architectures. The automated design process involves selection of solutions from a large space of designs as well as pure synthesis of designs. Hence the whole objective is to efficiently search for or synthesize designs or parts of designs in the database and to integrate them to form the entire system design. The automation system adopts two approaches in order to produce the designs: (a) Bottom-up approach and (b) Top down approach. Both the approaches are endowed with a Suite of quantitative and quantitative techniques that enable a) the selection of matching component instances, b) the determination of design parameters, c) the evaluation of candidate designs at component-level and at system-level, d) the performance of cost-benefit analyses, e) the performance of trade-off analyses, etc. In short, the automation system attempts to capitalize on the knowledge developed from years of experience in engineering, system design and operation of the HUMS systems in order to economically produce the most optimal and domain-specific designs.
Advancing Precision Nuclear Medicine and Molecular Imaging for Lymphoma.
Wright, Chadwick L; Maly, Joseph J; Zhang, Jun; Knopp, Michael V
2017-01-01
PET with fluorodeoxyglucose F 18 ( 18 F FDG-PET) is a meaningful biomarker for the detection, targeted biopsy, and treatment of lymphoma. This article reviews the evolution of 18 F FDG-PET as a putative biomarker for lymphoma and addresses the current capabilities, challenges, and opportunities to enable precision medicine practices for lymphoma. Precision nuclear medicine is driven by new imaging technologies and methodologies to more accurately detect malignant disease. Although quantitative assessment of response is limited, such technologies will enable a more precise metabolic mapping with much higher definition image detail and thus may make it a robust and valid quantitative response assessment methodology. Copyright © 2016 Elsevier Inc. All rights reserved.
Halcomb, Elizabeth; Hickman, Louise
2015-04-08
Mixed methods research involves the use of qualitative and quantitative data in a single research project. It represents an alternative methodological approach, combining qualitative and quantitative research approaches, which enables nurse researchers to explore complex phenomena in detail. This article provides a practical overview of mixed methods research and its application in nursing, to guide the novice researcher considering a mixed methods research project.
Synthesising quantitative and qualitative research in evidence-based patient information.
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-03-01
Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review.
Da Costa, Caitlyn; Reynolds, James C; Whitmarsh, Samuel; Lynch, Tom; Creaser, Colin S
2013-01-01
RATIONALE Chemical additives are incorporated into commercial lubricant oils to modify the physical and chemical properties of the lubricant. The quantitative analysis of additives in oil-based lubricants deposited on a surface without extraction of the sample from the surface presents a challenge. The potential of desorption electrospray ionization mass spectrometry (DESI-MS) for the quantitative surface analysis of an oil additive in a complex oil lubricant matrix without sample extraction has been evaluated. METHODS The quantitative surface analysis of the antioxidant additive octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix was carried out by DESI-MS in the presence of 2-(pentyloxy)ethyl 3-(3,5-di-tert-butyl-4-hydroxyphenyl)propionate as an internal standard. A quadrupole/time-of-flight mass spectrometer fitted with an in-house modified ion source enabling non-proximal DESI-MS was used for the analyses. RESULTS An eight-point calibration curve ranging from 1 to 80 µg/spot of octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix and in the presence of the internal standard was used to determine the quantitative response of the DESI-MS method. The sensitivity and repeatability of the technique were assessed by conducting replicate analyses at each concentration. The limit of detection was determined to be 11 ng/mm2 additive on spot with relative standard deviations in the range 3–14%. CONCLUSIONS The application of DESI-MS to the direct, quantitative surface analysis of a commercial lubricant additive in a native oil lubricant matrix is demonstrated. © 2013 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons, Ltd. PMID:24097398
Quantitative secondary electron detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi
Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.
Schilling, Birgit; Gibson, Bradford W.; Hunter, Christie L.
2017-01-01
Data-independent acquisition is a powerful mass spectrometry technique that enables comprehensive MS and MS/MS analysis of all detectable species, providing an information rich data file that can be mined deeply. Here, we describe how to acquire high-quality SWATH® Acquisition data to be used for large quantitative proteomic studies. We specifically focus on using variable sized Q1 windows for acquisition of MS/MS data for generating higher specificity quantitative data. PMID:28188533
Tranca, D. E.; Stanciu, S. G.; Hristu, R.; Stoichita, C.; Tofail, S. A. M.; Stanciu, G. A.
2015-01-01
A new method for high-resolution quantitative measurement of the dielectric function by using scattering scanning near-field optical microscopy (s-SNOM) is presented. The method is based on a calibration procedure that uses the s-SNOM oscillating dipole model of the probe-sample interaction and quantitative s-SNOM measurements. The nanoscale capabilities of the method have the potential to enable novel applications in various fields such as nano-electronics, nano-photonics, biology or medicine. PMID:26138665
Studying learning in the healthcare setting: the potential of quantitative diary methods.
Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke
2015-08-01
Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.
Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.
2007-01-01
High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml−1/μMol ml−1)], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174
Bai, Cheng; Reilly, Charles C; Wood, Bruce W
2007-03-28
High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.
Berger, Sarah; Mahler, Cornelia; Krug, Katja; Szecsenyi, Joachim; Schultz, Jobst-Hendrik
2016-01-01
Introduction: This project report describes the development, “piloting” and evaluation of an interprofessional seminar on team communication bringing together medical students and Interprofessional Health Care B.Sc. students at the Medical Faculty of Heidelberg University, Germany. Project Description: A five-member interprofessional team collaborated together on this project. Kolb’s experiential learning concept formed the theoretical foundation for the seminar, which explored three interprofessional competency areas: team work, communication and values/ethics. Evaluation for the purposes of quality assurance and future curricula development was conducted using two quantitative measures: descriptive analysis of a standardized course evaluation tool (EvaSys) ANOVA analysis of the German translation of the University of the West of England Interprofessional Questionnaire (UWE-IP-D). Results: The key finding from the standardized course evaluation was that the interprofessional seminars were rated more positively [M=2.11 (1 most positive and 5 most negative), SD=1, n=27] than the monoprofessional seminars [M=2.55, SD=0.98, n=90]. The key finding from the UWE-IP-D survey, comparing pre and post scores of the interprofessional (IP) (n=40) and monoprofessional (MP) groups (n=34), was that significant positive changes in mean scores for both groups towards communication, teamwork and interprofessional learning occurred. Conclusions: Lessons learnt included: a) recognising the benefit of being pragmatic when introducing interprofessional education initiatives, which enabled various logistical and attitudinal barriers to be overcome; b) quantitative evaluation of learning outcomes alone could not explain positive responses or potential influences of interprofessional aspects, which highlighted the need for a mixed methods approach, including qualitative methods, to enrich judgment formation on interprofessional educational outcomes. PMID:27280133
Berger, Sarah; Mahler, Cornelia; Krug, Katja; Szecsenyi, Joachim; Schultz, Jobst-Hendrik
2016-01-01
This project report describes the development, "piloting" and evaluation of an interprofessional seminar on team communication bringing together medical students and Interprofessional Health Care B.Sc. students at the Medical Faculty of Heidelberg University, Germany. A five-member interprofessional team collaborated together on this project. Kolb's experiential learning concept formed the theoretical foundation for the seminar, which explored three interprofessional competency areas: team work, communication and values/ethics. Evaluation for the purposes of quality assurance and future curricula development was conducted using two quantitative measures: descriptive analysis of a standardized course evaluation tool (EvaSys) ANOVA analysis of the German translation of the University of the West of England Interprofessional Questionnaire (UWE-IP-D). The key finding from the standardized course evaluation was that the interprofessional seminars were rated more positively [M=2.11 (1 most positive and 5 most negative), SD=1, n=27] than the monoprofessional seminars [M=2.55, SD=0.98, n=90]. The key finding from the UWE-IP-D survey, comparing pre and post scores of the interprofessional (IP) (n=40) and monoprofessional (MP) groups (n=34), was that significant positive changes in mean scores for both groups towards communication, teamwork and interprofessional learning occurred. Lessons learnt included: a) recognising the benefit of being pragmatic when introducing interprofessional education initiatives, which enabled various logistical and attitudinal barriers to be overcome; b) quantitative evaluation of learning outcomes alone could not explain positive responses or potential influences of interprofessional aspects, which highlighted the need for a mixed methods approach, including qualitative methods, to enrich judgment formation on interprofessional educational outcomes.
Use magnetic resonance imaging to assess articular cartilage
Wang, Yuanyuan; Wluka, Anita E.; Jones, Graeme; Ding, Changhai
2012-01-01
Magnetic resonance imaging (MRI) enables a noninvasive, three-dimensional assessment of the entire joint, simultaneously allowing the direct visualization of articular cartilage. Thus, MRI has become the imaging modality of choice in both clinical and research settings of musculoskeletal diseases, particular for osteoarthritis (OA). Although radiography, the current gold standard for the assessment of OA, has had recent significant technical advances, radiographic methods have significant limitations when used to measure disease progression. MRI allows accurate and reliable assessment of articular cartilage which is sensitive to change, providing the opportunity to better examine and understand preclinical and very subtle early abnormalities in articular cartilage, prior to the onset of radiographic disease. MRI enables quantitative (cartilage volume and thickness) and semiquantitative assessment of articular cartilage morphology, and quantitative assessment of cartilage matrix composition. Cartilage volume and defects have demonstrated adequate validity, accuracy, reliability and sensitivity to change. They are correlated to radiographic changes and clinical outcomes such as pain and joint replacement. Measures of cartilage matrix composition show promise as they seem to relate to cartilage morphology and symptoms. MRI-derived cartilage measurements provide a useful tool for exploring the effect of modifiable factors on articular cartilage prior to clinical disease and identifying the potential preventive strategies. MRI represents a useful approach to monitoring the natural history of OA and evaluating the effect of therapeutic agents. MRI assessment of articular cartilage has tremendous potential for large-scale epidemiological studies of OA progression, and for clinical trials of treatment response to disease-modifying OA drugs. PMID:22870497
Informal information for web-based engineering catalogues
NASA Astrophysics Data System (ADS)
Allen, Richard D.; Culley, Stephen J.; Hicks, Ben J.
2001-10-01
Success is highly dependent on the ability of a company to efficiently produce optimal designs. In order to achieve this companies must minimize time to market and possess the ability to make fully informed decisions at the early phase of the design process. Such decisions may include the choice of component and suppliers, as well as cost and maintenance considerations. Computer modeling and electronic catalogues are becoming the preferred medium for the selection and design of mechanical components. In utilizing these techniques, the designer demands the capability to identify, evaluate and select mechanical components both quantitatively and qualitatively. Quantitative decisions generally encompass performance data included in the formal catalogue representation. It is in the area of qualitative decisions that the use of what the authors call 'Informal Information' is of crucial importance. Thus, 'Informal Information' must often be incorporated into the selection process and selection systems. This would enable more informed decisions to be made quicker, without the need for information retrieval via discussion with colleagues in the design environment. This paper provides an overview of the use of electronic information in the design of mechanical systems, including a discussion of limitations of current technology. The importance of Informal Information is discussed and the requirements for association with web based electronic catalogues are developed. This system is based on a flexible XML schema and enables the storage, classification and recall of Informal Information packets. Furthermore, a strategy for the inclusion of Informal Information is proposed, and an example case is used to illustrate the benefits.
A custom-built PET phantom design for quantitative imaging of printed distributions.
Markiewicz, P J; Angelis, G I; Kotasidis, F; Green, M; Lionheart, W R; Reader, A J; Matthews, J C
2011-11-07
This note presents a practical approach to a custom-made design of PET phantoms enabling the use of digital radioactive distributions with high quantitative accuracy and spatial resolution. The phantom design allows planar sources of any radioactivity distribution to be imaged in transaxial and axial (sagittal or coronal) planes. Although the design presented here is specially adapted to the high-resolution research tomograph (HRRT), the presented methods can be adapted to almost any PET scanner. Although the presented phantom design has many advantages, a number of practical issues had to be overcome such as positioning of the printed source, calibration, uniformity and reproducibility of printing. A well counter (WC) was used in the calibration procedure to find the nonlinear relationship between digital voxel intensities and the actual measured radioactive concentrations. Repeated printing together with WC measurements and computed radiography (CR) using phosphor imaging plates (IP) were used to evaluate the reproducibility and uniformity of such printing. Results show satisfactory printing uniformity and reproducibility; however, calibration is dependent on the printing mode and the physical state of the cartridge. As a demonstration of the utility of using printed phantoms, the image resolution and quantitative accuracy of reconstructed HRRT images are assessed. There is very good quantitative agreement in the calibration procedure between HRRT, CR and WC measurements. However, the high resolution of CR and its quantitative accuracy supported by WC measurements made it possible to show the degraded resolution of HRRT brain images caused by the partial-volume effect and the limits of iterative image reconstruction.
NASA Astrophysics Data System (ADS)
Esposito, Alessandro
2006-05-01
This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These techniques have the capability to quantitatively probe the biochemical environment of fluorophores. An automated microscope capable of unsupervised operation has been developed that enables the investigation of molecular and cellular properties at high throughput levels and the analysis of cellular heterogeneity. State-of-the-art Förster Resonance Energy Transfer imaging, Fluorescence Lifetime Imaging Microscopy, Confocal Laser Scanning Microscopy and the newly developed tools have been combined with cellular and molecular biology techniques for the investigation of protein-protein interactions, oligomerization and post-translational modifications of α-Synuclein and Tau, two proteins involved in Parkinson’s and Alzheimer’s disease, respectively. The high inter-disciplinarity of this project required the merging of the expertise of both the Molecular Biophysics Group at the Debye Institute - Utrecht University and the Cell Biophysics Group at the European Neuroscience Institute - Gαttingen University. This project was conducted also with the support and the collaboration of the Center for the Molecular Physiology of the Brain (Göttingen), particularly with the groups associated with the Molecular Quantitative Microscopy and Parkinson’s Disease and Aggregopathies areas. This work demonstrates that molecular and cellular quantitative microscopy can be used in combination with high-throughput screening as a powerful tool for the investigation of the molecular mechanisms of complex biological phenomena like those occurring in neurodegenerative diseases.
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
Temporal efficiency evaluation and small-worldness characterization in temporal networks
Dai, Zhongxiang; Chen, Yu; Li, Junhua; Fam, Johnson; Bezerianos, Anastasios; Sun, Yu
2016-01-01
Numerous real-world systems can be modeled as networks. To date, most network studies have been conducted assuming stationary network characteristics. Many systems, however, undergo topological changes over time. Temporal networks, which incorporate time into conventional network models, are therefore more accurate representations of such dynamic systems. Here, we introduce a novel generalized analytical framework for temporal networks, which enables 1) robust evaluation of the efficiency of temporal information exchange using two new network metrics and 2) quantitative inspection of the temporal small-worldness. Specifically, we define new robust temporal network efficiency measures by incorporating the time dependency of temporal distance. We propose a temporal regular network model, and based on this plus the redefined temporal efficiency metrics and widely used temporal random network models, we introduce a quantitative approach for identifying temporal small-world architectures (featuring high temporal network efficiency both globally and locally). In addition, within this framework, we can uncover network-specific dynamic structures. Applications to brain networks, international trade networks, and social networks reveal prominent temporal small-world properties with distinct dynamic network structures. We believe that the framework can provide further insight into dynamic changes in the network topology of various real-world systems and significantly promote research on temporal networks. PMID:27682314
Temporal efficiency evaluation and small-worldness characterization in temporal networks
NASA Astrophysics Data System (ADS)
Dai, Zhongxiang; Chen, Yu; Li, Junhua; Fam, Johnson; Bezerianos, Anastasios; Sun, Yu
2016-09-01
Numerous real-world systems can be modeled as networks. To date, most network studies have been conducted assuming stationary network characteristics. Many systems, however, undergo topological changes over time. Temporal networks, which incorporate time into conventional network models, are therefore more accurate representations of such dynamic systems. Here, we introduce a novel generalized analytical framework for temporal networks, which enables 1) robust evaluation of the efficiency of temporal information exchange using two new network metrics and 2) quantitative inspection of the temporal small-worldness. Specifically, we define new robust temporal network efficiency measures by incorporating the time dependency of temporal distance. We propose a temporal regular network model, and based on this plus the redefined temporal efficiency metrics and widely used temporal random network models, we introduce a quantitative approach for identifying temporal small-world architectures (featuring high temporal network efficiency both globally and locally). In addition, within this framework, we can uncover network-specific dynamic structures. Applications to brain networks, international trade networks, and social networks reveal prominent temporal small-world properties with distinct dynamic network structures. We believe that the framework can provide further insight into dynamic changes in the network topology of various real-world systems and significantly promote research on temporal networks.
Rieger, Theodore R; Musante, Cynthia J
2016-10-30
Quantitative Systems Pharmacology (QSP) is an emerging science with increasing application to pharmaceutical research and development paradigms. Through case study we provide an overview of the benefits and challenges of applying QSP approaches to inform program decisions in the early stages of drug discovery and development. Specifically, we describe the use of a type 2 diabetes systems model to inform a No-Go decision prior to lead development for a potential GLP-1/GIP dual agonist program, enabling prioritization of exploratory programs with higher probability of clinical success. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tercero, Carlos; Ikeda, Seiichi; Fukuda, Toshio; Arai, Fumihito; Negoro, Makoto; Takahashi, Ikuo
2011-10-01
There is a need to develop quantitative evaluation for simulator based training in medicine. Photoelastic stress analysis can be used in human tissue modeling materials; this enables the development of simulators that measure respect for tissue. For applying this to endovascular surgery, first we present a model of saccular aneurism where stress variation during micro-coils deployment is measured, and then relying on a bi-planar vision system we measure a catheter trajectory and compare it to a reference trajectory considering respect for tissue. New photoelastic tissue modeling materials will expand the applications of this technology to other medical training domains.
Final Project Report - Revised Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prasad, Manika; Carolyn, Koh
An over-arching goal of this research is to calibrate geophysical techniques for hydrate exploration, evaluation, and production monitoring. Extensive field data of hydrate-bearing sediments exist, but quantitative estimates of the amount and distribution of hydrates are difficult to determine. Thus, the primary project objectives were to relate seismic and acoustic velocities and attenuations to hydrate saturation and texture. The project aimed to collect seismic properties along with other measurements (e.g., complex resistivity, micro-focus x-ray computed tomography, etc.). The multiphysics dataset would enable researchers to understand not only the interaction between mineral surfaces and gas hydrates, but also how the hydratemore » formation method affects the hydrate-sediment system in terms of elastic properties.« less
The role of surface vorticity during unsteady separation
NASA Astrophysics Data System (ADS)
Melius, Matthew S.; Mulleners, Karen; Cal, Raúl Bayoán
2018-04-01
Unsteady flow separation in rotationally augmented flow fields plays a significant role in a variety of fundamental flows. Through the use of time-resolved particle image velocimetry, vorticity accumulation and vortex shedding during unsteady separation over a three-dimensional airfoil are examined. The results of the study describe the critical role of surface vorticity accumulation during unsteady separation and reattachment. Through evaluation of the unsteady characteristics of the shear layer, it is demonstrated that the buildup and shedding of surface vorticity directly influence the dynamic changes of the separation point location. The quantitative characterization of surface vorticity and shear layer stability enables improved aerodynamic designs and has a broad impact within the field of unsteady fluid dynamics.
BATSE Gamma-Ray Burst Line Search. IV. Line Candidates from the Visual Search
NASA Astrophysics Data System (ADS)
Band, D. L.; Ryder, S.; Ford, L. A.; Matteson, J. L.; Palmer, D. M.; Teegarden, B. J.; Briggs, M. S.; Paciesas, W. S.; Pendleton, G. N.; Preece, R. D.
1996-02-01
We evaluate the significance of the line candidates identified by a visual search of burst spectra from BATSE's Spectroscopy Detectors. None of the candidates satisfy our detection criteria: an F-test probability less than 10-4 for a feature in one detector and consistency among the detectors that viewed the burst. Most of the candidates are not very significant and are likely to be fluctuations. Because of the expectation of finding absorption lines, the search was biased toward absorption features. We do not have a quantitative measure of the completeness of the search, which would enable a comparison with previous missions. Therefore, a more objective computerized search has begun.
Optimising sulfuric acid hard coat anodising for an Al-Mg-Si wrought aluminium alloy
NASA Astrophysics Data System (ADS)
Bartolo, N.; Sinagra, E.; Mallia, B.
2014-06-01
This research evaluates the effects of sulfuric acid hard coat anodising parameters, such as acid concentration, electrolyte temperature, current density and time, on the hardness and thickness of the resultant anodised layers. A small scale anodising facility was designed and set up to enable experimental investigation of the anodising parameters. An experimental design using the Taguchi method to optimise the parameters within an established operating window was performed. Qualitative and quantitative methods of characterisation of the resultant anodised layers were carried out. The anodised layer's thickness, and morphology were determined using a light optical microscope (LOM) and field emission gun scanning electron microscope (FEG-SEM). Hardness measurements were carried out using a nano hardness tester. Correlations between the various anodising parameters and their effect on the hardness and thickness of the anodised layers were established. Careful evaluation of these effects enabled optimum parameters to be determined using the Taguchi method, which were verified experimentally. Anodised layers having hardness varying between 2.4-5.2 GPa and a thickness of between 20-80 μm were produced. The Taguchi method was shown to be applicable to anodising. This finding could facilitate on-going and future research and development of anodising, which is attracting remarkable academic and industrial interest.
Infrared-optical transmission and reflection measurements on loose powders
NASA Astrophysics Data System (ADS)
Kuhn, J.; Korder, S.; Arduini-Schuster, M. C.; Caps, R.; Fricke, J.
1993-09-01
A method is described to determine quantitatively the infrared-optical properties of loose powder beds via directional-hemispherical transmission and reflection measurements. Instead of the integration of the powders into a potassium bromide (KBr) or a paraffin oil matrix, which would drastically alter the scattering behavior, the powders are placed onto supporting layers of polyethylene (PE) and KBr. A commercial spectrometer is supplemented by an external optics, which enables measurements on horizontally arranged samples. For data evaluation we use a solution of the equation of radiative transfer in the 3-flux approximation under boundary conditions adapted to the PE or KBr/powder system. A comparison with Kubelka-Munk's theory and Schuster's 2-flux approximation is performed, which shows that 3-flux approximation yields results closest to the exact solution. Equations are developed, which correct transmission and reflection of the samples for the influence of the supporting layer and calculate the specific extinction and the albedo of the powder and thus enables us to separate scattering and absorption part of the extinction spectrum. Measurements on TiO2 powder are presented, which show the influence of preparation techniques and data evaluation with different methods to obtain the albedo. The specific extinction of various TiO2 powders is presented.
NASA Astrophysics Data System (ADS)
Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi
2017-01-01
Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.
Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.
Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman
2016-10-28
Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).
Attendance at NHS mandatory training sessions.
Brand, Darren
2015-02-17
To identify factors that affect NHS healthcare professionals' attendance at mandatory training sessions. A quantitative approach was used, with a questionnaire sent to 400 randomly selected participants. A total of 122 responses were received, providing a mix of qualitative and quantitative data. Quantitative data were analysed using statistical methods. Open-ended responses were reviewed using thematic analysis. Clinical staff value mandatory training sessions highly. They are aware of the requirement to keep practice up-to-date and ensure patient safety remains a priority. However, changes to the delivery format of mandatory training sessions are required to enable staff to participate more easily, as staff are often unable to attend. The delivery of mandatory training should move from classroom-based sessions into the clinical area to maximise participation. Delivery should be assisted by local 'experts' who are able to customise course content to meet local requirements and the requirements of different staff groups. Improved arrangements to provide staff cover, for those attending training, would enable more staff to attend training sessions.
Nordmeyer-Massner, Jurek A; Wyss, Michael; Andreisek, Gustav; Pruessmann, Klaas P; Hodler, Juerg
2011-03-01
To evaluate in vivo MR imaging of the wrist at 3.0 Tesla (T) and 7.0T quantitatively and qualitatively. To enable unbiased signal-to-noise ratio (SNR) comparisons, geometrically identical eight-channel receiver arrays were used at both field strengths. First, in vitro images of a phantom bottle were acquired at 3.0T and 7.0T to obtain an estimate of the maximum SNR gain that can be expected. MR images of the dominant wrist of 10 healthy volunteers were acquired at both field strengths. All measurements were done using the same sequence parameters. Quantitative SNR maps were calculated on a pixel-by-pixel basis and analyzed in several regions-of-interest. Furthermore, the images were qualitatively evaluated by two independent radiologists. The quantitative analysis showed SNR increases of up to 100% at 7.0T compared with 3.0T, with considerable variation between different anatomical structures. The qualitative analysis revealed no significant difference in the visualization of anatomical structures comparing 3.0T and 7.0T MR images (P>0.05). The presented results establish the SNR benefits of the transition from 3.0T to 7.0T for wrist imaging without bias by different array designs and based on exact, algebraic SNR quantification. The observed SNR increase nearly reaches expected values but varies greatly between different tissues. It does not necessarily improve the visibility of anatomic structures but adds valuable latitude for sequence optimization. Copyright © 2011 Wiley-Liss, Inc.
Validating internal controls for quantitative plant gene expression studies
Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H
2004-01-01
Background Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Results Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Conclusion Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments. PMID:15317655
NASA Astrophysics Data System (ADS)
Traganos, D.; Cerra, D.; Reinartz, P.
2017-05-01
Seagrasses are one of the most productive and widespread yet threatened coastal ecosystems on Earth. Despite their importance, they are declining due to various threats, which are mainly anthropogenic. Lack of data on their distribution hinders any effort to rectify this decline through effective detection, mapping and monitoring. Remote sensing can mitigate this data gap by allowing retrospective quantitative assessment of seagrass beds over large and remote areas. In this paper, we evaluate the quantitative application of Planet high resolution imagery for the detection of seagrasses in the Thermaikos Gulf, NW Aegean Sea, Greece. The low Signal-to-noise Ratio (SNR), which characterizes spectral bands at shorter wavelengths, prompts the application of the Unmixing-based denoising (UBD) as a pre-processing step for seagrass detection. A total of 15 spectral-temporal patterns is extracted from a Planet image time series to restore the corrupted blue and green band in the processed Planet image. Subsequently, we implement Lyzenga's empirical water column correction and Support Vector Machines (SVM) to evaluate quantitative benefits of denoising. Denoising aids detection of Posidonia oceanica seagrass species by increasing its producer and user accuracy by 31.7 % and 10.4 %, correspondingly, with a respective increase in its Kappa value from 0.3 to 0.48. In the near future, our objective is to improve accuracies in seagrass detection by applying more sophisticated, analytical water column correction algorithms to Planet imagery, developing time- and cost-effective monitoring of seagrass distribution that will enable in turn the effective management and conservation of these highly valuable and productive ecosystems.
Gregg, Chelsea L; Recknagel, Andrew K; Butcher, Jonathan T
2015-01-01
Tissue morphogenesis and embryonic development are dynamic events challenging to quantify, especially considering the intricate events that happen simultaneously in different locations and time. Micro- and more recently nano-computed tomography (micro/nanoCT) has been used for the past 15 years to characterize large 3D fields of tortuous geometries at high spatial resolution. We and others have advanced micro/nanoCT imaging strategies for quantifying tissue- and organ-level fate changes throughout morphogenesis. Exogenous soft tissue contrast media enables visualization of vascular lumens and tissues via extravasation. Furthermore, the emergence of antigen-specific tissue contrast enables direct quantitative visualization of protein and mRNA expression. Micro-CT X-ray doses appear to be non-embryotoxic, enabling longitudinal imaging studies in live embryos. In this chapter we present established soft tissue contrast protocols for obtaining high-quality micro/nanoCT images and the image processing techniques useful for quantifying anatomical and physiological information from the data sets.
2012-01-01
Background We propose a new quantitative measure that enables the researcher to make decisions and test hypotheses about the distribution of knowledge in a community and estimate the richness and sharing of information among informants. In our study, this measure has two levels of analysis: intracultural and intrafamily. Methods Using data collected in northeastern Brazil, we evaluated how these new estimators of richness and sharing behave for different categories of use. Results We observed trends in the distribution of the characteristics of informants. We were also able to evaluate how outliers interfere with these analyses and how other analyses may be conducted using these indices, such as determining the distance between the knowledge of a community and that of experts, as well as exhibiting the importance of these individuals' communal information of biological resources. One of the primary applications of these indices is to supply the researcher with an objective tool to evaluate the scope and behavior of the collected data. PMID:22420565
Araújo, Thiago Antonio Sousa; Almeida, Alyson Luiz Santos; Melo, Joabe Gomes; Medeiros, Maria Franco Trindade; Ramos, Marcelo Alves; Silva, Rafael Ricardo Vasconcelos; Almeida, Cecília Fátima Castelo Branco Rangel; Albuquerque, Ulysses Paulino
2012-03-15
We propose a new quantitative measure that enables the researcher to make decisions and test hypotheses about the distribution of knowledge in a community and estimate the richness and sharing of information among informants. In our study, this measure has two levels of analysis: intracultural and intrafamily. Using data collected in northeastern Brazil, we evaluated how these new estimators of richness and sharing behave for different categories of use. We observed trends in the distribution of the characteristics of informants. We were also able to evaluate how outliers interfere with these analyses and how other analyses may be conducted using these indices, such as determining the distance between the knowledge of a community and that of experts, as well as exhibiting the importance of these individuals' communal information of biological resources. One of the primary applications of these indices is to supply the researcher with an objective tool to evaluate the scope and behavior of the collected data.
Quantitation of Staphylococcus aureus in Seawater Using CHROMagar™ SA
Pombo, David; Hui, Jennifer; Kurano, Michelle; Bankowski, Matthew J; Seifried, Steven E
2010-01-01
A microbiological algorithm has been developed to analyze beach water samples for the determination of viable colony forming units (CFU) of Staphylococcus aureus (S. aureus). Membrane filtration enumeration of S. aureus from recreational beach waters using the chromogenic media CHROMagar™SA alone yields a positive predictive value (PPV) of 70%. Presumptive CHROMagar™SA colonies were confirmed as S. aureus by 24-hour tube coagulase test. Combined, these two tests yield a PPV of 100%. This algorithm enables accurate quantitation of S. aureus in seawater in 72 hours and could support risk-prediction processes for recreational waters. A more rapid protocol, utilizing a 4-hour tube coagulase confirmatory test, enables a 48-hour turnaround time with a modest false negative rate of less than 10%. PMID:20222490
Azimzadeh, Omid; Scherthan, Harry; Yentrapalli, Ramesh; Barjaktarovic, Zarko; Ueffing, Marius; Conrad, Marcus; Neff, Frauke; Calzada-Wack, Julia; Aubele, Michaela; Buske, Christian; Atkinson, Michael J; Hauck, Stefanie M; Tapio, Soile
2012-04-18
Qualitative proteome profiling of formalin-fixed, paraffin-embedded (FFPE) tissue is advancing the field of clinical proteomics. However, quantitative proteome analysis of FFPE tissue is hampered by the lack of an efficient labelling method. The usage of conventional protein labelling on FFPE tissue has turned out to be inefficient. Classical labelling targets lysine residues that are blocked by the formalin treatment. The aim of this study was to establish a quantitative proteomics analysis of FFPE tissue by combining the label-free approach with optimised protein extraction and separation conditions. As a model system we used FFPE heart tissue of control and exposed C57BL/6 mice after total body irradiation using a gamma ray dose of 3 gray. We identified 32 deregulated proteins (p≤0.05) in irradiated hearts 24h after the exposure. The proteomics data were further evaluated and validated by bioinformatics and immunoblotting investigation. In good agreement with our previous results using fresh-frozen tissue, the analysis indicated radiation-induced alterations in three main biological pathways: respiratory chain, lipid metabolism and pyruvate metabolism. The label-free approach enables the quantitative measurement of radiation-induced alterations in FFPE tissue and facilitates retrospective biomarker identification using clinical archives. Copyright © 2012 Elsevier B.V. All rights reserved.
Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L; Ilangovan, Govindasamy
2006-12-15
Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO(2)-independent respiration at higher pO(2) ranges, pO(2)-dependent respiration at low pO(2) ranges, and a static equilibrium with no change in pO(2) at very low pO(2) values. The approach here enables one to comprehensively analyze all of the three zones together-where the progression of O(2) diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO(2) data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry.
Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Shih, Ann T.
2014-01-01
NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.
Gokduman, Kurtulus; Avsaroglu, M Dilek; Cakiris, Aris; Ustek, Duran; Gurakan, G Candan
2016-03-01
The aim of the current study was to develop, a new, rapid, sensitive and quantitative Salmonella detection method using a Real-Time PCR technique based on an inexpensive, easy to produce, convenient and standardized recombinant plasmid positive control. To achieve this, two recombinant plasmids were constructed as reference molecules by cloning the two most commonly used Salmonella-specific target gene regions, invA and ttrRSBC. The more rapid detection enabled by the developed method (21 h) compared to the traditional culture method (90 h) allows the quantitative evaluation of Salmonella (quantification limits of 10(1)CFU/ml and 10(0)CFU/ml for the invA target and the ttrRSBC target, respectively), as illustrated using milk samples. Three advantages illustrated by the current study demonstrate the potential of the newly developed method to be used in routine analyses in the medical, veterinary, food and water/environmental sectors: I--The method provides fast analyses including the simultaneous detection and determination of correct pathogen counts; II--The method is applicable to challenging samples, such as milk; III--The method's positive controls (recombinant plasmids) are reproducible in large quantities without the need to construct new calibration curves. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Baek, Seong-Min
2013-11-01
The purpose of this study is to present a new method of quality assurance (QA) in order to ensure effective evaluation of the accuracy of respiratory-gated radiotherapy (RGR). This would help in quantitatively analyzing the patient's respiratory cycle and respiration-induced tumor motion and in performing a subsequent comparative analysis of dose distributions, using the gamma-index method, as reproduced in our in-house developed respiration-simulating phantom. Therefore, we designed a respiration-simulating phantom capable of reproducing the patient's respiratory cycle and respiration-induced tumor motion and evaluated the accuracy of RGR by estimating its pass rates. We applied the gamma index passing criteria of accepted error ranges of 3% and 3 mm for the dose distribution calculated by using the treatment planning system (TPS) and the actual dose distribution of RGR. The pass rate clearly increased inversely to the gating width chosen. When respiration-induced tumor motion was 12 mm or less, pass rates of 85% and above were achieved for the 30-70% respiratory phase, and pass rates of 90% and above were achieved for the 40-60% respiratory phase. However, a respiratory cycle with a very small fluctuation range of pass rates failed to prove reliable in evaluating the accuracy of RGR. Therefore, accurate and reliable outcomes of radiotherapy will be obtainable only by establishing a novel QA system using the respiration-simulating phantom, the gamma-index analysis, and a quantitative analysis of diaphragmatic motion, enabling an indirect measurement of tumor motion.
Evaluating public awareness of new currency design features
NASA Astrophysics Data System (ADS)
DiNunzio, Lisa; Church, Sara E.
2002-04-01
One of the goals of the 1996 series design was to integrate highly recognizable features that enable the general public to more easily distinguish counterfeit from genuine notes, thereby reducing the chance of counterfeit notes being passed. The purpose of this study is to evaluate how knowledgeable the public is concerning the new currency, to identify the channels through which the public learns about new currency design, and to assess the usefulness of the new currency's authentication features. Also, the study will serve as a baseline measurement for future design studies and in comparative analysis with other countries. The results of the qualitative research will be described in the following sections of this paper. The quantitative research is scheduled to begin in February 2002, at the same time as the Netherlands' opinion poll of the Euro and NLG-notes in an effort to compare results.
Imaging challenges in biomaterials and tissue engineering
Appel, Alyssa A.; Anastasio, Mark A.; Larson, Jeffery C.; Brey, Eric M.
2013-01-01
Biomaterials are employed in the fields of tissue engineering and regenerative medicine (TERM) in order to enhance the regeneration or replacement of tissue function and/or structure. The unique environments resulting from the presence of biomaterials, cells, and tissues result in distinct challenges in regards to monitoring and assessing the results of these interventions. Imaging technologies for three-dimensional (3D) analysis have been identified as a strategic priority in TERM research. Traditionally, histological and immunohistochemical techniques have been used to evaluate engineered tissues. However, these methods do not allow for an accurate volume assessment, are invasive, and do not provide information on functional status. Imaging techniques are needed that enable non-destructive, longitudinal, quantitative, and three-dimensional analysis of TERM strategies. This review focuses on evaluating the application of available imaging modalities for assessment of biomaterials and tissue in TERM applications. Included is a discussion of limitations of these techniques and identification of areas for further development. PMID:23768903
Stenberg, Nicola; Furness, Penny J
2017-03-01
The outcomes of self-management interventions are commonly assessed using quantitative measurement tools, and few studies ask people with long-term conditions to explain, in their own words, what aspects of the intervention they valued. In this Grounded Theory study, a Health Trainers service in the north of England was evaluated based on interviews with eight service-users. Open, focused, and theoretical coding led to the development of a preliminary model explaining participants' experiences and perceived impact of the service. The model reflects the findings that living well with a long-term condition encompassed social connectedness, changed identities, acceptance, and self-care. Health trainers performed four related roles that were perceived to contribute to these outcomes: conceptualizer, connector, coach, and champion. The evaluation contributes a grounded theoretical understanding of a personalized self-management intervention that emphasizes the benefits of a holistic approach to enable cognitive, behavioral, emotional, and social adjustments.
Actinic imaging and evaluation of phase structures on EUV lithography masks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mochi, Iacopo; Goldberg, Kenneth; Huh, Sungmin
2010-09-28
The authors describe the implementation of a phase-retrieval algorithm to reconstruct phase and complex amplitude of structures on EUV lithography masks. Many native defects commonly found on EUV reticles are difficult to detect and review accurately because they have a strong phase component. Understanding the complex amplitude of mask features is essential for predictive modeling of defect printability and defect repair. Besides printing in a stepper, the most accurate way to characterize such defects is with actinic inspection, performed at the design, EUV wavelength. Phase defect and phase structures show a distinct through-focus behavior that enables qualitative evaluation of themore » object phase from two or more high-resolution intensity measurements. For the first time, phase of structures and defects on EUV masks were quantitatively reconstructed based on aerial image measurements, using a modified version of a phase-retrieval algorithm developed to test optical phase shifting reticles.« less
NASA Astrophysics Data System (ADS)
Hosoda, Masaki; Wang, Jing; Tsikudi, Diane; Nadkarni, Seemantini
2016-02-01
Acute myocardial infarction is frequently caused by the rupture of coronary plaques with severely compromised viscoelastic properties. We have developed a new optical technology termed intravascular laser speckle imaging (ILSI) that evaluates plaque viscoelastic properties, by measuring the time scale (time constant, τ) of temporally evolving laser speckle fluctuations. To enable coronary evaluation in vivo, an optical ILSI catheter has been developed that accomplishes omni-directional illumination and viewing of the entire coronary circumference without the need for mechanical rotation. Here, we describe the capability of ILSI for evaluating human coronary atherosclerosis in cadaveric hearts. ILSI was conducted in conjunction with optical coherence tomography (OCT) imaging in five human cadaveric hearts. The left coronary artery (LCA), left anterior descending (LAD), left circumflex artery (LCx), and right coronary artery (RCA) segments were resected and secured on custom-developed coronary holders to enable accurate co-registration between ILSI, OCT, and histopathology. Speckle time constants, τ, calculated from each ILSI section were compared with lipid and collagen content measured from quantitative Histopathological analysis of the corresponding Oil Red O and Picrosirius Red stained sections. Because the presence of low viscosity lipid elicits rapid speckle fluctuations, we observed an inverse correlation between τ measured by ILSI and lipid content (R= -0.64, p< 0.05). In contrast, the higher viscoelastic modulus of fibrous regions resulted in a positive correlation between τ and collagen content (R= 0.54, p< 0.05). These results demonstrate the feasibility of conducting ILSI evaluation of arterial mechanical properties using a miniaturized omni-directional catheter.
Comparison of salivary collection and processing methods for quantitative HHV-8 detection.
Speicher, D J; Johnson, N W
2014-10-01
Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Choi, Ra-Young; Lee, Chang-Hee; Jun, Chul-Ho
2018-05-18
A methallylsilane coupling reagent, containing both a N-hydroxysuccinimidyl(NHS)-ester group and a UV/vis absorbing azobenzene linker undergoes acid-catalyzed immobilization on silica. Analysis of the UV/vis absorption band associated with the azobenzene group in the adduct enables facile quantitative determination of the extent of loading of the NHS groups. Reaction of NHS-groups on the silica surface with amine groups of GOx and rhodamine can be employed to generate enzyme or dye-immobilized silica for quantitative analysis.
A quantitative reconstruction software suite for SPECT imaging
NASA Astrophysics Data System (ADS)
Namías, Mauro; Jeraj, Robert
2017-11-01
Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.
Coy, Heidi; Young, Jonathan R; Douek, Michael L; Brown, Matthew S; Sayre, James; Raman, Steven S
2017-07-01
To evaluate the performance of a novel, quantitative computer-aided diagnostic (CAD) algorithm on four-phase multidetector computed tomography (MDCT) to detect peak lesion attenuation to enable differentiation of clear cell renal cell carcinoma (ccRCC) from chromophobe RCC (chRCC), papillary RCC (pRCC), oncocytoma, and fat-poor angiomyolipoma (fp-AML). We queried our clinical databases to obtain a cohort of histologically proven renal masses with preoperative MDCT with four phases [unenhanced (U), corticomedullary (CM), nephrographic (NP), and excretory (E)]. A whole lesion 3D contour was obtained in all four phases. The CAD algorithm determined a region of interest (ROI) of peak lesion attenuation within the 3D lesion contour. For comparison, a manual ROI was separately placed in the most enhancing portion of the lesion by visual inspection for a reference standard, and in uninvolved renal cortex. Relative lesion attenuation for both CAD and manual methods was obtained by normalizing the CAD peak lesion attenuation ROI (and the reference standard manually placed ROI) to uninvolved renal cortex with the formula [(peak lesion attenuation ROI - cortex ROI)/cortex ROI] × 100%. ROC analysis and area under the curve (AUC) were used to assess diagnostic performance. Bland-Altman analysis was used to compare peak ROI between CAD and manual method. The study cohort comprised 200 patients with 200 unique renal masses: 106 (53%) ccRCC, 32 (16%) oncocytomas, 18 (9%) chRCCs, 34 (17%) pRCCs, and 10 (5%) fp-AMLs. In the CM phase, CAD-derived ROI enabled characterization of ccRCC from chRCC, pRCC, oncocytoma, and fp-AML with AUCs of 0.850 (95% CI 0.732-0.968), 0.959 (95% CI 0.930-0.989), 0.792 (95% CI 0.716-0.869), and 0.825 (95% CI 0.703-0.948), respectively. On Bland-Altman analysis, there was excellent agreement of CAD and manual methods with mean differences between 14 and 26 HU in each phase. A novel, quantitative CAD algorithm enabled robust peak HU lesion detection and discrimination of ccRCC from other renal lesions with similar performance compared to the manual method.
Alborz, Alison; McNally, Rosalind
2004-12-01
To develop methods to facilitate the 'systematic' review of evidence from a range of methodologies on diffuse or 'soft' topics, as exemplified by 'access to health care'. Twenty-eight bibliographic databases, research registers, organizational websites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords' model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesized. Quality assessment was by an initial set of 'generic' quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Eighty-two studies were fully evaluated. Five studies were rated 'highly rigorous', 22 'rigorous', 46 'less rigorous' and nine 'poor' papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or 'soft' topics. Synthesis can be facilitated further by using software, such as the microsoft 'access' database, for managing information.
Human figure drawings in the evaluation of severe adolescent suicidal behavior.
Zalsman, G; Netanel, R; Fischel, T; Freudenstein, O; Landau, E; Orbach, I; Weizman, A; Pfeffer, C R; Apter, A
2000-08-01
To evaluate the reliability of using certain indicators derived from human figure drawings to distinguish between suicidal and nonsuicidal adolescents. Ninety consecutive admissions to an adolescent inpatient unit were assessed. Thirty-nine patients were admitted because of suicidal behavior and 51 for other reasons. All subjects were given the Human Figure Drawing (HFD) test. HFD was evaluated according to the method of Pfeffer and Richman, and the degree of suicidal behavior was rated by the Child Suicide Potential Scale. The internal reliability was satisfactory. HFD indicators correlated significantly with quantitative measures of suicidal behavior; of these indicators specifically, overall impression of the evaluator enabled the prediction of suicidal behavior and the distinction between suicidal and nonsuicidal inpatients (p < .001). A group of graphic indicators derived from a discriminant analysis formed a function, which was able to identify 84.6% of the suicidal and 76.6% of the nonsuicidal adolescents correctly. Many of the items had a regressive quality. The HFD is an example of a simple projective test that may have empirical reliability. It may be useful for the assessment of severe suicidal behavior in adolescents.
Nanoscale Magnetism in Next Generation Magnetic Nanoparticles
2018-03-17
as dextran coated SPIONs were studied. From the measured T1 and T2 relaxation times, a new method called Quantitative Ultra- Short Time-to-Echo...angiograms with high clarity and definition, and enabled quantitative MRI in biological samples. At UCL, the work included (i) fabricating multi-element...distribution unlimited. I. Introduction Compared to flat biosensor devices, 3D engineered biosensors achievemore intimate and conformal interfaces with cells
Melzer, Nina; Wittenburg, Dörte; Repsilber, Dirk
2013-01-01
In this study the benefit of metabolome level analysis for the prediction of genetic value of three traditional milk traits was investigated. Our proposed approach consists of three steps: First, milk metabolite profiles are used to predict three traditional milk traits of 1,305 Holstein cows. Two regression methods, both enabling variable selection, are applied to identify important milk metabolites in this step. Second, the prediction of these important milk metabolite from single nucleotide polymorphisms (SNPs) enables the detection of SNPs with significant genetic effects. Finally, these SNPs are used to predict milk traits. The observed precision of predicted genetic values was compared to the results observed for the classical genotype-phenotype prediction using all SNPs or a reduced SNP subset (reduced classical approach). To enable a comparison between SNP subsets, a special invariable evaluation design was implemented. SNPs close to or within known quantitative trait loci (QTL) were determined. This enabled us to determine if detected important SNP subsets were enriched in these regions. The results show that our approach can lead to genetic value prediction, but requires less than 1% of the total amount of (40,317) SNPs., significantly more important SNPs in known QTL regions were detected using our approach compared to the reduced classical approach. Concluding, our approach allows a deeper insight into the associations between the different levels of the genotype-phenotype map (genotype-metabolome, metabolome-phenotype, genotype-phenotype). PMID:23990900
Grzanka, Dariusz; Styczyński, Jan; Debski, Robert; Krenska, Anna; Pacholska, Małgorzata; Prokurat, Andrzej I; Wysocki, Mariusz; Marszałek, Andrzej
2008-01-01
Pathology diagnosis of chronic graft-versus-host-disease (GVHD) after allogeneic haematopoietic stem cell transplantation (allo-HSCT) is an important issue in clinical follow-up, in spite of frequent difficulties in interpretation., related to dynamic changes occurring in the skin during the disease, as well as to sequelae of basic disease and immunosuppressive therapy. Recently presented Consensus NIH (National Health Institute, Bethesda, USA) of histopathologic (HP) analysis is still complex and intrinsically divergent, thus clinically difficult to implement. Analysis of clinical value of histological evaluation results of skin biopsy in children after allo-HSCT and its correlation with clinical status. Ten skin biopsies were taken from 7 patients (4 boys, 3 girls, age 3-15 years) after allo-HSCT (6 MFD, 1 MMUD) and analyzed after hematoxylin/eosine and immunohistochemical (CD3, CD45T, CD20) staining. Pathology analysis was based on commonly accepted criteria enabling simple and unambiguous interpretation. Results were compared with clinical data and indications for immunosuppressive therapy. It was found that reliable and coherent interpretation can be made when following parameters were taken into account: 1. in epithelium: the presence of apoptosis, archetypical changes and vacuolar degeneration in the basilar layer, presence of CD3/CD45 in the epidermis; 2. in the dermis: the extent of collagenization, presence of melanophages and lymphocyte infiltrations; 3. in the eccrine glands epithelium: eccrine glands atrophy and presence of lymphocytes. A new scoring system of skin biopsy analysis in patients with chronic GVHD based on the modified NIH Consensus was proposed. The preliminary clinical value of histological results was assessed. Skin biopsy evaluation based on limited qualitative and quantitative analysis of lymphocyte infiltrates together with studies on intensity of apoptosis, collagenization and archetypical changes is a valuable diagnostic method complementary to clinical records, enabling easier undertaking of therapeutic decisions.
Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.
2010-01-01
Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167
Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1
Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A
2014-01-01
There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204
Li, Ning; Ziegemeier, Daisy; Bass, Laura; Wang, Wei
2008-12-15
In this study, size exclusion high performance liquid chromatography was evaluated for its application in separation and quantitation of free polyethylene glycol (PEG) and its PEGylated-protein-conjugate (PEG-conjugate). Although the large mass of the free PEG (2-fold greater than the protein) made separation difficult, chromatographic conditions were identified enabling resolution and quantitation of the free PEG, PEG-conjugate and non-PEGylated protein with Shodex Protein KW803 and KW804 columns in series and refractive index detection. The optimum resolution of 1.7 and 2.0 was achieved for the free PEG and PEG-conjugate as well as the free PEG and non-PEGylated protein using 20mM HEPES buffer at pH 6.5. Under this condition, the plot of log(10)MW of all the pertinent analytes against retention time showed a linear relationship with a correlation coefficient of 1. Limited assay performance evaluation demonstrated that the method was linear in the concentration range of 10 to 250 microg/mL of free PEG with correlation coefficients of > or = 0.99. When free PEG in this concentration range was spiked into PEG-conjugate samples at 1mg/mL, the recovery was in the range of 78%-120%. Detection and quantitation limits were determined to be, respectively, 10 and 25 microg/mL for free PEG. The R.S.D. for intra- and inter-day precision was 0.09% or less for retention time measurements and 2.9% or less for area count measurements. Robustness testing was performed by deliberately deviating +/-0.2 pH units away from the desired pH as well as by increasing the flow rate. These deviations resulted in no significant impact on area percent distribution of all species. However, separation was found to be sensitive to high ionic strength and buffer species.
NASA Astrophysics Data System (ADS)
Karakatsanis, Nicolas A.; Rahmim, Arman
2014-03-01
Graphical analysis is employed in the research setting to provide quantitative estimation of PET tracer kinetics from dynamic images at a single bed. Recently, we proposed a multi-bed dynamic acquisition framework enabling clinically feasible whole-body parametric PET imaging by employing post-reconstruction parameter estimation. In addition, by incorporating linear Patlak modeling within the system matrix, we enabled direct 4D reconstruction in order to effectively circumvent noise amplification in dynamic whole-body imaging. However, direct 4D Patlak reconstruction exhibits a relatively slow convergence due to the presence of non-sparse spatial correlations in temporal kinetic analysis. In addition, the standard Patlak model does not account for reversible uptake, thus underestimating the influx rate Ki. We have developed a novel whole-body PET parametric reconstruction framework in the STIR platform, a widely employed open-source reconstruction toolkit, a) enabling accelerated convergence of direct 4D multi-bed reconstruction, by employing a nested algorithm to decouple the temporal parameter estimation from the spatial image update process, and b) enhancing the quantitative performance particularly in regions with reversible uptake, by pursuing a non-linear generalized Patlak 4D nested reconstruction algorithm. A set of published kinetic parameters and the XCAT phantom were employed for the simulation of dynamic multi-bed acquisitions. Quantitative analysis on the Ki images demonstrated considerable acceleration in the convergence of the nested 4D whole-body Patlak algorithm. In addition, our simulated and patient whole-body data in the postreconstruction domain indicated the quantitative benefits of our extended generalized Patlak 4D nested reconstruction for tumor diagnosis and treatment response monitoring.
The role of quantitative safety evaluation in regulatory decision making of drugs.
Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat
2016-01-01
Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.
Analytic Guided-Search Model of Human Performance Accuracy in Target- Localization Search Tasks
NASA Technical Reports Server (NTRS)
Eckstein, Miguel P.; Beutter, Brent R.; Stone, Leland S.
2000-01-01
Current models of human visual search have extended the traditional serial/parallel search dichotomy. Two successful models for predicting human visual search are the Guided Search model and the Signal Detection Theory model. Although these models are inherently different, it has been difficult to compare them because the Guided Search model is designed to predict response time, while Signal Detection Theory models are designed to predict performance accuracy. Moreover, current implementations of the Guided Search model require the use of Monte-Carlo simulations, a method that makes fitting the model's performance quantitatively to human data more computationally time consuming. We have extended the Guided Search model to predict human accuracy in target-localization search tasks. We have also developed analytic expressions that simplify simulation of the model to the evaluation of a small set of equations using only three free parameters. This new implementation and extension of the Guided Search model will enable direct quantitative comparisons with human performance in target-localization search experiments and with the predictions of Signal Detection Theory and other search accuracy models.
Lee, Heow Peuh; Gordon, Bruce R.
2012-01-01
During the past decades, numerous computational fluid dynamics (CFD) studies, constructed from CT or MRI images, have simulated human nasal models. As compared to rhinomanometry and acoustic rhinometry, which provide quantitative information only of nasal airflow, resistance, and cross sectional areas, CFD enables additional measurements of airflow passing through the nasal cavity that help visualize the physiologic impact of alterations in intranasal structures. Therefore, it becomes possible to quantitatively measure, and visually appreciate, the airflow pattern (laminar or turbulent), velocity, pressure, wall shear stress, particle deposition, and temperature changes at different flow rates, in different parts of the nasal cavity. The effects of both existing anatomical factors, as well as post-operative changes, can be assessed. With recent improvements in CFD technology and computing power, there is a promising future for CFD to become a useful tool in planning, predicting, and evaluating outcomes of nasal surgery. This review discusses the possibilities and potential impacts, as well as technical limitations, of using CFD simulation to better understand nasal airflow physiology. PMID:23205221
Quantitative NDE of Composite Structures at NASA
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.
2015-01-01
The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.
Tsang, Vic Wing-Hang; Lei, Ngai-Yu; Lam, Michael Hon-Wah
2009-10-01
A mild, low-temperature analytical approach based on sonication assisted extraction coupled with HPLC electrospray ionization triple quadrupole tandem mass spectrometry has been developed for the simultaneous qualitative and quantitative determination of the four Irgarol-related s-triazine species, namely Irgarol-1051, M1, M2 and M3, in coastal sediments and Green-lipped mussel samples. Mild extraction conditions were necessary for the preservation of the thermally unstable M2. The Multiple Reaction Monitoring (MRM) mode of detection by ESI-MS/MS enabled reliable qualitative identification and sensitive quantitative determination of those s-triazines. This determination method was applied to evaluate the degree of Irgarol-1051 contamination in the sediments and biota of the coastal environment of Hong Kong--one of the busiest maritime ports in the world. All the four s-triazine species were observed in all of the samples. This is the first time that the newly identified M2 and M3 are detected in coastal sediments and biota tissues.
A workload model and measures for computer performance evaluation
NASA Technical Reports Server (NTRS)
Kerner, H.; Kuemmerle, K.
1972-01-01
A generalized workload definition is presented which constructs measurable workloads of unit size from workload elements, called elementary processes. An elementary process makes almost exclusive use of one of the processors, CPU, I/O processor, etc., and is measured by the cost of its execution. Various kinds of user programs can be simulated by quantitative composition of elementary processes into a type. The character of the type is defined by the weights of its elementary processes and its structure by the amount and sequence of transitions between its elementary processes. A set of types is batched to a mix. Mixes of identical cost are considered as equivalent amounts of workload. These formalized descriptions of workloads allow investigators to compare the results of different studies quantitatively. Since workloads of different composition are assigned a unit of cost, these descriptions enable determination of cost effectiveness of different workloads on a machine. Subsequently performance parameters such as throughput rate, gain factor, internal and external delay factors are defined and used to demonstrate the effects of various workload attributes on the performance of a selected large scale computer system.
Quantitative aspects of the clinical performance of transverse tripolar spinal cord stimulation.
Wesselink, W A; Holsheimer, J; King, G W; Torgerson, N A; Boom, H B
1999-01-01
A multicenter study was initiated to evaluate the performance of the transverse tripolar system for spinal cord stimulation. Computer modeling had predicted steering of paresthesia with a dual channel stimulator to be the main benefit of the system. The quantitative analysis presented here includes the results of 484 tests in 30 patients. For each test, paresthesia coverage as a function of voltage levels was stored in a computerized database, including a body map which enabled calculation of the degree of paresthesia coverage of separate body areas, as well as the overlap with the painful areas. The results show that with the transverse tripolar system steering of the paresthesia is possible, although optimal steering requires proper placement of the electrode with respect to the spinal cord. Therefore, with this steering ability as well as a larger therapeutic stimulation window as compared to conventional systems, we expect an increase of the long-term efficacy of spinal cord stimulation. Moreover, in view of the stimulation-induced paresthesia patterns, the system allows selective stimulation of the medial dorsal columns.
Carvalho, J J; Jerónimo, P C A; Gonçalves, C; Alpendurada, M F
2008-11-01
European Council Directive 98/83/EC on the quality of water intended for human consumption brought a new challenge for water-quality control routine laboratories, mainly on pesticides analysis. Under the guidelines of ISO/IEC 17025:2005, a multiresidue method was developed, validated, implemented in routine, and studied with real samples during a one-year period. The proposed method enables routine laboratories to handle a large number of samples, since 28 pesticides of 14 different chemical groups can be quantitated in a single procedure. The method comprises a solid-phase extraction step and subsequent analysis by liquid chromatography-mass spectrometry (LC-MS-MS). The accuracy was established on the basis of participation in interlaboratory proficiency tests, with encouraging results (majority |z-score| <2), and the precision was consistently analysed over one year. The limits of quantitation (below 0.050 microg L(-1)) are in agreement with the enforced threshold value for pesticides of 0.10 microg L(-1). Overall method performance is suitable for routine use according to accreditation rules, taking into account the data collected over one year.
NASA Astrophysics Data System (ADS)
Möller, M.; Diesner, M.; Manhart, A.; Küppers, P.; Spieth-Achtnich, A.; Pistner, C.
2014-08-01
In the study presented here qualitative and quantitative life-cycle considerations were employed to assess the potential material and energy savings that might be achieved through nanoenabled applications. Ten nanotechnology application fields with broad market coverage and immediate impact to either the generation of renewable energies or the use of critical resources were analyzed. Organic photovoltaic modules (solar cells that essentially consist of organic materials) and electronically dimmable windows (electrochromic laminated glass, which can be adjusted to conform to the ambient light conditions) as two very promising nano-enabled applications were quantitatively analyzed. Eight further products including neodymium magnets were evaluated on a qualitative basis. All assessments contain classical indicators such as energy efficiency, product carbon footprint, and resource consumption. In addition, pollutant aspects (exposure and toxicology) as well as other sustainability aspects (such as user benefits) were taken into account in the framework of a so-called "hot spot analysis". Furthermore, drivers behind the innovation as well as associated rebound effects were identified. The results highlight the importance of product specific analyses based on a life-cycle thinking approach.
Radiation exposure in X-ray-based imaging techniques used in osteoporosis
Adams, Judith E.; Guglielmi, Giuseppe; Link, Thomas M.
2010-01-01
Recent advances in medical X-ray imaging have enabled the development of new techniques capable of assessing not only bone quantity but also structure. This article provides (a) a brief review of the current X-ray methods used for quantitative assessment of the skeleton, (b) data on the levels of radiation exposure associated with these methods and (c) information about radiation safety issues. Radiation doses associated with dual-energy X-ray absorptiometry are very low. However, as with any X-ray imaging technique, each particular examination must always be clinically justified. When an examination is justified, the emphasis must be on dose optimisation of imaging protocols. Dose optimisation is more important for paediatric examinations because children are more vulnerable to radiation than adults. Methods based on multi-detector CT (MDCT) are associated with higher radiation doses. New 3D volumetric hip and spine quantitative computed tomography (QCT) techniques and high-resolution MDCT for evaluation of bone structure deliver doses to patients from 1 to 3 mSv. Low-dose protocols are needed to reduce radiation exposure from these methods and minimise associated health risks. PMID:20559834
Mazurek, Artur; Jamroz, Jerzy
2015-04-15
In food analysis, a method for determination of vitamin C should enable measuring of total content of ascorbic acid (AA) and dehydroascorbic acid (DHAA) because both chemical forms exhibit biological activity. The aim of the work was to confirm applicability of HPLC-DAD method for analysis of total content of vitamin C (TC) and ascorbic acid in various types of food by determination of validation parameters such as: selectivity, precision, accuracy, linearity and limits of detection and quantitation. The results showed that the method applied for determination of TC and AA was selective, linear and precise. Precision of DHAA determination by the subtraction method was also evaluated. It was revealed that the results of DHAA determination obtained by the subtraction method were not precise which resulted directly from the assumption of this method and the principles of uncertainty propagation. The proposed chromatographic method should be recommended for routine determinations of total vitamin C in various food. Copyright © 2014 Elsevier Ltd. All rights reserved.
Arab, Lenore; Khan, Faraz; Lam, Helen
2013-01-01
A systematic literature review of human studies relating caffeine or caffeine-rich beverages to cognitive decline reveals only 6 studies that have collected and analyzed cognition data in a prospective fashion that enables study of decline across the spectrum of cognition. These 6 studies, in general, evaluate cognitive function using the Mini Mental State Exam and base their beverage data on FFQs. Studies included in our review differed in their source populations, duration of study, and most dramatically in how their analyses were done, disallowing direct quantitative comparisons of their effect estimates. Only one of the studies reported on all 3 exposures, coffee, tea, and caffeine, making comparisons of findings across studies more difficult. However, in general, it can be stated that for all studies of tea and most studies of coffee and caffeine, the estimates of cognitive decline were lower among consumers, although there is a lack of a distinct dose response. Only a few measures showed a quantitative significance and, interestingly, studies indicate a stronger effect among women than men. PMID:23319129
Arab, Lenore; Khan, Faraz; Lam, Helen
2013-01-01
A systematic literature review of human studies relating caffeine or caffeine-rich beverages to cognitive decline reveals only 6 studies that have collected and analyzed cognition data in a prospective fashion that enables study of decline across the spectrum of cognition. These 6 studies, in general, evaluate cognitive function using the Mini Mental State Exam and base their beverage data on FFQs. Studies included in our review differed in their source populations, duration of study, and most dramatically in how their analyses were done, disallowing direct quantitative comparisons of their effect estimates. Only one of the studies reported on all 3 exposures, coffee, tea, and caffeine, making comparisons of findings across studies more difficult. However, in general, it can be stated that for all studies of tea and most studies of coffee and caffeine, the estimates of cognitive decline were lower among consumers, although there is a lack of a distinct dose response. Only a few measures showed a quantitative significance and, interestingly, studies indicate a stronger effect among women than men.
Characterization of Colloidal Quantum Dot Ligand Exchange by X-ray Photoelectron Spectroscopy
NASA Astrophysics Data System (ADS)
Atewologun, Ayomide; Ge, Wangyao; Stiff-Roberts, Adrienne D.
2013-05-01
Colloidal quantum dots (CQDs) are chemically synthesized semiconductor nanoparticles with size-dependent wavelength tunability. Chemical synthesis of CQDs involves the attachment of long organic surface ligands to prevent aggregation; however, these ligands also impede charge transport. Therefore, it is beneficial to exchange longer surface ligands for shorter ones for optoelectronic devices. Typical characterization techniques used to analyze surface ligand exchange include Fourier-transform infrared spectroscopy, x-ray diffraction, transmission electron microscopy, and nuclear magnetic resonance spectroscopy, yet these techniques do not provide a simultaneously direct, quantitative, and sensitive method for evaluating surface ligands on CQDs. In contrast, x-ray photoelectron spectroscopy (XPS) can provide nanoscale sensitivity for quantitative analysis of CQD surface ligand exchange. A unique aspect of this work is that a fingerprint is identified for shorter surface ligands by resolving the regional XPS spectrum corresponding to different types of carbon bonds. In addition, a deposition technique known as resonant infrared matrix-assisted pulsed laser evaporation is used to improve the CQD film uniformity such that stronger XPS signals are obtained, enabling more accurate analysis of the ligand exchange process.
Friedman, David B
2012-01-01
All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.
Specificity and non-specificity in RNA–protein interactions
Jankowsky, Eckhard; Harris, Michael E.
2016-01-01
Gene expression is regulated by complex networks of interactions between RNAs and proteins. Proteins that interact with RNA have been traditionally viewed as either specific or non-specific; specific proteins interact preferentially with defined RNA sequence or structure motifs, whereas non-specific proteins interact with RNA sites devoid of such characteristics. Recent studies indicate that the binary “specific vs. non-specific” classification is insufficient to describe the full spectrum of RNA–protein interactions. Here, we review new methods that enable quantitative measurements of protein binding to large numbers of RNA variants, and the concepts aimed as describing resulting binding spectra: affinity distributions, comprehensive binding models and free energy landscapes. We discuss how these new methodologies and associated concepts enable work towards inclusive, quantitative models for specific and non-specific RNA–protein interactions. PMID:26285679
Krappmann, Michael; de Boer, Arjen R; Kool, Daniël R W; Irth, Hubertus; Letzel, Thomas
2016-04-30
Continuous-flow reaction detection systems (monitoring enzymatic reactions with mass spectrometry (MS)) lack quantitative values so far. Therefore, two independent internal standards (IS) are implemented in a way that the online system stability can be observed, quantitative conversion values for substrate and product can be obtained and they can be used as mass calibration standards for high MS accuracy. An application previously developed for the MS detection of peptide phosphorylation by cAMP-dependent protein kinase A (PKA) (De Boer et al., Anal. Bioanal. Chem. 2005, 381, 647-655) was transferred to a continuous-flow reaction detection system. This enzymatic reaction, involving enzyme activation as well as the transfer of a phosphate group from ATP to a peptide substrate, was used to prove the compatibility of a quantitative enzymatic assay in a continuous-flow real-time system (connected to MS). Moreover (using internal standards), the critical parameter reaction temperature (including solution density variations depending on temperature) was studied in the continuous-flow mixing system. Furthermore, two substrates (malantide and kemptide), two enzyme types (catalytic subunit of PKA and complete PKA) and one inhibitor were tested to determine system robustness and long-term availability. Even spraying solutions that contained significant amount of MS contaminants (e.g. the polluted catalytic subunit) resulted in quantifiable MS signal intensities. Subsequent recalculations using the internal standards led to results representing the power of this application. The presented methodology and the data evaluation with available Achroma freeware enable the direct coupling of biochemical assays with quantitative MS detection. Monitoring changes such as temperature, reaction time, inhibition, or compound concentrations can be observed quantitatively and thus enzymatic activity can be calculated. Copyright © 2016 John Wiley & Sons, Ltd.
SIMULTANEOUS MULTISLICE MAGNETIC RESONANCE FINGERPRINTING WITH LOW-RANK AND SUBSPACE MODELING
Zhao, Bo; Bilgic, Berkin; Adalsteinsson, Elfar; Griswold, Mark A.; Wald, Lawrence L.; Setsompop, Kawin
2018-01-01
Magnetic resonance fingerprinting (MRF) is a new quantitative imaging paradigm that enables simultaneous acquisition of multiple magnetic resonance tissue parameters (e.g., T1, T2, and spin density). Recently, MRF has been integrated with simultaneous multislice (SMS) acquisitions to enable volumetric imaging with faster scan time. In this paper, we present a new image reconstruction method based on low-rank and subspace modeling for improved SMS-MRF. Here the low-rank model exploits strong spatiotemporal correlation among contrast-weighted images, while the subspace model captures the temporal evolution of magnetization dynamics. With the proposed model, the image reconstruction problem is formulated as a convex optimization problem, for which we develop an algorithm based on variable splitting and the alternating direction method of multipliers. The performance of the proposed method has been evaluated by numerical experiments, and the results demonstrate that the proposed method leads to improved accuracy over the conventional approach. Practically, the proposed method has a potential to allow for a 3x speedup with minimal reconstruction error, resulting in less than 5 sec imaging time per slice. PMID:29060594
The accurate assessment of small-angle X-ray scattering data
Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...
2015-01-23
Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less
Park, Ilwoo; Hu, Simon; Bok, Robert; Ozawa, Tomoko; Ito, Motokazu; Mukherjee, Joydeep; Phillips, Joanna J.; James, C. David; Pieper, Russell O.; Ronen, Sabrina M.; Vigneron, Daniel B.; Nelson, Sarah J.
2013-01-01
High resolution compressed sensing hyperpolarized 13C magnetic resonance spectroscopic imaging was applied in orthotopic human glioblastoma xenografts for quantitative assessment of spatial variations in 13C metabolic profiles and comparison with histopathology. A new compressed sensing sampling design with a factor of 3.72 acceleration was implemented to enable a factor of 4 increase in spatial resolution. Compressed sensing 3D 13C magnetic resonance spectroscopic imaging data were acquired from a phantom and 10 tumor-bearing rats following injection of hyperpolarized [1-13C]-pyruvate using a 3T scanner. The 13C metabolic profiles were compared with hematoxylin and eosin staining and carbonic anhydrase 9 staining. The high-resolution compressed sensing 13C magnetic resonance spectroscopic imaging data enabled the differentiation of distinct 13C metabolite patterns within abnormal tissues with high specificity in similar scan times compared to the fully sampled method. The results from pathology confirmed the different characteristics of 13C metabolic profiles between viable, non-necrotic, nonhypoxic tumor, and necrotic, hypoxic tissue. PMID:22851374
Park, Ilwoo; Hu, Simon; Bok, Robert; Ozawa, Tomoko; Ito, Motokazu; Mukherjee, Joydeep; Phillips, Joanna J; James, C David; Pieper, Russell O; Ronen, Sabrina M; Vigneron, Daniel B; Nelson, Sarah J
2013-07-01
High resolution compressed sensing hyperpolarized (13)C magnetic resonance spectroscopic imaging was applied in orthotopic human glioblastoma xenografts for quantitative assessment of spatial variations in (13)C metabolic profiles and comparison with histopathology. A new compressed sensing sampling design with a factor of 3.72 acceleration was implemented to enable a factor of 4 increase in spatial resolution. Compressed sensing 3D (13)C magnetic resonance spectroscopic imaging data were acquired from a phantom and 10 tumor-bearing rats following injection of hyperpolarized [1-(13)C]-pyruvate using a 3T scanner. The (13)C metabolic profiles were compared with hematoxylin and eosin staining and carbonic anhydrase 9 staining. The high-resolution compressed sensing (13)C magnetic resonance spectroscopic imaging data enabled the differentiation of distinct (13)C metabolite patterns within abnormal tissues with high specificity in similar scan times compared to the fully sampled method. The results from pathology confirmed the different characteristics of (13)C metabolic profiles between viable, non-necrotic, nonhypoxic tumor, and necrotic, hypoxic tissue. Copyright © 2012 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Lin, Xianke; Lu, Wei
2017-07-01
This paper proposes a model that enables consideration of the realistic anisotropic environment surrounding an active material particle by incorporating both diffusion and migration of lithium ions and electrons in the particle. This model makes it possible to quantitatively evaluate effects such as fracture on capacity degradation. In contrast, the conventional model assumes isotropic environment and only considers diffusion in the active particle, which cannot capture the effect of fracture since it would predict results contradictory to experimental observations. With the developed model we have investigated the effects of active material electronic conductivity, particle size, and State of Charge (SOC) swing window when fracture exists. The study shows that the low electronic conductivity of active material has a significant impact on the lithium ion pattern. Fracture increases the resistance for electron transport and therefore reduces lithium intercalation/deintercalation. Particle size plays an important role in lithium ion transport. Smaller particle size is preferable for mitigating capacity loss when fracture happens. The study also shows that operating at high SOC reduces the impact of fracture.
Simultaneous multislice magnetic resonance fingerprinting with low-rank and subspace modeling.
Bo Zhao; Bilgic, Berkin; Adalsteinsson, Elfar; Griswold, Mark A; Wald, Lawrence L; Setsompop, Kawin
2017-07-01
Magnetic resonance fingerprinting (MRF) is a new quantitative imaging paradigm that enables simultaneous acquisition of multiple magnetic resonance tissue parameters (e.g., T 1 , T 2 , and spin density). Recently, MRF has been integrated with simultaneous multislice (SMS) acquisitions to enable volumetric imaging with faster scan time. In this paper, we present a new image reconstruction method based on low-rank and subspace modeling for improved SMS-MRF. Here the low-rank model exploits strong spatiotemporal correlation among contrast-weighted images, while the subspace model captures the temporal evolution of magnetization dynamics. With the proposed model, the image reconstruction problem is formulated as a convex optimization problem, for which we develop an algorithm based on variable splitting and the alternating direction method of multipliers. The performance of the proposed method has been evaluated by numerical experiments, and the results demonstrate that the proposed method leads to improved accuracy over the conventional approach. Practically, the proposed method has a potential to allow for a 3× speedup with minimal reconstruction error, resulting in less than 5 sec imaging time per slice.
Multi-Wavelength Photomagnetic Imaging for Oral Cancer
NASA Astrophysics Data System (ADS)
Marks, Michael
In this study, a multi-wavelength Photomagnetic Imaging (PMI) system is developed and evaluated with experimental studies.. PMI measures temperature increases in samples illuminated by near-infrared light sources using magnetic resonance thermometry. A multiphysics solver combining light and heat transfer models the spatiotemporal distribution of the temperature change. The PMI system develop in this work uses three lasers of varying wavelength (785 nm, 808 nm, 860 nm) to heat the sample. By using multiple wavelengths, we enable the PMI system to quantify the relative concentrations of optical contrast in turbid media and monitor their distribution, at a higher resolution than conventional diffuse optical imaging. The data collected from agarose phantoms with multiple embedded contrast agents designed to simulate the optical properties of oxy- and deoxy-hemoglobin is presented. The reconstructed images demonstrate that multi-wavelength PMI can resolve this complex inclusion structure with high resolution and recover the concentration of each contrast agent with high quantitative accuracy. The modified multi-wavelength PMI system operates under the maximum skin exposure limits defined by the American National Standards Institute, to enable future clinical applications.
Luciw, Paul A; Oslund, Karen L; Yang, Xiao-Wei; Adamson, Lourdes; Ravindran, Resmi; Canfield, Don R; Tarara, Ross; Hirst, Linda; Christensen, Miles; Lerche, Nicholas W; Offenstein, Heather; Lewinsohn, David; Ventimiglia, Frank; Brignolo, Laurie; Wisner, Erik R; Hyde, Dallas M
2011-11-01
Infection with Mycobacterium tuberculosis primarily produces a multifocal distribution of pulmonary granulomas in which the pathogen resides. Accordingly, quantitative assessment of the bacterial load and pathology is a substantial challenge in tuberculosis. Such assessments are critical for studies of the pathogenesis and for the development of vaccines and drugs in animal models of experimental M. tuberculosis infection. Stereology enables unbiased quantitation of three-dimensional objects from two-dimensional sections and thus is suited to quantify histological lesions. We have developed a protocol for stereological analysis of the lung in rhesus macaques inoculated with a pathogenic clinical strain of M. tuberculosis (Erdman strain). These animals exhibit a pattern of infection and tuberculosis similar to that of naturally infected humans. Conditions were optimized for collecting lung samples in a nonbiased, random manner. Bacterial load in these samples was assessed by a standard plating assay, and granulomas were graded and enumerated microscopically. Stereological analysis provided quantitative data that supported a significant correlation between bacterial load and lung granulomas. Thus this stereological approach enables a quantitative, statistically valid analysis of the impact of M. tuberculosis infection in the lung and will serve as an essential tool for objectively comparing the efficacy of drugs and vaccines.
Using Fault Trees to Advance Understanding of Diagnostic Errors.
Rogith, Deevakar; Iyengar, M Sriram; Singh, Hardeep
2017-11-01
Diagnostic errors annually affect at least 5% of adults in the outpatient setting in the United States. Formal analytic techniques are only infrequently used to understand them, in part because of the complexity of diagnostic processes and clinical work flows involved. In this article, diagnostic errors were modeled using fault tree analysis (FTA), a form of root cause analysis that has been successfully used in other high-complexity, high-risk contexts. How factors contributing to diagnostic errors can be systematically modeled by FTA to inform error understanding and error prevention is demonstrated. A team of three experts reviewed 10 published cases of diagnostic error and constructed fault trees. The fault trees were modeled according to currently available conceptual frameworks characterizing diagnostic error. The 10 trees were then synthesized into a single fault tree to identify common contributing factors and pathways leading to diagnostic error. FTA is a visual, structured, deductive approach that depicts the temporal sequence of events and their interactions in a formal logical hierarchy. The visual FTA enables easier understanding of causative processes and cognitive and system factors, as well as rapid identification of common pathways and interactions in a unified fashion. In addition, it enables calculation of empirical estimates for causative pathways. Thus, fault trees might provide a useful framework for both quantitative and qualitative analysis of diagnostic errors. Future directions include establishing validity and reliability by modeling a wider range of error cases, conducting quantitative evaluations, and undertaking deeper exploration of other FTA capabilities. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Barroso Peña, Alvaro; Grüner, Malte; Forbes, Taylor; Denz, Cornelia; Strassert, Cristian A.
2016-09-01
Antimicrobial Photodynamic Inactivation (PDI) represents an attractive alternative in the treatment of infections by antibiotic-resistant pathogenic bacteria. In PDI a photosensitizer (PS) is administered to the site of the biological target in order to generate cytotoxic singlet oxygen which reacts with the biological membrane upon application of harmless visible light. Established methods for testing the photoinduced cytotoxicity of PSs rely on the observation of the whole bacterial ensemble providing only a population-averaged information about the overall produced toxicity. However, for a deeper understanding of the processes that take place in PDI, new methods are required that provide simultaneous regulation of the ROS production, monitoring the subsequent damage induced in the bacteria cells, and full control of the distance between the bacteria and the center of the singlet oxygen production. Herein we present a novel method that enables the quantitative spatio-time-resolved analysis at the single cell level of the photoinduced damage produced by transparent microspheres functionalized with PSs. For this purpose, a methodology was introduced to monitor phototriggered changes with spatiotemporal resolution employing holographic optical tweezers and functional fluorescence microscopy. The defined distance between the photoactive particles and individual bacteria can be fixed under the microscope before the photosensitization process, and the photoinduced damage is monitored by tracing the fluorescence turn-on of a suitable marker. Our methodology constitutes a new tool for the in vitro design and analysis of photosensitizers, as it enables a quantitative response evaluation of living systems towards oxidative stress.
Four simple rules that are sufficient to generate the mammalian blastocyst
Nissen, Silas Boye; Perera, Marta; Gonzalez, Javier Martin; Morgani, Sophie M.; Jensen, Mogens H.; Sneppen, Kim; Brickman, Joshua M.
2017-01-01
Early mammalian development is both highly regulative and self-organizing. It involves the interplay of cell position, predetermined gene regulatory networks, and environmental interactions to generate the physical arrangement of the blastocyst with precise timing. However, this process occurs in the absence of maternal information and in the presence of transcriptional stochasticity. How does the preimplantation embryo ensure robust, reproducible development in this context? It utilizes a versatile toolbox that includes complex intracellular networks coupled to cell—cell communication, segregation by differential adhesion, and apoptosis. Here, we ask whether a minimal set of developmental rules based on this toolbox is sufficient for successful blastocyst development, and to what extent these rules can explain mutant and experimental phenotypes. We implemented experimentally reported mechanisms for polarity, cell—cell signaling, adhesion, and apoptosis as a set of developmental rules in an agent-based in silico model of physically interacting cells. We find that this model quantitatively reproduces specific mutant phenotypes and provides an explanation for the emergence of heterogeneity without requiring any initial transcriptional variation. It also suggests that a fixed time point for the cells’ competence of fibroblast growth factor (FGF)/extracellular signal—regulated kinase (ERK) sets an embryonic clock that enables certain scaling phenomena, a concept that we evaluate quantitatively by manipulating embryos in vitro. Based on these observations, we conclude that the minimal set of rules enables the embryo to experiment with stochastic gene expression and could provide the robustness necessary for the evolutionary diversification of the preimplantation gene regulatory network. PMID:28700688
Registration of 3D spectral OCT volumes combining ICP with a graph-based approach
NASA Astrophysics Data System (ADS)
Niemeijer, Meindert; Lee, Kyungmoo; Garvin, Mona K.; Abràmoff, Michael D.; Sonka, Milan
2012-02-01
The introduction of spectral Optical Coherence Tomography (OCT) scanners has enabled acquisition of high resolution, 3D cross-sectional volumetric images of the retina. 3D-OCT is used to detect and manage eye diseases such as glaucoma and age-related macular degeneration. To follow-up patients over time, image registration is a vital tool to enable more precise, quantitative comparison of disease states. In this work we present a 3D registrationmethod based on a two-step approach. In the first step we register both scans in the XY domain using an Iterative Closest Point (ICP) based algorithm. This algorithm is applied to vessel segmentations obtained from the projection image of each scan. The distance minimized in the ICP algorithm includes measurements of the vessel orientation and vessel width to allow for a more robust match. In the second step, a graph-based method is applied to find the optimal translation along the depth axis of the individual A-scans in the volume to match both scans. The cost image used to construct the graph is based on the mean squared error (MSE) between matching A-scans in both images at different translations. We have applied this method to the registration of Optic Nerve Head (ONH) centered 3D-OCT scans of the same patient. First, 10 3D-OCT scans of 5 eyes with glaucoma imaged in vivo were registered for a qualitative evaluation of the algorithm performance. Then, 17 OCT data set pairs of 17 eyes with known deformation were used for quantitative assessment of the method's robustness.
NASA Astrophysics Data System (ADS)
Mooser, Matthias; Burri, Christian; Stoller, Markus; Luggen, David; Peyer, Michael; Arnold, Patrik; Meier, Christoph; Považay, Boris
2017-07-01
Ocular optical coherence tomography at the wavelengths ranges of 850 and 1060 nm have been integrated with a confocal scanning laser ophthalmoscope eye-tracker as a clinical commercial-class system. Collinear optics enables an exact overlap of the different channels to produce precisely overlapping depth-scans for evaluating the similarities and differences between the wavelengths to extract additional physiologic information. A reliable segmentation algorithm utilizing Graphcuts has been implemented and applied to automatically extract retinal and choroidal shape in cross-sections and volumes. The device has been tested in normals and pathologies including a cross-sectional and longitudinal study of myopia progress and control with a duplicate instrument in Asian children.
Evaluation of galectin binding by frontal affinity chromatography (FAC).
Iwaki, Jun; Hirabayashi, Jun
2015-01-01
Frontal affinity chromatography (FAC) is a simple and versatile procedure enabling quantitative determination of diverse biological interactions in terms of dissociation constants (K d), even though these interactions are relatively weak. The method is best applied to glycans and their binding proteins, with the analytical system operating on the basis of highly reproducible isocratic elution by liquid chromatography. Its application to galectins has been successfully developed to characterize their binding specificities in detail. As a result, their minimal requirements for recognition of disaccharides, i.e., β-galactosides, as well as characteristic features of individual galectins, have been elucidated. In this chapter, we describe standard procedures to determine the K d's for interactions between a series of standard glycans and various galectins.
Close, D.A.; Franks, L.A.; Kocimski, S.M.
1984-08-16
An invention is described that enables the quantitative simultaneous identification of the matrix materials in which fertile and fissile nuclides are embedded to be made along with the quantitative assay of the fertile and fissile materials. The invention also enables corrections for any absorption of neutrons by the matrix materials and by the measurement apparatus by the measurement of the prompt and delayed neutron flux emerging from a sample after the sample is interrogated by simultaneously applied neutrons and gamma radiation. High energy electrons are directed at a first target to produce gamma radiation. A second target receives the resulting pulsed gamma radiation and produces neutrons from the interaction with the gamma radiation. These neutrons are slowed by a moderator surrounding the sample and bathe the sample uniformly, generating second gamma radiation in the interaction. The gamma radiation is then resolved and quantitatively detected, providing a spectroscopic signature of the constituent elements contained in the matrix and in the materials within the vicinity of the sample. (LEW)
Self-powered integrated microfluidic point-of-care low-cost enabling (SIMPLE) chip
Yeh, Erh-Chia; Fu, Chi-Cheng; Hu, Lucy; Thakur, Rohan; Feng, Jeffrey; Lee, Luke P.
2017-01-01
Portable, low-cost, and quantitative nucleic acid detection is desirable for point-of-care diagnostics; however, current polymerase chain reaction testing often requires time-consuming multiple steps and costly equipment. We report an integrated microfluidic diagnostic device capable of on-site quantitative nucleic acid detection directly from the blood without separate sample preparation steps. First, we prepatterned the amplification initiator [magnesium acetate (MgOAc)] on the chip to enable digital nucleic acid amplification. Second, a simplified sample preparation step is demonstrated, where the plasma is separated autonomously into 224 microwells (100 nl per well) without any hemolysis. Furthermore, self-powered microfluidic pumping without any external pumps, controllers, or power sources is accomplished by an integrated vacuum battery on the chip. This simple chip allows rapid quantitative digital nucleic acid detection directly from human blood samples (10 to 105 copies of methicillin-resistant Staphylococcus aureus DNA per microliter, ~30 min, via isothermal recombinase polymerase amplification). These autonomous, portable, lab-on-chip technologies provide promising foundations for future low-cost molecular diagnostic assays. PMID:28345028
Reina, J; Weber, I; Riera, E; Busquets, M; Morales, C
2014-05-01
Cytomegalovirus (CMV) is the main virus causing congenital and postnatal infections in the pediatric population. The aim of this study is to evaluate the usefulness of a quantitative real-time PCR in the diagnosis of these infections using urine as a single sample. We studied all the urine samples of newborns (< 7 days) with suspected congenital infection, and urine of patients with suspected postnatal infection (urine negative at birth). Urines were simultaneously studied by cell culture, qualitative PCR (PCRc), and quantitative real-time PCR (PCRq). We analyzed 332 urine samples (270 to rule out congenital infection and 62 postnatal infections). Of the first, 22 were positive in the PCRq, 19 in the PCRc, and 17 in the culture. PCRq had a sensitivity of 100%, on comparing the culture with the rest of the techniques. Using the PCRq as a reference method, culture had a sensitivity of 77.2%, and PCRc 86.3%. In cases of postnatal infection, PCRq detected 16 positive urines, the PCRq 12, and the cell culture 10. The urines showed viral loads ranging from 2,178 to 116,641 copies/ml. The genomic amplification technique PCRq in real time was more sensitive than the other techniques evaluated. This technique should be considered as a reference (gold standard), leaving the cell culture as a second diagnostic level. The low cost and the automation of PCRq would enable the screening for CMV infection in large neonatal and postnatal populations. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
Rusu, Mirabela; Wang, Haibo; Golden, Thea; Gow, Andrew; Madabhushi, Anant
2013-03-01
Mouse lung models facilitate the investigation of conditions such as chronic inflammation which are associated with common lung diseases. The multi-scale manifestation of lung inflammation prompted us to use multi-scale imaging - both in vivo, ex vivo MRI along with ex vivo histology, for its study in a new quantitative way. Some imaging modalities, such as MRI, are non-invasive and capture macroscopic features of the pathology, while others, e.g. ex vivo histology, depict detailed structures. Registering such multi-modal data to the same spatial coordinates will allow the construction of a comprehensive 3D model to enable the multi-scale study of diseases. Moreover, it may facilitate the identification and definition of quantitative of in vivo imaging signatures for diseases and pathologic processes. We introduce a quantitative, image analytic framework to integrate in vivo MR images of the entire mouse with ex vivo histology of the lung alone, using lung ex vivo MRI as conduit to facilitate their co-registration. In our framework, we first align the MR images by registering the in vivo and ex vivo MRI of the lung using an interactive rigid registration approach. Then we reconstruct the 3D volume of the ex vivo histological specimen by efficient group wise registration of the 2D slices. The resulting 3D histologic volume is subsequently registered to the MRI volumes by interactive rigid registration, directly to the ex vivo MRI, and implicitly to in vivo MRI. Qualitative evaluation of the registration framework was performed by comparing airway tree structures in ex vivo MRI and ex vivo histology where airways are visible and may be annotated. We present a use case for evaluation of our co-registration framework in the context of studying chronic inammation in a diseased mouse.
Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P
2017-12-05
The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.
Novel graphene-based biosensor for early detection of Zika virus infection.
Afsahi, Savannah; Lerner, Mitchell B; Goldstein, Jason M; Lee, Joo; Tang, Xiaoling; Bagarozzi, Dennis A; Pan, Deng; Locascio, Lauren; Walker, Amy; Barron, Francie; Goldsmith, Brett R
2018-02-15
We have developed a cost-effective and portable graphene-enabled biosensor to detect Zika virus with a highly specific immobilized monoclonal antibody. Field Effect Biosensing (FEB) with monoclonal antibodies covalently linked to graphene enables real-time, quantitative detection of native Zika viral (ZIKV) antigens. The percent change in capacitance in response to doses of antigen (ZIKV NS1) coincides with levels of clinical significance with detection of antigen in buffer at concentrations as low as 450pM. Potential diagnostic applications were demonstrated by measuring Zika antigen in a simulated human serum. Selectivity was validated using Japanese Encephalitis NS1, a homologous and potentially cross-reactive viral antigen. Further, the graphene platform can simultaneously provide the advanced quantitative data of nonclinical biophysical kinetics tools, making it adaptable to both clinical research and possible diagnostic applications. The speed, sensitivity, and selectivity of this first-of-its-kind graphene-enabled Zika biosensor make it an ideal candidate for development as a medical diagnostic test. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.
2015-01-01
Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240
Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L
2015-08-01
Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantitative evaluation of phase processing approaches in susceptibility weighted imaging
NASA Astrophysics Data System (ADS)
Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.
2012-03-01
Susceptibility weighted imaging (SWI) takes advantage of the local variation in susceptibility between different tissues to enable highly detailed visualization of the cerebral venous system and sensitive detection of intracranial hemorrhages. Thus, it has been increasingly used in magnetic resonance imaging studies of traumatic brain injury as well as other intracranial pathologies. In SWI, magnitude information is combined with phase information to enhance the susceptibility induced image contrast. Because of global susceptibility variations across the image, the rate of phase accumulation varies widely across the image resulting in phase wrapping artifacts that interfere with the local assessment of phase variation. Homodyne filtering is a common approach to eliminate this global phase variation. However, filter size requires careful selection in order to preserve image contrast and avoid errors resulting from residual phase wraps. An alternative approach is to apply phase unwrapping prior to high pass filtering. A suitable phase unwrapping algorithm guarantees no residual phase wraps but additional computational steps are required. In this work, we quantitatively evaluate these two phase processing approaches on both simulated and real data using different filters and cutoff frequencies. Our analysis leads to an improved understanding of the relationship between phase wraps, susceptibility effects, and acquisition parameters. Although homodyne filtering approaches are faster and more straightforward, phase unwrapping approaches perform more accurately in a wider variety of acquisition scenarios.
Quantitative evaluation of 3D images produced from computer-generated holograms
NASA Astrophysics Data System (ADS)
Sheerin, David T.; Mason, Ian R.; Cameron, Colin D.; Payne, Douglas A.; Slinger, Christopher W.
1999-08-01
Advances in computing and optical modulation techniques now make it possible to anticipate the generation of near real- time, reconfigurable, high quality, three-dimensional images using holographic methods. Computer generated holography (CGH) is the only technique which holds promise of producing synthetic images having the full range of visual depth cues. These realistic images will be viewable by several users simultaneously, without the need for headtracking or special glasses. Such a data visualization tool will be key to speeding up the manufacture of new commercial and military equipment by negating the need for the production of physical 3D models in the design phase. DERA Malvern has been involved in designing and testing fixed CGH in order to understand the connection between the complexity of the CGH, the algorithms used to design them, the processes employed in their implementation and the quality of the images produced. This poster describes results from CGH containing up to 108 pixels. The methods used to evaluate the reconstructed images are discussed and quantitative measures of image fidelity made. An understanding of the effect of the various system parameters upon final image quality enables a study of the possible system trade-offs to be carried out. Such an understanding of CGH production and resulting image quality is key to effective implementation of a reconfigurable CGH system currently under development at DERA.
Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.
Sugino, T; Kawahira, H; Nakamura, R
2014-09-01
Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information. Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits. Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently. Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.
Anconina, Reut; Zur, Dinah; Kesler, Anat; Lublinsky, Svetlana; Toledano, Ronen; Novack, Victor; Benkobich, Elya; Novoa, Rosa; Novic, Evelyne Farkash; Shelef, Ilan
2017-06-01
Dural sinuses vary in size and shape in many pathological conditions with abnormal intracranial pressure. Size and shape normograms of dural brain sinuses are not available. The creation of such normograms may enable computer-assisted comparison to pathologic exams and facilitate diagnoses. The purpose of this study was to quantitatively evaluate normal magnetic resonance venography (MRV) studies in order to create normograms of dural sinuses using a computerized algorithm for vessel cross-sectional analysis. This was a retrospective analysis of MRV studies of 30 healthy persons. Data were analyzed using a specially developed Matlab algorithm for vessel cross-sectional analysis. The cross-sectional area and shape measurements were evaluated to create normograms. Mean cross-sectional size was 53.27±13.31 for the right transverse sinus (TS), 46.87+12.57 for the left TS (p=0.089) and 36.65+12.38 for the superior sagittal sinus. Normograms were created. The distribution of cross-sectional areas along the vessels showed distinct patterns and a parallel course for the median, 25th, 50th and 75th percentiles. In conclusion, using a novel computerized method for vessel cross-sectional analysis we were able to quantitatively characterize dural sinuses of healthy persons and create normograms. Copyright © 2017 Elsevier Ltd. All rights reserved.
Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M
2017-05-01
Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.
PET guidance for liver radiofrequency ablation: an evaluation
NASA Astrophysics Data System (ADS)
Lei, Peng; Dandekar, Omkar; Mahmoud, Faaiza; Widlus, David; Malloy, Patrick; Shekhar, Raj
2007-03-01
Radiofrequency ablation (RFA) is emerging as the primary mode of treatment of unresectable malignant liver tumors. With current intraoperative imaging modalities, quick, precise, and complete localization of lesions remains a challenge for liver RFA. Fusion of intraoperative CT and preoperative PET images, which relies on PET and CT registration, can produce a new image with complementary metabolic and anatomic data and thus greatly improve the targeting accuracy. Unlike neurological images, alignment of abdominal images by combined PET/CT scanner is prone to errors as a result of large nonrigid misalignment in abdominal images. Our use of a normalized mutual information-based 3D nonrigid registration technique has proven powerful for whole-body PET and CT registration. We demonstrate here that this technique is capable of acceptable abdominal PET and CT registration as well. In five clinical cases, both qualitative and quantitative validation showed that the registration is robust and accurate. Quantitative accuracy was evaluated by comparison between the result from the algorithm and clinical experts. The accuracy of registration is much less than the allowable margin in liver RFA. Study findings show the technique's potential to enable the augmentation of intraoperative CT with preoperative PET to reduce procedure time, avoid repeating procedures, provide clinicians with complementary functional/anatomic maps, avoid omitting dispersed small lesions, and improve the accuracy of tumor targeting in liver RFA.
Leahy, Edmund; Chipchase, Lucy; Blackstock, Felicity
2017-04-17
Learning activities are fundamental for the development of expertise in physiotherapy practice. Continuing professional development (CPD) encompasses formal and informal learning activities undertaken by physiotherapists. Identifying the most efficient and effective learning activities is essential to enable the profession to assimilate research findings and improve clinical skills to ensure the most efficacious care for clients. To date, systematic reviews on the effectiveness of CPD provide limited guidance on the most efficacious models of professional development for physiotherapists. The aim of this systematic review is to evaluate which learning activities enhance physiotherapy practice. A search of Ovid MEDLINE, EMBASE, Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO (Psychological Abstracts), PEDro, Cochrane Library, AMED and Educational Resources and Information Center (ERIC) will be completed. Citation searching and reference list searching will be undertaken to locate additional studies. Quantitative and qualitative studies will be included if they examine the impact of learning activities on clinician's behaviour, attitude, knowledge, beliefs, skills, self-efficacy, work satisfaction and patient outcomes. Risk of bias will be assessed by two independent researchers. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) and Confidence in the Evidence from Reviews of Qualitative research (CERQual) will be used to synthesise results where a meta-analysis is possible. Where a meta-analysis is not possible, a narrative synthesis will be conducted. PROSPERO CRD42016050157.
Meier, D.C.; Benkstein, K.D.; Hurst, W.S.; Chu, P.M.
2016-01-01
Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, −5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals. PMID:28090126
Fish acute toxicity syndromes and their use in the QSAR approach to hazard assessment.
McKim, J M; Bradbury, S P; Niemi, G J
1987-01-01
Implementation of the Toxic Substances Control Act of 1977 creates the need to reliably establish testing priorities because laboratory resources are limited and the number of industrial chemicals requiring evaluation is overwhelming. The use of quantitative structure activity relationship (QSAR) models as rapid and predictive screening tools to select more potentially hazardous chemicals for in-depth laboratory evaluation has been proposed. Further implementation and refinement of quantitative structure-toxicity relationships in aquatic toxicology and hazard assessment requires the development of a "mode-of-action" database. With such a database, a qualitative structure-activity relationship can be formulated to assign the proper mode of action, and respective QSAR, to a given chemical structure. In this review, the development of fish acute toxicity syndromes (FATS), which are toxic-response sets based on various behavioral and physiological-biochemical measurements, and their projected use in the mode-of-action database are outlined. Using behavioral parameters monitored in the fathead minnow during acute toxicity testing, FATS associated with acetylcholinesterase (AChE) inhibitors and narcotics could be reliably predicted. However, compounds classified as oxidative phosphorylation uncouplers or stimulants could not be resolved. Refinement of this approach by using respiratory-cardiovascular responses in the rainbow trout, enabled FATS associated with AChE inhibitors, convulsants, narcotics, respiratory blockers, respiratory membrane irritants, and uncouplers to be correctly predicted. PMID:3297660
Capillary Electrophoresis Chips for Fingerprinting Endotoxin Chemotypes and Subclasses.
Kocsis, Béla; Makszin, Lilla; Kilár, Anikó; Péterfi, Zoltán; Kilár, Ferenc
2017-01-01
Endotoxins (lipopolysaccharides, LPS; lipooligosaccharides, LOS) are components of the envelope of Gram-negative bacteria. These molecules, responsible for both advantageous and harmful biological activity of these microorganisms, are highly immunogenic and directly involved in numerous bacterial diseases in humans, such as Gram-negative sepsis. The characterization of endotoxins is of importance, since their physiological and pathophysiological effects depend on their chemical structure. The differences among the LPS from different bacterial serotypes and their mutants include variations mainly within the composition and length or missing of their O-polysaccharide chains. Microchip electrophoretic methodology enables the structural characterization of LPS molecules from several bacteria and the quantitative evaluation of components of endotoxin extracts. The improved microchip electrophoretic method is based on the direct labeling of endotoxins by covalent binding of a fluorescent dye. The classification of the S-type LPSs can be done according to their electrophoretic profiles, which are characteristics of the respective bacterial strains. According to the number, distribution, and the relative amounts of components in an endotoxin extract, it is possible to differentiate between the S-type endotoxins from different Gram-negative bacterial strains. The microchip electrophoresis affords high-resolution separation of pure and partially purified (e.g., obtained from whole-cell lysate) S and R endotoxins. This microchip technique provides a new, standardizable, fast, and sensitive method for the detection of endotoxins and for the quantitative evaluation of components of an endotoxin extract.
Gentry, S V; Powers, E F J; Azim, N; Maidrag, M
2018-07-01
Voluntary befriending schemes operate in many countries, promoting public health by supporting vulnerable individuals and families. Use of third sector and voluntary services to complement health and social care provision is increasingly important globally in the context of economic and demographic challenges, but the evidence base around such collaborations is limited. This article reports the results of operational evaluation research seeking to use robust routine work to generate transferable findings for use by those commissioning and providing services. The subject of our evaluation research is 'Home-Start Suffolk' (HSS) in Suffolk County, UK, an example of a third sector organisation commissioned to support the public health offer to local families. This evaluation research used the Donabedian framework, which assesses the structure, process and outcome in delivery of health services. Methods included a cross-sectional stakeholder survey with qualitative and quantitative elements (n = 96), qualitative interviews (n = 41) and quantitative analysis of the service's routine data (5740 visits) for the period from 01 July 2014 to 01 July 2016. Triangulation of data from each component revealed that HSS was perceived by diverse stakeholders to successfully support families in need of additional help. HSS service users perceived the service to offer greater flexibility, to be tailored to their needs and to be more trustworthy and supportive than statutory services. Volunteering with HSS enabled people to feel productive in their community and gain new skills. Managers of social care services perceived that HSS activity decreased burden on their staff. These benefits were facilitated through a long-standing organisational HSS structure and relationships between HSS and social care. Challenges posed by service provision by a third sector organisation included the need for volunteers to negotiate the boundary between being a friend and a professional outside of a professional framework. Quantitative analysis of impact was limited by the poor quality of routinely collected administrative data, highlighting the importance of planning processes for data collection with evaluation in mind. We believe that the results of this evaluation research provide transferrable lessons. They demonstrate how a third sector organisation with a long-standing structure and relationships with statutory services was able to reduce perceived service burden while also offering support in a more flexible and tailored way greatly valued by service users. Copyright © 2018 The Royal Society for Public Health. All rights reserved.
Enabling Large-Scale IoT-Based Services through Elastic Publish/Subscribe.
Vavassori, Sergio; Soriano, Javier; Fernández, Rafael
2017-09-19
In this paper, we report an algorithm that is designed to leverage the cloud as infrastructure to support Internet of Things (IoT) by elastically scaling in/out so that IoT-based service users never stop receiving sensors' data. This algorithm is able to provide an uninterrupted service to end users even during the scaling operation since its internal state repartitioning is transparent for publishers or subscribers; its scaling operation is time-bounded and depends only on the dimension of the state partitions to be transmitted to the different nodes. We describe its implementation in E-SilboPS, an elastic content-based publish/subscribe (CBPS) system specifically designed to support context-aware sensing and communication in IoT-based services. E-SilboPS is a key internal asset of the FIWARE IoT services enablement platform, which offers an architecture of components specifically designed to capture data from, or act upon, IoT devices as easily as reading/changing the value of attributes linked to context entities. In addition, we discuss the quantitative measurements used to evaluate the scale-out process, as well as the results of this evaluation. This new feature rounds out the context-aware content-based features of E-SilboPS by providing, for example, the necessary middleware for constructing dashboards and monitoring panels that are capable of dynamically changing queries and continuously handling data in IoT-based services.
Agrawal, Sony; Cifelli, Steven; Johnstone, Richard; Pechter, David; Barbey, Deborah A; Lin, Karen; Allison, Tim; Agrawal, Shree; Rivera-Gines, Aida; Milligan, James A; Schneeweis, Jonathan; Houle, Kevin; Struck, Alice J; Visconti, Richard; Sills, Matthew; Wildey, Mary Jo
2016-02-01
Quantitative reverse transcription PCR (qRT-PCR) is a valuable tool for characterizing the effects of inhibitors on viral replication. The amplification of target viral genes through the use of specifically designed fluorescent probes and primers provides a reliable method for quantifying RNA. Due to reagent costs, use of these assays for compound evaluation is limited. Until recently, the inability to accurately dispense low volumes of qRT-PCR assay reagents precluded the routine use of this PCR assay for compound evaluation in drug discovery. Acoustic dispensing has become an integral part of drug discovery during the past decade; however, acoustic transfer of microliter volumes of aqueous reagents was time consuming. The Labcyte Echo 525 liquid handler was designed to enable rapid aqueous transfers. We compared the accuracy and precision of a qPCR assay using the Labcyte Echo 525 to those of the BioMek FX, a traditional liquid handler, with the goal of reducing the volume and cost of the assay. The data show that the Echo 525 provides higher accuracy and precision compared to the current process using a traditional liquid handler. Comparable data for assay volumes from 500 nL to 12 µL allowed the miniaturization of the assay, resulting in significant cost savings of drug discovery and process streamlining. © 2015 Society for Laboratory Automation and Screening.
Enabling Large-Scale IoT-Based Services through Elastic Publish/Subscribe
2017-01-01
In this paper, we report an algorithm that is designed to leverage the cloud as infrastructure to support Internet of Things (IoT) by elastically scaling in/out so that IoT-based service users never stop receiving sensors’ data. This algorithm is able to provide an uninterrupted service to end users even during the scaling operation since its internal state repartitioning is transparent for publishers or subscribers; its scaling operation is time-bounded and depends only on the dimension of the state partitions to be transmitted to the different nodes. We describe its implementation in E-SilboPS, an elastic content-based publish/subscribe (CBPS) system specifically designed to support context-aware sensing and communication in IoT-based services. E-SilboPS is a key internal asset of the FIWARE IoT services enablement platform, which offers an architecture of components specifically designed to capture data from, or act upon, IoT devices as easily as reading/changing the value of attributes linked to context entities. In addition, we discuss the quantitative measurements used to evaluate the scale-out process, as well as the results of this evaluation. This new feature rounds out the context-aware content-based features of E-SilboPS by providing, for example, the necessary middleware for constructing dashboards and monitoring panels that are capable of dynamically changing queries and continuously handling data in IoT-based services. PMID:28925967
Developing methods for systematic reviewing in health services delivery and organisation
Alborz, Alison; McNally, Rosalind
2007-01-01
Objectives To develop methods to facilitate the ‘systematic’ review of evidence from a range of methodologies on diffuse or ‘soft’ topics, as exemplified by ‘access to healthcare’. Data sources 28 bibliographic databases, research registers, organisational web sites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Review methods Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords’ model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesised. Quality assessment was by an initial set of ‘generic’ quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Results 82 studies were fully evaluated. Five studies were rated ‘highly rigorous’, 22 ‘rigorous’, 46 ‘less rigorous’ and 9 ‘poor’ papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. Conclusions The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or ‘soft’ topics. Synthesis can be facilitated further by using software, such as the Microsoft ‘Access’ database, for managing information. PMID:15606880
Simulated Patients in Physical Therapy Education: Systematic Review and Meta-Analysis.
Pritchard, Shane A; Blackstock, Felicity C; Nestel, Debra; Keating, Jenny L
2016-09-01
Traditional models of physical therapy clinical education are experiencing unprecedented pressures. Simulation-based education with simulated (standardized) patients (SPs) is one alternative that has significant potential value, and implementation is increasing globally. However, no review evaluating the effects of SPs on professional (entry-level) physical therapy education is available. The purpose of this study was to synthesize and critically appraise the findings of empirical studies evaluating the contribution of SPs to entry-level physical therapy education, compared with no SP interaction or an alternative education strategy, on any outcome relevant to learning. A systematic search was conducted of Ovid MEDLINE, PubMed, AMED, ERIC, and CINAHL Plus databases and reference lists of included articles, relevant reviews, and gray literature up to May 2015. Articles reporting quantitative or qualitative data evaluating the contribution of SPs to entry-level physical therapy education were included. Two reviewers independently extracted study characteristics, intervention details, and quantitative and qualitative evaluation data from the 14 articles that met the eligibility criteria. Pooled random-effects meta-analysis indicated that replacing up to 25% of authentic patient-based physical therapist practice with SP-based education results in comparable competency (mean difference=1.55/100; 95% confidence interval=-1.08, 4.18; P=.25). Thematic analysis of qualitative data indicated that students value learning with SPs. Assumptions were made to enable pooling of data, and the search strategy was limited to English. Simulated patients appear to have an effect comparable to that of alternative educational strategies on development of physical therapy clinical practice competencies and serve a valuable role in entry-level physical therapy education. However, available research lacks the rigor required for confidence in findings. Given the potential advantages for students, high-quality studies that include an economic analysis should be conducted. © 2016 American Physical Therapy Association.
Serrona, Kevin Roy B; Yu, Jeongsoo; Aguinaldo, Emelita; Florece, Leonardo M
2014-09-01
The Philippines has been making inroads in solid waste management with the enactment and implementation of the Republic Act 9003 or the Ecological Waste Management Act of 2000. Said legislation has had tremendous influence in terms of how the national and local government units confront the challenges of waste management in urban and rural areas using the reduce, reuse, recycle and recovery framework or 4Rs. One of the sectors needing assistance is the informal waste sector whose aspiration is legal recognition of their rank and integration of their waste recovery activities in mainstream waste management. To realize this, the Philippine National Solid Waste Management Commission initiated the formulation of the National Framework Plan for the Informal Waste Sector, which stipulates approaches, strategies and methodologies to concretely involve the said sector in different spheres of local waste management, such as collection, recycling and disposal. What needs to be fleshed out is the monitoring and evaluation component in order to gauge qualitative and quantitative achievements vis-a-vis the Framework Plan. In the process of providing an enabling environment for the informal waste sector, progress has to be monitored and verified qualitatively and quantitatively and measured against activities, outputs, objectives and goals. Using the Framework Plan as the reference, this article developed monitoring and evaluation indicators using the logical framework approach in project management. The primary objective is to institutionalize monitoring and evaluation, not just in informal waste sector plans, but in any waste management initiatives to ensure that envisaged goals are achieved. © The Author(s) 2014.
Quantitative MR imaging in fracture dating--Initial results.
Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva
2016-04-01
For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no significant changes could be detected for T2. MTR remained constant at 35.5 ± 8.0% over time. The study shows that the quantitative assessment of T1 and T2 behaviour over time in the fractured region enable the generation of a novel model allowing for an objective age determination of a fracture. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Synthesising quantitative and qualitative research in evidence‐based patient information
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-01-01
Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review. PMID:17325406
Fundamentals of functional imaging I: current clinical techniques.
Luna, A; Martín Noguerol, T; Mata, L Alcalá
2018-05-01
Imaging techniques can establish a structural, physiological, and molecular phenotype for cancer, which helps enable accurate diagnosis and personalized treatment. In recent years, various imaging techniques that make it possible to study the functional characteristics of tumors quantitatively and reproducibly have been introduced and have become established in routine clinical practice. Perfusion studies enable us to estimate the microcirculation as well as tumor angiogenesis and permeability using ultrafast dynamic acquisitions with ultrasound, computed tomography, or magnetic resonance (MR) imaging. Diffusion-weighted sequences now form part of state-of-the-art MR imaging protocols to evaluate oncologic lesions in any anatomic location. Diffusion-weighted imaging provides information about the occupation of the extracellular and extravascular space and indirectly estimates the cellularity and apoptosis of tumors, having demonstrated its relation with biologic aggressiveness in various tumor lines and its usefulness in the evaluation of the early response to systemic and local targeted therapies. Another tool is hydrogen proton MR spectroscopy, which is used mainly in the study of the metabolic characteristics of brain tumors. However, the complexity of the technique and its lack of reproducibility have limited its clinical use in other anatomic areas, although much experience with the use of this technique in the assessment of prostate and breast cancers as well as liver lesions has also accumulated. This review analyzes the imaging techniques that make it possible to evaluate the physiological and molecular characteristics of cancer that have already been introduced into clinical practice, such as techniques that evaluate angiogenesis through dynamic acquisitions after the administration of contrast material, diffusion-weighted imaging, or hydrogen proton MR spectroscopy, as well as their principal applications in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuldna, Piret, E-mail: piret.kuldna@seit.ee; Peterson, Kaja; Kuhi-Thalfeldt, Reeli
Strategic Environmental Assessment (SEA) serves as a platform for bringing together researchers, policy developers and other stakeholders to evaluate and communicate significant environmental and socio-economic effects of policies, plans and programmes. Quantitative computer models can facilitate knowledge exchange between various parties that strive to use scientific findings to guide policy-making decisions. The process of facilitating knowledge generation and exchange, i.e. knowledge brokerage, has been increasingly explored, but there is not much evidence in the literature on how knowledge brokerage activities are used in full cycles of SEAs which employ quantitative models. We report on the SEA process of the nationalmore » energy plan with reflections on where and how the Long-range Energy Alternatives Planning (LEAP) model was used for knowledge brokerage on emissions modelling between researchers and policy developers. Our main suggestion is that applying a quantitative model not only in ex ante, but also ex post scenario modelling and associated impact assessment can facilitate systematic and inspiring knowledge exchange process on a policy problem and capacity building of participating actors. - Highlights: • We examine the knowledge brokering on emissions modelling between researchers and policy developers in a full cycle of SEA. • Knowledge exchange process can evolve at any modelling stage within SEA. • Ex post scenario modelling enables systematic knowledge exchange and learning on a policy problem.« less
Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L.; Ilangovan, Govindasamy
2006-01-01
Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO2-independent respiration at higher pO2 ranges, pO2-dependent respiration at low pO2 ranges, and a static equilibrium with no change in pO2 at very low pO2 values. The approach here enables one to comprehensively analyze all of the three zones together—where the progression of O2 diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO2 data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry. PMID:17012319
Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng
2014-01-01
Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649
Tang, Hsin-Yao; Beer, Lynn A.; Barnhart, Kurt T.; Speicher, David W.
2011-01-01
Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves, quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1-D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μl serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers. PMID:21726088
Tang, Hsin-Yao; Beer, Lynn A; Barnhart, Kurt T; Speicher, David W
2011-09-02
Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μL of serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers.
Caoili, Salvador Eugenio C.
2014-01-01
B-cell epitope prediction can enable novel pharmaceutical product development. However, a mechanistically framed consensus has yet to emerge on benchmarking such prediction, thus presenting an opportunity to establish standards of practice that circumvent epistemic inconsistencies of casting the epitope prediction task as a binary-classification problem. As an alternative to conventional dichotomous qualitative benchmark data, quantitative dose-response data on antibody-mediated biological effects are more meaningful from an information-theoretic perspective in the sense that such effects may be expressed as probabilities (e.g., of functional inhibition by antibody) for which the Shannon information entropy (SIE) can be evaluated as a measure of informativeness. Accordingly, half-maximal biological effects (e.g., at median inhibitory concentrations of antibody) correspond to maximally informative data while undetectable and maximal biological effects correspond to minimally informative data. This applies to benchmarking B-cell epitope prediction for the design of peptide-based immunogens that elicit antipeptide antibodies with functionally relevant cross-reactivity. Presently, the Immune Epitope Database (IEDB) contains relatively few quantitative dose-response data on such cross-reactivity. Only a small fraction of these IEDB data is maximally informative, and many more of them are minimally informative (i.e., with zero SIE). Nevertheless, the numerous qualitative data in IEDB suggest how to overcome the paucity of informative benchmark data. PMID:24949474
Nielsen, Gitte; Fritz-Hansen, Thomas; Dirks, Christina G; Jensen, Gorm B; Larsson, Henrik B W
2004-09-01
To investigate the diagnostic ability of quantitative magnetic resonance imaging (MRI) heart perfusion in acute heart patients, a fast, multislice dynamic contrast-enhanced MRI sequence was applied to patients with acute myocardial infarction. Seven patients with acute transmural myocardial infarction were studied using a Turbo-fast low angle shot (FLASH) MRI sequence to monitor the first pass of an extravascular contrast agent (CA), gadolinium diethylene triamine pentaacetic acid (Gd-DTPA). Quantitation of perfusion, expressed as Ki (mL/100 g/minute), in five slices, each having 60 sectors, provided an estimation of the severity and extent of the perfusion deficiency. Reperfusion was assessed both by noninvasive criteria and by coronary angiography (CAG). The Ki maps clearly delineated the infarction in all patients. Thrombolytic treatment was clearly beneficial in one case, but had no effect in the two other cases. Over the time-course of the study, normal perfusion values were not reestablished following thrombolytic treatment in all cases investigated. This study shows that quantitative MRI perfusion values can be obtained from acutely ill patients following acute myocardial infarction. The technique provides information on both the volume and severity of affected myocardial tissue, enabling the power of treatment regimes to be assessed objectively, and this approach should aid individual patient stratification and prognosis. Copyright 2004 Wiley-Liss, Inc.
Bouteille, R; Gaudet, M; Lecanu, B; This, H
2013-04-01
When fermenting milk, lactic bacteria convert part of α- and β-lactoses into d- and l- lactic acids, causing a pH decrease responsible for casein coagulation. Lactic acid monitoring during fermentation is essential for the control of dairy gel textural and organoleptic properties, and is a way to evaluate strain efficiency. Currently, titrations are used to follow the quantity of acids formed during jellification of milk but they are not specific to lactic acid. An analytical method without the use of any reagent was investigated to quantify lactic acid during milk fermentation: in situ quantitative proton nuclear magnetic resonance spectroscopy. Two methods using in situ quantitative proton nuclear magnetic resonance spectroscopy were compared: (1) d- and l-lactic acids content determination, using the resonance of their methyl protons, showing an increase from 2.06 ± 0.02 to 8.16 ± 0.74 g/L during 240 min of fermentation; and (2) the determination of the α- and β-lactoses content, decreasing from 42.68 ± 0.02 to 30.76 ± 1.75 g/L for the same fermentation duration. The ratio between the molar concentrations of produced lactic acids and consumed lactoses enabled cross-validation, as the value (2.02 ± 0.18) is consistent with lactic acid bacteria metabolism. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko
2017-07-01
Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.
Neutron-activation analysis applied to copper ores and artifacts
NASA Technical Reports Server (NTRS)
Linder, N. F.
1970-01-01
Neutron activation analysis is used for quantitative identification of trace metals in copper. Establishing a unique fingerprint of impurities in Michigan copper would enable identification of artifacts made from this copper.
Automated quantitative cytological analysis using portable microfluidic microscopy.
Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva
2016-06-01
In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DMD-based quantitative phase microscopy and optical diffraction tomography
NASA Astrophysics Data System (ADS)
Zhou, Renjie
2018-02-01
Digital micromirror devices (DMDs), which offer high speed and high degree of freedoms in steering light illuminations, have been increasingly applied to optical microscopy systems in recent years. Lately, we introduced DMDs into digital holography to enable new imaging modalities and break existing imaging limitations. In this paper, we will first present our progress in using DMDs for demonstrating laser-illumination Fourier ptychographic microscopy (FPM) with shotnoise limited detection. After that, we will present a novel common-path quantitative phase microscopy (QPM) system based on using a DMD. Building on those early developments, a DMD-based high speed optical diffraction tomography (ODT) system has been recently demonstrated, and the results will also be presented. This ODT system is able to achieve video-rate 3D refractive-index imaging, which can potentially enable observations of high-speed 3D sample structural changes.
Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface
2017-02-01
COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three
2017-01-01
Recent advances in next-generation sequencing approaches have revolutionized our understanding of transcriptional expression in diverse systems. However, measurements of transcription do not necessarily reflect gene translation, the process of ultimate importance in understanding cellular function. To circumvent this limitation, biochemical tagging of ribosome subunits to isolate ribosome-associated mRNA has been developed. However, this approach, called TRAP, lacks quantitative resolution compared to a superior technology, ribosome profiling. Here, we report the development of an optimized ribosome profiling approach in Drosophila. We first demonstrate successful ribosome profiling from a specific tissue, larval muscle, with enhanced resolution compared to conventional TRAP approaches. We next validate the ability of this technology to define genome-wide translational regulation. This technology is leveraged to test the relative contributions of transcriptional and translational mechanisms in the postsynaptic muscle that orchestrate the retrograde control of presynaptic function at the neuromuscular junction. Surprisingly, we find no evidence that significant changes in the transcription or translation of specific genes are necessary to enable retrograde homeostatic signaling, implying that post-translational mechanisms ultimately gate instructive retrograde communication. Finally, we show that a global increase in translation induces adaptive responses in both transcription and translation of protein chaperones and degradation factors to promote cellular proteostasis. Together, this development and validation of tissue-specific ribosome profiling enables sensitive and specific analysis of translation in Drosophila. PMID:29194454
Evaluating resective surgery targets in epilepsy patients: A comparison of quantitative EEG methods.
Müller, Michael; Schindler, Kaspar; Goodfellow, Marc; Pollo, Claudio; Rummel, Christian; Steimer, Andreas
2018-07-15
Quantitative analysis of intracranial EEG is a promising tool to assist clinicians in the planning of resective brain surgery in patients suffering from pharmacoresistant epilepsies. Quantifying the accuracy of such tools, however, is nontrivial as a ground truth to verify predictions about hypothetical resections is missing. As one possibility to address this, we use customized hypotheses tests to examine the agreement of the methods on a common set of patients. One method uses machine learning techniques to enable the predictive modeling of EEG time series. The other estimates nonlinear interrelation between EEG channels. Both methods were independently shown to distinguish patients with excellent post-surgical outcome (Engel class I) from those without improvement (Engel class IV) when assessing the electrodes associated with the tissue that was actually resected during brain surgery. Using the AND and OR conjunction of both methods we evaluate the performance gain that can be expected when combining them. Both methods' assessments correlate strongly positively with the similarity between a hypothetical resection and the corresponding actual resection in class I patients. Moreover, the Spearman rank correlation between the methods' patient rankings is significantly positive. To our best knowledge, this is the first study comparing surgery target assessments from fundamentally differing techniques. Although conceptually completely independent, there is a relation between the predictions obtained from both methods. Their broad consensus supports their application in clinical practice to provide physicians additional information in the process of presurgical evaluation. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Attene-Ramos, Matias S.; Huang, Ruili; Sakamuru, Srilatha; Witt, Kristine L.; Beeson, Gyda C.; Shou, Louie; Schnellmann, Rick G.; Beeson, Craig C.; Tice, Raymond R.; Austin, Christopher P.; Xia, Menghang
2014-01-01
A goal of the Tox21 program is to transit toxicity testing from traditional in vivo models to in vitro assays that assess how chemicals affect cellular responses and toxicity pathways. A critical contribution of the NIH Chemical Genomics center (NCGC) to the Tox21 program is the implementation of a quantitative high throughput screening (qHTS) approach, using cell- and biochemical-based assays to generate toxicological profiles for thousands of environmental compounds. Here, we evaluated the effect of chemical compounds on mitochondrial membrane potential in HepG2 cells by screening a library of 1,408 compounds provided by the National Toxicology Program (NTP) in a qHTS platform. Compounds were screened over 14 concentrations, and results showed that 91 and 88 compounds disrupted mitochondrial membrane potential after treatment for one or five h, respectively. Seventy-six compounds active at both time points were clustered by structural similarity, producing 11 clusters and 23 singletons. Thirty-eight compounds covering most of the active chemical space were more extensively evaluated. Thirty-six of the 38 compounds were confirmed to disrupt mitochondrial membrane potential using a fluorescence plate reader and 35 were confirmed using a high content imaging approach. Among the 38 compounds, 4 and 6 induced LDH release, a measure of cytotoxicity, at 1 or 5 h, respectively. Compounds were further assessed for mechanism of action (MOA) by measuring changes in oxygen consumption rate, which enabled identification of 20 compounds as uncouplers. This comprehensive approach allows for evaluation of thousands of environmental chemicals for mitochondrial toxicity and identification of possible MOAs. PMID:23895456
Brossard-Racine, Marie; Mazer, Barbara; Julien, Marilyse; Majnemer, Annette
2012-01-01
In this study we sought to validate the discriminant ability of the Evaluation Tool of Children's Handwriting-Manuscript in identifying children in Grades 2-3 with handwriting difficulties and to determine the percentage of change in handwriting scores that is consistently detected by occupational therapists. Thirty-four therapists judged and compared 35 pairs of handwriting samples. Receiver operating characteristic (ROC) analyses were performed to determine (1) the optimal cutoff values for word and letter legibility scores that identify children with handwriting difficulties who should be seen in rehabilitation and (2) the minimal clinically important difference (MCID) in handwriting scores. Cutoff scores of 75.0% for total word legibility and 76.0% for total letter legibility were found to provide excellent levels of accuracy. A difference of 10.0%-12.5% for total word legibility and 6.0%-7.0% for total letter legibility were found as the MCID. Study findings enable therapists to quantitatively support clinical judgment when evaluating handwriting. Copyright © 2012 by the American Occupational Therapy Association, Inc.
The EyeHarp: A Gaze-Controlled Digital Musical Instrument
Vamvakousis, Zacharias; Ramirez, Rafael
2016-01-01
We present and evaluate the EyeHarp, a new gaze-controlled Digital Musical Instrument, which aims to enable people with severe motor disabilities to learn, perform, and compose music using only their gaze as control mechanism. It consists of (1) a step-sequencer layer, which serves for constructing chords/arpeggios, and (2) a melody layer, for playing melodies and changing the chords/arpeggios. We have conducted a pilot evaluation of the EyeHarp involving 39 participants with no disabilities from both a performer and an audience perspective. In the first case, eight people with normal vision and no motor disability participated in a music-playing session in which both quantitative and qualitative data were collected. In the second case 31 people qualitatively evaluated the EyeHarp in a concert setting consisting of two parts: a solo performance part, and an ensemble (EyeHarp, two guitars, and flute) performance part. The obtained results indicate that, similarly to traditional music instruments, the proposed digital musical instrument has a steep learning curve, and allows to produce expressive performances both from the performer and audience perspective. PMID:27445885
Garbers, Samantha; Flandrick, Kathleen; Bermudez, Dayana; Meserve, Allison; Chiasson, Mary Ann
2014-11-01
Interventions to reduce unintended pregnancy through improved contraceptive use are a public health priority. A comprehensive process evaluation of a contraceptive assessment module intervention with demonstrated efficacy was undertaken. The 12-month process evaluation goal was to describe the extent to which the intervention was implemented as intended over time, and to identify programmatic adjustments to improve implementation fidelity. Quantitative and qualitative methods included staff surveys, electronic health record data, usage monitoring, and observations. Fidelity of implementation was low overall (<10% of eligible patients completed the entire module [dose received]). Although a midcourse correction making the module available in clinical areas led to increased dose delivered (23% vs. 30%, chi-square test p = .006), dose received did not increase significantly after this adjustment. Contextual factors including competing organizational and staff priorities and staff buy-in limited the level of implementation and precluded adoption of some strategies such as adjusting patient flow. Using a process evaluation framework enabled the research team to identify and address complexities inherent in effectiveness studies and facilitated the alignment of program and context. © 2014 Society for Public Health Education.
3D printed phantoms of retinal photoreceptor cells for evaluating adaptive optics imaging modalities
NASA Astrophysics Data System (ADS)
Kedia, Nikita; Liu, Zhuolin; Sochol, Ryan; Hammer, Daniel X.; Agrawal, Anant
2018-02-01
Adaptive optics-enabled optical coherence tomography (AO-OCT) and scanning laser ophthalmoscopy (AO-SLO) devices can resolve retinal cones and rods in three dimensions. To evaluate the improved resolution of AO-OCT and AO-SLO, a phantom that mimics retinal anatomy at the cellular level is required. We used a two-photon polymerization approach to fabricate three-dimensional (3D) photoreceptor phantoms modeled on the central foveal cones. By using a femtosecond laser to selectively photocure precise locations within a liquid-based photoresist via two-photon absorption, we produced high-resolution phantoms with μm-level dimensions similar to true anatomy. In this work, we present two phantoms to evaluate the resolution limits of an AO imaging system: one that models only the outer segments of the photoreceptor cells at varying retinal eccentricities and another that contains anatomically relevant features of the full-length photoreceptor. With these phantoms we are able to quantitatively estimate transverse resolution of an AO system and produce images that are comparable to those found in the human retina.
Uckermann, Ortrud; Sitoci-Ficici, Kerim H.; Later, Robert; Beiermeister, Rudolf; Doberenz, Falko; Gelinsky, Michael; Leipnitz, Elke; Schackert, Gabriele; Koch, Edmund; Sablinskas, Valdas; Steiner, Gerald; Kirsch, Matthias
2015-01-01
Spinal cord injury (SCI) induces complex biochemical changes, which result in inhibition of nervous tissue regeneration abilities. In this study, Fourier-transform infrared (FT-IR) spectroscopy was applied to assess the outcomes of implants made of a novel type of non-functionalized soft calcium alginate hydrogel in a rat model of spinal cord hemisection (n = 28). Using FT-IR spectroscopic imaging, we evaluated the stability of the implants and the effects on morphology and biochemistry of the injured tissue one and six months after injury. A semi-quantitative evaluation of the distribution of lipids and collagen showed that alginate significantly reduced injury-induced demyelination of the contralateral white matter and fibrotic scarring in the chronic state after SCI. The spectral information enabled to detect and localize the alginate hydrogel at the lesion site and proved its long-term persistence in vivo. These findings demonstrate a positive impact of alginate hydrogel on recovery after SCI and prove FT-IR spectroscopic imaging as alternative method to evaluate and optimize future SCI repair strategies. PMID:26559822
Risk manager formula for success: Influencing decision making.
Midgley, Mike
2017-10-01
Providing the ultimate decision makers with a quantitative risk analysis based on thoughtful assessment by the organization's experts enables an efficient decision. © 2017 American Society for Healthcare Risk Management of the American Hospital Association.
The application of time series models to cloud field morphology analysis
NASA Technical Reports Server (NTRS)
Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.
1987-01-01
A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.
Quantitative analysis of culture using millions of digitized books
Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman
2011-01-01
We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965
Quantitative analysis of culture using millions of digitized books.
Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman
2011-01-14
We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.
Qualitative and quantitative interpretation of SEM image using digital image processing.
Saladra, Dawid; Kopernik, Magdalena
2016-10-01
The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Shuga, Joe; Zeng, Yong; Novak, Richard; Lan, Qing; Tang, Xiaojiang; Rothman, Nathaniel; Vermeulen, Roel; Li, Laiyu; Hubbard, Alan; Zhang, Luoping; Mathies, Richard A; Smith, Martyn T
2013-09-01
Cancers are heterogeneous and genetically unstable. New methods are needed that provide the sensitivity and specificity to query single cells at the genetic loci that drive cancer progression, thereby enabling researchers to study the progression of individual tumors. Here, we report the development and application of a bead-based hemi-nested microfluidic droplet digital PCR (dPCR) technology to achieve 'quantitative' measurement and single-molecule sequencing of somatically acquired carcinogenic translocations at extremely low levels (<10(-6)) in healthy subjects. We use this technique in our healthy study population to determine the overall concentration of the t(14;18) translocation, which is strongly associated with follicular lymphoma. The nested dPCR approach improves the detection limit to 1×10(-7) or lower while maintaining the analysis efficiency and specificity. Further, the bead-based dPCR enabled us to isolate and quantify the relative amounts of the various clonal forms of t(14;18) translocation in these subjects, and the single-molecule sensitivity and resolution of dPCR led to the discovery of new clonal forms of t(14;18) that were otherwise masked by the conventional quantitative PCR measurements. In this manner, we created a quantitative map for this carcinogenic mutation in this healthy population and identified the positions on chromosomes 14 and 18 where the vast majority of these t(14;18) events occur.
Yao, Yingyi; Guo, Weisheng; Zhang, Jian; Wu, Yudong; Fu, Weihua; Liu, Tingting; Wu, Xiaoli; Wang, Hanjie; Gong, Xiaoqun; Liang, Xing-Jie; Chang, Jin
2016-09-07
Ultrasensitive and quantitative fast screening of cancer biomarkers by immunochromatography test strip (ICTS) is still challenging in clinic. The gold nanoparticles (NPs) based ICTS with colorimetric readout enables a quick spectrum screening but suffers from nonquantitative performance; although ICTS with fluorescence readout (FICTS) allows quantitative detection, its sensitivity still deserves more efforts and attentions. In this work, by taking advantages of colorimetric ICTS and FICTS, we described a reverse fluorescence enhancement ICTS (rFICTS) with bimodal signal readout for ultrasensitive and quantitative fast screening of carcinoembryonic antigen (CEA). In the presence of target, gold NPs aggregation in T line induced colorimetric readout, allowing on-the-spot spectrum screening in 10 min by naked eye. Meanwhile, the reverse fluorescence enhancement signal enabled more accurately quantitative detection with better sensitivity (5.89 pg/mL for CEA), which is more than 2 orders of magnitude lower than that of the conventional FICTS. The accuracy and stability of the rFICTS were investigated with more than 100 clinical serum samples for large-scale screening. Furthermore, this rFICTS also realized postoperative monitoring by detecting CEA in a patient with colon cancer and comparing with CT imaging diagnosis. These results indicated this rFICTS is particularly suitable for point-of-care (POC) diagnostics in both resource-rich and resource-limited settings.
Quantitative analysis of periodontal pathogens by ELISA and real-time polymerase chain reaction.
Hamlet, Stephen M
2010-01-01
The development of analytical methods enabling the accurate identification and enumeration of bacterial species colonizing the oral cavity has led to the identification of a small number of bacterial pathogens that are major factors in the etiology of periodontal disease. Further, these methods also underpin more recent epidemiological analyses of the impact of periodontal disease on general health. Given the complex milieu of over 700 species of microorganisms known to exist within the complex biofilms found in the oral cavity, the identification and enumeration of oral periodontopathogens has not been an easy task. In recent years however, some of the intrinsic limitations of the more traditional microbiological analyses previously used have been overcome with the advent of immunological and molecular analytical methods. Of the plethora of methodologies reported in the literature, the enzyme-linked immunosorbent assay (ELISA), which combines the specificity of antibody with the sensitivity of simple enzyme assays and the polymerase chain reaction (PCR), has been widely utilized in both laboratory and clinical applications. Although conventional PCR does not allow quantitation of the target organism, real-time PCR (rtPCR) has the ability to detect amplicons as they accumulate in "real time" allowing subsequent quantitation. These methods enable the accurate quantitation of as few as 10(2) (using rtPCR) to 10(4) (using ELISA) periodontopathogens in dental plaque samples.
Curran, V R; Hoekman, T; Gulliver, W; Landells, I; Hatcher, L
2000-01-01
Over the years, various distance learning technologies and methods have been applied to the continuing medical education needs of rural and remote physicians. They have included audio teleconferencing, slow scan imaging, correspondence study, and compressed videoconferencing. The recent emergence and growth of Internet, World Wide Web (Web), and compact disk read-only-memory (CD-ROM) technologies have introduced new opportunities for providing continuing education to the rural medical practitioner. This evaluation study assessed the instructional effectiveness of a hybrid computer-mediated courseware delivery system on dermatologic office procedures. A hybrid delivery system merges Web documents, multimedia, computer-mediated communications, and CD-ROMs to enable self-paced instruction and collaborative learning. Using a modified pretest to post-test control group study design, several evaluative criteria (participant reaction, learning achievement, self-reported performance change, and instructional transactions) were assessed by various qualitative and quantitative data collection methods. This evaluation revealed that a hybrid computer-mediated courseware system was an effective means for increasing knowledge (p < .05) and improving self-reported competency (p < .05) in dermatologic office procedures, and that participants were very satisfied with the self-paced instruction and use of asynchronous computer conferencing for collaborative information sharing among colleagues.
Cocks, Errol; Boaden, Ross
2009-10-01
The Individual Placement and Support (IPS) model aims to achieve open employment for people with mental illness. The Supported Employment Fidelity Scale (SEFS) is a 15-item instrument that evaluates the extent to which a service follows the IPS principles of best practice. This paper describes the IPS model and an evaluation of a specialist employment program for people with mental illness using the SEFS. The SEFS enabled a quantitative assessment of service provision against the criteria of evidence-based practice principles. Data were collected from multiple sources. In addition, a literature review was conducted, and personnel engaged in implementation of the IPS model at other Australian employment programs were consulted. The program achieved a score of 59 of a possible 75 on the SEFS, which is described as fair supported employment. Analysis of the 15-scale items resulted in the identification of strengths, areas for further development, and a set of recommendations. The program was operating substantially in line with evidence-based practice principles and had considerable scope for further development. Issues arising from the evaluation, areas of applicability of the SEFS and the underlying literature, and implications for occupational therapy are highlighted.
Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans
2012-01-01
SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.
Infiltration and Selective Interactions at the Interface in Polymer-Oxide Hybrid Solar Cells
NASA Astrophysics Data System (ADS)
Ferragut, R.; Aghion, S.; Moia, F.; Binda, M.; Canesi, E. V.; Lanzani, G.; Petrozza, A.
2013-06-01
Positron annihilation spectroscopy was used to characterize polymer-based hybrid solar cells formed by poly(3-hexylthiophene) (P3HT) finely infiltrated in a porous TiO2 skeleton. A step-change improvement in the device performance is enabled by engineering the hybrid interface by the insertion of a proper molecular interlayer namely 4-mercaptopyridine (4-MP). In order to obtain depth-resolved data, positrons were implanted in the sample using a variable-energy positron beam. The characteristics of the partially filled nanoporous structures were evaluated in terms of the depth profile of the positronium yield and the S-parameter. A quantitative evaluation of the pore filling in the deep region is given from the analysis of Coincidence Doppler Broadening taken at fixed implantation energy. We note a remarkable difference in terms of the positronium yield when the 4-MP interlayer is introduced, which means a better covering of P3HT on the porous surface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagayama, T.; Bailey, J. E.; Loisel, G. P.
Iron opacity calculations presently disagree with measurements at an electron temperature of ~180–195 eV and an electron density of (2–4)×10 22cm –3, conditions similar to those at the base of the solar convection zone. The measurements use x rays to volumetrically heat a thin iron sample that is tamped with low-Z materials. The opacity is inferred from spectrally resolved x-ray transmission measurements. Plasma self-emission, tamper attenuation, and temporal and spatial gradients can all potentially cause systematic errors in the measured opacity spectra. In this article we quantitatively evaluate these potential errors with numerical investigations. The analysis exploits computer simulations thatmore » were previously found to reproduce the experimentally measured plasma conditions. The simulations, combined with a spectral synthesis model, enable evaluations of individual and combined potential errors in order to estimate their potential effects on the opacity measurement. Lastly, the results show that the errors considered here do not account for the previously observed model-data discrepancies.« less
Optofluidic time-stretch quantitative phase microscopy.
Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke
2018-03-01
Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.
Design and analysis issues in quantitative proteomics studies.
Karp, Natasha A; Lilley, Kathryn S
2007-09-01
Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.
Distance-based microfluidic quantitative detection methods for point-of-care testing.
Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James
2016-04-07
Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.
Quantitative proteomics in biological research.
Wilm, Matthias
2009-10-01
Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.
Uncertainty Analysis for Angle Calibrations Using Circle Closure
Estler, W. Tyler
1998-01-01
We analyze two types of full-circle angle calibrations: a simple closure in which a single set of unknown angular segments is sequentially compared with an unknown reference angle, and a dual closure in which two divided circles are simultaneously calibrated by intercomparison. In each case, the constraint of circle closure provides auxiliary information that (1) enables a complete calibration process without reference to separately calibrated reference artifacts, and (2) serves to reduce measurement uncertainty. We derive closed-form expressions for the combined standard uncertainties of angle calibrations, following guidelines published by the International Organization for Standardization (ISO) and NIST. The analysis includes methods for the quantitative evaluation of the standard uncertainty of small angle measurement using electronic autocollimators, including the effects of calibration uncertainty and air turbulence. PMID:28009359
Tumor-targeted nanomedicines for cancer theranostics
Lammers, Twan; Shi, Yang
2017-01-01
Chemotherapeutic drugs have multiple drawbacks, including severe side effects and suboptimal therapeutic efficacy. Nanomedicines assist in improving the biodistribution and the target accumulation of chemotherapeutic drugs, and are therefore able to enhance the balance between efficacy and toxicity. Multiple different types of nanomedicines have been evaluated over the years, including liposomes, polymer-drug conjugates and polymeric micelles, which rely on strategies such as passive targeting, active targeting and triggered release for improved tumor-directed drug delivery. Based on the notion that tumors and metastases are highly heterogeneous, it is important to integrate imaging properties in nanomedicine formulations in order to enable non-invasive and quantitative assessment of targeting efficiency. By allowing for patient pre-selection, such next generation nanotheranostics are useful for facilitating clinical translation and personalizing nanomedicine treatments. PMID:27865762
Modeling of power transmission and stress grading for corona protection
NASA Astrophysics Data System (ADS)
Zohdi, T. I.; Abali, B. E.
2017-11-01
Electrical high voltage (HV) machines are prone to corona discharges leading to power losses as well as damage of the insulating layer. Many different techniques are applied as corona protection and computational methods aid to select the best design. In this paper we develop a reduced-order model in 1D estimating electric field and temperature distribution of a conductor wrapped with different layers, as usual for HV-machines. Many assumptions and simplifications are undertaken for this 1D model, therefore, we compare its results to a direct numerical simulation in 3D quantitatively. Both models are transient and nonlinear, giving a possibility to quickly estimate in 1D or fully compute in 3D by a computational cost. Such tools enable understanding, evaluation, and optimization of corona shielding systems for multilayered coils.
Functional magnetic resonance imaging in oncology: state of the art.
Guimaraes, Marcos Duarte; Schuch, Alice; Hochhegger, Bruno; Gross, Jefferson Luiz; Chojniak, Rubens; Marchiori, Edson
2014-01-01
In the investigation of tumors with conventional magnetic resonance imaging, both quantitative characteristics, such as size, edema, necrosis, and presence of metastases, and qualitative characteristics, such as contrast enhancement degree, are taken into consideration. However, changes in cell metabolism and tissue physiology which precede morphological changes cannot be detected by the conventional technique. The development of new magnetic resonance imaging techniques has enabled the functional assessment of the structures in order to obtain information on the different physiological processes of the tumor microenvironment, such as oxygenation levels, cellularity and vascularity. The detailed morphological study in association with the new functional imaging techniques allows for an appropriate approach to cancer patients, including the phases of diagnosis, staging, response evaluation and follow-up, with a positive impact on their quality of life and survival rate.
OIPAV: an integrated software system for ophthalmic image processing, analysis and visualization
NASA Astrophysics Data System (ADS)
Zhang, Lichun; Xiang, Dehui; Jin, Chao; Shi, Fei; Yu, Kai; Chen, Xinjian
2018-03-01
OIPAV (Ophthalmic Images Processing, Analysis and Visualization) is a cross-platform software which is specially oriented to ophthalmic images. It provides a wide range of functionalities including data I/O, image processing, interaction, ophthalmic diseases detection, data analysis and visualization to help researchers and clinicians deal with various ophthalmic images such as optical coherence tomography (OCT) images and color photo of fundus, etc. It enables users to easily access to different ophthalmic image data manufactured from different imaging devices, facilitate workflows of processing ophthalmic images and improve quantitative evaluations. In this paper, we will present the system design and functional modules of the platform and demonstrate various applications. With a satisfying function scalability and expandability, we believe that the software can be widely applied in ophthalmology field.
Maintenance = reuse-oriented software development
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1989-01-01
Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.
Differentiation of Normal and Malignant Breast Tissues using Infrared Spectroscopy
NASA Astrophysics Data System (ADS)
Mehrotra, Ranjana; Jangir, Deepak Kumar; Gupta, Alka; Kandpal, H. C.
2008-11-01
Infrared spectra of carcinomatous and their normal fore bearing tissues were collected in the 600 cm-1 to 4000 cm-1 region. Fourier Transform Infrared (FTIR) data of infiltrating ductal carcinoma of breast with different grades of malignancy from patients of different age groups were analyzed. Infrared spectra demonstrate significant spectral differences between the tumor sections of normal and the malignant breast tissues. In particular, changes in frequency and intensity in the spectra of protein, nucleic acid and glycogen were observed. This allows to make a qualitative and semi quantitative evaluation of the changes in proliferation activities from normal to diseased tissue. The findings establish a framework for additional studies, which may enable us to establish a relation of the diseased state with its infrared spectra.
Xiao, Xia; Feng, Ya-Ping; Du, Bin; Sun, Han-Ru; Ding, You-Quan; Qi, Jian-Guo
2017-03-01
Fluorescent immunolabeling and imaging in free-floating thick (50-60 μm) tissue sections is relatively simple in practice and enables design-based non-biased stereology, or 3-D reconstruction and analysis. This method is widely used for 3-D in situ quantitative biology in many areas of biological research. However, the labeling quality and efficiency of standard protocols for fluorescent immunolabeling of these tissue sections are not always satisfactory. Here, we systematically evaluate the effects of raising the conventional antibody incubation temperatures (4°C or 21°C) to mammalian body temperature (37°C) in these protocols. Our modification significantly enhances the quality (labeling sensitivity, specificity, and homogeneity) and efficiency (antibody concentration and antibody incubation duration) of fluorescent immunolabeling of free-floating thick tissue sections.
Urea Biosynthesis Using Liver Slices
ERIC Educational Resources Information Center
Teal, A. R.
1976-01-01
Presented is a practical scheme to enable introductory biology students to investigate the mechanism by which urea is synthesized in the liver. The tissue-slice technique is discussed, and methods for the quantitative analysis of metabolites are presented. (Author/SL)
Enhanced culvert inspections - best practices guidebook : final report.
DOT National Transportation Integrated Search
2017-06-01
Culvert inspection is a key enabler that allows MnDOT to manage the states highway culvert system. When quantitative detail on culvert condition is required, an inspector will need to use enhanced inspection technologies. Enhanced inspection techn...
NASA Astrophysics Data System (ADS)
Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.
2006-03-01
Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.
A real-time monitoring platform of myogenesis regulators using double fluorescent labeling
Sapoznik, Etai; Niu, Guoguang; Zhou, Yu; Prim, Peter M.; Criswell, Tracy L.
2018-01-01
Real-time, quantitative measurement of muscle progenitor cell (myoblast) differentiation is an important tool for skeletal muscle research and identification of drugs that support skeletal muscle regeneration. While most quantitative tools rely on sacrificial approach, we developed a double fluorescent tagging approach, which allows for dynamic monitoring of myoblast differentiation through assessment of fusion index and nuclei count. Fluorescent tagging of both the cell cytoplasm and nucleus enables monitoring of cell fusion and the formation of new myotube fibers, similar to immunostaining results. This labeling approach allowed monitoring the effects of Myf5 overexpression, TNFα, and Wnt agonist on myoblast differentiation. It also enabled testing the effects of surface coating on the fusion levels of scaffold-seeded myoblasts. The double fluorescent labeling of myoblasts is a promising technique to visualize even minor changes in myogenesis of myoblasts in order to support applications such as tissue engineering and drug screening. PMID:29444187
Let's push things forward: disruptive technologies and the mechanics of tissue assembly.
Varner, Victor D; Nelson, Celeste M
2013-09-01
Although many of the molecular mechanisms that regulate tissue assembly in the embryo have been delineated, the physical forces that couple these mechanisms to actual changes in tissue form remain unclear. Qualitative studies suggest that mechanical loads play a regulatory role in development, but clear quantitative evidence has been lacking. This is partly owing to the complex nature of these problems - embryonic tissues typically undergo large deformations and exhibit evolving, highly viscoelastic material properties. Still, despite these challenges, new disruptive technologies are enabling study of the mechanics of tissue assembly in unprecedented detail. Here, we present novel experimental techniques that enable the study of each component of these physical problems: kinematics, forces, and constitutive properties. Specifically, we detail advances in light sheet microscopy, optical coherence tomography, traction force microscopy, fluorescence force spectroscopy, microrheology and micropatterning. Taken together, these technologies are helping elucidate a more quantitative understanding of the mechanics of tissue assembly.
Broadband quantitative NQR for authentication of vitamins and dietary supplements
NASA Astrophysics Data System (ADS)
Chen, Cheng; Zhang, Fengchao; Bhunia, Swarup; Mandal, Soumyajit
2017-05-01
We describe hardware, pulse sequences, and algorithms for nuclear quadrupole resonance (NQR) spectroscopy of medicines and dietary supplements. Medicine and food safety is a pressing problem that has drawn more and more attention. NQR is an ideal technique for authenticating these substances because it is a non-invasive method for chemical identification. We have recently developed a broadband NQR front-end that can excite and detect 14N NQR signals over a wide frequency range; its operating frequency can be rapidly set by software, while sensitivity is comparable to conventional narrowband front-ends over the entire range. This front-end improves the accuracy of authentication by enabling multiple-frequency experiments. We have also developed calibration and signal processing techniques to convert measured NQR signal amplitudes into nuclear spin densities, thus enabling its use as a quantitative technique. Experimental results from several samples are used to illustrate the proposed methods.
Numerical Investigation of Vertical Plunging Jet Using a Hybrid Multifluid–VOF Multiphase CFD Solver
Shonibare, Olabanji Y.; Wardle, Kent E.
2015-06-28
A novel hybrid multiphase flow solver has been used to conduct simulations of a vertical plunging liquid jet. This solver combines a multifluid methodology with selective interface sharpening to enable simulation of both the initial jet impingement and the long-time entrained bubble plume phenomena. Models are implemented for variable bubble size capturing and dynamic switching of interface sharpened regions to capture transitions between the initially fully segregated flow types into the dispersed bubbly flow regime. It was found that the solver was able to capture the salient features of the flow phenomena under study and areas for quantitative improvement havemore » been explored and identified. In particular, a population balance approach is employed and detailed calibration of the underlying models with experimental data is required to enable quantitative prediction of bubble size and distribution to capture the transition between segregated and dispersed flow types with greater fidelity.« less
Let's push things forward: disruptive technologies and the mechanics of tissue assembly
Varner, Victor D.; Nelson, Celeste M.
2013-01-01
Although many of the molecular mechanisms that regulate tissue assembly in the embryo have been delineated, the physical forces that couple these mechanisms to actual changes in tissue form remain unclear. Qualitative studies suggest that mechanical loads play a regulatory role in development, but clear quantitative evidence has been lacking. This is partly owing to the complex nature of these problems – embryonic tissues typically undergo large deformations and exhibit evolving, highly viscoelastic material properties. Still, despite these challenges, new disruptive technologies are enabling study of the mechanics of tissue assembly in unprecedented detail. Here, we present novel experimental techniques that enable the study of each component of these physical problems: kinematics, forces, and constitutive properties. Specifically, we detail advances in light sheet microscopy, optical coherence tomography, traction force microscopy, fluorescence force spectroscopy, microrheology and micropatterning. Taken together, these technologies are helping elucidate a more quantitative understanding of the mechanics of tissue assembly. PMID:23907401
A toolbox to explore the mechanics of living embryonic tissues
Campàs, Otger
2016-01-01
The sculpting of embryonic tissues and organs into their functional morphologies involves the spatial and temporal regulation of mechanics at cell and tissue scales. Decades of in vitro work, complemented by some in vivo studies, have shown the relevance of mechanical cues in the control of cell behaviors that are central to developmental processes, but the lack of methodologies enabling precise, quantitative measurements of mechanical cues in vivo have hindered our understanding of the role of mechanics in embryonic development. Several methodologies are starting to enable quantitative studies of mechanics in vivo and in situ, opening new avenues to explore how mechanics contributes to shaping embryonic tissues and how it affects cell behavior within developing embryos. Here we review the present methodologies to study the role of mechanics in living embryonic tissues, considering their strengths and drawbacks as well as the conditions in which they are most suitable. PMID:27061360
A toolbox to explore the mechanics of living embryonic tissues.
Campàs, Otger
2016-07-01
The sculpting of embryonic tissues and organs into their functional morphologies involves the spatial and temporal regulation of mechanics at cell and tissue scales. Decades of in vitro work, complemented by some in vivo studies, have shown the relevance of mechanical cues in the control of cell behaviors that are central to developmental processes, but the lack of methodologies enabling precise, quantitative measurements of mechanical cues in vivo have hindered our understanding of the role of mechanics in embryonic development. Several methodologies are starting to enable quantitative studies of mechanics in vivo and in situ, opening new avenues to explore how mechanics contributes to shaping embryonic tissues and how it affects cell behavior within developing embryos. Here we review the present methodologies to study the role of mechanics in living embryonic tissues, considering their strengths and drawbacks as well as the conditions in which they are most suitable. Copyright © 2016 Elsevier Ltd. All rights reserved.
Investigating rate-limiting barriers to nanoscale nonviral gene transfer with nanobiophotonics
NASA Astrophysics Data System (ADS)
Chen, Hunter H.
Nucleic acids are a novel class of therapeutics poised to address many unmet clinical needs. Safe and efficient delivery remains a significant challenge that has delayed the realization of the full therapeutic potential of nucleic acids. Nanoscale nonviral vectors offer an attractive alternative to viral vectors as natural and synthetic polymers or polypeptides may be rationally designed to meet the unique demands of individual applications. A mechanistic understanding of cellular barriers is necessary to develop guidelines for designing custom gene carriers which are expected to greatly impact this delivery challenge. The work herein focused on the relationships among nanocomplex stability, intracellular trafficking and unpacking kinetics, and DNA degradation. Ultrasensitive nanosensors based on QD-FRET were developed to characterize the biophysical properties of nanocomplexes and study these rate-limiting steps. Quantitative image analysis enabled the distributions of the subpopulation of condensed or released DNA to be determined within the major cellular compartments encountered during gene transfer. The steady state stability and unpacking kinetics within these compartments were found to impact transgene expression, elucidating multiple design strategies to achieve efficient gene transfer. To address enzymatic barriers, a novel two-step QD-FRET nanosensor was developed to analyze unpacking and DNA degradation simultaneously, which has not been accomplished previously. Bioresponsive strategies such as disulfide crosslinking and thermosensitivity were evaluated by QD-FRET and quantitative compartmental analysis as case studies to determine appropriate design specifications for thiolated polymers and thermoresponsive polypeptides. Relevant nanobiophotonic tools were developed as a platform to study major rate-limiting barriers to nanomedicine and demonstrated the feasibility of using mechanistic information gained from these tools to guide the rational design of gene carriers and achieve the desired properties that enable efficient gene transfer.
Hyperspectral imaging with near-infrared-enabled mobile phones for tissue oximetry
NASA Astrophysics Data System (ADS)
Lin, Jonathan L.; Ghassemi, Pejhman; Chen, Yu; Pfefer, Joshua
2018-02-01
Hyperspectral reflectance imaging (HRI) is an emerging clinical tool for characterizing spatial and temporal variations in blood perfusion and oxygenation for applications such as burn assessment, wound healing, retinal exams and intraoperative tissue viability assessment. Since clinical HRI-based oximeters often use near-infrared (NIR) light, NIR-enabled mobile phones may provide a useful platform for future point-of-care devices. Furthermore, quantitative NIR imaging on mobile phones may dramatically increase the availability and accessibility of medical diagnostics for low-resource settings. We have evaluated the potential for phone-based NIR oximetry imaging and elucidated factors affecting performance using devices from two different manufacturers, as well as a scientific CCD. A broadband light source and liquid crystal tunable filter were used for imaging at 10 nm bands from 650 to 1000 nm. Spectral sensitivity measurements indicated that mobile phones with standard NIR blocking filters had minimal response beyond 700 nm, whereas one modified phone showed sensitivity to 800 nm and another to 1000 nm. Red pixel channels showed the greatest sensitivity up to 800 nm, whereas all channels provided essentially equivalent sensitivity at longer wavelengths. Referencing of blood oxygenation levels was performed with a CO-oximeter. HRI measurements were performed using cuvettes filled with hemoglobin solutions of different oxygen saturation levels. Good agreement between absorbance spectra measured with mobile phone and a CCD cameras were seen for wavelengths below 900 nm. Saturation estimates showed root-mean-squared-errors of 5.2% and 4.5% for the CCD and phone, respectively. Overall, this work provides strong evidence of the potential for mobile phones to provide quantitative spectral imaging in the NIR for applications such as oximetry, and generates practical insights into factors that impact performance as well as test methods for performance assessment.
Zhang, Jing; Wang, Chenchen; Ji, Li; Liu, Weiping
2016-05-16
According to the electrophilic theory in toxicology, many chemical carcinogens in the environment and/or their active metabolites are electrophiles that exert their effects by forming covalent bonds with nucleophilic DNA centers. The theory of hard and soft acids and bases (HSAB), which states that a toxic electrophile reacts preferentially with a biological macromolecule that has a similar hardness or softness, clarifies the underlying chemistry involved in this critical event. Epoxides are hard electrophiles that are produced endogenously by the enzymatic oxidation of parent chemicals (e.g., alkenes and PAHs). Epoxide ring opening proceeds through a SN2-type mechanism with hard nucleophile DNA sites as the major facilitators of toxic effects. Thus, the quantitative prediction of chemical reactivity would enable a predictive assessment of the molecular potential to exert electrophile-mediated toxicity. In this study, we calculated the activation energies for reactions between epoxides and the guanine N7 site for a diverse set of epoxides, including aliphatic epoxides, substituted styrene oxides, and PAH epoxides, using a state-of-the-art density functional theory (DFT) method. It is worth noting that these activation energies for diverse epoxides can be further predicted by quantum chemically calculated nucleophilic indices from HSAB theory, which is a less computationally demanding method than the exacting procedure for locating the transition state. More importantly, the good qualitative/quantitative correlations between the chemical reactivity of epoxides and their bioactivity suggest that the developed model based on HSAB theory may aid in the predictive hazard evaluation of epoxides, enabling the early identification of mutagenicity/carcinogenicity-relevant SN2 reactivity.
Pang, Wai-Man; Qin, Jing; Lu, Yuqiang; Xie, Yongming; Chui, Chee-Kong; Heng, Pheng-Ann
2011-03-01
To accelerate the simultaneous algebraic reconstruction technique (SART) with motion compensation for speedy and quality computed tomography reconstruction by exploiting CUDA-enabled GPU. Two core techniques are proposed to fit SART into the CUDA architecture: (1) a ray-driven projection along with hardware trilinear interpolation, and (2) a voxel-driven back-projection that can avoid redundant computation by combining CUDA shared memory. We utilize the independence of each ray and voxel on both techniques to design CUDA kernel to represent a ray in the projection and a voxel in the back-projection respectively. Thus, significant parallelization and performance boost can be achieved. For motion compensation, we rectify each ray's direction during the projection and back-projection stages based on a known motion vector field. Extensive experiments demonstrate the proposed techniques can provide faster reconstruction without compromising image quality. The process rate is nearly 100 projections s (-1), and it is about 150 times faster than a CPU-based SART. The reconstructed image is compared against ground truth visually and quantitatively by peak signal-to-noise ratio (PSNR) and line profiles. We further evaluate the reconstruction quality using quantitative metrics such as signal-to-noise ratio (SNR) and mean-square-error (MSE). All these reveal that satisfactory results are achieved. The effects of major parameters such as ray sampling interval and relaxation parameter are also investigated by a series of experiments. A simulated dataset is used for testing the effectiveness of our motion compensation technique. The results demonstrate our reconstructed volume can eliminate undesirable artifacts like blurring. Our proposed method has potential to realize instantaneous presentation of 3D CT volume to physicians once the projection data are acquired.
Ma, Lina; Liu, Fuyao; Lei, Zhen; Wang, Zhenxin
2017-01-15
Herein, a novel upconversion@polydopamine core@shell nanoparticle (termed as UCNP@PDA NP) -based aptameric biosensor has been fabricated for the quantitative analysis of cytochrome c (Cyt c) inside living cells, which comprises an UCNP@PDA NP, acting as an internal reference and fluorescence quenching agent, and Cy3 modified aptamer enabling ratiometric quantitative Cyt c measurement. After the hybridization of Cy3 labeled aptamer with amino-terminated single DNA on the UCNP@PDA NP surface (termed as UCNP@PDA@AP), the fluorescence of Cy3 can be efficiently quenched by the PDA shell. With the spontaneous cellular uptake of UCNP@PDA@AP, the Cyt c aptamer dissociates from UCNP@PDA NP surface through formation of aptamer-Cyt c complex, resulting in concomitant activation of the Cy3 fluorescence. High amount of Cyt c leads to high fluorescence emission, enabling direct visualization/measurement of the Cyt c by fluorescence microscopy/spectroscopy. The steady upconversion luminescent (UCL) signals can be employed not only for intracellular imaging, but also as an internal reference for evaluating intracellular Cyt c amount using the ratio of fluorescence intensity of Cy3 with the UCL intensity of UCNP. The UCNP@PDA@AP shows a reasonable detection limit (20nM) and large dynamic range (50nM to 10μM, which covers the literature reported values (1-10μM) for cytosolic Cyt c in apoptotic cells) for detecting Cyt c in buffer with excellent selectivity. In addition, the UCNP@PDA@AP has been successfully used to monitor etoposide induced intracellular releasing of Cyt c, providing the possibility for cell-based screening of apoptosis-inducing drugs. Copyright © 2016 Elsevier B.V. All rights reserved.
Jiřík, Miroslav; Bartoš, Martin; Tomášek, Petr; Malečková, Anna; Kural, Tomáš; Horáková, Jana; Lukáš, David; Suchý, Tomáš; Kochová, Petra; Hubálek Kalbáčová, Marie; Králíčková, Milena; Tonar, Zbyněk
2018-06-01
Quantification of the structure and composition of biomaterials using micro-CT requires image segmentation due to the low contrast and overlapping radioopacity of biological materials. The amount of bias introduced by segmentation procedures is generally unknown. We aim to develop software that generates three-dimensional models of fibrous and porous structures with known volumes, surfaces, lengths, and object counts in fibrous materials and to provide a software tool that calibrates quantitative micro-CT assessments. Virtual image stacks were generated using the newly developed software TeIGen, enabling the simulation of micro-CT scans of unconnected tubes, connected tubes, and porosities. A realistic noise generator was incorporated. Forty image stacks were evaluated using micro-CT, and the error between the true known and estimated data was quantified. Starting with geometric primitives, the error of the numerical estimation of surfaces and volumes was eliminated, thereby enabling the quantification of volumes and surfaces of colliding objects. Analysis of the sensitivity of the thresholding upon parameters of generated testing image sets revealed the effects of decreasing resolution and increasing noise on the accuracy of the micro-CT quantification. The size of the error increased with decreasing resolution when the voxel size exceeded 1/10 of the typical object size, which simulated the effect of the smallest details that could still be reliably quantified. Open-source software for calibrating quantitative micro-CT assessments by producing and saving virtually generated image data sets with known morphometric data was made freely available to researchers involved in morphometry of three-dimensional fibrillar and porous structures in micro-CT scans. © 2018 Wiley Periodicals, Inc.
Quantitative PLIF Imaging in High-Pressure Combustion
NASA Technical Reports Server (NTRS)
Hanson, R. K.
1997-01-01
This is the final report for a research project aimed at developing planar laser-induced fluorescence (PLIF) techniques for quantitative 2-D species imaging in fuel-lean, high-pressure combustion gases, relevant to modem aircraft gas turbine combustors. The program involved both theory and experiment. The theoretical activity led to spectroscopic models that allow calculation of the laser-induced fluorescence produced in OH, NO and 02 for arbitrary excitation wavelength, pressure, temperature, gas mixture and laser linewidth. These spectroscopic models incorporate new information on line- broadening, energy transfer and electronic quench rates. Extensive calculations have been made with these models in order to identify optimum excitation strategies, particularly for detecting low levels (ppm) of NO in the presence of large 02 mole fractions (10% is typical for the fuel-lean combustion of interest). A promising new measurement concept has emerged from these calculations, namely that excitation at specific wavelengths, together with detection of fluorescence in multiple spectral bands, promises to enable simultaneous detection of both NO (at ppm levels) and 02 or possibly NO, 02 and temperature. Calculations have been made to evaluate the expected performance of such a diagnostic for a variety of conditions and choices of excitation and detection wavelengths. The experimental effort began with assembly of a new high-pressure combustor to provide controlled high-temperature and high-pressure combustion products. The non-premixed burner enables access to postflame gases at high temperatures (to 2000 K) and high pressures (to 13 atm), and a range of fuel-air equivalence ratios. The chamber also allowed use of a sampling probe, for chemiluminescent detection of NO/NO2, and thermocouples for measurement of gas temperature. Experiments were conducted to confirm the spectroscopic models for OH, NO and 02.
Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers
NASA Astrophysics Data System (ADS)
Kowalski, Benjamin Andrew
Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.
Liu, Tian; Liu, Jing; de Rochefort, Ludovic; Spincemaille, Pascal; Khalidov, Ildar; Ledoux, James Robert; Wang, Yi
2011-09-01
Magnetic susceptibility varies among brain structures and provides insights into the chemical and molecular composition of brain tissues. However, the determination of an arbitrary susceptibility distribution from the measured MR signal phase is a challenging, ill-conditioned inverse problem. Although a previous method named calculation of susceptibility through multiple orientation sampling (COSMOS) has solved this inverse problem both theoretically and experimentally using multiple angle acquisitions, it is often impractical to carry out on human subjects. Recently, the feasibility of calculating the brain susceptibility distribution from a single-angle acquisition was demonstrated using morphology enabled dipole inversion (MEDI). In this study, we further improved the original MEDI method by sparsifying the edges in the quantitative susceptibility map that do not have a corresponding edge in the magnitude image. Quantitative susceptibility maps generated by the improved MEDI were compared qualitatively and quantitatively with those generated by calculation of susceptibility through multiple orientation sampling. The results show a high degree of agreement between MEDI and calculation of susceptibility through multiple orientation sampling, and the practicality of MEDI allows many potential clinical applications. Copyright © 2011 Wiley-Liss, Inc.
Kinetic Modeling of Accelerated Stability Testing Enabled by Second Harmonic Generation Microscopy.
Song, Zhengtian; Sarkar, Sreya; Vogt, Andrew D; Danzer, Gerald D; Smith, Casey J; Gualtieri, Ellen J; Simpson, Garth J
2018-04-03
The low limits of detection afforded by second harmonic generation (SHG) microscopy coupled with image analysis algorithms enabled quantitative modeling of the temperature-dependent crystallization of active pharmaceutical ingredients (APIs) within amorphous solid dispersions (ASDs). ASDs, in which an API is maintained in an amorphous state within a polymer matrix, are finding increasing use to address solubility limitations of small-molecule APIs. Extensive stability testing is typically performed for ASD characterization, the time frame for which is often dictated by the earliest detectable onset of crystal formation. Here a study of accelerated stability testing on ritonavir, a human immunodeficiency virus (HIV) protease inhibitor, has been conducted. Under the condition for accelerated stability testing at 50 °C/75%RH and 40 °C/75%RH, ritonavir crystallization kinetics from amorphous solid dispersions were monitored by SHG microscopy. SHG microscopy coupled by image analysis yielded limits of detection for ritonavir crystals as low as 10 ppm, which is about 2 orders of magnitude lower than other methods currently available for crystallinity detection in ASDs. The four decade dynamic range of SHG microscopy enabled quantitative modeling with an established (JMAK) kinetic model. From the SHG images, nucleation and crystal growth rates were independently determined.
Automated reagent-dispensing system for microfluidic cell biology assays.
Ly, Jimmy; Masterman-Smith, Michael; Ramakrishnan, Ravichandran; Sun, Jing; Kokubun, Brent; van Dam, R Michael
2013-12-01
Microscale systems that enable measurements of oncological phenomena at the single-cell level have a great capacity to improve therapeutic strategies and diagnostics. Such measurements can reveal unprecedented insights into cellular heterogeneity and its implications into the progression and treatment of complicated cellular disease processes such as those found in cancer. We describe a novel fluid-delivery platform to interface with low-cost microfluidic chips containing arrays of microchambers. Using multiple pairs of needles to aspirate and dispense reagents, the platform enables automated coating of chambers, loading of cells, and treatment with growth media or other agents (e.g., drugs, fixatives, membrane permeabilizers, washes, stains, etc.). The chips can be quantitatively assayed using standard fluorescence-based immunocytochemistry, microscopy, and image analysis tools, to determine, for example, drug response based on differences in protein expression and/or activation of cellular targets on an individual-cell level. In general, automation of fluid and cell handling increases repeatability, eliminates human error, and enables increased throughput, especially for sophisticated, multistep assays such as multiparameter quantitative immunocytochemistry. We report the design of the automated platform and compare several aspects of its performance to manually-loaded microfluidic chips.
The role of 3-D interactive visualization in blind surveys of H I in galaxies
NASA Astrophysics Data System (ADS)
Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.
2015-09-01
Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.
Schmidt, Vanessa; Baum, Katharina; Lao, Angelyn; Rateitschak, Katja; Schmitz, Yvonne; Teichmann, Anke; Wiesner, Burkhard; Petersen, Claus Munck; Nykjaer, Anders; Wolf, Jana; Wolkenhauer, Olaf; Willnow, Thomas E
2012-01-04
The extent of proteolytic processing of the amyloid precursor protein (APP) into neurotoxic amyloid-β (Aβ) peptides is central to the pathology of Alzheimer's disease (AD). Accordingly, modifiers that increase Aβ production rates are risk factors in the sporadic form of AD. In a novel systems biology approach, we combined quantitative biochemical studies with mathematical modelling to establish a kinetic model of amyloidogenic processing, and to evaluate the influence by SORLA/SORL1, an inhibitor of APP processing and important genetic risk factor. Contrary to previous hypotheses, our studies demonstrate that secretases represent allosteric enzymes that require cooperativity by APP oligomerization for efficient processing. Cooperativity enables swift adaptive changes in secretase activity with even small alterations in APP concentration. We also show that SORLA prevents APP oligomerization both in cultured cells and in the brain in vivo, eliminating the preferred form of the substrate and causing secretases to switch to a less efficient non-allosteric mode of action. These data represent the first mathematical description of the contribution of genetic risk factors to AD substantiating the relevance of subtle changes in SORLA levels for amyloidogenic processing as proposed for patients carrying SORL1 risk alleles.
Quantitative modelling of amyloidogenic processing and its influence by SORLA in Alzheimer's disease
Schmidt, Vanessa; Baum, Katharina; Lao, Angelyn; Rateitschak, Katja; Schmitz, Yvonne; Teichmann, Anke; Wiesner, Burkhard; Petersen, Claus Munck; Nykjaer, Anders; Wolf, Jana; Wolkenhauer, Olaf; Willnow, Thomas E
2012-01-01
The extent of proteolytic processing of the amyloid precursor protein (APP) into neurotoxic amyloid-β (Aβ) peptides is central to the pathology of Alzheimer's disease (AD). Accordingly, modifiers that increase Aβ production rates are risk factors in the sporadic form of AD. In a novel systems biology approach, we combined quantitative biochemical studies with mathematical modelling to establish a kinetic model of amyloidogenic processing, and to evaluate the influence by SORLA/SORL1, an inhibitor of APP processing and important genetic risk factor. Contrary to previous hypotheses, our studies demonstrate that secretases represent allosteric enzymes that require cooperativity by APP oligomerization for efficient processing. Cooperativity enables swift adaptive changes in secretase activity with even small alterations in APP concentration. We also show that SORLA prevents APP oligomerization both in cultured cells and in the brain in vivo, eliminating the preferred form of the substrate and causing secretases to switch to a less efficient non-allosteric mode of action. These data represent the first mathematical description of the contribution of genetic risk factors to AD substantiating the relevance of subtle changes in SORLA levels for amyloidogenic processing as proposed for patients carrying SORL1 risk alleles. PMID:21989385
Belmonte, Frances R; Martin, James L; Frescura, Kristin; Damas, Joana; Pereira, Filipe; Tarnopolsky, Mark A; Kaufman, Brett A
2016-04-28
Mitochondrial DNA (mtDNA) mutations are a common cause of primary mitochondrial disorders, and have also been implicated in a broad collection of conditions, including aging, neurodegeneration, and cancer. Prevalent among these pathogenic variants are mtDNA deletions, which show a strong bias for the loss of sequence in the major arc between, but not including, the heavy and light strand origins of replication. Because individual mtDNA deletions can accumulate focally, occur with multiple mixed breakpoints, and in the presence of normal mtDNA sequences, methods that detect broad-spectrum mutations with enhanced sensitivity and limited costs have both research and clinical applications. In this study, we evaluated semi-quantitative and digital PCR-based methods of mtDNA deletion detection using double-stranded reference templates or biological samples. Our aim was to describe key experimental assay parameters that will enable the analysis of low levels or small differences in mtDNA deletion load during disease progression, with limited false-positive detection. We determined that the digital PCR method significantly improved mtDNA deletion detection sensitivity through absolute quantitation, improved precision and reduced assay standard error.
Saldanha, J; Silvy, M; Beaufils, N; Arlinghaus, R; Barbany, G; Branford, S; Cayuela, J-M; Cazzaniga, G; Gonzalez, M; Grimwade, D; Kairisto, V; Miyamura, K; Lawler, M; Lion, T; Macintyre, E; Mahon, F-X; Muller, M C; Ostergaard, M; Pfeifer, H; Saglio, G; Sawyers, C; Spinelli, O; van der Velden, V H J; Wang, J Q; Zoi, K; Patel, V; Phillips, P; Matejtschuk, P; Gabert, J
2007-07-01
Monitoring of BCR-ABL transcripts has become established practice in the management of chronic myeloid leukemia. However, nucleic acid amplification techniques are prone to variations which limit the reliability of real-time quantitative PCR (RQ-PCR) for clinical decision making, highlighting the need for standardization of assays and reporting of minimal residual disease (MRD) data. We evaluated a lyophilized preparation of a leukemic cell line (K562) as a potential quality control reagent. This was found to be relatively stable, yielding comparable respective levels of ABL, GUS and BCR-ABL transcripts as determined by RQ-PCR before and after accelerated degradation experiments as well as following 5 years storage at -20 degrees C. Vials of freeze-dried cells were sent at ambient temperature to 22 laboratories on four continents, with RQ-PCR analyses detecting BCR-ABL transcripts at levels comparable to those observed in primary patient samples. Our results suggest that freeze-dried cells can be used as quality control reagents with a range of analytical instrumentations and could enable the development of urgently needed international standards simulating clinically relevant levels of MRD.
Spectral imaging of histological and cytological specimens
NASA Astrophysics Data System (ADS)
Rothmann, Chana; Malik, Zvi
1999-05-01
Evaluation of cell morphology by bright field microscopy is the pillar of histopathological diagnosis. The need for quantitative and objective parameters for diagnosis has given rise to the development of morphometric methods. The development of spectral imaging for biological and medical applications introduced both fields to large amounts of information extracted from a single image. Spectroscopic analysis is based on the ability of a stained histological specimen to absorb, reflect, or emit photons in ways characteristic to its interactions with specific dyes. Spectral information obtained from a histological specimen is stored in a cube whose appellate signifies the two spatial dimensions of a flat sample (x and y) and the third dimension, the spectrum, representing the light intensity for every wavelength. The spectral information stored in the cube can be further processed by morphometric analysis and quantitative procedures. Such a procedure is spectral-similarity mapping (SSM), which enables the demarcation of areas occupied by the same type of material. SSM constructs new images of the specimen, revealing areas with similar stain-macromolecule characteristics and enhancing subcellular features. Spectral imaging combined with SSM reveals nuclear organization through the differentiation stages as well as in apoptotic and necrotic conditions and identifies specifically the nucleoli domains.
Dynamic markers of altered gait rhythm in amyotrophic lateral sclerosis
NASA Technical Reports Server (NTRS)
Hausdorff, J. M.; Lertratanakul, A.; Cudkowicz, M. E.; Peterson, A. L.; Kaliton, D.; Goldberger, A. L.
2000-01-01
Amyotrophic lateral sclerosis (ALS) is a disorder marked by loss of motoneurons. We hypothesized that subjects with ALS would have an altered gait rhythm, with an increase in both the magnitude of the stride-to-stride fluctuations and perturbations in the fluctuation dynamics. To test for this locomotor instability, we quantitatively compared the gait rhythm of subjects with ALS with that of normal controls and with that of subjects with Parkinson's disease (PD) and Huntington's disease (HD), pathologies of the basal ganglia. Subjects walked for 5 min at their usual pace wearing an ankle-worn recorder that enabled determination of the duration of each stride and of stride-to-stride fluctuations. We found that the gait of patients with ALS is less steady and more temporally disorganized compared with that of healthy controls. In addition, advanced ALS, HD, and PD were associated with certain common, as well as apparently distinct, features of altered stride dynamics. Thus stride-to-stride control of gait rhythm is apparently compromised with ALS. Moreover, a matrix of markers based on gait dynamics may be useful in characterizing certain pathologies of motor control and, possibly, in quantitatively monitoring disease progression and evaluating therapeutic interventions.
Is the child 'father of the man'? evaluating the stability of genetic influences across development.
Ronald, Angelica
2011-11-01
This selective review considers findings in genetic research that have shed light on how genes operate across development. We will address the question of whether the child is 'father of the Man' from a genetic perspective. In other words, do the same genetic influences affect the same traits across development? Using a 'taster menu' approach and prioritizing newer findings on cognitive and behavioral traits, examples from the following genetic disciplines will be discussed: (a) developmental quantitative genetics (such as longitudinal twin studies), (b) neurodevelopmental genetic syndromes with known genetic causes (such as Williams syndrome), (c) developmental candidate gene studies (such as those that link infant and adult populations), (d) developmental genome-wide association studies (GWAS), and (e) DNA resequencing. Evidence presented here suggests that there is considerable genetic stability of cognitive and behavioral traits across development, but there is also evidence for genetic change. Quantitative genetic studies have a long history of assessing genetic continuity and change across development. It is now time for the newer, more technology-enabled fields such as GWAS and DNA resequencing also to take on board the dynamic nature of human behavior. 2011 Blackwell Publishing Ltd.
Design and evaluation of a miniature laser speckle imaging device to assess gingival health
Regan, Caitlin; White, Sean M.; Yang, Bruce Y.; Takesh, Thair; Ho, Jessica; Wink, Cherie; Wilder-Smith, Petra; Choi, Bernard
2016-01-01
Abstract. Current methods used to assess gingivitis are qualitative and subjective. We hypothesized that gingival perfusion measurements could provide a quantitative metric of disease severity. We constructed a compact laser speckle imaging (LSI) system that could be mounted in custom-made oral molds. Rigid fixation of the LSI system in the oral cavity enabled measurement of blood flow in the gingiva. In vitro validation performed in controlled flow phantoms demonstrated that the compact LSI system had comparable accuracy and linearity compared to a conventional bench-top LSI setup. In vivo validation demonstrated that the compact LSI system was capable of measuring expected blood flow dynamics during a standard postocclusive reactive hyperemia and that the compact LSI system could be used to measure gingival blood flow repeatedly without significant variation in measured blood flow values (p<0.05). Finally, compact LSI system measurements were collected from the interdental papilla of nine subjects and compared to a clinical assessment of gingival bleeding on probing. A statistically significant correlation (ρ=0.53; p<0.005) was found between these variables, indicating that quantitative gingival perfusion measurements performed using our system may aid in the diagnosis and prognosis of periodontal disease. PMID:27787545
Assessing return on investment of defined-population disease management interventions.
Wilson, Thomas W; Gruen, Jeff; William, Thar; Fetterolf, Donald; Minalkumar, Patel; Popiel, Richard G; Lewis, Al; Nash, David B
2004-11-01
Strategies to reduce health expenditures through the improvement of health and quality of care are in high demand. A group of experts formed a nonpartisan, independent work group, under the sponsorship of the National Managed Health Care Congress. Its goal was to establish a list of easy-to-understand, actionable, and usable recommendations to enable disease management program advocates to conduct basic-level evaluations. The work group made recommendations concerning identification of reference and intervention population, population definitions, quantitative methods and data quality, confounding and bias, and stakeholder agreements/contracting. A case study was created to quantitatively illustrate some of the major issues raised by the work group. Five typical errors were simulated by applying different rules to the intervention population than to the reference population: differential inclusion (high versus low risk), differential exclusion (high versus low risk) and differential claims run-out. Compared with the true impact, four of the five errors resulted in a bias toward "intervention effect," while one (differential inclusion of high-risk patients) was biased against the "intervention effect." The direction and magnitude of the bias in natural settings will not necessarily follow this pattern.
Design and evaluation of a miniature laser speckle imaging device to assess gingival health
NASA Astrophysics Data System (ADS)
Regan, Caitlin; White, Sean M.; Yang, Bruce Y.; Takesh, Thair; Ho, Jessica; Wink, Cherie; Wilder-Smith, Petra; Choi, Bernard
2016-10-01
Current methods used to assess gingivitis are qualitative and subjective. We hypothesized that gingival perfusion measurements could provide a quantitative metric of disease severity. We constructed a compact laser speckle imaging (LSI) system that could be mounted in custom-made oral molds. Rigid fixation of the LSI system in the oral cavity enabled measurement of blood flow in the gingiva. In vitro validation performed in controlled flow phantoms demonstrated that the compact LSI system had comparable accuracy and linearity compared to a conventional bench-top LSI setup. In vivo validation demonstrated that the compact LSI system was capable of measuring expected blood flow dynamics during a standard postocclusive reactive hyperemia and that the compact LSI system could be used to measure gingival blood flow repeatedly without significant variation in measured blood flow values (p<0.05). Finally, compact LSI system measurements were collected from the interdental papilla of nine subjects and compared to a clinical assessment of gingival bleeding on probing. A statistically significant correlation (ρ=0.53 p<0.005) was found between these variables, indicating that quantitative gingival perfusion measurements performed using our system may aid in the diagnosis and prognosis of periodontal disease.
Belmonte, Frances R.; Martin, James L.; Frescura, Kristin; Damas, Joana; Pereira, Filipe; Tarnopolsky, Mark A.; Kaufman, Brett A.
2016-01-01
Mitochondrial DNA (mtDNA) mutations are a common cause of primary mitochondrial disorders, and have also been implicated in a broad collection of conditions, including aging, neurodegeneration, and cancer. Prevalent among these pathogenic variants are mtDNA deletions, which show a strong bias for the loss of sequence in the major arc between, but not including, the heavy and light strand origins of replication. Because individual mtDNA deletions can accumulate focally, occur with multiple mixed breakpoints, and in the presence of normal mtDNA sequences, methods that detect broad-spectrum mutations with enhanced sensitivity and limited costs have both research and clinical applications. In this study, we evaluated semi-quantitative and digital PCR-based methods of mtDNA deletion detection using double-stranded reference templates or biological samples. Our aim was to describe key experimental assay parameters that will enable the analysis of low levels or small differences in mtDNA deletion load during disease progression, with limited false-positive detection. We determined that the digital PCR method significantly improved mtDNA deletion detection sensitivity through absolute quantitation, improved precision and reduced assay standard error. PMID:27122135
Advantages and limitations of quantitative PCR (Q-PCR)-based approaches in microbial ecology.
Smith, Cindy J; Osborn, A Mark
2009-01-01
Quantitative PCR (Q-PCR or real-time PCR) approaches are now widely applied in microbial ecology to quantify the abundance and expression of taxonomic and functional gene markers within the environment. Q-PCR-based analyses combine 'traditional' end-point detection PCR with fluorescent detection technologies to record the accumulation of amplicons in 'real time' during each cycle of the PCR amplification. By detection of amplicons during the early exponential phase of the PCR, this enables the quantification of gene (or transcript) numbers when these are proportional to the starting template concentration. When Q-PCR is coupled with a preceding reverse transcription reaction, it can be used to quantify gene expression (RT-Q-PCR). This review firstly addresses the theoretical and practical implementation of Q-PCR and RT-Q-PCR protocols in microbial ecology, highlighting key experimental considerations. Secondly, we review the applications of (RT)-Q-PCR analyses in environmental microbiology and evaluate the contribution and advances gained from such approaches. Finally, we conclude by offering future perspectives on the application of (RT)-Q-PCR in furthering understanding in microbial ecology, in particular, when coupled with other molecular approaches and more traditional investigations of environmental systems.
NASA Astrophysics Data System (ADS)
El-Nour, K. M. A.; Salam, E. T. A.; Soliman, H. M.; Orabi, A. S.
2017-03-01
A new optical sensor was developed for rapid screening with high sensitivity for the existence of biogenic amines (BAs) in poultry meat samples. Gold nanoparticles (GNPs) with particle size 11-19 nm function as a fast and sensitive biosensor for detection of histamine resulting from bacterial decarboxylation of histidine as a spoilage marker for stored poultry meat. Upon reaction with histamine, the red color of the GNPs converted into deep blue. The appearance of blue color favorably coincides with the concentration of BAs that can induce symptoms of poisoning. This biosensor enables a semi-quantitative detection of analyte in real samples by eye-vision. Quality evaluation is carried out by measuring histamine and histidine using different analytical techniques such as UV-vis, FTIR, and fluorescence spectroscopy as well as TEM. A rapid quantitative readout of samples by UV-vis and fluorescence methods with standard instrumentation were proposed in a short time unlike chromatographic and electrophoretic methods. Sensitivity and limit of detection (LOD) of 6.59 × 10-4 and 0.6 μM, respectively, are determined for histamine as a spoilage marker with a correlation coefficient ( R 2) of 0.993.
2017-10-13
7b08574 14. ABSTRACT (Maximum 200 words) We report a Co2-based magnetic resonance (MR) probe that enables the ratiometric quantitation and imaging of...ratios of CEST peak intensities at 104 and 64 ppm are correlated with solution pH in the physiological range 6.5−7.6 to construct a linear calibration...magnetic resonance (MR); ratiometric quantitation ; chemical exchange saturation transfer (CEST); carboxamide; hydroxyl-substituted bisphosphonate
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
A systematic review on how to conduct evaluations in community-based rehabilitation.
Grandisson, Marie; Hébert, Michèle; Thibeault, Rachel
2014-01-01
Community-based rehabilitation (CBR) must prove that it is making a significant difference for people with disabilities in low- and middle-income countries. Yet, evaluation is not a common practice and the evidence for its effectiveness is fragmented and largely insufficient. The objective of this article was to review the literature on best practices in program evaluation in CBR in relation to the evaluative process, the frameworks, and the methods of data collection. A systematic search was conducted on five rehabilitation databases and the World Health Organization website with keywords associated with CBR and program evaluation. Two independent researchers selected the articles. Twenty-two documents were included. The results suggest that (1) the evaluative process needs to be conducted in close collaboration with the local community, including people with disabilities, and to be followed by sharing the findings and taking actions, (2) many frameworks have been proposed to evaluate CBR but no agreement has been reached, and (3) qualitative methodologies have dominated the scene in CBR so far, but their combination with quantitative methods has a lot of potential to better capture the effectiveness of this strategy. In order to facilitate and improve evaluations in CBR, there is an urgent need to agree on a common framework, such as the CBR matrix, and to develop best practice guidelines based on the literature available and consensus among a group of experts. These will need to demonstrate a good balance between community development and standards for effective evaluations. Implications for Rehabilitation In the quest for evidence of the effectiveness of community-based rehabilitation (CBR), a shared program evaluation framework would better enable the combination of findings from different studies. The evaluation of CBR programs should always include sharing findings and taking action for the sake of the local community. Although qualitative methodologies have dominated the scene in CBR and remain highly relevant, there is also a call for the inclusion of quantitative indicators in order to capture the progress made by people participating in CBR programs. The production of best practice guidelines for evaluation in CBR could foster accountable and empowering program evaluations that are congruent with the principles at the heart of CBR and the standards for effective evaluations.
A systematic review on how to conduct evaluations in community-based rehabilitation
Hébert, Michèle; Thibeault, Rachel
2014-01-01
Purpose Community-based rehabilitation (CBR) must prove that it is making a significant difference for people with disabilities in low- and middle-income countries. Yet, evaluation is not a common practice and the evidence for its effectiveness is fragmented and largely insufficient. The objective of this article was to review the literature on best practices in program evaluation in CBR in relation to the evaluative process, the frameworks, and the methods of data collection. Method A systematic search was conducted on five rehabilitation databases and the World Health Organization website with keywords associated with CBR and program evaluation. Two independent researchers selected the articles. Results Twenty-two documents were included. The results suggest that (1) the evaluative process needs to be conducted in close collaboration with the local community, including people with disabilities, and to be followed by sharing the findings and taking actions, (2) many frameworks have been proposed to evaluate CBR but no agreement has been reached, and (3) qualitative methodologies have dominated the scene in CBR so far, but their combination with quantitative methods has a lot of potential to better capture the effectiveness of this strategy. Conclusions In order to facilitate and improve evaluations in CBR, there is an urgent need to agree on a common framework, such as the CBR matrix, and to develop best practice guidelines based on the literature available and consensus among a group of experts. These will need to demonstrate a good balance between community development and standards for effective evaluations. Implications for Rehabilitation In the quest for evidence of the effectiveness of community-based rehabilitation (CBR), a shared program evaluation framework would better enable the combination of findings from different studies. The evaluation of CBR programs should always include sharing findings and taking action for the sake of the local community. Although qualitative methodologies have dominated the scene in CBR and remain highly relevant, there is also a call for the inclusion of quantitative indicators in order to capture the progress made by people participating in CBR programs. The production of best practice guidelines for evaluation in CBR could foster accountable and empowering program evaluations that are congruent with the principles at the heart of CBR and the standards for effective evaluations. PMID:23614357
Cross-layer restoration with software defined networking based on IP over optical transport networks
NASA Astrophysics Data System (ADS)
Yang, Hui; Cheng, Lei; Deng, Junni; Zhao, Yongli; Zhang, Jie; Lee, Young
2015-10-01
The IP over optical transport network is a very promising networking architecture applied to the interconnection of geographically distributed data centers due to the performance guarantee of low delay, huge bandwidth and high reliability at a low cost. It can enable efficient resource utilization and support heterogeneous bandwidth demands in highly-available, cost-effective and energy-effective manner. In case of cross-layer link failure, to ensure a high-level quality of service (QoS) for user request after the failure becomes a research focus. In this paper, we propose a novel cross-layer restoration scheme for data center services with software defined networking based on IP over optical network. The cross-layer restoration scheme can enable joint optimization of IP network and optical network resources, and enhance the data center service restoration responsiveness to the dynamic end-to-end service demands. We quantitatively evaluate the feasibility and performances through the simulation under heavy traffic load scenario in terms of path blocking probability and path restoration latency. Numeric results show that the cross-layer restoration scheme improves the recovery success rate and minimizes the overall recovery time.
Estimating skin sensitization potency from a single dose LLNA.
Roberts, David W
2015-04-01
Skin sensitization is an important aspect of safety assessment. The mouse local lymph node assay (LLNA) developed in the 1990 s is an in vivo test used for skin sensitization hazard identification and characterization. More recently a reduced version of the LLNA (rLLNA) has been developed as a means of identifying, but not quantifying, sensitization hazard. The work presented here is aimed at enabling rLLNA data to be used to give quantitative potency information that can be used, inter alia, in modeling and read-across approaches to non-animal based potency estimation. A probit function has been derived enabling estimation of EC3 from a single dose. This has led to development of a modified version of the rLLNA, whereby as a general principle the SI value at 10%, or at a lower concentration if 10% is not testable, is used to calculate the EC3. This version of the rLLNA has been evaluated against a selection of chemicals for which full LLNA data are available, and has been shown to give EC3 values in good agreement with those derived from the full LLNA. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sheppard, Adrian; Latham, Shane; Middleton, Jill; Kingston, Andrew; Myers, Glenn; Varslot, Trond; Fogden, Andrew; Sawkins, Tim; Cruikshank, Ron; Saadatfar, Mohammad; Francois, Nicolas; Arns, Christoph; Senden, Tim
2014-04-01
This paper reports on recent advances at the micro-computed tomography facility at the Australian National University. Since 2000 this facility has been a significant centre for developments in imaging hardware and associated software for image reconstruction, image analysis and image-based modelling. In 2010 a new instrument was constructed that utilises theoretically-exact image reconstruction based on helical scanning trajectories, allowing higher cone angles and thus better utilisation of the available X-ray flux. We discuss the technical hurdles that needed to be overcome to allow imaging with cone angles in excess of 60°. We also present dynamic tomography algorithms that enable the changes between one moment and the next to be reconstructed from a sparse set of projections, allowing higher speed imaging of time-varying samples. Researchers at the facility have also created a sizeable distributed-memory image analysis toolkit with capabilities ranging from tomographic image reconstruction to 3D shape characterisation. We show results from image registration and present some of the new imaging and experimental techniques that it enables. Finally, we discuss the crucial question of image segmentation and evaluate some recently proposed techniques for automated segmentation.
Patel, Atit A.; Cox, Daniel N.
2017-01-01
To investigate cellular, molecular and behavioral mechanisms of noxious cold detection, we developed cold plate behavioral assays and quantitative means for evaluating the predominant noxious cold-evoked contraction behavior. To characterize neural activity in response to noxious cold, we implemented a GCaMP6-based calcium imaging assay enabling in vivo studies of intracellular calcium dynamics in intact Drosophila larvae. We identified Drosophila class III multidendritic (md) sensory neurons as multimodal sensors of innocuous mechanical and noxious cold stimuli and to dissect the mechanistic bases of multimodal sensory processing we developed two independent functional assays. First, we developed an optogenetic dose response assay to assess whether levels of neural activation contributes to the multimodal aspects of cold sensitive sensory neurons. Second, we utilized CaMPARI, a photo-switchable calcium integrator that stably converts fluorescence from green to red in presence of high intracellular calcium and photo-converting light, to assess in vivo functional differences in neural activation levels between innocuous mechanical and noxious cold stimuli. These novel assays enable investigations of behavioral and functional roles of peripheral sensory neurons and multimodal sensory processing in Drosophila larvae. PMID:28835907
Fang, Yin; Leo, Sin-Yen; Ni, Yongliang; Wang, Junyu; Wang, Bingchen; Yu, Long; Dong, Zhe; Dai, Yuqiong; Basile, Vito; Taylor, Curtis; Jiang, Peng
2017-02-15
Traditional shape memory polymers (SMPs) are mostly thermoresponsive, and their applications in nano-optics are hindered by heat-demanding programming and recovery processes. By integrating a polyurethane-based shape memory copolymer with templating nanofabrication, reconfigurable/rewritable macroporous photonic crystals have been demonstrated. This SMP coupled with the unique macroporous structure enables unusual all-room-temperature shape memory cycles. "Cold" programming involving microscopic order-disorder transitions of the templated macropores is achieved by mechanically deforming the macroporous SMP membranes. The rapid recovery of the permanent, highly ordered photonic crystal structure from the temporary, disordered configuration can be triggered by multiple stimuli including a large variety of vapors and solvents, heat, and microwave radiation. Importantly, the striking chromogenic effects associated with these athermal and thermal processes render a sensitive and noninvasive optical methodology for quantitatively characterizing the intriguing nanoscopic shape memory effects. Some critical parameters/mechanisms that could significantly affect the final performance of SMP-based reconfigurable photonic crystals including strain recovery ratio, dynamics and reversibility of shape recovery, as well as capillary condensation of vapors in macropores, which play a crucial role in vapor-triggered recovery, can be evaluated using this new optical technology.
Laksmana, F L; Van Vliet, L J; Hartman Kok, P J A; Vromans, H; Frijlink, H W; Van der Voort Maarschalk, K
2009-04-01
This study aims to develop a characterization method for coating structure based on image analysis, which is particularly promising for the rational design of coated particles in the pharmaceutical industry. The method applies the MATLAB image processing toolbox to images of coated particles taken with Confocal Laser Scanning Microscopy (CSLM). The coating thicknesses have been determined along the particle perimeter, from which a statistical analysis could be performed to obtain relevant thickness properties, e.g. the minimum coating thickness and the span of the thickness distribution. The characterization of the pore structure involved a proper segmentation of pores from the coating and a granulometry operation. The presented method facilitates the quantification of porosity, thickness and pore size distribution of a coating. These parameters are considered the important coating properties, which are critical to coating functionality. Additionally, the effect of the coating process variations on coating quality can straight-forwardly be assessed. Enabling a good characterization of the coating qualities, the presented method can be used as a fast and effective tool to predict coating functionality. This approach also enables the influence of different process conditions on coating properties to be effectively monitored, which latterly leads to process tailoring.
Roberts, Shelley; McInnes, Elizabeth; Bucknall, Tracey; Wallis, Marianne; Banks, Merrilyn; Chaboyer, Wendy
2017-02-13
As pressure ulcers contribute to significant patient burden and increased health care costs, their prevention is a clinical priority. Our team developed and tested a complex intervention, a pressure ulcer prevention care bundle promoting patient participation in care, in a cluster-randomised trial. The UK Medical Research Council recommends process evaluation of complex interventions to provide insight into why they work or fail and how they might be improved. This study aimed to evaluate processes underpinning implementation of the intervention and explore end-users' perceptions of it, in order to give a deeper understanding of its effects. A pre-specified, mixed-methods process evaluation was conducted as an adjunct to the main trial, guided by a framework for process evaluation of cluster-randomised trials. Data was collected across eight Australian hospitals but mainly focused on the four intervention hospitals. Quantitative and qualitative data were collected across the evaluation domains: recruitment, reach, intervention delivery and response to intervention, at both cluster and individual patient level. Quantitative data were analysed using descriptive and inferential statistics. Qualitative data were analysed using thematic analysis. In the context of the main trial, which found a 42% reduction in risk of pressure ulcer with the intervention that was not significant after adjusting for clustering and covariates, this process evaluation provides important insights. Recruitment and reach among clusters and individuals was high, indicating that patients, nurses and hospitals are willing to engage with a pressure ulcer prevention care bundle. Of 799 intervention patients in the trial, 96.7% received the intervention, which took under 10 min to deliver. Patients and nurses accepted the care bundle, recognising benefits to it and describing how it enabled participation in pressure ulcer prevention (PUP) care. This process evaluation found no major failures relating to implementation of the intervention. The care bundle was found to be easy to understand and deliver, and it reached a large proportion of the target population and was found to be acceptable to patients and nurses; therefore, it may be an effective way of engaging patients in their pressure ulcer prevention care and promoting evidence-based practise.
NASA Astrophysics Data System (ADS)
Riches, A. J. V.; Burton, K. W.; Nowell, G. M.; Dale, C. W.; Ottley, C. J.
2016-08-01
New methods presented here enable quantitative determination of mineral-scale PGE-abundances and Os-isotope compositions in meteorite materials thereby providing valuable new insight into planetary evolution.
Resistance of a Wire as a Function of Temperature.
ERIC Educational Resources Information Center
Henry, David
1995-01-01
Presents a simple experiment that enables students to get a quantitative measure of the relationship between the resistance of a wire and the temperature of the wire allowing the calculation of the temperature coefficient of resistance. (JRH)
Evaluating HDR photos using Web 2.0 technology
NASA Astrophysics Data System (ADS)
Qiu, Guoping; Mei, Yujie; Duan, Jiang
2011-01-01
High dynamic range (HDR) photography is an emerging technology that has the potential to dramatically enhance the visual quality and realism of digital photos. One of the key technical challenges of HDR photography is displaying HDR photos on conventional devices through tone mapping or dynamic range compression. Although many different tone mapping techniques have been developed in recent years, evaluating tone mapping operators prove to be extremely difficult. Web2.0, social media and crowd-sourcing are emerging Internet technologies which can be harnessed to harvest the brain power of the mass to solve difficult problems in science, engineering and businesses. Paired comparison is used in the scientific study of preferences and attitudes and has been shown to be capable of obtaining an interval-scale ordering of items along a psychometric dimension such as preference or importance. In this paper, we exploit these technologies for evaluating HDR tone mapping algorithms. We have developed a Web2.0 style system that enables Internet users from anywhere to evaluate tone mapped HDR photos at any time. We adopt a simple paired comparison protocol, Internet users are presented a pair of tone mapped images and are simply asked to select the one that they think is better or click a "no difference" button. These user inputs are collected in the web server and analyzed by a rank aggregation algorithm which ranks the tone mapped photos according to the votes they received. We present experimental results which demonstrate that the emerging Internet technologies can be exploited as a new paradigm for evaluating HDR tone mapping algorithms. The advantages of this approach include the potential of collecting large user inputs under a variety of viewing environments rather than limited user participation under controlled laboratory environments thus enabling more robust and reliable quality assessment. We also present data analysis to correlate user generated qualitative indices with quantitative image statistics which may provide useful guidance for developing better tone mapping operators.
Chambers, Andrew G.; Percy, Andrew J.; Yang, Juncong; Camenzind, Alexander G.; Borchers, Christoph H.
2013-01-01
Dried blood spot (DBS) sampling, coupled with multiple reaction monitoring mass spectrometry (MRM-MS), is a well-established approach for quantifying a wide range of small molecule biomarkers and drugs. This sampling procedure is simpler and less-invasive than those required for traditional plasma or serum samples enabling collection by minimally trained personnel. Many analytes are stable in the DBS format without refrigeration, which reduces the cost and logistical challenges of sample collection in remote locations. These advantages make DBS sample collection desirable for advancing personalized medicine through population-wide biomarker screening. Here we expand this technology by demonstrating the first multiplexed method for the quantitation of endogenous proteins in DBS samples. A panel of 60 abundant proteins in human blood was targeted by monitoring proteotypic tryptic peptides and their stable isotope-labeled analogs by MRM. Linear calibration curves were obtained for 40 of the 65 peptide targets demonstrating multiple proteins can be quantitatively extracted from DBS collection cards. The method was also highly reproducible with a coefficient of variation of <15% for all 40 peptides. Overall, this assay quantified 37 proteins spanning a range of more than four orders of magnitude in concentration within a single 25 min LC/MRM-MS analysis. The protein abundances of the 33 proteins quantified in matching DBS and whole blood samples showed an excellent correlation, with a slope of 0.96 and an R2 value of 0.97. Furthermore, the measured concentrations for 80% of the proteins were stable for at least 10 days when stored at −20 °C, 4 °C and 37 °C. This work represents an important first step in evaluating the integration of DBS sampling with highly-multiplexed MRM for quantitation of endogenous proteins. PMID:23221968
Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).
Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan
2015-01-01
Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.
NASA Astrophysics Data System (ADS)
Gao, Liang; Hammoudi, Ahmad A.; Li, Fuhai; Thrall, Michael J.; Cagle, Philip T.; Chen, Yuanxin; Yang, Jian; Xia, Xiaofeng; Fan, Yubo; Massoud, Yehia; Wang, Zhiyong; Wong, Stephen T. C.
2012-06-01
The advent of molecularly targeted therapies requires effective identification of the various cell types of non-small cell lung carcinomas (NSCLC). Currently, cell type diagnosis is performed using small biopsies or cytology specimens that are often insufficient for molecular testing after morphologic analysis. Thus, the ability to rapidly recognize different cancer cell types, with minimal tissue consumption, would accelerate diagnosis and preserve tissue samples for subsequent molecular testing in targeted therapy. We report a label-free molecular vibrational imaging framework enabling three-dimensional (3-D) image acquisition and quantitative analysis of cellular structures for identification of NSCLC cell types. This diagnostic imaging system employs superpixel-based 3-D nuclear segmentation for extracting such disease-related features as nuclear shape, volume, and cell-cell distance. These features are used to characterize cancer cell types using machine learning. Using fresh unstained tissue samples derived from cell lines grown in a mouse model, the platform showed greater than 97% accuracy for diagnosis of NSCLC cell types within a few minutes. As an adjunct to subsequent histology tests, our novel system would allow fast delineation of cancer cell types with minimum tissue consumption, potentially facilitating on-the-spot diagnosis, while preserving specimens for additional tests. Furthermore, 3-D measurements of cellular structure permit evaluation closer to the native state of cells, creating an alternative to traditional 2-D histology specimen evaluation, potentially increasing accuracy in diagnosing cell type of lung carcinomas.
NASA Astrophysics Data System (ADS)
Vogt, William C.; Jia, Congxian; Wear, Keith A.; Garra, Brian S.; Pfefer, T. Joshua
2017-03-01
Recent years have seen rapid development of hybrid optical-acoustic imaging modalities with broad applications in research and clinical imaging, including photoacoustic tomography (PAT), photoacoustic microscopy, and ultrasound-modulated optical tomography. Tissue-mimicking phantoms are an important tool for objectively and quantitatively simulating in vivo imaging system performance. However, no standard tissue phantoms exist for such systems. One major challenge is the development of tissue-mimicking materials (TMMs) that are both highly stable and possess biologically realistic properties. To address this need, we have explored the use of various formulations of PVC plastisol (PVCP) based on varying mixtures of several liquid plasticizers. We developed a custom PVCP formulation with optical absorption and scattering coefficients, speed of sound, and acoustic attenuation that are tunable and tissue-relevant. This TMM can simulate different tissue compositions and offers greater mechanical strength than hydrogels. Optical properties of PVCP samples with varying composition were characterized using integrating sphere spectrophotometry and the inverse adding-doubling method. Acoustic properties were determined using a broadband pulse-transmission technique. To demonstrate the utility of this bimodal TMM, we constructed an image quality phantom designed to enable quantitative evaluation of PAT spatial resolution. The phantom was imaged using a custom combined PAT-ultrasound imaging system. Results indicated that this more biologically realistic TMM produced performance trends not captured in simpler liquid phantoms. In the future, this TMM may be broadly utilized for performance evaluation of optical, acoustic, and hybrid optical-acoustic imaging systems.
Padmore, Trudy; Stark, Carahline; Turkevich, Leonid A.; Champion, Julie A.
2017-01-01
Background In the lung, macrophages attempt to engulf inhaled high aspect ratio pathogenic materials, secreting inflammatory molecules in the process. The inability of macrophages to remove these materials leads to chronic inflammation and disease. How the biophysical and biochemical mechanisms of these effects are influenced by fiber length remains undetermined. This study evaluates the role of fiber length on phagocytosis and molecular inflammatory responses to non-cytotoxic fibers, enabling development of quantitative length-based models. Methods Murine alveolar macrophages were exposed to long and short populations of JM-100 glass fibers, produced by successive sedimentation and repeated crushing, respectively. Interactions between fibers and macrophages were observed using time-lapse video microscopy, and quantified by flow cytometry. Inflammatory biomolecules (TNF-α, IL-1 α, COX-2, PGE2) were measured. Results Uptake of short fibers occurred more readily than for long, but long fibers were more potent stimulators of inflammatory molecules. Stimulation resulted in dose-dependent secretion of inflammatory biomolecules but no cytotoxicity or strong ROS production. Linear cytokine dose-response curves evaluated with length-dependent potency models, using measured fiber length distributions, resulted in identification of critical fiber lengths that cause frustrated phagocytosis and increased inflammatory biomolecule production. Conclusion Short fibers played a minor role in the inflammatory response compared to long fibers. The critical lengths at which frustrated phagocytosis occurs can be quantified by fitting dose-response curves to fiber distribution data. PMID:27784615
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeraatkar, Navid; Farahani, Mohammad Hossein; Rahmim, Arman
Purpose: Given increasing efforts in biomedical research utilizing molecular imaging methods, development of dedicated high-performance small-animal SPECT systems has been growing rapidly in the last decade. In the present work, we propose and assess an alternative concept for SPECT imaging enabling desktop open-gantry imaging of small animals. Methods: The system, PERSPECT, consists of an imaging desk, with a set of tilted detector and pinhole collimator placed beneath it. The object to be imaged is simply placed on the desk. Monte Carlo (MC) and analytical simulations were utilized to accurately model and evaluate the proposed concept and design. Furthermore, a dedicatedmore » image reconstruction algorithm, finite-aperture-based circular projections (FABCP), was developed and validated for the system, enabling more accurate modeling of the system and higher quality reconstructed images. Image quality was quantified as a function of different tilt angles in the acquisition and number of iterations in the reconstruction algorithm. Furthermore, more complex phantoms including Derenzo, Defrise, and mouse whole body were simulated and studied. Results: The sensitivity of the PERSPECT was 207 cps/MBq. It was quantitatively demonstrated that for a tilt angle of 30°, comparable image qualities were obtained in terms of normalized squared error, contrast, uniformity, noise, and spatial resolution measurements, the latter at ∼0.6 mm. Furthermore, quantitative analyses demonstrated that 3 iterations of FABCP image reconstruction (16 subsets/iteration) led to optimally reconstructed images. Conclusions: The PERSPECT, using a novel imaging protocol, can achieve comparable image quality performance in comparison with a conventional pinhole SPECT with the same configuration. The dedicated FABCP algorithm, which was developed for reconstruction of data from the PERSPECT system, can produce high quality images for small-animal imaging via accurate modeling of the system as incorporated in the forward- and back-projection steps. Meanwhile, the developed MC model and the analytical simulator of the system can be applied for further studies on development and evaluation of the system.« less
Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang
2013-05-01
To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.
Gui, Jiang; Moore, Jason H.; Williams, Scott M.; Andrews, Peter; Hillege, Hans L.; van der Harst, Pim; Navis, Gerjan; Van Gilst, Wiek H.; Asselbergs, Folkert W.; Gilbert-Diamond, Diane
2013-01-01
We present an extension of the two-class multifactor dimensionality reduction (MDR) algorithm that enables detection and characterization of epistatic SNP-SNP interactions in the context of a quantitative trait. The proposed Quantitative MDR (QMDR) method handles continuous data by modifying MDR’s constructive induction algorithm to use a T-test. QMDR replaces the balanced accuracy metric with a T-test statistic as the score to determine the best interaction model. We used a simulation to identify the empirical distribution of QMDR’s testing score. We then applied QMDR to genetic data from the ongoing prospective Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. PMID:23805232
End-to-end deep neural network for optical inversion in quantitative photoacoustic imaging.
Cai, Chuangjian; Deng, Kexin; Ma, Cheng; Luo, Jianwen
2018-06-15
An end-to-end deep neural network, ResU-net, is developed for quantitative photoacoustic imaging. A residual learning framework is used to facilitate optimization and to gain better accuracy from considerably increased network depth. The contracting and expanding paths enable ResU-net to extract comprehensive context information from multispectral initial pressure images and, subsequently, to infer a quantitative image of chromophore concentration or oxygen saturation (sO 2 ). According to our numerical experiments, the estimations of sO 2 and indocyanine green concentration are accurate and robust against variations in both optical property and object geometry. An extremely short reconstruction time of 22 ms is achieved.
Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.
Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin
2018-01-08
We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.
He, Dan; Xie, Xiao; Yang, Fan; Zhang, Heng; Su, Haomiao; Ge, Yun; Song, Haiping; Chen, Peng R
2017-11-13
A genetically encoded, multifunctional photocrosslinker was developed for quantitative and comparative proteomics. By bearing a bioorthogonal handle and a releasable linker in addition to its photoaffinity warhead, this probe enables the enrichment of transient and low-abundance prey proteins after intracellular photocrosslinking and prey-bait separation, which can be subject to stable isotope dimethyl labeling and mass spectrometry analysis. This quantitative strategy (termed isoCAPP) allowed a comparative proteomic approach to be adopted to identify the proteolytic substrates of an E. coli protease-chaperone dual machinery DegP. Two newly identified substrates were subsequently confirmed by proteolysis experiments. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
On predicting monitoring system effectiveness
NASA Astrophysics Data System (ADS)
Cappello, Carlo; Sigurdardottir, Dorotea; Glisic, Branko; Zonta, Daniele; Pozzi, Matteo
2015-03-01
While the objective of structural design is to achieve stability with an appropriate level of reliability, the design of systems for structural health monitoring is performed to identify a configuration that enables acquisition of data with an appropriate level of accuracy in order to understand the performance of a structure or its condition state. However, a rational standardized approach for monitoring system design is not fully available. Hence, when engineers design a monitoring system, their approach is often heuristic with performance evaluation based on experience, rather than on quantitative analysis. In this contribution, we propose a probabilistic model for the estimation of monitoring system effectiveness based on information available in prior condition, i.e. before acquiring empirical data. The presented model is developed considering the analogy between structural design and monitoring system design. We assume that the effectiveness can be evaluated based on the prediction of the posterior variance or covariance matrix of the state parameters, which we assume to be defined in a continuous space. Since the empirical measurements are not available in prior condition, the estimation of the posterior variance or covariance matrix is performed considering the measurements as a stochastic variable. Moreover, the model takes into account the effects of nuisance parameters, which are stochastic parameters that affect the observations but cannot be estimated using monitoring data. Finally, we present an application of the proposed model to a real structure. The results show how the model enables engineers to predict whether a sensor configuration satisfies the required performance.
Thodis, Antonia; Itsiopoulos, Catherine; Kouris-Blazos, Antigone; Brazionis, Laima; Tyrovolas, Stefanos; Polychronopoulos, Evangelos; Panagiotakos, Demosthenes B
2018-02-01
To describe the study protocol of the MEDiterranean ISlands-Australia (MEDIS-Australia) Study modelled on the MEDIS Study conducted in Greece. The present study aims to explore adherence to the traditional Mediterranean diet pattern, determine enablers and barriers to adherence, explore the definition of Greek cuisine, and associations between adherence to the diet pattern and risk factors for cardiovascular disease (CVD) and metabolic syndrome in older Greek Australians originally from Greek islands and Cyprus. Now long-term immigrants, with at least 50 years in Australia, characteristics and risk factor profiles of older Greek islander-born Australians will be compared and contrasted to their counterparts living on Greek islands to evaluate the influence of migration on adherence. The present study is an observational study of cross-sectional design using a modified lifestyle and semi-quantitative food frequency questionnaire to capture sociodemographic, health, psychosocial and dietary characteristics, including cuisine, of 150 older Greek islander-born Australians. Anthropometric measures and medical history will be collected. Participants will be aged over 65 years, live independently, are originally from a Greek island and are free from CVD. Data collection is underway. Characteristics and behaviours associated with adherence, if identified, could be evaluated in future studies. For example, exploration of enablers or barriers to adherence to a Mediterranean dietary pattern in an Australian population. © 2017 Dietitians Association of Australia.
Tian, Q; Price, N D; Hood, L
2012-02-01
A grand challenge impeding optimal treatment outcomes for patients with cancer arises from the complex nature of the disease: the cellular heterogeneity, the myriad of dysfunctional molecular and genetic networks as results of genetic (somatic) and environmental perturbations. Systems biology, with its holistic approach to understanding fundamental principles in biology, and the empowering technologies in genomics, proteomics, single-cell analysis, microfluidics and computational strategies, enables a comprehensive approach to medicine, which strives to unveil the pathogenic mechanisms of diseases, identify disease biomarkers and begin thinking about new strategies for drug target discovery. The integration of multidimensional high-throughput 'omics' measurements from tumour tissues and corresponding blood specimens, together with new systems strategies for diagnostics, enables the identification of cancer biomarkers that will enable presymptomatic diagnosis, stratification of disease, assessment of disease progression, evaluation of patient response to therapy and the identification of reoccurrences. Whilst some aspects of systems medicine are being adopted in clinical oncology practice through companion molecular diagnostics for personalized therapy, the mounting influx of global quantitative data from both wellness and diseases is shaping up a transformational paradigm in medicine we termed 'predictive', 'preventive', 'personalized', and 'participatory' (P4) medicine, which requires new strategies, both scientific and organizational, to enable bringing this revolution in medicine to patients and to the healthcare system. P4 medicine will have a profound impact on society - transforming the healthcare system, turning around the ever escalating costs of healthcare, digitizing the practice of medicine and creating enormous economic opportunities for those organizations and nations that embrace this revolution. © 2011 The Association for the Publication of the Journal of Internal Medicine.
Juliana, Philomin; Singh, Ravi P; Singh, Pawan K; Crossa, Jose; Rutkoski, Jessica E; Poland, Jesse A; Bergstrom, Gary C; Sorrells, Mark E
2017-07-01
The leaf spotting diseases in wheat that include Septoria tritici blotch (STB) caused by , Stagonospora nodorum blotch (SNB) caused by , and tan spot (TS) caused by pose challenges to breeding programs in selecting for resistance. A promising approach that could enable selection prior to phenotyping is genomic selection that uses genome-wide markers to estimate breeding values (BVs) for quantitative traits. To evaluate this approach for seedling and/or adult plant resistance (APR) to STB, SNB, and TS, we compared the predictive ability of least-squares (LS) approach with genomic-enabled prediction models including genomic best linear unbiased predictor (GBLUP), Bayesian ridge regression (BRR), Bayes A (BA), Bayes B (BB), Bayes Cπ (BC), Bayesian least absolute shrinkage and selection operator (BL), and reproducing kernel Hilbert spaces markers (RKHS-M), a pedigree-based model (RKHS-P) and RKHS markers and pedigree (RKHS-MP). We observed that LS gave the lowest prediction accuracies and RKHS-MP, the highest. The genomic-enabled prediction models and RKHS-P gave similar accuracies. The increase in accuracy using genomic prediction models over LS was 48%. The mean genomic prediction accuracies were 0.45 for STB (APR), 0.55 for SNB (seedling), 0.66 for TS (seedling) and 0.48 for TS (APR). We also compared markers from two whole-genome profiling approaches: genotyping by sequencing (GBS) and diversity arrays technology sequencing (DArTseq) for prediction. While, GBS markers performed slightly better than DArTseq, combining markers from the two approaches did not improve accuracies. We conclude that implementing GS in breeding for these diseases would help to achieve higher accuracies and rapid gains from selection. Copyright © 2017 Crop Science Society of America.
Ehrhardt, J; Säring, D; Handels, H
2007-01-01
Modern tomographic imaging devices enable the acquisition of spatial and temporal image sequences. But, the spatial and temporal resolution of such devices is limited and therefore image interpolation techniques are needed to represent images at a desired level of discretization. This paper presents a method for structure-preserving interpolation between neighboring slices in temporal or spatial image sequences. In a first step, the spatiotemporal velocity field between image slices is determined using an optical flow-based registration method in order to establish spatial correspondence between adjacent slices. An iterative algorithm is applied using the spatial and temporal image derivatives and a spatiotemporal smoothing step. Afterwards, the calculated velocity field is used to generate an interpolated image at the desired time by averaging intensities between corresponding points. Three quantitative measures are defined to evaluate the performance of the interpolation method. The behavior and capability of the algorithm is demonstrated by synthetic images. A population of 17 temporal and spatial image sequences are utilized to compare the optical flow-based interpolation method to linear and shape-based interpolation. The quantitative results show that the optical flow-based method outperforms the linear and shape-based interpolation statistically significantly. The interpolation method presented is able to generate image sequences with appropriate spatial or temporal resolution needed for image comparison, analysis or visualization tasks. Quantitative and qualitative measures extracted from synthetic phantoms and medical image data show that the new method definitely has advantages over linear and shape-based interpolation.
Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.
Smith, Anne E; Gans, Will
2015-03-01
The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.
2018-02-01
Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.
Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; ...
2013-05-15
Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sagemore » grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. In conclusion, the framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.« less
Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine
2013-01-01
Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.
Paper Capillary Enables Effective Sampling for Microfluidic Paper Analytical Devices.
Shangguan, Jin-Wen; Liu, Yu; Wang, Sha; Hou, Yun-Xuan; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan
2018-06-06
Paper capillary is introduced to enable effective sampling on microfluidic paper analytical devices. By coupling mac-roscale capillary force of paper capillary and microscale capillary forces of native paper, fluid transport can be flexibly tailored with proper design. Subsequently, a hybrid-fluid-mode paper capillary device was proposed, which enables fast and reliable sampling in an arrayed form, with less surface adsorption and bias for different components. The resulting device thus well supports high throughput, quantitative, and repeatable assays all by hands operation. With all these merits, multiplex analysis of ions, proteins, and microbe have all been realized on this platform, which has paved the way to level-up analysis on μPADs.
NASA Astrophysics Data System (ADS)
Wang, Shang; Lopez, Andrew L.; Morikawa, Yuka; Tao, Ge; Li, Jiasong; Larina, Irina V.; Martin, James F.; Larin, Kirill V.
2015-03-01
Optical coherence elastography (OCE) is an emerging low-coherence imaging technique that provides noninvasive assessment of tissue biomechanics with high spatial resolution. Among various OCE methods, the capability of quantitative measurement of tissue elasticity is of great importance for tissue characterization and pathology detection across different samples. Here we report a quantitative OCE technique, termed quantitative shear wave imaging optical coherence tomography (Q-SWI-OCT), which enables noncontact measurement of tissue Young's modulus based on the ultra-fast imaging of the shear wave propagation inside the sample. A focused air-puff device is used to interrogate the tissue with a low-pressure short-duration air stream that stimulates a localized displacement with the scale at micron level. The propagation of this tissue deformation in the form of shear wave is captured by a phase-sensitive OCT system running with the scan of the M-mode imaging over the path of the wave propagation. The temporal characteristics of the shear wave is quantified based on the cross-correlation of the tissue deformation profiles at all the measurement locations, and linear regression is utilized to fit the data plotted in the domain of time delay versus wave propagation distance. The wave group velocity is thus calculated, which results in the quantitative measurement of the Young's modulus. As the feasibility demonstration, experiments are performed on tissuemimicking phantoms with different agar concentrations and the quantified elasticity values with Q-SWI-OCT agree well with the uniaxial compression tests. For functional characterization of myocardium with this OCE technique, we perform our pilot experiments on ex vivo mouse cardiac muscle tissues with two studies, including 1) elasticity difference of cardiac muscle under relaxation and contract conditions and 2) mechanical heterogeneity of the heart introduced by the muscle fiber orientation. Our results suggest the potential of using Q-SWI-OCT as an essential tool for nondestructive biomechanical evaluation of myocardium.
Using Electronic Messaging to Improve the Quality of Instruction.
ERIC Educational Resources Information Center
Zack, Michael H.
1995-01-01
Qualitative and quantitative data from business students using electronic mail and computer conferencing showed these methods enabled the instructor to be more accessible and responsive; greater class cohesion developed, and perceived quality of the course and instructor effectiveness increased. (SK)
Advanced Technologies for Structural and Functional Optical Coherence Tomography
2015-01-07
vertical scale bar: 500 um. 9 OCT speckle noise can significantly affect polarimetry measurement and must be reduced for birefringence...shown in Figure 7. This technique enables more accurate polarimetry measurement and quantitative assessment of tissue birefringence. Figure 7
Grid-Enabled Quantitative Analysis of Breast Cancer
2010-10-01
large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also
Using Technology to Balance Algebraic Explorations
ERIC Educational Resources Information Center
Kurz, Terri L.
2013-01-01
In 2000, the "National Council of Teachers of Mathematics" recommended that Algebra Standards, "instructional programs from prekindergarten through grade 12 should enable all students to use mathematical models to represent and understand quantitative relationships." In this article, the authors suggest the "Balance"…
Kettenbach, Arminja N; Sano, Hiroyuki; Keller, Susanna R; Lienhard, Gustav E; Gerber, Scott A
2015-01-30
The study of cellular signaling remains a significant challenge for translational and clinical research. In particular, robust and accurate methods for quantitative phosphoproteomics in tissues and tumors represent significant hurdles for such efforts. In the present work, we design, implement and validate a method for single-stage phosphopeptide enrichment and stable isotope chemical tagging, or SPECHT, that enables the use of iTRAQ, TMT and/or reductive dimethyl-labeling strategies to be applied to phosphoproteomics experiments performed on primary tissue. We develop and validate our approach using reductive dimethyl-labeling and HeLa cells in culture, and find these results indistinguishable from data generated from more traditional SILAC-labeled HeLa cells mixed at the cell level. We apply the SPECHT approach to the quantitative analysis of insulin signaling in a murine myotube cell line and muscle tissue, identify known as well as new phosphorylation events, and validate these phosphorylation sites using phospho-specific antibodies. Taken together, our work validates chemical tagging post-single-stage phosphoenrichment as a general strategy for studying cellular signaling in primary tissues. Through the use of a quantitatively reproducible, proteome-wide phosphopeptide enrichment strategy, we demonstrated the feasibility of post-phosphopeptide purification chemical labeling and tagging as an enabling approach for quantitative phosphoproteomics of primary tissues. Using reductive dimethyl labeling as a generalized chemical tagging strategy, we compared the performance of post-phosphopeptide purification chemical tagging to the well established community standard, SILAC, in insulin-stimulated tissue culture cells. We then extended our method to the analysis of low-dose insulin signaling in murine muscle tissue, and report on the analytical and biological significance of our results. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Delhez, Robert; Van der Gaast, S. J.; Wielders, Arno; de Boer, J. L.; Helmholdt, R. B.; van Mechelen, J.; Reiss, C.; Woning, L.; Schenk, H.
2003-02-01
The mineralogy of the surface material of Mars is the key to disclose its present and past life and climates. Clay mineral species, carbonates, and ice (water and CO2) are and/or contain their witnesses. X-ray powder diffraction (XRPD) is the most powerful analytical method to identify and quantitatively characterize minerals in complex mixtures. This paper discusses the development of a working model of an instrument consisting of a reflection mode diffractometer and a transmission mode CCD-XRPD instrument, combined with an XRF module. The CCD-XRD/XRF instrument is analogous to the instrument for Mars missions developed by Sarrazin et al. (1998). This part of the tandem instrument enables "quick and dirty" analysis of powdered (!) matter to monitor semi-quantitatively the presence of clay minerals as a group, carbonates, and ices and yields semi-quantitative chemical information from X-ray fluorescence (XRF). The reflection mode instrument (i) enables in-situ measurements of rocks and soils and quantitative information on the compounds identified, (ii) has a high resolution and reveals large spacings for accurate identification, in particular of clay mineral species, and (iii) the shape of the line profiles observed reveals the kind and approximate amounts of lattice imperfections present. It will be shown that the information obtained with the reflection mode diffractometer is crucial for finding signs of life and changes in the climate on Mars. Obviously this instrument can also be used for other extra-terrestrial research.
Tamosaityte, Sandra; Leipnitz, Elke; Geiger, Kathrin D.; Schackert, Gabriele; Koch, Edmund; Steiner, Gerald; Kirsch, Matthias
2014-01-01
Background Coherent anti-Stokes Raman scattering (CARS) microscopy provides fine resolution imaging and displays morphochemical properties of unstained tissue. Here, we evaluated this technique to delineate and identify brain tumors. Methods Different human tumors (glioblastoma, brain metastases of melanoma and breast cancer) were induced in an orthotopic mouse model. Cryosections were investigated by CARS imaging tuned to probe C-H molecular vibrations, thereby addressing the lipid content of the sample. Raman microspectroscopy was used as reference. Histopathology provided information about the tumor's localization, cell proliferation and vascularization. Results The morphochemical contrast of CARS images enabled identifying brain tumors irrespective of the tumor type and properties: All tumors were characterized by a lower CARS signal intensity than the normal parenchyma. On this basis, tumor borders and infiltrations could be identified with cellular resolution. Quantitative analysis revealed that the tumor-related reduction of CARS signal intensity was more pronounced in glioblastoma than in metastases. Raman spectroscopy enabled relating the CARS intensity variation to the decline of total lipid content in the tumors. The analysis of the immunohistochemical stainings revealed no correlation between tumor-induced cytological changes and the extent of CARS signal intensity reductions. The results were confirmed on samples of human glioblastoma. Conclusions CARS imaging enables label-free, rapid and objective identification of primary and secondary brain tumors. Therefore, it is a potential tool for diagnostic neuropathology as well as for intraoperative tumor delineation. PMID:25198698
Butler, Claire; Brigden, Charlotte; Gage, Heather; Williams, Peter; Holdsworth, Laura; Greene, Kay; Wee, Bee; Barclay, Stephen; Wilson, Patricia
2018-05-16
Hospice at home (HAH) services aim to enable patients to be cared for and die in their place of choice, if that is at home, and to achieve a 'good death'. There is a considerable range of HAH services operating in England. The published evidence focuses on evaluations of individual services which vary considerably, and there is a lack of consistency in terms of the outcome measures reported. The evidence, therefore, does not provide generalisable information, so the question 'What are the features of hospice at home service models that work, for whom, and under what circumstances?' remains unanswered. The study aims to answer this question. This is a mixed-methods study in three phases informed by realist evaluation methodology. All HAH services in England will be invited to participate in a telephone survey to enable the development of a typology of services. In the second phase, case study sites representing the different service types will collect patient data and recruit carers, service managers and commissioners to gather quantitative and qualitative data about service provision and outcomes. A third phase will synthesise and refine the results through consensus workshops. The first survey phase has university ethics approval and the second phase, Integrated Research Application System (IRAS) and Health Research Authority (HRA) approval (IRAS ID:205986, REC:17/LO/0880); the third phase does not require ethics approval. Dissemination will be facilitated by project coapplicants with established connections to national policy-making forums, in addition to publications, conference presentations and reports targeted to service providers and commissioners. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Impact of voice- and knowledge-enabled clinical reporting--US example.
Bushko, Renata G; Havlicek, Penny L; Deppert, Edward; Epner, Stephen
2002-01-01
This study shows qualitative and quantitative estimates of the national and the clinic level impact of utilizing voice and knowledge enabled clinical reporting systems. Using common sense estimation methodology, we show that the delivery of health care can experience a dramatic improvement in four areas as a result of the broad use of voice and knowledge enabled clinical reporting: (1) Process Quality as measured by cost savings, (2) Organizational Quality as measured by compliance, (3) Clinical Quality as measured by clinical outcomes and (4) Service Quality as measured by patient satisfaction. If only 15 percent of US physicians replaced transcription with modem clinical reporting voice-based methodology, about one half billion dollars could be saved. $6.7 Billion could be saved annually if all medical reporting currently transcribed was handled with voice-and knowledge-enabled dictation and reporting systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Najafi, M; El Kaffas, A; Han, B
Purpose: Clarity Autoscan ultrasound monitoring system allows acquisition of raw radiofrequency (RF) ultrasound data prior and during radiotherapy. This enables the computation of 3D Quantitative Ultrasound (QUS) tissue parametric maps from. We aim to evaluate whether QUS parameters undergo changes with radiotherapy and thus potentially be used as early predictors and/or markers of treatment response in prostate cancer patients. Methods: In-vivo evaluation was performed under IRB protocol to allow data collection in prostate patients treated with VMAT whereby prostate was imaged through the acoustic window of the perineum. QUS spectroscopy analysis was carried out by computing a tissue power spectrummore » normalized to the power spectrum obtained from a quartz to remove system transfer function effects. A ROI was selected within the 3D image volume of the prostate. Because longitudinal registration was optimal, the same features could be used to select ROIs at roughly the same location in images acquired on different days. Parametric maps were generated within the rectangular ROIs with window sizes that were approximately 8 times the wavelength of the ultrasound. The mid-band fit (MBF), spectral slope (SS) and spectral intercept (SI) QUS parameters were computed for each window within the ROI and displayed as parametric maps. Quantitative parameters were obtained by averaging each of the spectral parameters over the whole ROI. Results: Data was acquired for over 21 treatment fractions. Preliminary results show changes in the parametric maps. MBF values decreased from −33.9 dB to −38.7 dB from pre-treatment to the last day of treatment. The spectral slope increased from −1.1 a.u. to −0.5 a.u., and spectral intercept decreased from −28.2 dB to −36.3 dB over the 21 treatment regimen. Conclusion: QUS parametric maps change over the course of treatment which warrants further investigation in their potential use for treatment planning and predicting treatment outcomes. Research was supported by Elekta.« less
Molenaar, Remco J; Khurshed, Mohammed; Hira, Vashendriya V V; Van Noorden, Cornelis J F
2018-05-26
Altered cellular metabolism is a hallmark of many diseases, including cancer, cardiovascular diseases and infection. The metabolic motor units of cells are enzymes and their activity is heavily regulated at many levels, including the transcriptional, mRNA stability, translational, post-translational and functional level. This complex regulation means that conventional quantitative or imaging assays, such as quantitative mRNA experiments, Western Blots and immunohistochemistry, yield incomplete information regarding the ultimate activity of enzymes, their function and/or their subcellular localization. Quantitative enzyme cytochemistry and histochemistry (i.e., metabolic mapping) show in-depth information on in situ enzymatic activity and its kinetics, function and subcellular localization in an almost true-to-nature situation. We describe a protocol to detect the activity of dehydrogenases, which are enzymes that perform redox reactions to reduce cofactors such as NAD(P) + and FAD. Cells and tissue sections are incubated in a medium that is specific for the enzymatic activity of one dehydrogenase. Subsequently, the dehydrogenase that is the subject of investigation performs its enzymatic activity in its subcellular site. In a chemical reaction with the reaction medium, this ultimately generates blue-colored formazan at the site of the dehydrogenase's activity. The formazan's absorbance is therefore a direct measure of the dehydrogenase's activity and can be quantified using monochromatic light microscopy and image analysis. The quantitative aspect of this protocol enables researchers to draw statistical conclusions from these assays. Besides observational studies, this technique can be used for inhibition studies of specific enzymes. In this context, studies benefit from the true-to-nature advantages of metabolic mapping, giving in situ results that may be physiologically more relevant than in vitro enzyme inhibition studies. In all, metabolic mapping is an indispensable technique to study metabolism at the cellular or tissue level. The technique is easy to adopt, provides in-depth, comprehensive and integrated metabolic information and enables rapid quantitative analysis.
Floberg, S; Hartvig, P; Lindström, B; Lönner-Holm, G; Odlind, B
1981-09-11
An analytical procedure was developed for the determination of 6-mercaptopurine in plasma. Owing to the polar character and low plasma concentration of the compound, extraction and derivatization was carried out directly from the plasma sample by extractive alkylation. Determination was made using gas chromatography-mass spectrometry with multiple-ion detection. Conditions with respect to the rate of formation and the stability of the derivative formed in the extractive alkylation step were evaluated. The selectively of the method to azathioprine and to metabolites was thoroughly investigated. No 6-mercaptopurine was formed from azathioprine added to water or plasma and run through the method. The method enables the detection of 2 ng of 6 mercaptopurine in a 1.0-ml plasma sample. Quantitative determinations were done down to 10 ng/ml 6 mercaptopurine in plasma.
Exact kinetic energy enables accurate evaluation of weak interactions by the FDE-vdW method.
Sinha, Debalina; Pavanello, Michele
2015-08-28
The correlation energy of interaction is an elusive and sought-after interaction between molecular systems. By partitioning the response function of the system into subsystem contributions, the Frozen Density Embedding (FDE)-vdW method provides a computationally amenable nonlocal correlation functional based on the adiabatic connection fluctuation dissipation theorem applied to subsystem density functional theory. In reproducing potential energy surfaces of weakly interacting dimers, we show that FDE-vdW, either employing semilocal or exact nonadditive kinetic energy functionals, is in quantitative agreement with high-accuracy coupled cluster calculations (overall mean unsigned error of 0.5 kcal/mol). When employing the exact kinetic energy (which we term the Kohn-Sham (KS)-vdW method), the binding energies are generally closer to the benchmark, and the energy surfaces are also smoother.
Integrated Platform for Expedited Synthesis–Purification–Testing of Small Molecule Libraries
2017-01-01
The productivity of medicinal chemistry programs can be significantly increased through the introduction of automation, leading to shortened discovery cycle times. Herein, we describe a platform that consolidates synthesis, purification, quantitation, dissolution, and testing of small molecule libraries. The system was validated through the synthesis and testing of two libraries of binders of polycomb protein EED, and excellent correlation of obtained data with results generated through conventional approaches was observed. The fully automated and integrated platform enables batch-supported compound synthesis based on a broad array of chemical transformations with testing in a variety of biochemical assay formats. A library turnaround time of between 24 and 36 h was achieved, and notably, each library synthesis produces sufficient amounts of compounds for further evaluation in secondary assays thereby contributing significantly to the shortening of medicinal chemistry discovery cycles. PMID:28435537
Exact kinetic energy enables accurate evaluation of weak interactions by the FDE-vdW method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sinha, Debalina; Pavanello, Michele, E-mail: m.pavanello@rutgers.edu
2015-08-28
The correlation energy of interaction is an elusive and sought-after interaction between molecular systems. By partitioning the response function of the system into subsystem contributions, the Frozen Density Embedding (FDE)-vdW method provides a computationally amenable nonlocal correlation functional based on the adiabatic connection fluctuation dissipation theorem applied to subsystem density functional theory. In reproducing potential energy surfaces of weakly interacting dimers, we show that FDE-vdW, either employing semilocal or exact nonadditive kinetic energy functionals, is in quantitative agreement with high-accuracy coupled cluster calculations (overall mean unsigned error of 0.5 kcal/mol). When employing the exact kinetic energy (which we term themore » Kohn-Sham (KS)-vdW method), the binding energies are generally closer to the benchmark, and the energy surfaces are also smoother.« less
Flow and clogging of a sheep herd passing through a bottleneck.
Garcimartín, A; Pastor, J M; Ferrer, L M; Ramos, J J; Martín-Gómez, C; Zuriguel, I
2015-02-01
We present an experimental study of a flock passing through a narrow door. Video monitoring of daily routines in a farm has enabled us to collect a sizable amount of data. By measuring the time lapse between the passage of consecutive animals, some features of the flow regime can be assessed. A quantitative definition of clogging is demonstrated based on the passage time statistics. These display broad tails, which can be fitted by power laws with a relatively large exponent. On the other hand, the distribution of burst sizes robustly evidences exponential behavior. Finally, borrowing concepts from granular physics and statistical mechanics, we evaluate the effect of increasing the door size and the performance of an obstacle placed in front of it. The success of these techniques opens new possibilities regarding their eventual extension to the management of human crowds.
Measuring the effectiveness and impact of an open innovation platform.
Carroll, Glenn P; Srivastava, Sanjay; Volini, Adam S; Piñeiro-Núñez, Marta M; Vetman, Tatiana
2017-05-01
Today, most pharmaceutical companies complement their traditional R&D models with some variation on the Open Innovation (OI) approach in an effort to better access global scientific talent, ideas and hypotheses. Traditional performance indicators that measure economic returns from R&D through commercialization are often not applicable to the practical assessment of these OI approaches, particularly within the context of early drug discovery. This leaves OI programs focused on early R&D without a standard assessment framework from which to evaluate overall performance. This paper proposes a practical dashboard for such assessment, encompassing quantitative and qualitative elements, to enable decision-making and improvement of future performance. The use of this dashboard is illustrated using real-time data from the Lilly Open Innovation Drug Discovery (OIDD) program. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Perea Palazón, R J; Solé Arqués, M; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Ortiz Pérez, J T
2015-01-01
Cardiac magnetic resonance imaging is considered the reference technique for characterizing myocardial tissue; for example, T2-weighted sequences make it possible to evaluate areas of edema or myocardial inflammation. However, traditional sequences have many limitations and provide only qualitative information. Moreover, traditional sequences depend on the reference to remote myocardium or skeletal muscle, which limits their ability to detect and quantify diffuse myocardial damage. Recently developed magnetic resonance myocardial mapping techniques enable quantitative assessment of parameters indicative of edema. These techniques have proven better than traditional sequences both in acute cardiomyopathy and in acute ischemic heart disease. This article synthesizes current developments in T2 mapping as well as their clinical applications and limitations. Copyright © 2014 SERAM. Published by Elsevier España, S.L.U. All rights reserved.
Functional magnetic resonance imaging in oncology: state of the art*
Guimaraes, Marcos Duarte; Schuch, Alice; Hochhegger, Bruno; Gross, Jefferson Luiz; Chojniak, Rubens; Marchiori, Edson
2014-01-01
In the investigation of tumors with conventional magnetic resonance imaging, both quantitative characteristics, such as size, edema, necrosis, and presence of metastases, and qualitative characteristics, such as contrast enhancement degree, are taken into consideration. However, changes in cell metabolism and tissue physiology which precede morphological changes cannot be detected by the conventional technique. The development of new magnetic resonance imaging techniques has enabled the functional assessment of the structures in order to obtain information on the different physiological processes of the tumor microenvironment, such as oxygenation levels, cellularity and vascularity. The detailed morphological study in association with the new functional imaging techniques allows for an appropriate approach to cancer patients, including the phases of diagnosis, staging, response evaluation and follow-up, with a positive impact on their quality of life and survival rate. PMID:25741058
Flow and clogging of a sheep herd passing through a bottleneck
NASA Astrophysics Data System (ADS)
Garcimartín, A.; Pastor, J. M.; Ferrer, L. M.; Ramos, J. J.; Martín-Gómez, C.; Zuriguel, I.
2015-02-01
We present an experimental study of a flock passing through a narrow door. Video monitoring of daily routines in a farm has enabled us to collect a sizable amount of data. By measuring the time lapse between the passage of consecutive animals, some features of the flow regime can be assessed. A quantitative definition of clogging is demonstrated based on the passage time statistics. These display broad tails, which can be fitted by power laws with a relatively large exponent. On the other hand, the distribution of burst sizes robustly evidences exponential behavior. Finally, borrowing concepts from granular physics and statistical mechanics, we evaluate the effect of increasing the door size and the performance of an obstacle placed in front of it. The success of these techniques opens new possibilities regarding their eventual extension to the management of human crowds.
Metabolomics: Insulin Resistance and Type 2 Diabetes Mellitus
USDA-ARS?s Scientific Manuscript database
Type 2 diabetes mellitus (T2DM) develops over many years, providing an opportunity to consider early prognostic tools that guide interventions to thwart disease. Advancements in analytical chemistry enable quantitation of hundreds of metabolites in biofluids and tissues (metabolomics), providing in...
20180312 - Mechanistic Modeling of Developmental Defects through Computational Embryology (SOT)
Significant advances in the genome sciences, in automated high-throughput screening (HTS), and in alternative methods for testing enable rapid profiling of chemical libraries for quantitative effects on diverse cellular activities. While a surfeit of HTS data and information is n...
The Evaluator's Perspective: Evaluating the State Capacity Building Program.
ERIC Educational Resources Information Center
Madey, Doren L.
A historical antagonism between the advocates of quantitative evaluation methods and the proponents of qualitative evaluation methods has stymied the recognition of the value to be gained by utilizing both methodologies in the same study. The integration of quantitative and qualitative methods within a single evaluation has synergistic effects in…
Schalk, Kathrin; Koehler, Peter; Scherf, Katharina Anne
2018-04-04
Celiac disease is triggered by the ingestion of gluten from wheat, barley, rye, and possibly oats. Gluten is quantitated by DNA-based methods or enzyme-linked immunosorbent assays (ELISAs). ELISAs mostly detect the prolamin fraction and potentially over- or underestimate gluten contents. Therefore, a new independent method is required to comprehensively detect gluten. A targeted liquid chromatography-tandem mass spectrometry method was developed to quantitate seven barley, seven rye, and three oat marker peptides derived from each gluten protein fraction (prolamin and glutelin) and type (barley, B-, C-, D-, and γ-hordeins; rye, γ-75k-, γ-40k-, ω-, and HMW-secalins). The quantitation of each marker peptide in the chymotryptic digest of a defined amount of the respective reference gluten protein type resulted in peptide-specific yields, which enabled the conversion of peptide into protein concentrations. This method was applied to quantitate gluten in samples from the brewing process, in raw materials for sourdough fermentation, and in dried sourdoughs.
Quantitation without Calibration: Response Profile as an Indicator of Target Amount.
Debnath, Mrittika; Farace, Jessica M; Johnson, Kristopher D; Nesterova, Irina V
2018-06-21
Quantitative assessment of biomarkers is essential in numerous contexts from decision-making in clinical situations to food quality monitoring to interpretation of life-science research findings. However, appropriate quantitation techniques are not as widely addressed as detection methods. One of the major challenges in biomarker's quantitation is the need to have a calibration for correlating a measured signal to a target amount. The step complicates the methodologies and makes them less sustainable. In this work we address the issue via a new strategy: relying on position of response profile rather than on an absolute signal value for assessment of a target's amount. In order to enable the capability we develop a target-probe binding mechanism based on a negative cooperativity effect. A proof-of-concept example demonstrates that the model is suitable for quantitative analysis of nucleic acids over a wide concentration range. The general principles of the platform will be applicable toward a variety of biomarkers such as nucleic acids, proteins, peptides, and others.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hou, Hongfei; Deng, Zhiqun; Martinez, Jayson
Currently, approximately 16% of the world’s electricity and over 80% of the world’s renewable electricity is generated from hydropower resources, and there is potential for development of a significant amount of new hydropower capacity. However, in practice, realizing all the potential hydropower resource is limited by various factors, including environmental effects and related mitigation requirements. That is why hydropower regulatory requirements frequently call for targets to be met regarding fish injury and mortality rates. Hydropower Biological Evaluation Toolset (HBET), an integrated suite of software tools, is designed to characterize hydraulic conditions of hydropower structures and provide quantitative estimates of fishmore » injury and mortality rates due to various physical stressors including strike, pressure, and shear. HBET enables users to design new studies, analyze data, perform statistical analyses, and evaluate biological responses. In this paper, we discuss the features of the HBET software and describe a case study that illustrates its functionalities. HBET can be used by turbine manufacturers, hydropower operators, and regulators to design and operate hydropower systems that minimize ecological impacts in a cost-effective manner.« less
Candy, Bridget; France, Rachel; Low, Joe; Sampson, Liz
2015-03-01
Despite the extent of volunteers' contribution to palliative care, and their role in direct patient care, there has been no systematic evaluation of the evidence-base on volunteers in relation to patient and family wellbeing. To critically review research, on the impact of volunteers involved in the direct care of palliative patients and their families. We searched for studies, reporting patient and family data on the impact of volunteer services in palliative care in thirteen citation databases up to May 2013. We included quantitative comparative studies. We also noted any non-comparative studies, enabling us to give a comprehensive review of the existing research. We also included qualitative studies that explored the experiences of patients and families who received volunteer support, potentially illustrating which aspects of volunteer activities patients and families value. We applied quality appraisal criteria to all studies meeting inclusion criteria. Two researchers undertook key review processes. We found eight studies. Only two studies were undertaken outside of North America; one in the Netherlands and the other in Uganda. All studies were in adult palliative care services. All evaluated volunteers were in home care settings, three of the studies included other settings such as hospitals and nursing homes. All of the studies fulfilled our quality appraisal criteria. Six of them were quantitative studies and two were comparative: one found that those families who experienced greater (as opposed to lesser) volunteer involvement were significantly more satisfied with care; the other found that patients survived significantly longer if they had received home visits from a volunteer. Four cross-sectional studies focused on satisfaction ratings. No study considered possible disadvantages or adverse effects of volunteer involvement. Two qualitative studies were identified; both highlighted the uniqueness of the role volunteers may fulfil in care support, from the viewpoint of patients and their families. Further research is needed to ensure the resource of volunteers in palliative care is used appropriately and effectively. Evaluation in well-designed comparative studies is recommended including economic analyses, as are further qualitative studies to explore the roles, benefits and possible adverse effects of volunteers. Evaluation is particularly needed outside of North America and in dedicated hospice facilities. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Liao-Chan, Sindy; Daine-Matsuoka, Barbara; Heald, Nathan; Wong, Tiffany; Lin, Tracey; Cai, Allen G; Lai, Michelle; D'Alessio, Joseph A; Theunissen, Jan-Willem
2015-01-01
Antibodies against cell surface antigens may be internalized through their specific interactions with these proteins and in some cases may induce or perturb antigen internalization. The anti-cancer efficacy of antibody-drug conjugates is thought to rely on their uptake by cancer cells expressing the surface antigen. Numerous techniques, including microscopy and flow cytometry, have been used to identify antibodies with desired cellular uptake rates. To enable quantitative measurements of internalization of labeled antibodies, an assay based on internalized and quenched fluorescence was developed. For this approach, we generated novel anti-Alexa Fluor monoclonal antibodies (mAbs) that effectively and specifically quench cell surface-bound Alexa Fluor 488 or Alexa Fluor 594 fluorescence. Utilizing Alexa Fluor-labeled mAbs against the EphA2 receptor tyrosine kinase, we showed that the anti-Alexa Fluor reagents could be used to monitor internalization quantitatively over time. The anti-Alexa Fluor mAbs were also validated in a proof of concept dual-label internalization assay with simultaneous exposure of cells to two different mAbs. Importantly, the unique anti-Alexa Fluor mAbs described here may also enable other single- and dual-label experiments, including label detection and signal enhancement in macromolecules, trafficking of proteins and microorganisms, and cell migration and morphology.
Shemesh-Mayer, Einat; Ben-Michael, Tomer; Rotem, Neta; Rabinowitch, Haim D.; Doron-Faigenboim, Adi; Kosmala, Arkadiusz; Perlikowski, Dawid; Sherman, Amir; Kamenetsky, Rina
2015-01-01
Commercial cultivars of garlic, a popular condiment, are sterile, making genetic studies and breeding of this plant challenging. However, recent fertility restoration has enabled advanced physiological and genetic research and hybridization in this important crop. Morphophysiological studies, combined with transcriptome and proteome analyses and quantitative PCR validation, enabled the identification of genes and specific processes involved in gametogenesis in fertile and male-sterile garlic genotypes. Both genotypes exhibit normal meiosis at early stages of anther development, but in the male-sterile plants, tapetal hypertrophy after microspore release leads to pollen degeneration. Transcriptome analysis and global gene-expression profiling showed that >16,000 genes are differentially expressed in the fertile vs. male-sterile developing flowers. Proteome analysis and quantitative comparison of 2D-gel protein maps revealed 36 significantly different protein spots, 9 of which were present only in the male-sterile genotype. Bioinformatic and quantitative PCR validation of 10 candidate genes exhibited significant expression differences between male-sterile and fertile flowers. A comparison of morphophysiological and molecular traits of fertile and male-sterile garlic flowers suggests that respiratory restrictions and/or non-regulated programmed cell death of the tapetum can lead to energy deficiency and consequent pollen abortion. Potential molecular markers for male fertility and sterility in garlic are proposed. PMID:25972879
Hyperspectral and differential CARS microscopy for quantitative chemical imaging in human adipocytes
Di Napoli, Claudia; Pope, Iestyn; Masia, Francesco; Watson, Peter; Langbein, Wolfgang; Borri, Paola
2014-01-01
In this work, we demonstrate the applicability of coherent anti-Stokes Raman scattering (CARS) micro-spectroscopy for quantitative chemical imaging of saturated and unsaturated lipids in human stem-cell derived adipocytes. We compare dual-frequency/differential CARS (D-CARS), which enables rapid imaging and simple data analysis, with broadband hyperspectral CARS microscopy analyzed using an unsupervised phase-retrieval and factorization method recently developed by us for quantitative chemical image analysis. Measurements were taken in the vibrational fingerprint region (1200–2000/cm) and in the CH stretch region (2600–3300/cm) using a home-built CARS set-up which enables hyperspectral imaging with 10/cm resolution via spectral focussing from a single broadband 5 fs Ti:Sa laser source. Through a ratiometric analysis, both D-CARS and phase-retrieved hyperspectral CARS determine the concentration of unsaturated lipids with comparable accuracy in the fingerprint region, while in the CH stretch region D-CARS provides only a qualitative contrast owing to its non-linear behavior. When analyzing hyperspectral CARS images using the blind factorization into susceptibilities and concentrations of chemical components recently demonstrated by us, we are able to determine vol:vol concentrations of different lipid components and spatially resolve inhomogeneities in lipid composition with superior accuracy compared to state-of-the art ratiometric methods. PMID:24877002
A quantitative literature-curated gold standard for kinase-substrate pairs
2011-01-01
We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431
Simulation Of Combat With An Expert System
NASA Technical Reports Server (NTRS)
Provenzano, J. P.
1989-01-01
Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.
Quantitative structure parameters from the NMR spectroscopy of quadrupolar nuclei
Perras, Frederic A.
2015-12-15
Here, nuclear magnetic resonance (NMR) spectroscopy is one of the most important characterization tools in chemistry, however, 3/4 of the NMR active nuclei are underutilized due to their quadrupolar nature. This short review centers on the development of methods that use solid-state NMR of quadrupolar nuclei for obtaining quantitative structural information. Namely, techniques using dipolar recoupling as well as the resolution afforded by double-rotation are presented for the measurement of spin–spin coupling between quadrupoles, enabling the measurement of internuclear distances and connectivities.
Food intake attenuates the drug interaction between new quinolones and aluminum.
Imaoka, Ayuko; Abiru, Kosuke; Akiyoshi, Takeshi; Ohtani, Hisakazu
2018-01-01
Intestinal absorption of new quinolones is decreased by oral administration of polyvalent metal cations. Some clinical studies have demonstrated this drug - drug interaction is more prominent under fasted condition. However, the effect of food intake on the extent of drug - drug interaction between new quinolones and metal cations remains to be investigated quantitatively and systematically. The aim of this study was to develop an animal model that enables to evaluate the effect of food intake on the extent of drug - drug interaction in the gastrointestinal tract by chelation and to apply the model to evaluate quantitatively the effect of food intake on the drug - drug interaction between two new quinolones, ofloxacin or ciprofloxacin and sucralfate. The rats were orally administered new quinolones (5.3 mg/kg of ofloxacin or 10 mg/kg of ciprofloxacin) with or without 13.3 mg/kg of sucralfate under fasted or fed condition and plasma concentration profiles of new quinolones were monitored. To the fed group, standard breakfast used in human studies was pasted and administered at a dose of 8.8 g/kg. The area under the plasma concentration - time curves (AUC 0-6 ) of ofloxacin and ciprofloxacin under the fasted condition were significantly decreased to 28.8 and 17.1% by co-administration of sucralfate, respectively. On the contrary, sucralfate moderately decreased the AUC 0-6 of ofloxacin and ciprofloxacin to 54.9 and 33.2%, respectively, under fed condition. The effects of sucralfate and food intake on the kinetics of ofloxacin in this study were well consistent with the results of previous clinical trial. The developed animal model quantitatively reproduced the effect of food intake on the drug - drug interaction between ofloxacin and sucralfate. The similar influences were observed for the drug - drug interaction between ciprofloxacin and sucralfate, suggesting that the extent of drug - drug interaction caused by chelation is generally attenuated by food intake.
Preceptors' perceptions of a preceptorship programme for newly qualified nurses.
Muir, Jenny; Ooms, Ann; Tapping, Jen; Marks-Maran, Di; Phillips, Sonia; Burke, Linda
2013-06-01
A study was undertaken into preceptors' perceptions of a preceptorship programme for newly-qualified nurses. The preceptorship programme is designed to enable newly qualified nurses to make the transition from student to registered nurse. Preceptors undergo a training programme to take on the role of preceptor. To evaluate the preceptors' perception of the preceptorship programme. Mixed method evaluative research design was used. This study took place in one National Health Service Healthcare Trust in South West London, UK. Ninety preceptors were invited to participate in the study and the response rate was 44.4% (n=40). The study took place in 2011. Qualitative and quantitative data were collected through questionnaires and one-to-one interviews with a convenience sample of preceptors. Quantitative data were analysed using SPSS, version 18; qualitative data were analysed using the Framework Method. From the quantitative data seven themes emerged. These were preceptors' perceptions of: the personal development of preceptees; the role development of preceptees; the communication skills development of preceptees; the clinical development of preceptees; the development of professional relationships by preceptees; value of the preceptorship programme to the organisation and value of being a preceptor in terms of their own professional development. Qualitative analysis confirmed many of the findings from the statistical analysis and was used to triangulate those findings. The preceptors largely viewed the preceptorship programme and their role within this programme positively. Although difficulties in making time to meet with preceptees was an issue, the preceptorship experience was perceived to have a positive impact on several aspects of preceptee development as well as on the organisation and on the preceptors' own development. The study is unique when mapped against other research studies because there is little in the literature about studies into preceptors' perceptions of preceptorship programmes. Copyright © 2013 Elsevier Ltd. All rights reserved.
Predicting Future Morphological Changes of Lesions from Radiotracer Uptake in 18F-FDG-PET Images
Bagci, Ulas; Yao, Jianhua; Miller-Jaster, Kirsten; Chen, Xinjian; Mollura, Daniel J.
2013-01-01
We introduce a novel computational framework to enable automated identification of texture and shape features of lesions on 18F-FDG-PET images through a graph-based image segmentation method. The proposed framework predicts future morphological changes of lesions with high accuracy. The presented methodology has several benefits over conventional qualitative and semi-quantitative methods, due to its fully quantitative nature and high accuracy in each step of (i) detection, (ii) segmentation, and (iii) feature extraction. To evaluate our proposed computational framework, thirty patients received 2 18F-FDG-PET scans (60 scans total), at two different time points. Metastatic papillary renal cell carcinoma, cerebellar hemongioblastoma, non-small cell lung cancer, neurofibroma, lymphomatoid granulomatosis, lung neoplasm, neuroendocrine tumor, soft tissue thoracic mass, nonnecrotizing granulomatous inflammation, renal cell carcinoma with papillary and cystic features, diffuse large B-cell lymphoma, metastatic alveolar soft part sarcoma, and small cell lung cancer were included in this analysis. The radiotracer accumulation in patients' scans was automatically detected and segmented by the proposed segmentation algorithm. Delineated regions were used to extract shape and textural features, with the proposed adaptive feature extraction framework, as well as standardized uptake values (SUV) of uptake regions, to conduct a broad quantitative analysis. Evaluation of segmentation results indicates that our proposed segmentation algorithm has a mean dice similarity coefficient of 85.75±1.75%. We found that 28 of 68 extracted imaging features were correlated well with SUVmax (p<0.05), and some of the textural features (such as entropy and maximum probability) were superior in predicting morphological changes of radiotracer uptake regions longitudinally, compared to single intensity feature such as SUVmax. We also found that integrating textural features with SUV measurements significantly improves the prediction accuracy of morphological changes (Spearman correlation coefficient = 0.8715, p<2e-16). PMID:23431398
Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?
Gizak, Agnieszka; Rakus, Dariusz
2016-01-11
Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.
Pollock, Samuel B; Hu, Amy; Mou, Yun; Martinko, Alexander J; Julien, Olivier; Hornsby, Michael; Ploder, Lynda; Adams, Jarrett J; Geng, Huimin; Müschen, Markus; Sidhu, Sachdev S; Moffat, Jason; Wells, James A
2018-03-13
Human cells express thousands of different surface proteins that can be used for cell classification, or to distinguish healthy and disease conditions. A method capable of profiling a substantial fraction of the surface proteome simultaneously and inexpensively would enable more accurate and complete classification of cell states. We present a highly multiplexed and quantitative surface proteomic method using genetically barcoded antibodies called phage-antibody next-generation sequencing (PhaNGS). Using 144 preselected antibodies displayed on filamentous phage (Fab-phage) against 44 receptor targets, we assess changes in B cell surface proteins after the development of drug resistance in a patient with acute lymphoblastic leukemia (ALL) and in adaptation to oncogene expression in a Myc-inducible Burkitt lymphoma model. We further show PhaNGS can be applied at the single-cell level. Our results reveal that a common set of proteins including FLT3, NCR3LG1, and ROR1 dominate the response to similar oncogenic perturbations in B cells. Linking high-affinity, selective, genetically encoded binders to NGS enables direct and highly multiplexed protein detection, comparable to RNA-sequencing for mRNA. PhaNGS has the potential to profile a substantial fraction of the surface proteome simultaneously and inexpensively to enable more accurate and complete classification of cell states. Copyright © 2018 the Author(s). Published by PNAS.
Elliott, Jonathan T.; Samkoe, Kimberley S.; Davis, Scott C.; Gunn, Jason R.; Paulsen, Keith D.; Roberts, David W.; Pogue, Brian W.
2017-01-01
Receptor concentration imaging (RCI) with targeted-untargeted optical dye pairs has enabled in vivo immunohistochemistry analysis in preclinical subcutaneous tumors. Successful application of RCI to fluorescence guided resection (FGR), so that quantitative molecular imaging of tumor-specific receptors could be performed in situ, would have a high impact. However, assumptions of pharmacokinetics, permeability and retention, as well as the lack of a suitable reference region limit the potential for RCI in human neurosurgery. In this study, an arterial input graphic analysis (AIGA) method is presented which is enabled by independent component analysis (ICA). The percent difference in arterial concentration between the image-derived arterial input function (AIFICA) and that obtained by an invasive method (ICACAR) was 2.0 ± 2.7% during the first hour of circulation of a targeted-untargeted dye pair in mice. Estimates of distribution volume and receptor concentration in tumor bearing mice (n = 5) recovered using the AIGA technique did not differ significantly from values obtained using invasive AIF measurements (p=0.12). The AIGA method, enabled by the subject-specific AIFICA, was also applied in a rat orthotopic model of U-251 glioblastoma to obtain the first reported receptor concentration and distribution volume maps during open craniotomy. PMID:26349671
STARE-HI – Statement on Reporting of Evaluation Studies in Health Informatics
Brender, J.; Talmon, J.; de Keizer, N.; Nykänen, P.; Rigby, M.; Ammenwerth, E.
2013-01-01
Summary Background Improving the quality of reporting of evaluation studies in health informatics is an important requirement towards the vision of evidence-based health informatics. The STARE-HI – Statement on Reporting of Evaluation Studies in health informatics, published in 2009, provides guidelines on the elements to be contained in an evaluation study report. Objectives To elaborate on and provide a rationale for the principles of STARE-HI and to guide authors and readers of evaluation studies in health informatics by providing explanatory examples of reporting. Methods A group of methodologists, researchers and editors prepared the present elaboration of the STARE-HI statement and selected examples from the literature. Results The 35 STARE-HI items to be addressed in evaluation papers describing health informatics interventions are discussed one by one and each is extended with examples and elaborations. Conclusion The STARE-HI statement and this elaboration document should be helpful resources to improve reporting of both quantitative and qualitative evaluation studies. Evaluation manuscripts adhering to the principles will enable readers of such papers to better place the studies in a proper context and judge their validity and generalizability, and thus in turn optimize the exploitation of the evidence contained therein. Limitations This paper is based on experiences of a group of editors, reviewers, authors of systematic reviews and readers of the scientific literature. The applicability of the details of these principles has to evolve as a function of their use in practice. PMID:24155788
NASA Astrophysics Data System (ADS)
Schelenz, Sophie; Dietrich, Peter; Vienken, Thomas
2016-04-01
A sustainable thermal exploitation of the shallow subsurface requires a precise understanding of all relevant heat transport processes. Currently, planning practice of shallow geothermal systems (especially for systems < 30 kW) focuses on conductive heat transport as the main energy source while the impact of groundwater flow as the driver for advective heat transport is neglected or strongly simplified. The presented study proves that those simplifications of complex geological and hydrogeological subsurface characteristics are insufficient for a precise evaluation of site-specific energy extraction rates. Based on synthetic model scenarios with varying subsurface conditions (groundwater flow velocity and aquifer thickness) the impact of advection on induced long term temperature changes in 5 and 10 m distance of the borehole heat exchanger is presented. Extending known investigations, this study enhances the evaluation of shallow geothermal energy extraction rates by considering conductive and advective heat transport under varying aquifer thicknesses. Further, it evaluates the impact of advection on installation lengths of the borehole heat exchanger to optimize the initial financial investment. Finally, an evaluation approach is presented that classifies relevant heat transport processes according to their Péclet number to enable a first quantitative assessment of the subsurface energy regime and recommend further investigation and planning procedures.
Modeling and evaluating user behavior in exploratory visual analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.
Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling andmore » evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.« less
D'Elia, Paolo; Coppo, Alessandro; Di Stefano, Francesca; Charrier, Lorena; Piccinelli, Cristiano; Molinar, Roberta; Senore, Carlo; Giordano, Livia; Segnan, Nereo
2008-01-01
Community interventions represent a key component of the current anti-smoking strategies. We propose a conceptual framework for classifying these interventions, based on the concept of community utilised in different studies. We identified 5 different focuses: geographical areas (i.e. city, county, region); targets (sub-group of a population); settings (school, workplace); culture and individual attitudes; multilevel networks. These two latter views refer to functional rather than to structural aspects of a community and they represent the most promising approaches to design intervention strategies. Communities are represented as a group of organizations, systems and social networks investigating individual, environmental and cultural factors that can strongly influence behavioural changes. The great heterogeneity in what the authors mean as community interventions has in our opinion affected the evaluation of their impact. To facilitate their evaluation and to contribute to the detection of determinants, as well as of barriers, it is necessary to compare community interventions sharing similar theoretical approaches and focuses. Also, studies aimed at assessing the steps of the implementation process of community programmes may allow to identify those components related to specific levels of intervention, thus enabling the generalisation of results. To reach this goal it may be helpful to combine study designs allowing for both quantitative and qualitative assessments, such as action research and participatory evaluation research.
Rock Slide Risk Assessment: A Semi-Quantitative Approach
NASA Astrophysics Data System (ADS)
Duzgun, H. S. B.
2009-04-01
Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them, four of the slides caused formation of tsunami waves which washed up to 74 m above the lake level. Two of the slides resulted in many fatalities in the inner part of the Loen Valley as well as great damages. There are three predominant joint structures in Ramnefjell Mountain, which controls failure and the geometry of the slides. The first joint set is a foliation plane striking northeast-southwest and dipping 35Ë -40Ë to the east-southeast. The second and the third joint sets are almost perpendicular and parallel to the mountain side and scarp, respectively. These three joint sets form slices of rock columns with width ranging between 7-10 m and height of 400-450 m. It is stated that the joints in set II are opened between 1-2 m, which may bring about collection of water during heavy rainfall or snow melt causing the slices to be pressed out. It is estimated that water in the vertical joints both reduces the shear strength of sliding plane and causes reduction of normal stress on the sliding plane due to formation of uplift force. Hence rock slides in Ramnefjell mountain occur in plane failure mode. The quantitative evaluation of rock slide risk requires probabilistic analysis of rock slope stability and identification of consequences if the rock slide occurs. In this study failure probability of a rock slice is evaluated by first-order reliability method (FORM). Then in order to use the calculated probability of failure value (Pf) in risk analyses, it is required to associate this Pf with frequency based probabilities (i.ePf / year) since the computed failure probabilities is a measure of hazard and not a measure of risk unless they are associated with the consequences of the failure. This can be done by either considering the time dependent behavior of the basic variables in the probabilistic models or associating the computed Pf with frequency of the failures in the region. In this study, the frequency of previous rock slides in the previous century in Remnefjell is used for evaluation of frequency based probability to be used in risk assessment. The major consequence of a rock slide is generation of a tsunami in the lake Loen, causing inundation of residential areas around the lake. Risk is assessed by adapting damage probability matrix approach, which is originally developed for risk assessment for buildings in case of earthquake.
Quantitative imaging of mammalian transcriptional dynamics: from single cells to whole embryos.
Zhao, Ziqing W; White, Melanie D; Bissiere, Stephanie; Levi, Valeria; Plachta, Nicolas
2016-12-23
Probing dynamic processes occurring within the cell nucleus at the quantitative level has long been a challenge in mammalian biology. Advances in bio-imaging techniques over the past decade have enabled us to directly visualize nuclear processes in situ with unprecedented spatial and temporal resolution and single-molecule sensitivity. Here, using transcription as our primary focus, we survey recent imaging studies that specifically emphasize the quantitative understanding of nuclear dynamics in both time and space. These analyses not only inform on previously hidden physical parameters and mechanistic details, but also reveal a hierarchical organizational landscape for coordinating a wide range of transcriptional processes shared by mammalian systems of varying complexity, from single cells to whole embryos.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-11
... quantitative research and evaluation process that forecasts economic excess sector returns (over/under the... proprietary SectorSAM quantitative research and evaluation process. \\8\\ The following convictions constitute... Allocation Methodology'' (``SectorSAM''), which is a proprietary quantitative analysis, to forecast each...
A multiplexed system for quantitative comparisons of chromatin landscapes
van Galen, Peter; Viny, Aaron D.; Ram, Oren; Ryan, Russell J.H.; Cotton, Matthew J.; Donohue, Laura; Sievers, Cem; Drier, Yotam; Liau, Brian B.; Gillespie, Shawn M.; Carroll, Kaitlin M.; Cross, Michael B.; Levine, Ross L.; Bernstein, Bradley E.
2015-01-01
Genome-wide profiling of histone modifications can provide systematic insight into the regulatory elements and programs engaged in a given cell type. However, conventional chromatin immunoprecipitation and sequencing (ChIP-seq) does not capture quantitative information on histone modification levels, requires large amounts of starting material, and involves tedious processing of each individual sample. Here we address these limitations with a technology that leverages DNA barcoding to profile chromatin quantitatively and in multiplexed format. We concurrently map relative levels of multiple histone modifications across multiple samples, each comprising as few as a thousand cells. We demonstrate the technology by monitoring dynamic changes following inhibition of P300, EZH2 or KDM5, by linking altered epigenetic landscapes to chromatin regulator mutations, and by mapping active and repressive marks in purified human hematopoietic stem cells. Hence, this technology enables quantitative studies of chromatin state dynamics across rare cell types, genotypes, environmental conditions and drug treatments. PMID:26687680
Early Foundations for Mathematics Learning and Their Relations to Learning Disabilities.
Geary, David C
2013-02-01
Children's quantitative competencies upon entry into school can have lifelong consequences. Children who start behind generally stay behind, and mathematical skills at school completion influence employment prospects and wages in adulthood. I review the current debate over whether early quantitative learning is supported by (a) an inherent system for representing approximate magnitudes, (b) an attentional-control system that enables explicit processing of quantitative symbols, such as Arabic numerals, or (c) the logical problem-solving abilities that facilitate learning of the relations among numerals. Studies of children with mathematical learning disabilities and difficulties have suggested that each of these competencies may be involved, but to different degrees and at different points in the learning process. Clarifying how and when these competencies facilitate early quantitative learning and developing interventions to address their impact on children have the potential to yield substantial benefits for individuals and for society.
Uncertainty of quantitative microbiological methods of pharmaceutical analysis.
Gunar, O V; Sakhno, N G
2015-12-30
The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.
Quantitative self-assembly prediction yields targeted nanomedicines
NASA Astrophysics Data System (ADS)
Shamay, Yosi; Shah, Janki; Işık, Mehtap; Mizrachi, Aviram; Leibold, Josef; Tschaharganeh, Darjus F.; Roxbury, Daniel; Budhathoki-Uprety, Januka; Nawaly, Karla; Sugarman, James L.; Baut, Emily; Neiman, Michelle R.; Dacek, Megan; Ganesh, Kripa S.; Johnson, Darren C.; Sridharan, Ramya; Chu, Karen L.; Rajasekhar, Vinagolu K.; Lowe, Scott W.; Chodera, John D.; Heller, Daniel A.
2018-02-01
Development of targeted nanoparticle drug carriers often requires complex synthetic schemes involving both supramolecular self-assembly and chemical modification. These processes are generally difficult to predict, execute, and control. We describe herein a targeted drug delivery system that is accurately and quantitatively predicted to self-assemble into nanoparticles based on the molecular structures of precursor molecules, which are the drugs themselves. The drugs assemble with the aid of sulfated indocyanines into particles with ultrahigh drug loadings of up to 90%. We devised quantitative structure-nanoparticle assembly prediction (QSNAP) models to identify and validate electrotopological molecular descriptors as highly predictive indicators of nano-assembly and nanoparticle size. The resulting nanoparticles selectively targeted kinase inhibitors to caveolin-1-expressing human colon cancer and autochthonous liver cancer models to yield striking therapeutic effects while avoiding pERK inhibition in healthy skin. This finding enables the computational design of nanomedicines based on quantitative models for drug payload selection.
Quantitative multimodality imaging in cancer research and therapy.
Yankeelov, Thomas E; Abramson, Richard G; Quarles, C Chad
2014-11-01
Advances in hardware and software have enabled the realization of clinically feasible, quantitative multimodality imaging of tissue pathophysiology. Earlier efforts relating to multimodality imaging of cancer have focused on the integration of anatomical and functional characteristics, such as PET-CT and single-photon emission CT (SPECT-CT), whereas more-recent advances and applications have involved the integration of multiple quantitative, functional measurements (for example, multiple PET tracers, varied MRI contrast mechanisms, and PET-MRI), thereby providing a more-comprehensive characterization of the tumour phenotype. The enormous amount of complementary quantitative data generated by such studies is beginning to offer unique insights into opportunities to optimize care for individual patients. Although important technical optimization and improved biological interpretation of multimodality imaging findings are needed, this approach can already be applied informatively in clinical trials of cancer therapeutics using existing tools. These concepts are discussed herein.
Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J; Zhang, Li-Qun
2011-01-01
Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke.
Digital Immigrant Teacher Perceptions of an Extended Cyberhunt Strategy
ERIC Educational Resources Information Center
du Plessis, Andre; Webb, Paul
2012-01-01
This quantitative and qualitative interpretive exploratory case study investigates whether exposure to an Internet based "Extended Cyberhunt" strategy enables teachers to attain a set of outcomes similar to Prensky's "Essential 21st Century Skills" and the "Critical Outcomes of the South African National Curriculum…
Comparison and quantitative verification of mapping algorithms for whole genome bisulfite sequencing
USDA-ARS?s Scientific Manuscript database
Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitat...
ERIC Educational Resources Information Center
Gluck, P.; Krakower, Zeev
2010-01-01
We present a unit comprising theory, simulation and experiment for a body oscillating on a vertical spring, in which the simultaneous use of a force probe and an ultrasonic range finder enables one to explore quantitatively and understand many aspects of simple and damped harmonic motions. (Contains 14 figures.)