Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
Cardiac imaging: working towards fully-automated machine analysis & interpretation
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-01-01
Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804
The value of health care information exchange and interoperability.
Walker, Jan; Pan, Eric; Johnston, Douglas; Adler-Milstein, Julia; Bates, David W; Middleton, Blackford
2005-01-01
In this paper we assess the value of electronic health care information exchange and interoperability (HIEI) between providers (hospitals and medical group practices) and independent laboratories, radiology centers, pharmacies, payers, public health departments, and other providers. We have created an HIEI taxonomy and combined published evidence with expert opinion in a cost-benefit model. Fully standardized HIEI could yield a net value of dollar 77.8 billion per year once fully implemented. Nonstandardized HIEI offers smaller positive financial returns. The clinical impact of HIEI for which quantitative estimates cannot yet be made would likely add further value. A compelling business case exists for national implementation of fully standardized HIEI.
Villa, C A; Finlayson, S; Limpus, C; Gaus, C
2015-04-15
Biomonitoring of blood is commonly used to identify and quantify occupational or environmental exposure to chemical contaminants. Increasingly, this technique has been applied to wildlife contaminant monitoring, including for green turtles, allowing for the non-lethal evaluation of chemical exposure in their nearshore environment. The sources, composition, bioavailability and toxicity of metals in the marine environment are, however, often unknown and influenced by numerous biotic and abiotic factors. These factors can vary considerably across time and space making the selection of the most informative elements for biomonitoring challenging. This study aimed to validate an ICP-MS multi-element screening method for green turtle blood in order to identify and facilitate prioritisation of target metals for subsequent fully quantitative analysis. Multi-element screening provided semiquantitative results for 70 elements, 28 of which were also determined through fully quantitative analysis. Of the 28 comparable elements, 23 of the semiquantitative results had an accuracy between 67% and 112% relative to the fully quantified values. In lieu of any available turtle certified reference materials (CRMs), we evaluated the use of human blood CRMs as a matrix surrogate for quality control, and compared two commonly used sample preparation methods for matrix related effects. The results demonstrate that human blood provides an appropriate matrix for use as a quality control material in the fully quantitative analysis of metals in turtle blood. An example for the application of this screening method is provided by comparing screening results from blood of green turtles foraging in an urban and rural region in Queensland, Australia. Potential targets for future metal biomonitoring in these regions were identified by this approach. Copyright © 2014 Elsevier B.V. All rights reserved.
Image segmentation evaluation for very-large datasets
NASA Astrophysics Data System (ADS)
Reeves, Anthony P.; Liu, Shuang; Xie, Yiting
2016-03-01
With the advent of modern machine learning methods and fully automated image analysis there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. Current approaches of visual inspection and manual markings do not scale well to big data. We present a new approach that depends on fully automated algorithm outcomes for segmentation documentation, requires no manual marking, and provides quantitative evaluation for computer algorithms. The documentation of new image segmentations and new algorithm outcomes are achieved by visual inspection. The burden of visual inspection on large datasets is minimized by (a) customized visualizations for rapid review and (b) reducing the number of cases to be reviewed through analysis of quantitative segmentation evaluation. This method has been applied to a dataset of 7,440 whole-lung CT images for 6 different segmentation algorithms designed to fully automatically facilitate the measurement of a number of very important quantitative image biomarkers. The results indicate that we could achieve 93% to 99% successful segmentation for these algorithms on this relatively large image database. The presented evaluation method may be scaled to much larger image databases.
Asymptotic analysis of discrete schemes for non-equilibrium radiation diffusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Xia, E-mail: cui_xia@iapcm.ac.cn; Yuan, Guang-wei; Shen, Zhi-jun
Motivated by providing well-behaved fully discrete schemes in practice, this paper extends the asymptotic analysis on time integration methods for non-equilibrium radiation diffusion in [2] to space discretizations. Therein studies were carried out on a two-temperature model with Larsen's flux-limited diffusion operator, both the implicitly balanced (IB) and linearly implicit (LI) methods were shown asymptotic-preserving. In this paper, we focus on asymptotic analysis for space discrete schemes in dimensions one and two. First, in construction of the schemes, in contrast to traditional first-order approximations, asymmetric second-order accurate spatial approximations are devised for flux-limiters on boundary, and discrete schemes with second-ordermore » accuracy on global spatial domain are acquired consequently. Then by employing formal asymptotic analysis, the first-order asymptotic-preserving property for these schemes and furthermore for the fully discrete schemes is shown. Finally, with the help of manufactured solutions, numerical tests are performed, which demonstrate quantitatively the fully discrete schemes with IB time evolution indeed have the accuracy and asymptotic convergence as theory predicts, hence are well qualified for both non-equilibrium and equilibrium radiation diffusion. - Highlights: • Provide AP fully discrete schemes for non-equilibrium radiation diffusion. • Propose second order accurate schemes by asymmetric approach for boundary flux-limiter. • Show first order AP property of spatially and fully discrete schemes with IB evolution. • Devise subtle artificial solutions; verify accuracy and AP property quantitatively. • Ideas can be generalized to 3-dimensional problems and higher order implicit schemes.« less
Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E
2014-01-01
This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Crespi, H. L.; Harkness, L.; Katz, J. J.; Norman, G.; Saur, W.
1969-01-01
Method allows qualitative and quantitative analysis of mixtures of partially deuterated compounds. Nuclear magnetic resonance spectroscopy determines location and amount of deuterium in organic compounds but not fully deuterated compounds. Mass spectroscopy can detect fully deuterated species but not the location.
Brody, Sarah; Anilkumar, Thapasimuthu; Liliensiek, Sara; Last, Julie A; Murphy, Christopher J; Pandit, Abhay
2006-02-01
A fully effective prosthetic heart valve has not yet been developed. A successful tissue-engineered valve prosthetic must contain a scaffold that fully supports valve endothelial cell function. Recently, topographic features of scaffolds have been shown to influence the behavior of a variety of cell types and should be considered in rational scaffold design and fabrication. The basement membrane of the aortic valve endothelium provides important parameters for tissue engineering scaffold design. This study presents a quantitative characterization of the topographic features of the native aortic valve endothelial basement membrane; topographical features were measured, and quantitative data were generated using scanning electron microscopy (SEM), atomic force microscopy (AFM), transmission electron microscopy (TEM), and light microscopy. Optimal conditions for basement membrane isolation were established. Histological, immunohistochemical, and TEM analyses following decellularization confirmed basement membrane integrity. SEM and AFM photomicrographs of isolated basement membrane were captured and quantitatively analyzed. The basement membrane of the aortic valve has a rich, felt-like, 3-D nanoscale topography, consisting of pores, fibers, and elevations. All features measured were in the sub-100 nm range. No statistical difference was found between the fibrosal and ventricular surfaces of the cusp. These data provide a rational starting point for the design of extracellular scaffolds with nanoscale topographic features that mimic those found in the native aortic heart valve basement membrane.
BRODY, SARAH; ANILKUMAR, THAPASIMUTHU; LILIENSIEK, SARA; LAST, JULIE A.; MURPHY, CHRISTOPHER J.; PANDIT, ABHAY
2016-01-01
A fully effective prosthetic heart valve has not yet been developed. A successful tissue-engineered valve prosthetic must contain a scaffold that fully supports valve endothelial cell function. Recently, topographic features of scaffolds have been shown to influence the behavior of a variety of cell types and should be considered in rational scaffold design and fabrication. The basement membrane of the aortic valve endothelium provides important parameters for tissue engineering scaffold design. This study presents a quantitative characterization of the topographic features of the native aortic valve endothelial basement membrane; topographical features were measured, and quantitative data were generated using scanning electron microscopy (SEM), atomic force microscopy (AFM), transmission electron microscopy (TEM), and light microscopy. Optimal conditions for basement membrane isolation were established. Histological, immunohistochemical, and TEM analyses following decellularization confirmed basement membrane integrity. SEM and AFM photomicrographs of isolated basement membrane were captured and quantitatively analyzed. The basement membrane of the aortic valve has a rich, felt-like, 3-D nanoscale topography, consisting of pores, fibers, and elevations. All features measured were in the sub-100 nm range. No statistical difference was found between the fibrosal and ventricular surfaces of the cusp. These data provide a rational starting point for the design of extracellular scaffolds with nanoscale topographic features that mimic those found in the native aortic heart valve basement membrane. PMID:16548699
,
2015-10-20
From 2000 to 2011, the U.S. Geological Survey conducted 139 quantitative assessments of continuous (unconventional) oil and gas accumulations within the United States. This report documents those assessments more fully than previously done by providing detailed documentation of both the assessment input and output. This report also compiles the data into spreadsheet tables that can be more readily used to provide analogs for future assessments, especially for hypothetical continuous accumulations.
Wang, Zhenyu; Li, Shiming; Ferguson, Stephen; Goodnow, Robert; Ho, Chi-Tang
2008-01-01
Polymethoxyflavones (PMFs), which exist exclusively in the citrus genus, have biological activities including anti-inflammatory, anticarcinogenic, and antiatherogenic properties. A validated RPLC method was developed for quantitative analysis of six major PMFs, namely nobiletin, tangeretin, sinensetin, 5,6,7,4'-tetramethoxyflavone, 3,5,6,7,3',4'-hexamethoxyflavone, and 3,5,6,7,8,3',4'-heptamethoxyflavone. The polar embedded LC stationary phase was able to fully resolve the six analogues. The developed method was fully validated in terms of linearity, accuracy, precision, sensitivity, and system suitability. The LOD of the method was calculated as 0.15 microg/mL and the recovery rate was between 97.0 and 105.1%. This analytical method was successfully applied to quantify the individual PMFs in four commercially available citrus peel extracts (CPEs). Each extract shows significant difference in the PMF composition and concentration. This method may provide a simple, rapid, and reliable tool to help reveal the correlation between the bioactivity of the PMF extracts and the individual PMF content.
Quantitative, spectrally-resolved intraoperative fluorescence imaging
Valdés, Pablo A.; Leblond, Frederic; Jacobs, Valerie L.; Wilson, Brian C.; Paulsen, Keith D.; Roberts, David W.
2012-01-01
Intraoperative visual fluorescence imaging (vFI) has emerged as a promising aid to surgical guidance, but does not fully exploit the potential of the fluorescent agents that are currently available. Here, we introduce a quantitative fluorescence imaging (qFI) approach that converts spectrally-resolved data into images of absolute fluorophore concentration pixel-by-pixel across the surgical field of view (FOV). The resulting estimates are linear, accurate, and precise relative to true values, and spectral decomposition of multiple fluorophores is also achieved. Experiments with protoporphyrin IX in a glioma rodent model demonstrate in vivo quantitative and spectrally-resolved fluorescence imaging of infiltrating tumor margins for the first time. Moreover, we present images from human surgery which detect residual tumor not evident with state-of-the-art vFI. The wide-field qFI technique has broad implications for intraoperative surgical guidance because it provides near real-time quantitative assessment of multiple fluorescent biomarkers across the operative field. PMID:23152935
Liu, Fang; Zhou, Zhaoye; Jang, Hyungseok; Samsonov, Alexey; Zhao, Gengyan; Kijowski, Richard
2018-04-01
To describe and evaluate a new fully automated musculoskeletal tissue segmentation method using deep convolutional neural network (CNN) and three-dimensional (3D) simplex deformable modeling to improve the accuracy and efficiency of cartilage and bone segmentation within the knee joint. A fully automated segmentation pipeline was built by combining a semantic segmentation CNN and 3D simplex deformable modeling. A CNN technique called SegNet was applied as the core of the segmentation method to perform high resolution pixel-wise multi-class tissue classification. The 3D simplex deformable modeling refined the output from SegNet to preserve the overall shape and maintain a desirable smooth surface for musculoskeletal structure. The fully automated segmentation method was tested using a publicly available knee image data set to compare with currently used state-of-the-art segmentation methods. The fully automated method was also evaluated on two different data sets, which include morphological and quantitative MR images with different tissue contrasts. The proposed fully automated segmentation method provided good segmentation performance with segmentation accuracy superior to most of state-of-the-art methods in the publicly available knee image data set. The method also demonstrated versatile segmentation performance on both morphological and quantitative musculoskeletal MR images with different tissue contrasts and spatial resolutions. The study demonstrates that the combined CNN and 3D deformable modeling approach is useful for performing rapid and accurate cartilage and bone segmentation within the knee joint. The CNN has promising potential applications in musculoskeletal imaging. Magn Reson Med 79:2379-2391, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Lipid Informed Quantitation and Identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Crowell, PNNL
2014-07-21
LIQUID (Lipid Informed Quantitation and Identification) is a software program that has been developed to enable users to conduct both informed and high-throughput global liquid chromatography-tandem mass spectrometry (LC-MS/MS)-based lipidomics analysis. This newly designed desktop application can quickly identify and quantify lipids from LC-MS/MS datasets while providing a friendly graphical user interface for users to fully explore the data. Informed data analysis simply involves the user specifying an electrospray ionization mode, lipid common name (i.e. PE(16:0/18:2)), and associated charge carrier. A stemplot of the isotopic profile and a line plot of the extracted ion chromatogram are also provided to showmore » the MS-level evidence of the identified lipid. In addition to plots, other information such as intensity, mass measurement error, and elution time are also provided. Typically, a global analysis for 15,000 lipid targets« less
Orringer, Jeffrey S; Sachs, Dana L; Shao, Yuan; Hammerberg, Craig; Cui, Yilei; Voorhees, John J; Fisher, Gary J
2012-10-01
Fractionated ablative laser resurfacing has become a widely used treatment modality. Its clinical results are often found to approach those of traditional fully ablative laser resurfacing. To directly compare the molecular changes that result from fractionated and fully ablative carbon dioxide (CO(2)) laser resurfacing in photodamaged human skin. Photodamaged skin of 34 adult volunteers was focally treated at distinct sites with a fully ablative CO(2) laser and a fractionated CO(2) laser. Serial skin samples were obtained at baseline and several time points after treatment. Real-time reverse transcriptase polymerase chain reaction technology and immunohistochemistry were used to quantify molecular responses to each type of laser treatment. Fully ablative and fractionated CO(2) laser resurfacing induced significant dermal remodeling and collagen induction. After a single treatment, fractionated ablative laser resurfacing resulted in collagen induction that was approximately 40% to 50% as pronounced as that induced by fully ablative laser resurfacing. The fundamental cutaneous responses that result from fully ablative and fractionated carbon dioxide laser resurfacing are similar but differ in magnitude and duration, with the fully ablative procedure inducing relatively greater changes including more pronounced collagen induction. However, the molecular data reported here provide substantial support for fractionated ablative resurfacing as an effective treatment modality for improving skin texture. © 2012 by the American Society for Dermatologic Surgery, Inc. Published by Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...
2016-06-22
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
The Steep Nekhoroshev's Theorem
NASA Astrophysics Data System (ADS)
Guzzo, M.; Chierchia, L.; Benettin, G.
2016-03-01
Revising Nekhoroshev's geometry of resonances, we provide a fully constructive and quantitative proof of Nekhoroshev's theorem for steep Hamiltonian systems proving, in particular, that the exponential stability exponent can be taken to be {1/(2nα_1\\cdotsα_{n-2}}) ({α_i}'s being Nekhoroshev's steepness indices and {n ≥ 3} the number of degrees of freedom). On the base of a heuristic argument, we conjecture that the new stability exponent is optimal.
NASA Astrophysics Data System (ADS)
Durst, Phillip J.; Gray, Wendell; Trentini, Michael
2013-05-01
A simple, quantitative measure for encapsulating the autonomous capabilities of unmanned systems (UMS) has yet to be established. Current models for measuring a UMS's autonomy level require extensive, operational level testing, and provide a means for assessing the autonomy level for a specific mission/task and operational environment. A more elegant technique for quantifying autonomy using component level testing of the robot platform alone, outside of mission and environment contexts, is desirable. Using a high level framework for UMS architectures, such a model for determining a level of autonomy has been developed. The model uses a combination of developmental and component level testing for each aspect of the UMS architecture to define a non-contextual autonomous potential (NCAP). The NCAP provides an autonomy level, ranging from fully non- autonomous to fully autonomous, in the form of a single numeric parameter describing the UMS's performance capabilities when operating at that level of autonomy.
Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.
Some selected quantitative methods of thermal image analysis in Matlab.
Koprowski, Robert
2016-05-01
The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantitative fluorescence tomography using a trimodality system: in vivo validation
Lin, Yuting; Barber, William C.; Iwanczyk, Jan S.; Roeck, Werner W.; Nalcioglu, Orhan; Gulsen, Gultekin
2010-01-01
A fully integrated trimodality fluorescence, diffuse optical, and x-ray computed tomography (FT∕DOT∕XCT) system for small animal imaging is reported in this work. The main purpose of this system is to obtain quantitatively accurate fluorescence concentration images using a multimodality approach. XCT offers anatomical information, while DOT provides the necessary background optical property map to improve FT image accuracy. The quantitative accuracy of this trimodality system is demonstrated in vivo. In particular, we show that a 2-mm-diam fluorescence inclusion located 8 mm deep in a nude mouse can only be localized when functional a priori information from DOT is available. However, the error in the recovered fluorophore concentration is nearly 87%. On the other hand, the fluorophore concentration can be accurately recovered within 2% error when both DOT functional and XCT structural a priori information are utilized together to guide and constrain the FT reconstruction algorithm. PMID:20799770
Support for All in the UK Work Programme? Differential Payments, Same Old Problem
Rees, James; Whitworth, Adam; Carter, Elle
2014-01-01
The UK has been a high profile policy innovator in welfare-to-work provision which has led in the Coalition government's Work Programme to a fully outsourced, ‘black box’ model with payments based overwhelmingly on job outcome results. A perennial fear in such programmes is providers' incentives to ‘cream’ and ‘park’ claimants, and the Department for Work and Pensions has sought to mitigate such provider behaviours through Work Programme design, particularly via the use of claimant groups and differential pricing. In this article, we draw on a qualitative study of providers in the programme alongside quantitative analysis of published performance data to explore evidence around creaming and parking. The combination of the quantitative and qualitative evidence suggest that creaming and parking are widespread, seem systematically embedded within the Work Programme, and are driven by a combination of intense cost-pressures and extremely ambitious performance targets alongside overly diverse claimant groups and inadequately calibrated differentiated payment levels. PMID:25411516
Support for All in the UK Work Programme? Differential Payments, Same Old Problem.
Rees, James; Whitworth, Adam; Carter, Elle
2014-04-01
The UK has been a high profile policy innovator in welfare-to-work provision which has led in the Coalition government's Work Programme to a fully outsourced, 'black box' model with payments based overwhelmingly on job outcome results. A perennial fear in such programmes is providers' incentives to 'cream' and 'park' claimants, and the Department for Work and Pensions has sought to mitigate such provider behaviours through Work Programme design, particularly via the use of claimant groups and differential pricing. In this article, we draw on a qualitative study of providers in the programme alongside quantitative analysis of published performance data to explore evidence around creaming and parking. The combination of the quantitative and qualitative evidence suggest that creaming and parking are widespread, seem systematically embedded within the Work Programme, and are driven by a combination of intense cost-pressures and extremely ambitious performance targets alongside overly diverse claimant groups and inadequately calibrated differentiated payment levels.
Fully automatic multi-atlas segmentation of CTA for partial volume correction in cardiac SPECT/CT
NASA Astrophysics Data System (ADS)
Liu, Qingyi; Mohy-ud-Din, Hassan; Boutagy, Nabil E.; Jiang, Mingyan; Ren, Silin; Stendahl, John C.; Sinusas, Albert J.; Liu, Chi
2017-05-01
Anatomical-based partial volume correction (PVC) has been shown to improve image quality and quantitative accuracy in cardiac SPECT/CT. However, this method requires manual segmentation of various organs from contrast-enhanced computed tomography angiography (CTA) data. In order to achieve fully automatic CTA segmentation for clinical translation, we investigated the most common multi-atlas segmentation methods. We also modified the multi-atlas segmentation method by introducing a novel label fusion algorithm for multiple organ segmentation to eliminate overlap and gap voxels. To evaluate our proposed automatic segmentation, eight canine 99mTc-labeled red blood cell SPECT/CT datasets that incorporated PVC were analyzed, using the leave-one-out approach. The Dice similarity coefficient of each organ was computed. Compared to the conventional label fusion method, our proposed label fusion method effectively eliminated gaps and overlaps and improved the CTA segmentation accuracy. The anatomical-based PVC of cardiac SPECT images with automatic multi-atlas segmentation provided consistent image quality and quantitative estimation of intramyocardial blood volume, as compared to those derived using manual segmentation. In conclusion, our proposed automatic multi-atlas segmentation method of CTAs is feasible, practical, and facilitates anatomical-based PVC of cardiac SPECT/CT images.
Nelson, Kären C.; Marbach-Ad, Gili; Keller, Michael; Fagan, William F.
2010-01-01
There is widespread agreement within the scientific and education communities that undergraduate biology curricula fall short in providing students with the quantitative and interdisciplinary problem-solving skills they need to obtain a deep understanding of biological phenomena and be prepared fully to contribute to future scientific inquiry. MathBench Biology Modules were designed to address these needs through a series of interactive, Web-based modules that can be used to supplement existing course content across the biological sciences curriculum. The effect of the modules was assessed in an introductory biology course at the University of Maryland. Over the course of the semester, students showed significant increases in quantitative skills that were independent of previous math course work. Students also showed increased comfort with solving quantitative problems, whether or not they ultimately arrived at the correct answer. A survey of spring 2009 graduates indicated that those who had experienced MathBench in their course work had a greater appreciation for the role of mathematics in modern biology than those who had not used MathBench. MathBench modules allow students from diverse educational backgrounds to hone their quantitative skills, preparing them for more complex mathematical approaches in upper-division courses. PMID:20810959
Thompson, Katerina V; Nelson, Kären C; Marbach-Ad, Gili; Keller, Michael; Fagan, William F
2010-01-01
There is widespread agreement within the scientific and education communities that undergraduate biology curricula fall short in providing students with the quantitative and interdisciplinary problem-solving skills they need to obtain a deep understanding of biological phenomena and be prepared fully to contribute to future scientific inquiry. MathBench Biology Modules were designed to address these needs through a series of interactive, Web-based modules that can be used to supplement existing course content across the biological sciences curriculum. The effect of the modules was assessed in an introductory biology course at the University of Maryland. Over the course of the semester, students showed significant increases in quantitative skills that were independent of previous math course work. Students also showed increased comfort with solving quantitative problems, whether or not they ultimately arrived at the correct answer. A survey of spring 2009 graduates indicated that those who had experienced MathBench in their course work had a greater appreciation for the role of mathematics in modern biology than those who had not used MathBench. MathBench modules allow students from diverse educational backgrounds to hone their quantitative skills, preparing them for more complex mathematical approaches in upper-division courses.
The Impact of Situation-Based Learning to Students’ Quantitative Literacy
NASA Astrophysics Data System (ADS)
Latifah, T.; Cahya, E.; Suhendra
2017-09-01
Nowadays, the usage of quantities can be seen almost everywhere. There has been an increase of quantitative thinking, such as quantitative reasoning and quantitative literacy, within the context of daily life. However, many people today are still not fully equipped with the knowledge of quantitative thinking. There are still a lot of individuals not having enough quantitative skills to perform well within today’s society. Based on this issue, the research aims to improve students’ quantitative literacy in junior high school. The qualitative analysis of written student work and video observations during the experiment reveal that the impact of situation-based learning affects students’ quantitative literacy.
Chen, Bin; Zhao, Kai; Li, Bo; Cai, Wenchao; Wang, Xiaoying; Zhang, Jue; Fang, Jing
2015-10-01
To demonstrate the feasibility of the improved temporal resolution by using compressed sensing (CS) combined imaging sequence in dynamic contrast-enhanced MRI (DCE-MRI) of kidney, and investigate its quantitative effects on renal perfusion measurements. Ten rabbits were included in the accelerated scans with a CS-combined 3D pulse sequence. To evaluate the image quality, the signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were compared between the proposed CS strategy and the conventional full sampling method. Moreover, renal perfusion was estimated by using the separable compartmental model in both CS simulation and realistic CS acquisitions. The CS method showed DCE-MRI images with improved temporal resolution and acceptable image contrast, while presenting significantly higher SNR than the fully sampled images (p<.01) at 2-, 3- and 4-X acceleration. In quantitative measurements, renal perfusion results were in good agreement with the fully sampled one (concordance correlation coefficient=0.95, 0.91, 0.88) at 2-, 3- and 4-X acceleration in CS simulation. Moreover, in realistic acquisitions, the estimated perfusion by the separable compartmental model exhibited no significant differences (p>.05) between each CS-accelerated acquisition and the full sampling method. The CS-combined 3D sequence could improve the temporal resolution for DCE-MRI in kidney while yielding diagnostically acceptable image quality, and it could provide effective measurements of renal perfusion. Copyright © 2015 Elsevier Inc. All rights reserved.
Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028
3D Slicer as an Image Computing Platform for the Quantitative Imaging Network
Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron
2012-01-01
Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690
Tunnel transport and interlayer excitons in bilayer fractional quantum Hall systems
NASA Astrophysics Data System (ADS)
Zhang, Yuhe; Jain, J. K.; Eisenstein, J. P.
2017-05-01
In a bilayer system consisting of a composite-fermion (CF) Fermi sea in each layer, the tunnel current is exponentially suppressed at zero bias, followed by a strong peak at a finite-bias voltage Vmax. This behavior, which is qualitatively different from that observed for the electron Fermi sea, provides fundamental insight into the strongly correlated non-Fermi-liquid nature of the CF Fermi sea and, in particular, offers a window into the short-distance high-energy physics of this highly nontrivial state. We identify the exciton responsible for the peak current and provide a quantitative account of the value of Vmax. The excitonic attraction is shown to be quantitatively significant, and its variation accounts for the increase of Vmax with the application of an in-plane magnetic field. We also estimate the critical Zeeman energy where transition occurs from a fully spin-polarized composite-fermion Fermi sea to a partially spin-polarized one, carefully incorporating corrections due to finite width and Landau level mixing, and find it to be in satisfactory agreement with the Zeeman energy where a qualitative change has been observed for the onset bias voltage [J. P. Eisenstein et al., Phys. Rev. B 94, 125409 (2016), 10.1103/PhysRevB.94.125409]. For fractional quantum Hall states, we predict a substantial discontinuous jump in Vmax when the system undergoes a transition from a fully spin-polarized state to a spin singlet or a partially spin-polarized state.
Knowles, D.B.
1955-01-01
The objective of the Ground Water Branch is to evaluate the occurrence, availability, and quality of ground water. The science of ground-water hydrology is applied toward attaining that goal. Although many ground-water investigations are of a qualitative nature, quantitative studies are necessarily an integral component of the complete evaluation of occurrence and availability. The worth of an aquifer as a fully developed source of water depends largely on two inherent characteristics: its ability to store, and its ability to transmit water. Furthermore, quantitative knowledge of these characteristics facilitates measurement of hydrologic entities such as recharge, leakage, evapotranspiration, etc. It is recognized that these two characteristics, referred to as the coefficients of storage and transmissibility, generally provide the very foundation on which quantitative studies are constructed. Within the science of ground-water hydrology, ground-water hydraulics methods are applied to determine these constats from field data.
Simulation of FRET dyes allows quantitative comparison against experimental data
NASA Astrophysics Data System (ADS)
Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander
2018-03-01
Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.
NASA Astrophysics Data System (ADS)
Sharma, Archie; Corona, Enrique; Mitra, Sunanda; Nutter, Brian S.
2006-03-01
Early detection of structural damage to the optic nerve head (ONH) is critical in diagnosis of glaucoma, because such glaucomatous damage precedes clinically identifiable visual loss. Early detection of glaucoma can prevent progression of the disease and consequent loss of vision. Traditional early detection techniques involve observing changes in the ONH through an ophthalmoscope. Stereo fundus photography is also routinely used to detect subtle changes in the ONH. However, clinical evaluation of stereo fundus photographs suffers from inter- and intra-subject variability. Even the Heidelberg Retina Tomograph (HRT) has not been found to be sufficiently sensitive for early detection. A semi-automated algorithm for quantitative representation of the optic disc and cup contours by computing accumulated disparities in the disc and cup regions from stereo fundus image pairs has already been developed using advanced digital image analysis methodologies. A 3-D visualization of the disc and cup is achieved assuming camera geometry. High correlation among computer-generated and manually segmented cup to disc ratios in a longitudinal study involving 159 stereo fundus image pairs has already been demonstrated. However, clinical usefulness of the proposed technique can only be tested by a fully automated algorithm. In this paper, we present a fully automated algorithm for segmentation of optic cup and disc contours from corresponding stereo disparity information. Because this technique does not involve human intervention, it eliminates subjective variability encountered in currently used clinical methods and provides ophthalmologists with a cost-effective and quantitative method for detection of ONH structural damage for early detection of glaucoma.
The ease and rapidity of quantitative DNA sequence detection by real-time PCR instruments promises to make their use increasingly common for the microbial analysis many different types of environmental samples. To fully exploit the capabilities of these instruments, correspondin...
Exchange-Dominated Pure Spin Current Transport in Alq3 Molecules.
Jiang, S W; Liu, S; Wang, P; Luan, Z Z; Tao, X D; Ding, H F; Wu, D
2015-08-21
We address the controversy over the spin transport mechanism in Alq3 utilizing spin pumping in the Y3Fe5O12/Alq3/Pd system. An unusual angular dependence of the inverse spin Hall effect is found. It, however, disappears when the microwave magnetic field is fully in the sample plane, excluding the presence of the Hanle effect. Together with the quantitative temperature-dependent measurements, these results provide compelling evidence that the pure spin current transport in Alq3 is dominated by the exchange-mediated mechanism.
Random diffusion and leverage effect in financial markets.
Perelló, Josep; Masoliver, Jaume
2003-03-01
We prove that Brownian market models with random diffusion coefficients provide an exact measure of the leverage effect [J-P. Bouchaud et al., Phys. Rev. Lett. 87, 228701 (2001)]. This empirical fact asserts that past returns are anticorrelated with future diffusion coefficient. Several models with random diffusion have been suggested but without a quantitative study of the leverage effect. Our analysis lets us to fully estimate all parameters involved and allows a deeper study of correlated random diffusion models that may have practical implications for many aspects of financial markets.
NASA Astrophysics Data System (ADS)
Georgiou, Mike F.; Sfakianakis, George N.; Johnson, Gary; Douligeris, Christos; Scandar, Silvia; Eisler, E.; Binkley, B.
1994-05-01
In an effort to improve patient care while considering cost-effectiveness, we developed a Picture Archiving and Communication System (PACS), which combines imaging cameras, computers and other peripheral equipment from multiple nuclear medicine vectors. The PACS provides fully-digital clinical operation which includes acquisition and automatic organization of patient data, distribution of the data to all networked units inside the department and other remote locations, digital analysis and quantitation of images, digital diagnostic reading of image studies and permanent data archival with the ability for fast retrieval. The PACS enabled us to significantly reduce the amount of film used, and we are currently proceeding with implementing a film-less laboratory. Hard copies are produced on paper or transparent sheets for non-digitally connected parts of the hospital. The PACS provides full-digital operation which is faster, more reliable, better organized and managed, and overall more efficient than a conventional film-based operation. In this paper, the integration of the various PACS components from multiple vendors is reviewed, and the impact of PACS, with its advantages and limitations on our clinical operation is analyzed.
Deep machine learning provides state-of-the-art performance in image-based plant phenotyping.
Pound, Michael P; Atkinson, Jonathan A; Townsend, Alexandra J; Wilson, Michael H; Griffiths, Marcus; Jackson, Aaron S; Bulat, Adrian; Tzimiropoulos, Georgios; Wells, Darren M; Murchie, Erik H; Pridmore, Tony P; French, Andrew P
2017-10-01
In plant phenotyping, it has become important to be able to measure many features on large image sets in order to aid genetic discovery. The size of the datasets, now often captured robotically, often precludes manual inspection, hence the motivation for finding a fully automated approach. Deep learning is an emerging field that promises unparalleled results on many data analysis problems. Building on artificial neural networks, deep approaches have many more hidden layers in the network, and hence have greater discriminative and predictive power. We demonstrate the use of such approaches as part of a plant phenotyping pipeline. We show the success offered by such techniques when applied to the challenging problem of image-based plant phenotyping and demonstrate state-of-the-art results (>97% accuracy) for root and shoot feature identification and localization. We use fully automated trait identification using deep learning to identify quantitative trait loci in root architecture datasets. The majority (12 out of 14) of manually identified quantitative trait loci were also discovered using our automated approach based on deep learning detection to locate plant features. We have shown deep learning-based phenotyping to have very good detection and localization accuracy in validation and testing image sets. We have shown that such features can be used to derive meaningful biological traits, which in turn can be used in quantitative trait loci discovery pipelines. This process can be completely automated. We predict a paradigm shift in image-based phenotyping bought about by such deep learning approaches, given sufficient training sets. © The Authors 2017. Published by Oxford University Press.
Understanding Quantitative and Qualitative Research in Early Childhood Education.
ERIC Educational Resources Information Center
Goodwin, William L.; Goodwin, Laura D.
This book describes the research process in order to facilitate understanding of the process and its products, especially as they pertain to early childhood education. It examines both quantitative and qualitative research methods, emphasizing ways in which they can be used together to fully study a given phenomenon or topic. Chapter 1 examines…
Modern Projection of the Old Electroscope for Nuclear Radiation Quantitative Work and Demonstrations
ERIC Educational Resources Information Center
Bastos, Rodrigo Oliveira; Boch, Layara Baltokoski
2017-01-01
Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple…
Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W
2016-04-01
Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.
Fully automated segmentation of callus by micro-CT compared to biomechanics.
Bissinger, Oliver; Götz, Carolin; Wolff, Klaus-Dietrich; Hapfelmeier, Alexander; Prodinger, Peter Michael; Tischer, Thomas
2017-07-11
A high percentage of closed femur fractures have slight comminution. Using micro-CT (μCT), multiple fragment segmentation is much more difficult than segmentation of unfractured or osteotomied bone. Manual or semi-automated segmentation has been performed to date. However, such segmentation is extremely laborious, time-consuming and error-prone. Our aim was to therefore apply a fully automated segmentation algorithm to determine μCT parameters and examine their association with biomechanics. The femura of 64 rats taken after randomised inhibitory or neutral medication, in terms of the effect on fracture healing, and controls were closed fractured after a Kirschner wire was inserted. After 21 days, μCT and biomechanical parameters were determined by a fully automated method and correlated (Pearson's correlation). The fully automated segmentation algorithm automatically detected bone and simultaneously separated cortical bone from callus without requiring ROI selection for each single bony structure. We found an association of structural callus parameters obtained by μCT to the biomechanical properties. However, results were only explicable by additionally considering the callus location. A large number of slightly comminuted fractures in combination with therapies that influence the callus qualitatively and/or quantitatively considerably affects the association between μCT and biomechanics. In the future, contrast-enhanced μCT imaging of the callus cartilage might provide more information to improve the non-destructive and non-invasive prediction of callus mechanical properties. As studies evaluating such important drugs increase, fully automated segmentation appears to be clinically important.
Bischof, Sylvain; Umhang, Martin; Eicke, Simona; Streb, Sebastian; Qi, Weihong; Zeeman, Samuel C.
2013-01-01
The branched glucans glycogen and starch are the most widespread storage carbohydrates in living organisms. The production of semicrystalline starch granules in plants is more complex than that of small, soluble glycogen particles in microbes and animals. However, the factors determining whether glycogen or starch is formed are not fully understood. The tropical tree Cecropia peltata is a rare example of an organism able to make either polymer type. Electron micrographs and quantitative measurements show that glycogen accumulates to very high levels in specialized myrmecophytic structures (Müllerian bodies), whereas starch accumulates in leaves. Compared with polymers comprising leaf starch, glycogen is more highly branched and has shorter branches—factors that prevent crystallization and explain its solubility. RNA sequencing and quantitative shotgun proteomics reveal that isoforms of all three classes of glucan biosynthetic enzyme (starch/glycogen synthases, branching enzymes, and debranching enzymes) are differentially expressed in Müllerian bodies and leaves, providing a system-wide view of the quantitative programming of storage carbohydrate metabolism. This work will prompt targeted analysis in model organisms and cross-species comparisons. Finally, as starch is the major carbohydrate used for food and industrial applications worldwide, these data provide a basis for manipulating starch biosynthesis in crops to synthesize tailor-made polyglucans. PMID:23632447
Flexible automated approach for quantitative liquid handling of complex biological samples.
Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H
2007-11-01
A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.
Vocal development in a Waddington landscape
Teramoto, Yayoi; Takahashi, Daniel Y; Holmes, Philip; Ghazanfar, Asif A
2017-01-01
Vocal development is the adaptive coordination of the vocal apparatus, muscles, the nervous system, and social interaction. Here, we use a quantitative framework based on optimal control theory and Waddington’s landscape metaphor to provide an integrated view of this process. With a biomechanical model of the marmoset monkey vocal apparatus and behavioral developmental data, we show that only the combination of the developing vocal tract, vocal apparatus muscles and nervous system can fully account for the patterns of vocal development. Together, these elements influence the shape of the monkeys’ vocal developmental landscape, tilting, rotating or shifting it in different ways. We can thus use this framework to make quantitative predictions regarding how interfering factors or experimental perturbations can change the landscape within a species, or to explain comparative differences in vocal development across species DOI: http://dx.doi.org/10.7554/eLife.20782.001 PMID:28092262
2016-09-15
Investigative Questions This research will quantitatively address the impact of proposed benefits of a 3D printed satellite architecture on the...subsystems of a CubeSat. The objective of this research is to bring a quantitative analysis to the discussion of whether a fully 3D printed satellite...manufacturers to quantitatively address what impact the architecture would have on the subsystems of a CubeSat. Summary of Research Gap, Research Questions, and
Introductory science and mathematics education for 21st-Century biologists.
Bialek, William; Botstein, David
2004-02-06
Galileo wrote that "the book of nature is written in the language of mathematics"; his quantitative approach to understanding the natural world arguably marks the beginning of modern science. Nearly 400 years later, the fragmented teaching of science in our universities still leaves biology outside the quantitative and mathematical culture that has come to define the physical sciences and engineering. This strikes us as particularly inopportune at a time when opportunities for quantitative thinking about biological systems are exploding. We propose that a way out of this dilemma is a unified introductory science curriculum that fully incorporates mathematics and quantitative thinking.
Temporal lobe epilepsy: quantitative MR volumetry in detection of hippocampal atrophy.
Farid, Nikdokht; Girard, Holly M; Kemmotsu, Nobuko; Smith, Michael E; Magda, Sebastian W; Lim, Wei Y; Lee, Roland R; McDonald, Carrie R
2012-08-01
To determine the ability of fully automated volumetric magnetic resonance (MR) imaging to depict hippocampal atrophy (HA) and to help correctly lateralize the seizure focus in patients with temporal lobe epilepsy (TLE). This study was conducted with institutional review board approval and in compliance with HIPAA regulations. Volumetric MR imaging data were analyzed for 34 patients with TLE and 116 control subjects. Structural volumes were calculated by using U.S. Food and Drug Administration-cleared software for automated quantitative MR imaging analysis (NeuroQuant). Results of quantitative MR imaging were compared with visual detection of atrophy, and, when available, with histologic specimens. Receiver operating characteristic analyses were performed to determine the optimal sensitivity and specificity of quantitative MR imaging for detecting HA and asymmetry. A linear classifier with cross validation was used to estimate the ability of quantitative MR imaging to help lateralize the seizure focus. Quantitative MR imaging-derived hippocampal asymmetries discriminated patients with TLE from control subjects with high sensitivity (86.7%-89.5%) and specificity (92.2%-94.1%). When a linear classifier was used to discriminate left versus right TLE, hippocampal asymmetry achieved 94% classification accuracy. Volumetric asymmetries of other subcortical structures did not improve classification. Compared with invasive video electroencephalographic recordings, lateralization accuracy was 88% with quantitative MR imaging and 85% with visual inspection of volumetric MR imaging studies but only 76% with visual inspection of clinical MR imaging studies. Quantitative MR imaging can depict the presence and laterality of HA in TLE with accuracy rates that may exceed those achieved with visual inspection of clinical MR imaging studies. Thus, quantitative MR imaging may enhance standard visual analysis, providing a useful and viable means for translating volumetric analysis into clinical practice.
Building Extraction from Remote Sensing Data Using Fully Convolutional Networks
NASA Astrophysics Data System (ADS)
Bittner, K.; Cui, S.; Reinartz, P.
2017-05-01
Building detection and footprint extraction are highly demanded for many remote sensing applications. Though most previous works have shown promising results, the automatic extraction of building footprints still remains a nontrivial topic, especially in complex urban areas. Recently developed extensions of the CNN framework made it possible to perform dense pixel-wise classification of input images. Based on these abilities we propose a methodology, which automatically generates a full resolution binary building mask out of a Digital Surface Model (DSM) using a Fully Convolution Network (FCN) architecture. The advantage of using the depth information is that it provides geometrical silhouettes and allows a better separation of buildings from background as well as through its invariance to illumination and color variations. The proposed framework has mainly two steps. Firstly, the FCN is trained on a large set of patches consisting of normalized DSM (nDSM) as inputs and available ground truth building mask as target outputs. Secondly, the generated predictions from FCN are viewed as unary terms for a Fully connected Conditional Random Fields (FCRF), which enables us to create a final binary building mask. A series of experiments demonstrate that our methodology is able to extract accurate building footprints which are close to the buildings original shapes to a high degree. The quantitative and qualitative analysis show the significant improvements of the results in contrast to the multy-layer fully connected network from our previous work.
Planner-Based Control of Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott
2005-01-01
The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.
Xue, Angli; Wang, Hongcheng; Zhu, Jun
2017-09-28
Startle behavior is important for survival, and abnormal startle responses are related to several neurological diseases. Drosophila melanogaster provides a powerful system to investigate the genetic underpinnings of variation in startle behavior. Since mechanically induced, startle responses and environmental conditions can be readily quantified and precisely controlled. The 156 wild-derived fully sequenced lines of the Drosophila Genetic Reference Panel (DGRP) were used to identify SNPs and transcripts associated with variation in startle behavior. The results validated highly significant effects of 33 quantitative trait SNPs (QTSs) and 81 quantitative trait transcripts (QTTs) directly associated with phenotypic variation of startle response. We also detected QTT variation controlled by 20 QTSs (tQTSs) and 73 transcripts (tQTTs). Association mapping based on genomic and transcriptomic data enabled us to construct a complex genetic network that underlies variation in startle behavior. Based on principles of evolutionary conservation, human orthologous genes could be superimposed on this network. This study provided both genetic and biological insights into the variation of startle response behavior of Drosophila melanogaster, and highlighted the importance of genetic network to understand the genetic architecture of complex traits.
Verplaetse, Ruth; Henion, Jack
2016-01-01
Opioids are well known, widely used painkillers. Increased stability of opioids in the dried blood spot (DBS) matrix compared to blood/plasma has been described. Other benefits provided by DBS techniques include point-of-care collection, less invasive micro sampling, more economical shipment, and convenient storage. Current methodology for analysis of micro whole blood samples for opioids is limited to the classical DBS workflow, including tedious manual punching of the DBS cards followed by extraction and liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. The goal of this study was to develop and validate a fully automated on-line sample preparation procedure for the analysis of DBS micro samples relevant to the detection of opioids in finger prick blood. To this end, automated flow-through elution of DBS cards was followed by on-line solid-phase extraction (SPE) and analysis by LC-MS/MS. Selective, sensitive, accurate, and reproducible quantitation of five representative opioids in human blood at sub-therapeutic, therapeutic, and toxic levels was achieved. The range of reliable response (R(2) ≥0.997) was 1 to 500 ng/mL whole blood for morphine, codeine, oxycodone, hydrocodone; and 0.1 to 50 ng/mL for fentanyl. Inter-day, intra-day, and matrix inter-lot accuracy and precision was less than 15% (even at lower limits of quantitation (LLOQ) level). The method was successfully used to measure hydrocodone and its major metabolite norhydrocodone in incurred human samples. Our data support the enormous potential of DBS sampling and automated analysis for monitoring opioids as well as other pharmaceuticals in both anti-doping and pain management regimens. Copyright © 2015 John Wiley & Sons, Ltd.
General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.
de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael
2016-11-01
Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.
Choe, Leila H; Lee, Kelvin H
2003-10-01
We investigate one approach to assess the quantitative variability in two-dimensional gel electrophoresis (2-DE) separations based on gel-to-gel variability, sample preparation variability, sample load differences, and the effect of automation on image analysis. We observe that 95% of spots present in three out of four replicate gels exhibit less than a 0.52 coefficient of variation (CV) in fluorescent stain intensity (% volume) for a single sample run on multiple gels. When four parallel sample preparations are performed, this value increases to 0.57. We do not observe any significant change in quantitative value for an increase or decrease in sample load of 30% when using appropriate image analysis variables. Increasing use of automation, while necessary in modern 2-DE experiments, does change the observed level of quantitative and qualitative variability among replicate gels. The number of spots that change qualitatively for a single sample run in parallel varies from a CV = 0.03 for fully manual analysis to CV = 0.20 for a fully automated analysis. We present a systematic method by which a single laboratory can measure gel-to-gel variability using only three gel runs.
Smith, Eric G.
2015-01-01
Background: Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method: A method is presented that exploits the recently-identified phenomenon of “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors. Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure. Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results: Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met. Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations: Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions: To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward. The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226
Wadephul, Franziska; Jones, Catriona; Jomeen, Julie
2016-06-08
Depression, anxiety and stress in the perinatal period can have serious, long-term consequences for women, their babies and their families. Over the last two decades, an increasing number of group interventions with a psychological approach have been developed to improve the psychological well-being of pregnant women. This systematic review examines interventions targeting women with elevated symptoms of, or at risk of developing, perinatal mental health problems, with the aim of understanding the successful and unsuccessful features of these interventions. We systematically searched online databases to retrieve qualitative and quantitative studies on psychological antenatal group interventions. A total number of 19 papers describing 15 studies were identified; these included interventions based on cognitive behavioural therapy, interpersonal therapy and mindfulness. Quantitative findings suggested beneficial effects in some studies, particularly for women with high baseline symptoms. However, overall there is insufficient quantitative evidence to make a general recommendation for antenatal group interventions. Qualitative findings suggest that women and their partners experience these interventions positively in terms of psychological wellbeing and providing reassurance of their 'normality'. This review suggests that there are some benefits to attending group interventions, but further research is required to fully understand their successful and unsuccessful features.
Modern projection of the old electroscope for nuclear radiation quantitative work and demonstrations
NASA Astrophysics Data System (ADS)
Oliveira Bastos, Rodrigo; Baltokoski Boch, Layara
2017-11-01
Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple apparatus that has been widely used for educational purposes, although generally for qualitative work. The main objective is to show the possibility of measuring radioactivity not only in qualitative demonstrations, but also in quantitative experimental practices. The experimental set-up is a low-cost ion chamber connected to an electroscope in a configuration that is very similar to that used by Marie and Pierre Currie, Rutherford, Geiger, Pacini, Hess and other great researchers from the time of the big discoveries in nuclear and high-energy particle physics. An electroscope leaf is filmed and projected, permitting the collection of quantitative data for the measurement of the 220Rn half-life, collected from the emanation of the lantern mantles. The article presents the experimental procedures and the expected results, indicating that the experiment may provide support for nuclear physics classes. These practices could spread widely to either university or school didactic laboratories, and the apparatus has the potential to allow the development of new teaching activity for nuclear physics.
Šrámková, Ivana; Amorim, Célia G; Sklenářová, Hana; Montenegro, Maria C B M; Horstkotte, Burkhard; Araújo, Alberto N; Solich, Petr
2014-01-01
In this work, an application of an enzymatic reaction for the determination of the highly hydrophobic drug propofol in emulsion dosage form is presented. Emulsions represent a complex and therefore challenging matrix for analysis. Ethanol was used for breakage of a lipid emulsion, which enabled optical detection. A fully automated method based on Sequential Injection Analysis was developed, allowing propofol determination without the requirement of tedious sample pre-treatment. The method was based on spectrophotometric detection after the enzymatic oxidation catalysed by horseradish peroxidase and subsequent coupling with 4-aminoantipyrine leading to a coloured product with an absorbance maximum at 485 nm. This procedure was compared with a simple fluorimetric method, which was based on the direct selective fluorescence emission of propofol in ethanol at 347 nm. Both methods provide comparable validation parameters with linear working ranges of 0.005-0.100 mg mL(-1) and 0.004-0.243 mg mL(-1) for the spectrophotometric and fluorimetric methods, respectively. The detection and quantitation limits achieved with the spectrophotometric method were 0.0016 and 0.0053 mg mL(-1), respectively. The fluorimetric method provided the detection limit of 0.0013 mg mL(-1) and limit of quantitation of 0.0043 mg mL(-1). The RSD did not exceed 5% and 2% (n=10), correspondingly. A sample throughput of approx. 14 h(-1) for the spectrophotometric and 68 h(-1) for the fluorimetric detection was achieved. Both methods proved to be suitable for the determination of propofol in pharmaceutical formulation with average recovery values of 98.1 and 98.5%. © 2013 Elsevier B.V. All rights reserved.
Umesh Agarwal; Sally A. Ralph
2003-01-01
With the objective of using FT-Raman to quantitatively analyze ethylenic units in lignin in thermomechanical pulps (TMPs), coniferyl alcohol, coniferin, coniferaldehyde, and G-DHP lignin models were used to first demonstrate that the technique was fully capable of quantifying ring conjugated ethylenic units. Based on this result, the amount of ethylenic units in TMP...
Design of an Electric Propulsion System for SCEPTOR
NASA Technical Reports Server (NTRS)
Dubois, Arthur; van der Geest, Martin; Bevirt, JoeBen; Clarke, Sean; Christie, Robert J.; Borer, Nicholas K.
2016-01-01
The rise of electric propulsion systems has pushed aircraft designers towards new and potentially transformative concepts. As part of this effort, NASA is leading the SCEPTOR program which aims at designing a fully electric distributed propulsion general aviation aircraft. This article highlights critical aspects of the design of SCEPTOR's propulsion system conceived at Joby Aviation in partnership with NASA, including motor electromagnetic design and optimization as well as cooling system integration. The motor is designed with a finite element based multi-objective optimization approach. This provides insight into important design tradeoffs such as mass versus efficiency, and enables a detailed quantitative comparison between different motor topologies. Secondly, a complete design and Computational Fluid Dynamics analysis of the air breathing cooling system is presented. The cooling system is fully integrated into the nacelle, contains little to no moving parts and only incurs a small drag penalty. Several concepts are considered and compared over a range of operating conditions. The study presents trade-offs between various parameters such as cooling efficiency, drag, mechanical simplicity and robustness.
Phenotypic convergence in bacterial adaptive evolution to ethanol stress.
Horinouchi, Takaaki; Suzuki, Shingo; Hirasawa, Takashi; Ono, Naoaki; Yomo, Tetsuya; Shimizu, Hiroshi; Furusawa, Chikara
2015-09-03
Bacterial cells have a remarkable ability to adapt to environmental changes, a phenomenon known as adaptive evolution. During adaptive evolution, phenotype and genotype dynamically changes; however, the relationship between these changes and associated constraints is yet to be fully elucidated. In this study, we analyzed phenotypic and genotypic changes in Escherichia coli cells during adaptive evolution to ethanol stress. Phenotypic changes were quantified by transcriptome and metabolome analyses and were similar among independently evolved ethanol tolerant populations, which indicate the existence of evolutionary constraints in the dynamics of adaptive evolution. Furthermore, the contribution of identified mutations in one of the tolerant strains was evaluated using site-directed mutagenesis. The result demonstrated that the introduction of all identified mutations cannot fully explain the observed tolerance in the tolerant strain. The results demonstrated that the convergence of adaptive phenotypic changes and diverse genotypic changes, which suggested that the phenotype-genotype mapping is complex. The integration of transcriptome and genome data provides a quantitative understanding of evolutionary constraints.
Soto, Juan M; Rodrigo, José A; Alieva, Tatiana
2018-01-01
Quantitative label-free imaging is an important tool for the study of living microorganisms that, during the last decade, has attracted wide attention from the optical community. Optical diffraction tomography (ODT) is probably the most relevant technique for quantitative label-free 3D imaging applied in wide-field microscopy in the visible range. The ODT is usually performed using spatially coherent light illumination and specially designed holographic microscopes. Nevertheless, the ODT is also compatible with partially coherent illumination and can be realized in conventional wide-field microscopes by applying refocusing techniques, as it has been recently demonstrated. Here, we compare these two ODT modalities, underlining their pros and cons and discussing the optical setups for their implementation. In particular, we pay special attention to a system that is compatible with a conventional wide-field microscope that can be used for both ODT modalities. It consists of two easily attachable modules: the first for sample illumination engineering based on digital light processing technology; the other for focus scanning by using an electrically driven tunable lens. This hardware allows for a programmable selection of the wavelength and the illumination design, and provides fast data acquisition as well. Its performance is experimentally demonstrated in the case of ODT with partially coherent illumination providing speckle-free 3D quantitative imaging.
NASA Astrophysics Data System (ADS)
Mehta, Shalin B.; Sheppard, Colin J. R.
2010-05-01
Various methods that use large illumination aperture (i.e. partially coherent illumination) have been developed for making transparent (i.e. phase) specimens visible. These methods were developed to provide qualitative contrast rather than quantitative measurement-coherent illumination has been relied upon for quantitative phase analysis. Partially coherent illumination has some important advantages over coherent illumination and can be used for measurement of the specimen's phase distribution. However, quantitative analysis and image computation in partially coherent systems have not been explored fully due to the lack of a general, physically insightful and computationally efficient model of image formation. We have developed a phase-space model that satisfies these requirements. In this paper, we employ this model (called the phase-space imager) to elucidate five different partially coherent systems mentioned in the title. We compute images of an optical fiber under these systems and verify some of them with experimental images. These results and simulated images of a general phase profile are used to compare the contrast and the resolution of the imaging systems. We show that, for quantitative phase imaging of a thin specimen with matched illumination, differential phase contrast offers linear transfer of specimen information to the image. We also show that the edge enhancement properties of spiral phase contrast are compromised significantly as the coherence of illumination is reduced. The results demonstrate that the phase-space imager model provides a useful framework for analysis, calibration, and design of partially coherent imaging methods.
QACD: A method for the quantitative assessment of compositional distribution in geologic materials
NASA Astrophysics Data System (ADS)
Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.
2017-12-01
In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.
Providers' perceptions of spinal cord injury pressure ulcer guidelines.
Thomason, Susan S; Evitt, Celinda P; Harrow, Jeffrey J; Love, Linda; Moore, D Helen; Mullins, Maria A; Powell-Cope, Gail; Nelson, Audrey L
2007-01-01
Pressure ulcers are a serious complication for people with spinal cord injury (SCI). The Consortium for Spinal Cord Medicine (CSCM) published clinical practice guidelines (CPGs) that provided guidance for pressure ulcer prevention and treatment after SCI. The aim of this study was to assess providers' perceptions for each of the 32 CPG recommendations regarding their agreement with CPGs, degree of CPG implementation, and CPG implementation barriers and facilitators. This descriptive mixed-methods study included both qualitative (focus groups) and quantitative (survey) data collection approaches. The sample (n = 60) included 24 physicians and 36 nurses who attended the 2004 annual national conferences of the American Paraplegia Society or American Association of Spinal Cord Injury Nurses. This sample drew from two sources: a purposive sample from a list of preregistered participants and a convenience sample of conference attendee volunteers. We analyzed quantitative data using descriptive statistics and qualitative data using a coding scheme to capture barriers and facilitators. The focus groups agreed unanimously on the substance of 6 of the 32 recommendations. Nurse and physician focus groups disagreed on the degree of CGP implementation at their sites, with nurses as a group perceiving less progress in implementation of the guideline recommendations. The focus groups identified only one recommendation, complications of surgery, as being fully implemented at their sites. Categories of barriers and facilitators for implementation of CPGs that emerged from the qualitative analysis included (a) characteristics of CPGs: need for research/evidence, (b) characteristics of CPGs: complexity of design and wording, (c) organizational factors, (d) lack of knowledge, and (e) lack of resources. Although generally SCI physicians and nurses agreed with the CPG recommendations as written, they did not feel these recommendations were fully implemented in their respective clinical settings. The focus groups identified multiple barriers to the implementation of the CPGs and suggested several facilitators/solutions to improve implementation of these guidelines in SCI. Participants identified organizational factors and the lack of knowledge as the most substantial systems/issues that created barriers to CPG implementation.
Bosonic excitations and electron pairing in an electron-doped cuprate superconductor
NASA Astrophysics Data System (ADS)
Wang, M. C.; Yu, H. S.; Xiong, J.; Yang, Y.-F.; Luo, S. N.; Jin, K.; Qi, J.
2018-04-01
By applying ultrafast optical spectroscopy to electron-doped La1.9Ce0.1CuO4 ±δ , we discern a bosonic mode of electronic origin and provide the evolution of its coupling with the charge carriers as a function of temperature. Our results show that it has the strongest coupling strength near Tc and can fully account for the superconducting pairing. This mode can be associated with the two-dimensional antiferromagnetic spin correlations emerging below a critical temperature T† larger than Tc. Our work may help to establish a quantitative relation between bosonic excitations and superconducting pairing in electron-doped cuprates.
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.
1988-01-01
The effects of actual variations, also called uncertainties, in geometry and material properties on the structural response of a space shuttle main engine turbopump blade are evaluated. A normal distribution was assumed to represent the uncertainties statistically. Uncertainties were assumed to be totally random, partially correlated, and fully correlated. The magnitude of these uncertainties were represented in terms of mean and variance. Blade responses, recorded in terms of displacements, natural frequencies, and maximum stress, was evaluated and plotted in the form of probabilistic distributions under combined uncertainties. These distributions provide an estimate of the range of magnitudes of the response and probability of occurrence of a given response. Most importantly, these distributions provide the information needed to estimate quantitatively the risk in a structural design.
Theory of sampling: four critical success factors before analysis.
Wagner, Claas; Esbensen, Kim H
2015-01-01
Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.
33 CFR 325.1 - Applications for permits.
Code of Federal Regulations, 2012 CFR
2012-07-01
... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...
33 CFR 325.1 - Applications for permits.
Code of Federal Regulations, 2010 CFR
2010-07-01
... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...
33 CFR 325.1 - Applications for permits.
Code of Federal Regulations, 2013 CFR
2013-07-01
... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...
33 CFR 325.1 - Applications for permits.
Code of Federal Regulations, 2011 CFR
2011-07-01
... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...
33 CFR 325.1 - Applications for permits.
Code of Federal Regulations, 2014 CFR
2014-07-01
... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...
Summers, Ronald M; Baecher, Nicolai; Yao, Jianhua; Liu, Jiamin; Pickhardt, Perry J; Choi, J Richard; Hill, Suvimol
2011-01-01
To show the feasibility of calculating the bone mineral density (BMD) from computed tomographic colonography (CTC) scans using fully automated software. Automated BMD measurement software was developed that measures the BMD of the first and second lumbar vertebrae on computed tomography and calculates the mean of the 2 values to provide a per patient BMD estimate. The software was validated in a reference population of 17 consecutive women who underwent quantitative computed tomography and in a population of 475 women from a consecutive series of asymptomatic patients enrolled in a CTC screening trial conducted at 3 medical centers. The mean (SD) BMD was 133.6 (34.6) mg/mL (95% confidence interval, 130.5-136.7; n = 475). In women aged 42 to 60 years (n = 316) and 61 to 79 years (n = 159), the mean (SD) BMDs were 143.1 (33.5) and 114.7 (28.3) mg/mL, respectively (P < 0.0001). Fully automated BMD measurements were reproducible for a given patient with 95% limits of agreement of -9.79 to 8.46 mg/mL for the mean difference between paired assessments on supine and prone CTC. Osteoporosis screening can be performed simultaneously with screening for colorectal polyps.
NASA Astrophysics Data System (ADS)
Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-05-01
A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.
Quantitative profiling of sphingolipids in wild Cordyceps and its mycelia by using UHPLC-MS
Mi, Jia-Ning; Wang, Jing-Rong; Jiang, Zhi-Hong
2016-01-01
In the present study, 101 sphingolipids in wild Cordyceps and its five mycelia were quantitatively profiled by using a fully validated UHPLC-MS method. The results revealed that a general rank order for the abundance of different classes of sphingolipids in wild Cordyceps and its mycelia is sphingoid bases/ceramides > phosphosphingolipids > glycosphingolipids. However, remarkable sphingolipid differences between wild Cordyceps and its mycelia were observed. One is that sphingoid base is the dominant sphingolipid in wild Cordyceps, whereas ceramide is the major sphingolipid in mycelia. Another difference is that the abundance of sphingomyelins in wild Cordyceps is almost 10-folds higher than those in most mycelia. The third one is that mycelia contain more inositol phosphorylceramides and glycosphingolipids than wild Cordyceps. Multivariate analysis was further employed to visualize the difference among wild Cordyceps and different mycelia, leading to the identification of respective sphingolipids as potential chemical markers for the differentiation of wild Cordyceps and its related mycelia. This study represents the first report on the quantitative profiling of sphingolipids in wild Cordyceps and its related mycelia, which provided comprehensive chemical evidence for the quality control and rational utilization of wild Cordyceps and its mycelia. PMID:26868933
Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-05-07
A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.
Tang, Xiaoying; Luo, Yuan; Chen, Zhibin; Huang, Nianwei; Johnson, Hans J.; Paulsen, Jane S.; Miller, Michael I.
2018-01-01
In this paper, we present a fully-automated subcortical and ventricular shape generation pipeline that acts on structural magnetic resonance images (MRIs) of the human brain. Principally, the proposed pipeline consists of three steps: (1) automated structure segmentation using the diffeomorphic multi-atlas likelihood-fusion algorithm; (2) study-specific shape template creation based on the Delaunay triangulation; (3) deformation-based shape filtering using the large deformation diffeomorphic metric mapping for surfaces. The proposed pipeline is shown to provide high accuracy, sufficient smoothness, and accurate anatomical topology. Two datasets focused upon Huntington's disease (HD) were used for evaluating the performance of the proposed pipeline. The first of these contains a total of 16 MRI scans, each with a gold standard available, on which the proposed pipeline's outputs were observed to be highly accurate and smooth when compared with the gold standard. Visual examinations and outlier analyses on the second dataset, which contains a total of 1,445 MRI scans, revealed 100% success rates for the putamen, the thalamus, the globus pallidus, the amygdala, and the lateral ventricle in both hemispheres and rates no smaller than 97% for the bilateral hippocampus and caudate. Another independent dataset, consisting of 15 atlas images and 20 testing images, was also used to quantitatively evaluate the proposed pipeline, with high accuracy having been obtained. In short, the proposed pipeline is herein demonstrated to be effective, both quantitatively and qualitatively, using a large collection of MRI scans. PMID:29867332
Tang, Xiaoying; Luo, Yuan; Chen, Zhibin; Huang, Nianwei; Johnson, Hans J; Paulsen, Jane S; Miller, Michael I
2018-01-01
In this paper, we present a fully-automated subcortical and ventricular shape generation pipeline that acts on structural magnetic resonance images (MRIs) of the human brain. Principally, the proposed pipeline consists of three steps: (1) automated structure segmentation using the diffeomorphic multi-atlas likelihood-fusion algorithm; (2) study-specific shape template creation based on the Delaunay triangulation; (3) deformation-based shape filtering using the large deformation diffeomorphic metric mapping for surfaces. The proposed pipeline is shown to provide high accuracy, sufficient smoothness, and accurate anatomical topology. Two datasets focused upon Huntington's disease (HD) were used for evaluating the performance of the proposed pipeline. The first of these contains a total of 16 MRI scans, each with a gold standard available, on which the proposed pipeline's outputs were observed to be highly accurate and smooth when compared with the gold standard. Visual examinations and outlier analyses on the second dataset, which contains a total of 1,445 MRI scans, revealed 100% success rates for the putamen, the thalamus, the globus pallidus, the amygdala, and the lateral ventricle in both hemispheres and rates no smaller than 97% for the bilateral hippocampus and caudate. Another independent dataset, consisting of 15 atlas images and 20 testing images, was also used to quantitatively evaluate the proposed pipeline, with high accuracy having been obtained. In short, the proposed pipeline is herein demonstrated to be effective, both quantitatively and qualitatively, using a large collection of MRI scans.
Quantitative microstructural imaging by scanning Laue x-ray micro- and nanodiffraction
Chen, Xian; Dejoie, Catherine; Jiang, Tengfei; ...
2016-06-08
We present that local crystal structure, crystal orientation, and crystal deformation can all be probed by Laue diffraction using a submicron x-ray beam. This technique, employed at a synchrotron facility, is particularly suitable for fast mapping the mechanical and microstructural properties of inhomogeneous multiphase polycrystalline samples, as well as imperfect epitaxial films or crystals. As synchrotron Laue x-ray microdiffraction enters its 20th year of existence and new synchrotron nanoprobe facilities are being built and commissioned around the world, we take the opportunity to overview current capabilities as well as the latest technical developments. Fast data collection provided by state-of-the-art areamore » detectors and fully automated pattern indexing algorithms optimized for speed make it possible to map large portions of a sample with fine step size and obtain quantitative images of its microstructure in near real time. Lastly, we extrapolate how the technique is anticipated to evolve in the near future and its potential emerging applications at a free-electron laser facility.« less
Stability and instability towards delocalization in many-body localization systems
NASA Astrophysics Data System (ADS)
De Roeck, Wojciech; Huveneers, François
2017-04-01
We propose a theory that describes quantitatively the (in)stability of fully many-body localization (MBL) systems due to ergodic, i.e., delocalized, grains, that can be, for example, due to disorder fluctuations. The theory is based on the ETH hypothesis and elementary notions of perturbation theory. The main idea is that we assume as much chaoticity as is consistent with conservation laws. The theory describes correctly—even without relying on the theory of local integrals of motion (LIOM)—the MBL phase in one dimension at strong disorder. It yields an explicit and quantitative picture of the spatial boundary between localized and ergodic systems. We provide numerical evidence for this picture. When the theory is taken to its extreme logical consequences, it predicts that the MBL phase is destabilised in the long time limit whenever (1) interactions decay slower than exponentially in d =1 and (2) always in d >1 . Finer numerics is required to assess these predictions.
Machado, Ana S; Darmohray, Dana M; Fayad, João; Marques, Hugo G; Carey, Megan R
2015-01-01
The coordination of movement across the body is a fundamental, yet poorly understood aspect of motor control. Mutant mice with cerebellar circuit defects exhibit characteristic impairments in locomotor coordination; however, the fundamental features of this gait ataxia have not been effectively isolated. Here we describe a novel system (LocoMouse) for analyzing limb, head, and tail kinematics of freely walking mice. Analysis of visibly ataxic Purkinje cell degeneration (pcd) mice reveals that while differences in the forward motion of individual paws are fully accounted for by changes in walking speed and body size, more complex 3D trajectories and, especially, inter-limb and whole-body coordination are specifically impaired. Moreover, the coordination deficits in pcd are consistent with a failure to predict and compensate for the consequences of movement across the body. These results isolate specific impairments in whole-body coordination in mice and provide a quantitative framework for understanding cerebellar contributions to coordinated locomotion. DOI: http://dx.doi.org/10.7554/eLife.07892.001 PMID:26433022
Context influences on TALE–DNA binding revealed by quantitative profiling
Rogers, Julia M.; Barrera, Luis A.; Reyon, Deepak; Sander, Jeffry D.; Kellis, Manolis; Joung, J Keith; Bulyk, Martha L.
2015-01-01
Transcription activator-like effector (TALE) proteins recognize DNA using a seemingly simple DNA-binding code, which makes them attractive for use in genome engineering technologies that require precise targeting. Although this code is used successfully to design TALEs to target specific sequences, off-target binding has been observed and is difficult to predict. Here we explore TALE–DNA interactions comprehensively by quantitatively assaying the DNA-binding specificities of 21 representative TALEs to ∼5,000–20,000 unique DNA sequences per protein using custom-designed protein-binding microarrays (PBMs). We find that protein context features exert significant influences on binding. Thus, the canonical recognition code does not fully capture the complexity of TALE–DNA binding. We used the PBM data to develop a computational model, Specificity Inference For TAL-Effector Design (SIFTED), to predict the DNA-binding specificity of any TALE. We provide SIFTED as a publicly available web tool that predicts potential genomic off-target sites for improved TALE design. PMID:26067805
Context influences on TALE-DNA binding revealed by quantitative profiling.
Rogers, Julia M; Barrera, Luis A; Reyon, Deepak; Sander, Jeffry D; Kellis, Manolis; Joung, J Keith; Bulyk, Martha L
2015-06-11
Transcription activator-like effector (TALE) proteins recognize DNA using a seemingly simple DNA-binding code, which makes them attractive for use in genome engineering technologies that require precise targeting. Although this code is used successfully to design TALEs to target specific sequences, off-target binding has been observed and is difficult to predict. Here we explore TALE-DNA interactions comprehensively by quantitatively assaying the DNA-binding specificities of 21 representative TALEs to ∼5,000-20,000 unique DNA sequences per protein using custom-designed protein-binding microarrays (PBMs). We find that protein context features exert significant influences on binding. Thus, the canonical recognition code does not fully capture the complexity of TALE-DNA binding. We used the PBM data to develop a computational model, Specificity Inference For TAL-Effector Design (SIFTED), to predict the DNA-binding specificity of any TALE. We provide SIFTED as a publicly available web tool that predicts potential genomic off-target sites for improved TALE design.
Automatic spatiotemporal matching of detected pleural thickenings
NASA Astrophysics Data System (ADS)
Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas
2014-01-01
Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).
NASA Astrophysics Data System (ADS)
Kang, Jinbum; Jang, Won Seuk; Yoo, Yangmo
2018-02-01
Ultrafast compound Doppler imaging based on plane-wave excitation (UCDI) can be used to evaluate cardiovascular diseases using high frame rates. In particular, it provides a fully quantifiable flow analysis over a large region of interest with high spatio-temporal resolution. However, the pulse-repetition frequency (PRF) in the UCDI method is limited for high-velocity flow imaging since it has a tradeoff between the number of plane-wave angles (N) and acquisition time. In this paper, we present high PRF ultrafast sliding compound Doppler imaging method (HUSDI) to improve quantitative flow analysis. With the HUSDI method, full scanline images (i.e. each tilted plane wave data) in a Doppler frame buffer are consecutively summed using a sliding window to create high-quality ensemble data so that there is no reduction in frame rate and flow sensitivity. In addition, by updating a new compounding set with a certain time difference (i.e. sliding window step size or L), the HUSDI method allows various Doppler PRFs with the same acquisition data to enable a fully qualitative, retrospective flow assessment. To evaluate the performance of the proposed HUSDI method, simulation, in vitro and in vivo studies were conducted under diverse flow circumstances. In the simulation and in vitro studies, the HUSDI method showed improved hemodynamic representations without reducing either temporal resolution or sensitivity compared to the UCDI method. For the quantitative analysis, the root mean squared velocity error (RMSVE) was measured using 9 angles (-12° to 12°) with L of 1-9, and the results were found to be comparable to those of the UCDI method (L = N = 9), i.e. ⩽0.24 cm s-1, for all L values. For the in vivo study, the flow data acquired from a full cardiac cycle of the femoral vessels of a healthy volunteer were analyzed using a PW spectrogram, and arterial and venous flows were successfully assessed with high Doppler PRF (e.g. 5 kHz at L = 4). These results indicate that the proposed HUSDI method can improve flow visualization and quantification with a higher frame rate, PRF and flow sensitivity in cardiovascular imaging.
Kang, Jinbum; Jang, Won Seuk; Yoo, Yangmo
2018-02-09
Ultrafast compound Doppler imaging based on plane-wave excitation (UCDI) can be used to evaluate cardiovascular diseases using high frame rates. In particular, it provides a fully quantifiable flow analysis over a large region of interest with high spatio-temporal resolution. However, the pulse-repetition frequency (PRF) in the UCDI method is limited for high-velocity flow imaging since it has a tradeoff between the number of plane-wave angles (N) and acquisition time. In this paper, we present high PRF ultrafast sliding compound Doppler imaging method (HUSDI) to improve quantitative flow analysis. With the HUSDI method, full scanline images (i.e. each tilted plane wave data) in a Doppler frame buffer are consecutively summed using a sliding window to create high-quality ensemble data so that there is no reduction in frame rate and flow sensitivity. In addition, by updating a new compounding set with a certain time difference (i.e. sliding window step size or L), the HUSDI method allows various Doppler PRFs with the same acquisition data to enable a fully qualitative, retrospective flow assessment. To evaluate the performance of the proposed HUSDI method, simulation, in vitro and in vivo studies were conducted under diverse flow circumstances. In the simulation and in vitro studies, the HUSDI method showed improved hemodynamic representations without reducing either temporal resolution or sensitivity compared to the UCDI method. For the quantitative analysis, the root mean squared velocity error (RMSVE) was measured using 9 angles (-12° to 12°) with L of 1-9, and the results were found to be comparable to those of the UCDI method (L = N = 9), i.e. ⩽0.24 cm s -1 , for all L values. For the in vivo study, the flow data acquired from a full cardiac cycle of the femoral vessels of a healthy volunteer were analyzed using a PW spectrogram, and arterial and venous flows were successfully assessed with high Doppler PRF (e.g. 5 kHz at L = 4). These results indicate that the proposed HUSDI method can improve flow visualization and quantification with a higher frame rate, PRF and flow sensitivity in cardiovascular imaging.
Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin
2018-03-14
The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.
Temporal Lobe Epilepsy: Quantitative MR Volumetry in Detection of Hippocampal Atrophy
Farid, Nikdokht; Girard, Holly M.; Kemmotsu, Nobuko; Smith, Michael E.; Magda, Sebastian W.; Lim, Wei Y.; Lee, Roland R.
2012-01-01
Purpose: To determine the ability of fully automated volumetric magnetic resonance (MR) imaging to depict hippocampal atrophy (HA) and to help correctly lateralize the seizure focus in patients with temporal lobe epilepsy (TLE). Materials and Methods: This study was conducted with institutional review board approval and in compliance with HIPAA regulations. Volumetric MR imaging data were analyzed for 34 patients with TLE and 116 control subjects. Structural volumes were calculated by using U.S. Food and Drug Administration–cleared software for automated quantitative MR imaging analysis (NeuroQuant). Results of quantitative MR imaging were compared with visual detection of atrophy, and, when available, with histologic specimens. Receiver operating characteristic analyses were performed to determine the optimal sensitivity and specificity of quantitative MR imaging for detecting HA and asymmetry. A linear classifier with cross validation was used to estimate the ability of quantitative MR imaging to help lateralize the seizure focus. Results: Quantitative MR imaging–derived hippocampal asymmetries discriminated patients with TLE from control subjects with high sensitivity (86.7%–89.5%) and specificity (92.2%–94.1%). When a linear classifier was used to discriminate left versus right TLE, hippocampal asymmetry achieved 94% classification accuracy. Volumetric asymmetries of other subcortical structures did not improve classification. Compared with invasive video electroencephalographic recordings, lateralization accuracy was 88% with quantitative MR imaging and 85% with visual inspection of volumetric MR imaging studies but only 76% with visual inspection of clinical MR imaging studies. Conclusion: Quantitative MR imaging can depict the presence and laterality of HA in TLE with accuracy rates that may exceed those achieved with visual inspection of clinical MR imaging studies. Thus, quantitative MR imaging may enhance standard visual analysis, providing a useful and viable means for translating volumetric analysis into clinical practice. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12112638/-/DC1 PMID:22723496
The route to chaos for the Kuramoto-Sivashinsky equation
NASA Technical Reports Server (NTRS)
Papageorgiou, Demetrios T.; Smyrlis, Yiorgos
1990-01-01
The results of extensive numerical experiments of the spatially periodic initial value problem for the Kuramoto-Sivashinsky equation. This paper is concerned with the asymptotic nonlinear dynamics at the dissipation parameter decreases and spatio-temporal chaos sets in. To this end the initial condition is taken to be the same for all numerical experiments (a single sine wave is used) and the large time evolution of the system is followed numerically. Numerous computations were performed to establish the existence of windows, in parameter space, in which the solution has the following characteristics as the viscosity is decreased: a steady fully modal attractor to a steady bimodal attractor to another steady fully modal attractor to a steady trimodal attractor to a periodic attractor, to another steady fully modal attractor, to another periodic attractor, to a steady tetramodal attractor, to another periodic attractor having a full sequence of period-doublings (in parameter space) to chaos. Numerous solutions are presented which provide conclusive evidence of the period-doubling cascades which precede chaos for this infinite-dimensional dynamical system. These results permit a computation of the length of subwindows which in turn provide an estimate for their successive ratios as the cascade develops. A calculation based on the numerical results is also presented to show that the period doubling sequences found here for the Kuramoto-Sivashinsky equation, are in complete agreement with Feigenbaum's universal constant of 4,669201609... . Some preliminary work shows several other windows following the first chaotic one including periodic, chaotic, and a steady octamodal window; however, the windows shrink significantly in size to enable concrete quantitative conclusions to be made.
Takeda, Hiroaki; Izumi, Yoshihiro; Takahashi, Masatomo; Paxton, Thanai; Tamura, Shohei; Koike, Tomonari; Yu, Ying; Kato, Noriko; Nagase, Katsutoshi; Shiomi, Masashi; Bamba, Takeshi
2018-05-03
Lipidomics, the mass spectrometry-based comprehensive analysis of lipids, has attracted attention as an analytical approach to provide novel insight into lipid metabolism and to search for biomarkers. However, an ideal method for both comprehensive and quantitative analysis of lipids has not been fully developed. Herein, we have proposed a practical methodology for widely-targeted quantitative lipidome analysis using supercritical fluid chromatography fast-scanning triple-quadrupole mass spectrometry (SFC/QqQMS) and theoretically calculated a comprehensive lipid multiple reaction monitoring (MRM) library. Lipid classes can be separated by SFC with a normal phase diethylamine-bonded silica column with high-resolution, high-throughput, and good repeatability. Structural isomers of phospholipids can be monitored by mass spectrometric separation with fatty acyl-based MRM transitions. SFC/QqQMS analysis with an internal standard-dilution method offers quantitative information for both lipid class and individual lipid molecular species in the same lipid class. Additionally, data acquired using this method has advantages including reduction of misidentification and acceleration of data analysis. Using the SFC/QqQMS system, alteration of plasma lipid levels in myocardial infarction-prone rabbits to the supplementation of eicosapentaenoic acid was first observed. Our developed SFC/QqQMS method represents a potentially useful tool for in-depth studies focused on complex lipid metabolism and biomarker discovery. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
Zhang, Xirui; Daaboul, George G; Spuhler, Philipp S; Dröge, Peter; Ünlü, M Selim
2016-03-14
DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.
Quantitative imaging features: extension of the oncology medical image database
NASA Astrophysics Data System (ADS)
Patel, M. N.; Looney, P. T.; Young, K. C.; Halling-Brown, M. D.
2015-03-01
Radiological imaging is fundamental within the healthcare industry and has become routinely adopted for diagnosis, disease monitoring and treatment planning. With the advent of digital imaging modalities and the rapid growth in both diagnostic and therapeutic imaging, the ability to be able to harness this large influx of data is of paramount importance. The Oncology Medical Image Database (OMI-DB) was created to provide a centralized, fully annotated dataset for research. The database contains both processed and unprocessed images, associated data, and annotations and where applicable expert determined ground truths describing features of interest. Medical imaging provides the ability to detect and localize many changes that are important to determine whether a disease is present or a therapy is effective by depicting alterations in anatomic, physiologic, biochemical or molecular processes. Quantitative imaging features are sensitive, specific, accurate and reproducible imaging measures of these changes. Here, we describe an extension to the OMI-DB whereby a range of imaging features and descriptors are pre-calculated using a high throughput approach. The ability to calculate multiple imaging features and data from the acquired images would be valuable and facilitate further research applications investigating detection, prognosis, and classification. The resultant data store contains more than 10 million quantitative features as well as features derived from CAD predictions. Theses data can be used to build predictive models to aid image classification, treatment response assessment as well as to identify prognostic imaging biomarkers.
Confocal microscopy imaging of the biofilm matrix.
Schlafer, Sebastian; Meyer, Rikke L
2017-07-01
The extracellular matrix is an integral part of microbial biofilms and an important field of research. Confocal laser scanning microscopy is a valuable tool for the study of biofilms, and in particular of the biofilm matrix, as it allows real-time visualization of fully hydrated, living specimens. Confocal microscopes are held by many research groups, and a number of methods for qualitative and quantitative imaging of the matrix have emerged in recent years. This review provides an overview and a critical discussion of techniques used to visualize different matrix compounds, to determine the concentration of solutes and the diffusive properties of the biofilm matrix. Copyright © 2016 Elsevier B.V. All rights reserved.
Vera-Candioti, Luciana; Culzoni, María J; Olivieri, Alejandro C; Goicoechea, Héctor C
2008-11-01
Drug monitoring in serum samples was performed using second-order data generated by CE-DAD, processed with a suitable chemometric strategy. Carbamazepine could be accurately quantitated in the presence of its main metabolite (carbamazepine epoxide), other therapeutic drugs (lamotrigine, phenobarbital, phenytoin, phenylephrine, ibuprofen, acetaminophen, theophylline, caffeine, acetyl salicylic acid), and additional serum endogenous components. The analytical strategy consisted of the following steps: (i) serum sample clean-up to remove matrix interferences, (ii) data pre-processing, in order to reduce the background and to correct for electrophoretic time shifts, and (iii) resolution of fully overlapped CE peaks (corresponding to carbamazepine, its metabolite, lamotrigine and unexpected serum components) by the well-known multivariate curve resolution-alternating least squares algorithm, which extracts quantitative information that can be uniquely ascribed to the analyte of interest. The analyte concentration in serum samples ranged from 2.00 to 8.00 mg/L. Mean recoveries were 102.6% (s=7.7) for binary samples, and 94.8% (s=13.5) for spiked serum samples, while CV (%)=4.0 was computed for five replicate, indicative of the acceptable accuracy and precision of the proposed method.
Brain tumor segmentation in MR slices using improved GrowCut algorithm
NASA Astrophysics Data System (ADS)
Ji, Chunhong; Yu, Jinhua; Wang, Yuanyuan; Chen, Liang; Shi, Zhifeng; Mao, Ying
2015-12-01
The detection of brain tumor from MR images is very significant for medical diagnosis and treatment. However, the existing methods are mostly based on manual or semiautomatic segmentation which are awkward when dealing with a large amount of MR slices. In this paper, a new fully automatic method for the segmentation of brain tumors in MR slices is presented. Based on the hypothesis of the symmetric brain structure, the method improves the interactive GrowCut algorithm by further using the bounding box algorithm in the pre-processing step. More importantly, local reflectional symmetry is used to make up the deficiency of the bounding box method. After segmentation, 3D tumor image is reconstructed. We evaluate the accuracy of the proposed method on MR slices with synthetic tumors and actual clinical MR images. Result of the proposed method is compared with the actual position of simulated 3D tumor qualitatively and quantitatively. In addition, our automatic method produces equivalent performance as manual segmentation and the interactive GrowCut with manual interference while providing fully automatic segmentation.
2010-01-01
Background Cell motility is a critical parameter in many physiological as well as pathophysiological processes. In time-lapse video microscopy, manual cell tracking remains the most common method of analyzing migratory behavior of cell populations. In addition to being labor-intensive, this method is susceptible to user-dependent errors regarding the selection of "representative" subsets of cells and manual determination of precise cell positions. Results We have quantitatively analyzed these error sources, demonstrating that manual cell tracking of pancreatic cancer cells lead to mis-calculation of migration rates of up to 410%. In order to provide for objective measurements of cell migration rates, we have employed multi-target tracking technologies commonly used in radar applications to develop fully automated cell identification and tracking system suitable for high throughput screening of video sequences of unstained living cells. Conclusion We demonstrate that our automatic multi target tracking system identifies cell objects, follows individual cells and computes migration rates with high precision, clearly outperforming manual procedures. PMID:20377897
Wiegers, Susan E; Houser, Steven R; Pearson, Helen E; Untalan, Ann; Cheung, Joseph Y; Fisher, Susan G; Kaiser, Larry R; Feldman, Arthur M
2015-08-01
Academic medical centers are faced with increasing budgetary constraints due to a flat National Institutes of Health budget, lower reimbursements for clinical services, higher costs of technology including informatics and a changing competitive landscape. As such, institutional stakeholders are increasingly asking whether resources are allocated appropriately and whether there are objective methods for measuring faculty contributions and engagement. The complexities of translational research can be particularly challenging when trying to assess faculty contributions because of team science. For over a decade, we have used an objective scoring system called the Matrix to assess faculty productivity and engagement in four areas: research, education, scholarship, and administration or services. The Matrix was developed to be dynamic, quantitative, and able to insure that a fully engaged educator would have a Matrix score that was comparable to a fully engaged investigator. In this report, we present the Matrix in its current form in order to provide a well-tested objective system of performance evaluation for nonclinical faculty to help academic leaders in decision making. © 2015 Wiley Periodicals, Inc.
Pedersen, Maria; Overgaard, Dorthe; Andersen, Ingelise; Baastrup, Marie; Egerod, Ingrid
2018-05-17
To explore the extent to which the qualitative and quantitative data converge and explain mechanisms and drivers of social inequality in cardiac rehabilitation attendance. Social inequality in cardiac rehabilitation attendance has been a recognized problem for many years. However, to date the mechanisms driving these inequalities are still not fully understood. The study was designed as a convergent mixed methods study. From March 2015 - March 2016, patients hospitalized with acute coronary syndrome to two Danish regional hospitals were included in a quantitative prospective observational study (N=302). Qualitative interview informants (N=24) were sampled from the quantitative study population and half brought a close relative (N=12) for dyadic interviews. Interviews were conducted from August 2015 to February 2016. Integrated analyses were conducted in joint displays by merging the quantitative and qualitative findings. Qualitative and quantitative findings primarily confirmed and expanded each other; however, discordant results were also evident. Integrated analyses identified socially differentiated lifestyles, health beliefs, travel barriers and self-efficacy as potential drivers of social inequality in cardiac rehabilitation. Our study adds empirical evidence regarding how a mixed methods study can be used to obtain an understanding of complex healthcare problems. The study provides new knowledge concerning the mechanisms driving social inequality in cardiac rehabilitation attendance. To prevent social inequality, cardiac rehabilitation should be accommodated to patients with a history of unhealthy behaviour and low self-efficacy. Additionally, the rehabilitation programme should be offered in locations not requiring a long commute. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
2001-10-25
analyses of electroencephalogram at half- closed eye and fully closed eye. This study aimed at quantitative estimating rest rhythm of horses by the...analyses of eyeball movement. The mask attached with a miniature CCD camera was newly developed. The continuous images of the horse eye for about 24...eyeball area were calculated. As for the results, the fluctuating status of eyeball area was analyzed quantitatively, and the rest rhythm of horses was
NASA Astrophysics Data System (ADS)
Torres-Verdin, C.
2007-05-01
This paper describes the successful implementation of a new 3D AVA stochastic inversion algorithm to quantitatively integrate pre-stack seismic amplitude data and well logs. The stochastic inversion algorithm is used to characterize flow units of a deepwater reservoir located in the central Gulf of Mexico. Conventional fluid/lithology sensitivity analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generates typical Class III AVA responses. On the other hand, layer- dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution. Accordingly, AVA stochastic inversion, which combines the advantages of AVA analysis with those of geostatistical inversion, provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties (P-velocity, S-velocity, density), and lithotype (sand- shale) distributions. The quantitative use of rock/fluid information through AVA seismic amplitude data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, yields accurate 3D models of petrophysical properties such as porosity and permeability. Finally, by fully integrating pre-stack seismic amplitude data and well logs, the vertical resolution of inverted products is higher than that of deterministic inversions methods.
NASA Astrophysics Data System (ADS)
Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai
2018-06-01
Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment
. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.
iTRAQ-Based Quantitative Proteomics of Developing and Ripening Muscadine Grape Berry
Kambiranda, Devaiah; Katam, Ramesh; Basha, Sheikh M.; Siebert, Shalom
2014-01-01
Grapes are among the widely cultivated fruit crops in the world. Grape berries like other nonclimacteric fruits undergo a complex set of dynamic, physical, physiological, and biochemical changes during ripening. Muscadine grapes are widely cultivated in the southern United States for fresh fruit and wine. To date, changes in the metabolites composition of muscadine grapes have been well documented; however, the molecular changes during berry development and ripening are not fully known. The aim of this study was to investigate changes in the berry proteome during ripening in muscadine grape cv. Noble. Isobaric tags for relative and absolute quantification (iTRAQ) MS/MS was used to detect statistically significant changes in the berry proteome. A total of 674 proteins were detected, and 76 were differentially expressed across four time points in muscadine berry. Proteins obtained were further analyzed to provide information about its potential functions during ripening. Several proteins involved in abiotic and biotic stimuli and sucrose and hexose metabolism were upregulated during berry ripening. Quantitative real-time PCR analysis validated the protein expression results for nine proteins. Identification of vicilin-like antimicrobial peptides indicates additional disease tolerance proteins are present in muscadines for berry protection during ripening. The results provide new information for characterization and understanding muscadine berry proteome and grape ripening. PMID:24251720
Health risk behaviours amongst school adolescents: protocol for a mixed methods study.
El Achhab, Youness; El Ammari, Abdelghaffar; El Kazdouh, Hicham; Najdi, Adil; Berraho, Mohamed; Tachfouti, Nabil; Lamri, Driss; El Fakir, Samira; Nejjari, Chakib
2016-11-29
Determining risky behaviours of adolescents provides valuable information for designing appropriate intervention programmes for advancing adolescent's health. However, these behaviours are not fully addressed by researchers in a comprehensive approach. We report the protocol of a mixed methods study designed to investigate the health risk behaviours of Moroccan adolescents with the goal of identifying suitable strategies to address their health concerns. We used a sequential two-phase explanatory mixed method study design. The approach begins with the collection of quantitative data, followed by the collection of qualitative data to explain and enrich the quantitative findings. In the first phase, the global school-based student health survey (GSHS) was administered to 800 students who were between 14 and 19 years of age. The second phase engaged adolescents, parents and teachers in focus groups and assessed education documents to explore the level of coverage of health education in the programme learnt in the middle school. To obtain opinions about strategies to reduce Moroccan adolescents' health risk behaviours, a nominal group technique will be used. The findings of this mixed methods sequential explanatory study provide insights into the risk behaviours that need to be considered if intervention programmes and preventive strategies are to be designed to promote adolescent's health in the Moroccan school.
Mapping Electrical Crosstalk in Pixelated Sensor Arrays
NASA Technical Reports Server (NTRS)
Seshadri, S.; Cole, D. M.; Hancock, B. R.; Smith, R. M.
2008-01-01
Electronic coupling effects such as Inter-Pixel Capacitance (IPC) affect the quantitative interpretation of image data from CMOS, hybrid visible and infrared imagers alike. Existing methods of characterizing IPC do not provide a map of the spatial variation of IPC over all pixels. We demonstrate a deterministic method that provides a direct quantitative map of the crosstalk across an imager. The approach requires only the ability to reset single pixels to an arbitrary voltage, different from the rest of the imager. No illumination source is required. Mapping IPC independently for each pixel is also made practical by the greater S/N ratio achievable for an electrical stimulus than for an optical stimulus, which is subject to both Poisson statistics and diffusion effects of photo-generated charge. The data we present illustrates a more complex picture of IPC in Teledyne HgCdTe and HyViSi focal plane arrays than is presently understood, including the presence of a newly discovered, long range IPC in the HyViSi FPA that extends tens of pixels in distance, likely stemming from extended field effects in the fully depleted substrate. The sensitivity of the measurement approach has been shown to be good enough to distinguish spatial structure in IPC of the order of 0.1%.
NASA Astrophysics Data System (ADS)
Wahi-Anwar, M. Wasil; Emaminejad, Nastaran; Hoffman, John; Kim, Grace H.; Brown, Matthew S.; McNitt-Gray, Michael F.
2018-02-01
Quantitative imaging in lung cancer CT seeks to characterize nodules through quantitative features, usually from a region of interest delineating the nodule. The segmentation, however, can vary depending on segmentation approach and image quality, which can affect the extracted feature values. In this study, we utilize a fully-automated nodule segmentation method - to avoid reader-influenced inconsistencies - to explore the effects of varied dose levels and reconstruction parameters on segmentation. Raw projection CT images from a low-dose screening patient cohort (N=59) were reconstructed at multiple dose levels (100%, 50%, 25%, 10%), two slice thicknesses (1.0mm, 0.6mm), and a medium kernel. Fully-automated nodule detection and segmentation was then applied, from which 12 nodules were selected. Dice similarity coefficient (DSC) was used to assess the similarity of the segmentation ROIs of the same nodule across different reconstruction and dose conditions. Nodules at 1.0mm slice thickness and dose levels of 25% and 50% resulted in DSC values greater than 0.85 when compared to 100% dose, with lower dose leading to a lower average and wider spread of DSC values. At 0.6mm, the increased bias and wider spread of DSC values from lowering dose were more pronounced. The effects of dose reduction on DSC for CAD-segmented nodules were similar in magnitude to reducing the slice thickness from 1.0mm to 0.6mm. In conclusion, variation of dose and slice thickness can result in very different segmentations because of noise and image quality. However, there exists some stability in segmentation overlap, as even at 1mm, an image with 25% of the lowdose scan still results in segmentations similar to that seen in a full-dose scan.
Graduate student theses supported by DOE`s Environmental Sciences Division
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cushman, Robert M.; Parra, Bobbi M.
1995-07-01
This report provides complete bibliographic citations, abstracts, and keywords for 212 doctoral and master`s theses supported fully or partly by the U.S. Department of Energy`s Environmental Sciences Division (and its predecessors) in the following areas: Atmospheric Sciences; Marine Transport; Terrestrial Transport; Ecosystems Function and Response; Carbon, Climate, and Vegetation; Information; Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP); Atmospheric Radiation Measurement (ARM); Oceans; National Institute for Global Environmental Change (NIGEC); Unmanned Aerial Vehicles (UAV); Integrated Assessment; Graduate Fellowships for Global Change; and Quantitative Links. Information on the major professor, department, principal investigator, and program area is given for each abstract.more » Indexes are provided for major professor, university, principal investigator, program area, and keywords. This bibliography is also available in various machine-readable formats (ASCII text file, WordPerfect{reg_sign} files, and PAPYRUS{trademark} files).« less
A new theoretical approach to analyze complex processes in cytoskeleton proteins.
Li, Xin; Kolomeisky, Anatoly B
2014-03-20
Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.
Space-Time Data fusion for Remote Sensing Applications
NASA Technical Reports Server (NTRS)
Braverman, Amy; Nguyen, H.; Cressie, N.
2011-01-01
NASA has been collecting massive amounts of remote sensing data about Earth's systems for more than a decade. Missions are selected to be complementary in quantities measured, retrieval techniques, and sampling characteristics, so these datasets are highly synergistic. To fully exploit this, a rigorous methodology for combining data with heterogeneous sampling characteristics is required. For scientific purposes, the methodology must also provide quantitative measures of uncertainty that propagate input-data uncertainty appropriately. We view this as a statistical inference problem. The true but notdirectly- observed quantities form a vector-valued field continuous in space and time. Our goal is to infer those true values or some function of them, and provide to uncertainty quantification for those inferences. We use a spatiotemporal statistical model that relates the unobserved quantities of interest at point-level to the spatially aggregated, observed data. We describe and illustrate our method using CO2 data from two NASA data sets.
The Role of Collaborations in Sustaining an Evidence-Based Intervention to Reduce Child Neglect
Green, Amy E.; Trott, Elise; Willging, Cathleen E.; Finn, Natalie K.; Ehrhart, Mark G.; Aarons, Gregory A.
2016-01-01
Child neglect is the most prevalent form of child maltreatment and represents 79.5% of open child-welfare cases. A recent study found the evidence-based intervention (EBI) SafeCare® (SC) to significantly reduce child neglect recidivism rates. To fully capitalize on the effectiveness of such EBIs, service systems must engage in successful implementation and sustainment; however, little is known regarding what factors influence EBI sustainment. Collaborations among stakeholders are suggested as a means for facilitating EBI implementation and sustainment. This study combines descriptive quantitative survey data with qualitative interview and focus group findings to examine the role of collaboration within the context of public-private partnerships in 11 child welfare systems implementing SC. Participants included administrators of government child welfare systems and community-based organizations, as well as supervisors, coaches, and home visitors of the SC program. Sites were classified as fully-, partially-, and non-sustaining based on implementation fidelity. One-way analysis of variance was used to examine differences in stakeholder reported Effective Collaboration scores across fully-sustaining, partially-sustaining, and non-sustaining sites. Qualitative transcripts were analyzed via open and focused coding to identify the commonality, diversity, and complexity of collaborations involved in implementing and sustaining SC. Fully-sustaining sites reported significantly greater levels of effective collaboration than non-sustaining sites. Key themes described by SC stakeholders included shared vision, building on existing relationships, academic support, problem solving and resource sharing, and maintaining collaborations over time. Both quantitative and qualitative results converge in highlighting the importance of effective collaboration in EBI sustainment in child welfare service systems. PMID:26712422
NASA Astrophysics Data System (ADS)
Zhang, Xirui; Daaboul, George G.; Spuhler, Philipp S.; Dröge, Peter; Ünlü, M. Selim
2016-03-01
DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions. Electronic supplementary information (ESI) available: DNA sequences and nomenclature (Table 1S); SDS-PAGE assay of IHF stock solution (Fig. 1S); determination of the concentration of IHF stock solution by Bradford assay (Fig. 2S); equilibrium binding isotherm fitting results of other DNA sequences (Table 2S); calculation of dissociation constants (Fig. 3S, 4S; Table 2S); geometric model for quantitation of DNA bending angle induced by specific IHF binding (Fig. 4S); customized flow cell assembly (Fig. 5S); real-time measurement of average fluorophore height change by SSFM (Fig. 6S); summary of binding parameters obtained from additive isotherm model fitting (Table 3S); average surface densities of 10 dsDNA spots and bound IHF at equilibrium (Table 4S); effects of surface densities on the binding and bending of dsDNA (Tables 5S, 6S and Fig. 7S-10S). See DOI: 10.1039/c5nr06785e
NASA Astrophysics Data System (ADS)
Chiarot, C. B.; Siewerdsen, J. H.; Haycocks, T.; Moseley, D. J.; Jaffray, D. A.
2005-11-01
Development, characterization, and quality assurance of advanced x-ray imaging technologies require phantoms that are quantitative and well suited to such modalities. This note reports on the design, construction, and use of an innovative phantom developed for advanced imaging technologies (e.g., multi-detector CT and the numerous applications of flat-panel detectors in dual-energy imaging, tomosynthesis, and cone-beam CT) in diagnostic and image-guided procedures. The design addresses shortcomings of existing phantoms by incorporating criteria satisfied by no other single phantom: (1) inserts are fully 3D—spherically symmetric rather than cylindrical; (2) modules are quantitative, presenting objects of known size and contrast for quality assurance and image quality investigation; (3) features are incorporated in ideal and semi-realistic (anthropomorphic) contexts; and (4) the phantom allows devices to be inserted and manipulated in an accessible module (right lung). The phantom consists of five primary modules: (1) head, featuring contrast-detail spheres approximate to brain lesions; (2) left lung, featuring contrast-detail spheres approximate to lung modules; (3) right lung, an accessible hull in which devices may be placed and manipulated; (4) liver, featuring conrast-detail spheres approximate to metastases; and (5) abdomen/pelvis, featuring simulated kidneys, colon, rectum, bladder, and prostate. The phantom represents a two-fold evolution in design philosophy—from 2D (cylindrically symmetric) to fully 3D, and from exclusively qualitative or quantitative to a design accommodating quantitative study within an anatomical context. It has proven a valuable tool in investigations throughout our institution, including low-dose CT, dual-energy radiography, and cone-beam CT for image-guided radiation therapy and surgery.
Preprocessing film-copied MRI for studying morphological brain changes.
Pham, Tuan D; Eisenblätter, Uwe; Baune, Bernhard T; Berger, Klaus
2009-06-15
The magnetic resonance imaging (MRI) of the brain is one of the important data items for studying memory and morbidity in elderly as these images can provide useful information through the quantitative measures of various regions of interest of the brain. As an effort to fully automate the biomedical analysis of the brain that can be combined with the genetic data of the same human population and where the records of the original MRI data are missing, this paper presents two effective methods for addressing this imaging problem. The first method handles the restoration of the film-copied MRI. The second method involves the segmentation of the image data. Experimental results and comparisons with other methods suggest the usefulness of the proposed image analysis methodology.
What can posturography tell us about vestibular function?
NASA Technical Reports Server (NTRS)
Black, F. O.
2001-01-01
Patients with balance disorders want answers to the following basic questions: (1) What is causing my problem? and (2) What can be done about my problem? Information to fully answer these questions must include status of both sensory and motor components of the balance control systems. Computerized dynamic posturography (CDP) provides quantitative assessment of both sensory and motor components of postural control along with how the sensory inputs to the brain interact. This paper reviews the scientific basis and clinical applications of CDP. Specifically, studies describing the integration of vestibular inputs with other sensory systems for postural control are briefly summarized. Clinical applications, including assessment, rehabilitation, and management are presented. Effects of aging on postural control along with prevention and management strategies are discussed.
Endoscopic low coherence interferometry in upper airways
NASA Astrophysics Data System (ADS)
Delacrétaz, Yves; Boss, Daniel; Lang, Florian; Depeursinge, Christian
2009-07-01
We introduce Endoscopic Low Coherence Interferometry to obtain topology of upper airways through commonly used rigid endoscopes. Quantitative dimensioning of upper airways pathologies is crucial to provide maximum health recovery chances, for example in order to choose the correct stent to treat endoluminal obstructing pathologies. Our device is fully compatible with procedures used in day-to-day examinations and can potentially be brought to bedside. Besides this, the approach described here can be almost straightforwardly adapted to other endoscopy-related field of interest, such as gastroscopy and arthroscopy. The principle of the method is first exposed, then filtering procedure used to extract the depth information is described. Finally, demonstration of the method ability to operate on biological samples is assessed through measurements on ex-vivo pork bronchi.
NASA Technical Reports Server (NTRS)
Shearer, C. K.; Eppler, D.; Farrell, W.; Gruener, J.; Lawrence, S.; Pellis, N.; Spudis, P. D.; Stopar, J.; Zeigler, R.; Neal, C;
2016-01-01
The Lunar Exploration Analysis Group (LEAG) was tasked by the Human Exploration Operations Mission Directorate (HEOMD) to establish a Specific Action Team (SAT) to review lunar Strategic Knowledge Gaps (SKGs) within the context of new lunar data and some specific human mission scenarios. Within this review, the SAT was to identify the SKGs that have been fully or partially retired, identify new SKGs resulting from new data and observations, and review quantitative descriptions of measurements that are required to fill knowledge gaps, the fidelity of the measurements needed, and if relevant, provide examples of existing instruments or potential missions capable of filling the SKGs.
Two-channel Kondo effect and renormalization flow with macroscopic quantum charge states.
Iftikhar, Z; Jezouin, S; Anthore, A; Gennser, U; Parmentier, F D; Cavanna, A; Pierre, F
2015-10-08
Many-body correlations and macroscopic quantum behaviours are fascinating condensed matter problems. A powerful test-bed for the many-body concepts and methods is the Kondo effect, which entails the coupling of a quantum impurity to a continuum of states. It is central in highly correlated systems and can be explored with tunable nanostructures. Although Kondo physics is usually associated with the hybridization of itinerant electrons with microscopic magnetic moments, theory predicts that it can arise whenever degenerate quantum states are coupled to a continuum. Here we demonstrate the previously elusive 'charge' Kondo effect in a hybrid metal-semiconductor implementation of a single-electron transistor, with a quantum pseudospin of 1/2 constituted by two degenerate macroscopic charge states of a metallic island. In contrast to other Kondo nanostructures, each conduction channel connecting the island to an electrode constitutes a distinct and fully tunable Kondo channel, thereby providing unprecedented access to the two-channel Kondo effect and a clear path to multi-channel Kondo physics. Using a weakly coupled probe, we find the renormalization flow, as temperature is reduced, of two Kondo channels competing to screen the charge pseudospin. This provides a direct view of how the predicted quantum phase transition develops across the symmetric quantum critical point. Detuning the pseudospin away from degeneracy, we demonstrate, on a fully characterized device, quantitative agreement with the predictions for the finite-temperature crossover from quantum criticality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jannson, T.
1993-03-01
During the last two decades, there have been dramatic improvements in the development of optical sources. Examples of this development range from semiconductor laser diodes to free electron beam lasers and synchrotron radiation. Before these developments, standards for the measurement of basic optical parameters (quantities) were less demanding. Now, however, there is a fundamental need for new, reliable methods for providing fast quantitative results for a very broad variety of optical systems and sources. This is particularly true for partially coherent optical beams, since all optical sources are either fully or partially spatially coherent (including Lambertian sources). Until now, theremore » has been no satisfactory solution to this problem. During the last two decades, however, the foundations of physical radiometry have been developed by Walther, Wolf and co-workers. By integrating physical optics, statistical optics and conventional radiometry, this body of work provides necessary tools for the evaluation of radiometric quantities for partially coherent optical beams propagating through optical systems. In this program, Physical Optics Corporation (POC) demonstrated the viability of such a radiometric approach for the specific case of generalized energy concentrators called Liouville transformers. We believe that this radiometric approach is necessary to fully characterize any type of optical system since it takes into account the partial coherence of radiation. 90 refs., 57 figs., 4 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.
2004-04-01
There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. Themore » system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk.« less
EBM regeneration and changes in EBM component mRNA expression in stromal cells after corneal injury
Santhanam, Abirami; Marino, Gustavo K.; Torricelli, Andre A. M.
2017-01-01
Purpose To investigate the production of the epithelial basement membrane (EBM) component mRNAs at time points before lamina lucida and lamina densa regeneration in anterior stromal cells after corneal injury that would heal with and without fibrosis. Methods Rabbit corneas were removed from 2 to 19 days after −4.5D or −9.0D photorefractive keratectomy (PRK) with the VISX S4 IR laser. Corneas were evaluated with transmission electron microscopy (TEM) for full regeneration of the lamina lucida and the lamina densa. Laser capture microdissection (LCM) based quantitative real-time (RT)–PCR was used to quantitate the expression of mRNAs for laminin α-3 (LAMA3), perlecan, nidogen-1, and nidogen-2 in the anterior stroma. Results After −4.5D PRK, EBM was found to be fully regenerated at 8 to 10 days after surgery. At 4 days after PRK, the nidogen-2 and LAMA3 mRNAs levels were detected at statistically significantly lower levels in the anterior stroma of the −9.0D PRK corneas (where the EBM would not fully regenerate) compared to the −4.5D PRK corneas (where the EBM was destined to fully regenerate). At 7 days after PRK, nidogen-2 and LAMA3 mRNAs continued to be statistically significantly lower in the anterior stroma of the −9.0D PRK corneas compared to their expression in the anterior stroma of the −4.5D PRK corneas. Conclusions Key EBM components LAMA3 and nidogen-2 mRNAs are expressed at higher levels in the anterior stroma during EBM regeneration in the −4.5D PRK corneas where the EBM is destined to fully regenerate and no haze developed compared to the −9.0D PRK corneas where the EBM will not fully regenerate and myofibroblast-related stromal fibrosis (haze) will develop. PMID:28275314
Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien
2018-01-01
We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.
Molnár, Borbála; Fodor, Blanka; Boldizsár, Imre; Molnár-Perl, Ibolya
2015-10-20
A novel, quantitative trimethylsilylation approach derivatizing 11 primary phenylalkyl amines (PPAAs), including amphetamine (A) and 3,4-methylenedioxyamphetamine (MDA), was noted. Triggering the fully derivatized ditrimethylsilyl (diTMS) species with the N-methyl-N-(trimethylsilyl)-trifluoroacetamide (MSTFA) reagent, a new principle was recognized followed by GC/MS. In the course of method optimization, the complementary impact of solvents (acetonitrile, ACN; ethyl acetate, ETAC; pyridine, PYR) and catalysts (trimethylchlorosilane, TMCS; trimethyliodosilane, TMIS) was studied: the role of solvent and catalyst proved to be equally crucial. Optimum, proportional, huge responses were obtained with the MSTFA/PYR = 2/1-9/1 (v/v) reagent applying catalysts; A and MDA needed the TMIS, while the rest of PPAAs provided the diTMS products also with TMCS. Similar to derivatives generated with hexamethyldisilazane and perfluorocarboxylic acid (HMDS and PFCA) ( Molnár et al. Anal. Chem. 2015 , 87 , 848 - 852 ), the fully silylated PPAAs offer several advantages. Both of our methods save time and cost by allowing for direct injection of analytes into the column; this is in stark contrast with the requirement to evaporate acid anhydrides by nitrogen prior to their injection. Efficiences of the novel catalyzed trimethylsilylation (MSTFA) and our recently introduced (now, for A and MDA extended) acylation principle were contrasted. Catalyzed trimethylsilylation led to diTMS derivatives resulting in on average a 1.7 times larger response compared to the corresponding acylated species. Catalyzed trimethylsilylation of PPAAs, A, and MDA were characterized with retention, mass fragmentation, and analytical performance properties (R(2), LOQ values). The practical utility of ditrimethylsilyation was shown by analyzing A in urine and mescaline (MSC) in cactus samples.
Virtual and augmented reality in the treatment of phantom limb pain: A literature review.
Dunn, Justin; Yeo, Elizabeth; Moghaddampour, Parisah; Chau, Brian; Humbert, Sarah
2017-01-01
Phantom limb pain (PLP), the perception of discomfort in a limb no longer present, commonly occurs following amputation. A variety of interventions have been employed for PLP, including mirror therapy. Virtual Reality (VR) and augmented reality (AR) mirror therapy treatments have also been utilized and have the potential to provide an even greater immersive experience for the amputee. However, there is not currently a consensus on the efficacy of VR and AR therapy. The aim of this review is to evaluate and summarize the current research on the effect of immersive VR and AR in the treatment of PLP. A comprehensive literature search was conducted utilizing PubMed and Google Scholar in order to collect all available studies concerning the use of VR and/or AR in the treatment of PLP using the search terms "virtual reality," "augmented reality," and "phantom limb pain." Eight studies in total were evaluated, with six of those reporting quantitative data and the other two reporting qualitative findings. All studies located were of low-level evidence. Each noted improved pain with VR and AR treatment for phantom limb pain, through quantitative or qualitative reporting. Additionally, adverse effects were limited only to simulator sickness occurring in one trial for one patient. Despite the positive findings, all of the studies were confined purely to case studies and case report series. No studies of higher evidence have been conducted, thus considerably limiting the strength of the findings. As such, the current use of VR and AR for PLP management, while attractive due to the increasing levels of immersion, customizable environments, and decreasing cost, is yet to be fully proven and continues to need further research with higher quality studies to fully explore its benefits.
Belda-Palazón, Borja; Nohales, María A.; Rambla, José L.; Aceña, José L.; Delgado, Oscar; Fustero, Santos; Martínez, M. Carmen; Granell, Antonio; Carbonell, Juan; Ferrando, Alejandro
2014-01-01
The eukaryotic translation elongation factor eIF5A is the only protein known to contain the unusual amino acid hypusine which is essential for its biological activity. This post-translational modification is achieved by the sequential action of the enzymes deoxyhypusine synthase (DHS) and deoxyhypusine hydroxylase (DOHH). The crucial molecular function of eIF5A during translation has been recently elucidated in yeast and it is expected to be fully conserved in every eukaryotic cell, however the functional description of this pathway in plants is still sparse. The genetic approaches with transgenic plants for either eIF5A overexpression or antisense have revealed some activities related to the control of cell death processes but the molecular details remain to be characterized. One important aspect of fully understanding this pathway is the biochemical description of the hypusine modification system. Here we have used recombinant eIF5A proteins either modified by hypusination or non-modified to establish a bi-dimensional electrophoresis (2D-E) profile for the three eIF5A protein isoforms and their hypusinated or unmodified proteoforms present in Arabidopsis thaliana. The combined use of the recombinant 2D-E profile together with 2D-E/western blot analysis from whole plant extracts has provided a quantitative approach to measure the hypusination status of eIF5A. We have used this information to demonstrate that treatment with the hormone abscisic acid produces an alteration of the hypusine modification system in Arabidopsis thaliana. Overall this study presents the first biochemical description of the post-translational modification of eIF5A by hypusination which will be functionally relevant for future studies related to the characterization of this pathway in Arabidopsis thaliana. PMID:24904603
Development of Miniaturized Optimized Smart Sensors (MOSS) for space plasmas
NASA Technical Reports Server (NTRS)
Young, D. T.
1993-01-01
The cost of space plasma sensors is high for several reasons: (1) Most are one-of-a-kind and state-of-the-art, (2) the cost of launch to orbit is high, (3) ruggedness and reliability requirements lead to costly development and test programs, and (4) overhead is added by overly elaborate or generalized spacecraft interface requirements. Possible approaches to reducing costs include development of small 'sensors' (defined as including all necessary optics, detectors, and related electronics) that will ultimately lead to cheaper missions by reducing (2), improving (3), and, through work with spacecraft designers, reducing (4). Despite this logical approach, there is no guarantee that smaller sensors are necessarily either better or cheaper. We have previously advocated applying analytical 'quality factors' to plasma sensors (and spacecraft) and have begun to develop miniaturized particle optical systems by applying quantitative optimization criteria. We are currently designing a Miniaturized Optimized Smart Sensor (MOSS) in which miniaturized electronics (e.g., employing new power supply topology and extensive us of gate arrays and hybrid circuits) are fully integrated with newly developed particle optics to give significant savings in volume and mass. The goal of the SwRI MOSS program is development of a fully self-contained and functional plasma sensor weighing 1 lb and requiring 1 W. MOSS will require only a typical spacecraft DC power source (e.g., 30 V) and command/data interfaces in order to be fully functional, and will provide measurement capabilities comparable in most ways to current sensors.
Lozano, Ana; Rajski, Łukasz; Uclés, Samanta; Belmonte-Valles, Noelia; Mezcua, Milagros; Fernández-Alba, Amadeo R
2014-01-01
Two sorbents containing ZrO₂ (Z-Sep and Z-Sep+) were tested as a d-SPE clean-up in combination with the QuEChERS and ethyl acetate multiresidue method in the pesticide residues extraction in avocado. All extracts were analysed using gas chromatography coupled with a triple quadrupole mass spectrometer working in multi-reaction monitoring mode. GC QToF was used to compare the amount of matrix compounds present in the final extracts, prepared according to different protocols. The highest number of pesticides with acceptable recoveries and the lowest amount of coextracted matrix compounds were provided by QuEChERS with Z-Sep. Subsequently, this method was fully validated in avocado and almonds. Validation studies were carried out according to DG Sanco guidelines including: the evaluation of recoveries at two levels (10 and 50 μg/kg), limit of quantitation, linearity, matrix effects, as well as interday and intraday precision. In avocado, 166 pesticides were fully validated compared to 119 in almonds. The method was operated satisfactorily in routine analysis and was applied to real samples. © 2013 Published by Elsevier B.V.
Early Impact Of CareFirst's Patient-Centered Medical Home With Strong Financial Incentives.
Afendulis, Christopher C; Hatfield, Laura A; Landon, Bruce E; Gruber, Jonathan; Landrum, Mary Beth; Mechanic, Robert E; Zinner, Darren E; Chernew, Michael E
2017-03-01
In 2011 CareFirst BlueCross BlueShield, a large mid-Atlantic health insurance plan, implemented a payment and delivery system reform program. The model, called the Total Care and Cost Improvement Program, includes enhanced payments for primary care, significant financial incentives for primary care physicians to control spending, and care coordination tools to support progress toward the goal of higher-quality and lower-cost patient care. We conducted a mixed-methods evaluation of the initiative's first three years. Our quantitative analyses used spending and utilization data for 2010-13 to compare enrollees who received care from participating physician groups to similar enrollees cared for by nonparticipating groups. Savings were small and fully shared with providers, which suggests no significant effect on total spending (including bonuses). Our qualitative analysis suggested that early in the program, many physicians were not fully engaged with the initiative and did not make full use of its tools. These findings imply that this and similar payment reforms may require greater time to realize significant savings than many stakeholders had expected. Patience may be necessary if payer-led reform is going to lead to system transformation. Project HOPE—The People-to-People Health Foundation, Inc.
ABSTRACT:Only a fraction of chemicals in commerce have been fully assessed for their potential hazards to human health due to difficulties involved in conventional regulatory tests. It has recently been proposed that quantitative transcriptomic data can be used to determine bench...
Vaudano, Enrico; Costantini, Antonella; Garcia-Moruno, Emilia
2016-10-03
The availability of genetically modified (GM) yeasts for winemaking and, in particular, transgenic strains based on the integration of genetic constructs deriving from other organisms into the genome of Saccharomyces cerevisiae, has been a reality for several years. Despite this, their use is only authorized in a few countries and limited to two strains: ML01, able to convert malic acid into lactic acid during alcoholic fermentation, and ECMo01 suitable for reducing the risk of carbamate production. In this work we propose a quali-quantitative culture-independent method for the detection of GM yeast ML01 in commercial preparations of ADY (Active Dry Yeast) consisting of efficient extraction of DNA and qPCR (quantitative PCR) analysis based on event-specific assay targeting MLC (malolactic cassette), and a taxon-specific S. cerevisiae assay detecting the MRP2 gene. The ADY DNA extraction methodology has been shown to provide good purity DNA suitable for subsequent qPCR. The MLC and MRP2 qPCR assay showed characteristics of specificity, dynamic range, limit of quantification (LOQ) limit of detection (LOD), precision and trueness, which were fully compliant with international reference guidelines. The method has been shown to reliably detect 0.005% (mass/mass) of GM ML01 S. cerevisiae in commercial preparations of ADY. Copyright © 2016 Elsevier B.V. All rights reserved.
Quantitative detection of type A staphylococcal enterotoxin by Laurell electroimmunodiffusion.
Gasper, E; Heimsch, R C; Anderson, A W
1973-03-01
The detection of staphylococcal enterotoxin A by the quantitative technique of electroimmunodiffusion is described. High dilutions of type-specific rabbit antiserum were used in 1% agarose gels, 1 mm thick, and prepared in 0.05-mug barbital buffer, pH 8.6. Volumes of 10 muliters containing 1.5 to 10 ng of toxin were electrophoresed out of 4-mm diameter wells at 5 mA/cm width of gel. The precipitin cones formed were made visible by first immersing the agarose gels in 0.2 M NaCl and then overlaying the surface with the purified globulin fraction of sheep serum against rabbit globulin, followed by soaking of the gels in 1% aqueous cadmium acetate and staining with 0.1% thiazine red in 1% glacial acetic acid. Fully extended cones, 4 to 23 mm in length depending on toxin concentration and antiserum dilution, were developed in 2 to 5 h of electrophoresis, and visualization was achieved within 2 to 3 h. Because the method is qualitative, quantitative, simple, rapid, and sensitive, it offers a practical tool for the detection of small amounts of bacterial toxins in contaminated foods. The method should also qualify as a sensitive detection device in biochemical procedures which attempt to trace, detect, and identify biological substances in nanogram quantities, provided these substances are antigenic and capable of forming a precipitate with their specific antibodies.
Quantification of Microbial Phenotypes
Martínez, Verónica S.; Krömer, Jens O.
2016-01-01
Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694
Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.
2011-01-01
Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212
Pertuz, Said; McDonald, Elizabeth S; Weinstein, Susan P; Conant, Emily F; Kontos, Despina
2016-04-01
To assess a fully automated method for volumetric breast density (VBD) estimation in digital breast tomosynthesis (DBT) and to compare the findings with those of full-field digital mammography (FFDM) and magnetic resonance (MR) imaging. Bilateral DBT images, FFDM images, and sagittal breast MR images were retrospectively collected from 68 women who underwent breast cancer screening from October 2011 to September 2012 with institutional review board-approved, HIPAA-compliant protocols. A fully automated computer algorithm was developed for quantitative estimation of VBD from DBT images. FFDM images were processed with U.S. Food and Drug Administration-cleared software, and the MR images were processed with a previously validated automated algorithm to obtain corresponding VBD estimates. Pearson correlation and analysis of variance with Tukey-Kramer post hoc correction were used to compare the multimodality VBD estimates. Estimates of VBD from DBT were significantly correlated with FFDM-based and MR imaging-based estimates with r = 0.83 (95% confidence interval [CI]: 0.74, 0.90) and r = 0.88 (95% CI: 0.82, 0.93), respectively (P < .001). The corresponding correlation between FFDM and MR imaging was r = 0.84 (95% CI: 0.76, 0.90). However, statistically significant differences after post hoc correction (α = 0.05) were found among VBD estimates from FFDM (mean ± standard deviation, 11.1% ± 7.0) relative to MR imaging (16.6% ± 11.2) and DBT (19.8% ± 16.2). Differences between VDB estimates from DBT and MR imaging were not significant (P = .26). Fully automated VBD estimates from DBT, FFDM, and MR imaging are strongly correlated but show statistically significant differences. Therefore, absolute differences in VBD between FFDM, DBT, and MR imaging should be considered in breast cancer risk assessment.
Lupidi, Marco; Coscas, Florence; Cagini, Carlo; Fiore, Tito; Spaccini, Elisa; Fruttini, Daniela; Coscas, Gabriel
2016-09-01
To describe a new automated quantitative technique for displaying and analyzing macular vascular perfusion using optical coherence tomography angiography (OCT-A) and to determine a normative data set, which might be used as reference in identifying progressive changes due to different retinal vascular diseases. Reliability study. A retrospective review of 47 eyes of 47 consecutive healthy subjects imaged with a spectral-domain OCT-A device was performed in a single institution. Full-spectrum amplitude-decorrelation angiography generated OCT angiograms of the retinal superficial and deep capillary plexuses. A fully automated custom-built software was used to provide quantitative data on the foveal avascular zone (FAZ) features and the total vascular and avascular surfaces. A comparative analysis between central macular thickness (and volume) and FAZ metrics was performed. Repeatability and reproducibility were also assessed in order to establish the feasibility and reliability of the method. The comparative analysis between the superficial capillary plexus and the deep capillary plexus revealed a statistically significant difference (P < .05) in terms of FAZ perimeter, surface, and major axis and a not statistically significant difference (P > .05) when considering total vascular and avascular surfaces. A linear correlation was demonstrated between central macular thickness (and volume) and the FAZ surface. Coefficients of repeatability and reproducibility were less than 0.4, thus demonstrating high intraobserver repeatability and interobserver reproducibility for all the examined data. A quantitative approach on retinal vascular perfusion, which is visible on Spectralis OCT angiography, may offer an objective and reliable method for monitoring disease progression in several retinal vascular diseases. Copyright © 2016 Elsevier Inc. All rights reserved.
Grussu, Francesco; Ianus, Andrada; Schneider, Torben; Prados, Ferran; Fairney, James; Ourselin, Sebastien; Alexander, Daniel C.; Cercignani, Mara; Gandini Wheeler‐Kingshott, Claudia A.M.; Samson, Rebecca S.
2017-01-01
Purpose To develop a framework to fully characterize quantitative magnetization transfer indices in the human cervical cord in vivo within a clinically feasible time. Methods A dedicated spinal cord imaging protocol for quantitative magnetization transfer was developed using a reduced field‐of‐view approach with echo planar imaging (EPI) readout. Sequence parameters were optimized based in the Cramer‐Rao‐lower bound. Quantitative model parameters (i.e., bound pool fraction, free and bound pool transverse relaxation times [ T2F, T2B], and forward exchange rate [k FB]) were estimated implementing a numerical model capable of dealing with the novelties of the sequence adopted. The framework was tested on five healthy subjects. Results Cramer‐Rao‐lower bound minimization produces optimal sampling schemes without requiring the establishment of a steady‐state MT effect. The proposed framework allows quantitative voxel‐wise estimation of model parameters at the resolution typically used for spinal cord imaging (i.e. 0.75 × 0.75 × 5 mm3), with a protocol duration of ∼35 min. Quantitative magnetization transfer parametric maps agree with literature values. Whole‐cord mean values are: bound pool fraction = 0.11(±0.01), T2F = 46.5(±1.6) ms, T2B = 11.0(±0.2) µs, and k FB = 1.95(±0.06) Hz. Protocol optimization has a beneficial effect on reproducibility, especially for T2B and k FB. Conclusion The framework developed enables robust characterization of spinal cord microstructure in vivo using qMT. Magn Reson Med 79:2576–2588, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. PMID:28921614
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahi-Anwar, M; Lo, P; Kim, H
Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifiesmore » the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel component to automatically verify image acquisition parameters and automated adherence to specifications. Institutional research agreement, Siemens Healthcare; Past recipient, research grant support, Siemens Healthcare; Consultant, Toshiba America Medical Systems; Consultant, Samsung Electronics; NIH Grant support from: U01 CA181156.« less
Lowe, Benjamin M; Sun, Kai; Zeimpekis, Ioannis; Skylaris, Chris-Kriton; Green, Nicolas G
2017-11-06
Field-Effect Transistor sensors (FET-sensors) have been receiving increasing attention for biomolecular sensing over the last two decades due to their potential for ultra-high sensitivity sensing, label-free operation, cost reduction and miniaturisation. Whilst the commercial application of FET-sensors in pH sensing has been realised, their commercial application in biomolecular sensing (termed BioFETs) is hindered by poor understanding of how to optimise device design for highly reproducible operation and high sensitivity. In part, these problems stem from the highly interdisciplinary nature of the problems encountered in this field, in which knowledge of biomolecular-binding kinetics, surface chemistry, electrical double layer physics and electrical engineering is required. In this work, a quantitative analysis and critical review has been performed comparing literature FET-sensor data for pH-sensing with data for sensing of biomolecular streptavidin binding to surface-bound biotin systems. The aim is to provide the first systematic, quantitative comparison of BioFET results for a single biomolecular analyte, specifically streptavidin, which is the most commonly used model protein in biosensing experiments, and often used as an initial proof-of-concept for new biosensor designs. This novel quantitative and comparative analysis of the surface potential behaviour of a range of devices demonstrated a strong contrast between the trends observed in pH-sensing and those in biomolecule-sensing. Potential explanations are discussed in detail and surface-chemistry optimisation is shown to be a vital component in sensitivity-enhancement. Factors which can influence the response, yet which have not always been fully appreciated, are explored and practical suggestions are provided on how to improve experimental design.
Factors Influencing Learning Environments in an Integrated Experiential Program
NASA Astrophysics Data System (ADS)
Koci, Peter
The research conducted for this dissertation examined the learning environment of a specific high school program that delivered the explicit curriculum through an integrated experiential manner, which utilized field and outdoor experiences. The program ran over one semester (five months) and it integrated the grade 10 British Columbian curriculum in five subjects. A mixed methods approach was employed to identify the students' perceptions and provide richer descriptions of their experiences related to their unique learning environment. Quantitative instruments were used to assess changes in students' perspectives of their learning environment, as well as other supporting factors including students' mindfulness, and behaviours towards the environment. Qualitative data collection included observations, open-ended questions, and impromptu interviews with the teacher. The qualitative data describe the factors and processes that influenced the learning environment and give a richer, deeper interpretation which complements the quantitative findings. The research results showed positive scores on all the quantitative measures conducted, and the qualitative data provided further insight into descriptions of learning environment constructs that the students perceived as most important. A major finding was that the group cohesion measure was perceived by students as the most important attribute of their preferred learning environment. A flow chart was developed to help the researcher conceptualize how the learning environment, learning process, and outcomes relate to one another in the studied program. This research attempts to explain through the consideration of this case study: how learning environments can influence behavioural change and how an interconnectedness among several factors in the learning process is influenced by the type of learning environment facilitated. Considerably more research is needed in this area to understand fully the complexity learning environments and how they influence learning and behaviour. Keywords: learning environments; integrated experiential programs; environmental education.
Real-time quantitative fluorescence measurement of microscale cell culture analog systems
NASA Astrophysics Data System (ADS)
Oh, Taek-il; Kim, Donghyun; Tatosian, Daniel; Sung, Jong Hwan; Shuler, Michael
2007-02-01
A microscale cell culture analog (μCCA) is a cell-based lab-on-a-chip assay that, as an animal surrogate, is applied to pharmacological studies for toxicology tests. A μCCA typically comprises multiple chambers and microfluidics that connect the chambers, which represent animal organs and blood flow to mimic animal metabolism more realistically. A μCCA is expected to provide a tool for high-throughput drug discovery. Previously, a portable fluorescence detection system was investigated for a single μCCA device in real-time. In this study, we present a fluorescence-based imaging system that provides quantitative real-time data of the metabolic interactions in μCCAs with an emphasis on measuring multiple μCCA samples simultaneously for high-throughput screening. The detection system is based on discrete optics components, with a high-power LED and a charge-coupled device (CCD) camera as a light source and a detector, for monitoring cellular status on the chambers of each μCCA sample. Multiple samples are characterized mechanically on a motorized linear stage, which is fully-automated. Each μCCA sample has four chambers, where cell lines MES-SA/DX- 5, and MES-SA (tumor cells of human uterus) have been cultured. All cell-lines have been transfected to express the fusion protein H2B-GFP, which is a human histone protein fused at the amino terminus to EGFP. As a model cytotoxic drug, 10 μM doxorubicin (DOX) was used. Real-time quantitative data of the intensity loss of enhanced green fluorescent protein (EGFP) during cell death of target cells have been collected over several minutes to 40 hours. Design issues and improvements are also discussed.
Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S
2002-02-01
This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.
High-Resolution Digital Two-Color PIV for Turbomachinery Flows
NASA Astrophysics Data System (ADS)
Copenhaver, W.; Gogineni, S.; Goss, L.
1996-11-01
Turbomachinery flows are inherently unsteady. However, steady design methods are currently used to develop turbomachinery, with the lack of basic understanding of unsteady effects being compensated by use of extensive empirical correlations. Conventional laser anemometry provides quantitative evidence of unsteady effects in turbomachinery but is limited in fully exploring this phenomenon. The PIV technique holds great promise for elucidating unsteady flow mechanisms in turbomachinery if obstacles to its application in a transonic turbomachine can be overcome. Implementation involves critical issues such as tracer seeding and optical access for transmitter and receiver. Initially, an 18-in.-dia. axial fan is used to explore these issues. One optical configuration considered is the fiber-optic fanning light sheet in conjunction with high-power pulsed lasers. Instantaneous velocity measurements are made between blades at different spanwise locations.
Integrating multiparametric prostate MRI into clinical practice
2011-01-01
Abstract Multifunctional magnetic resonance imaging (MRI) techniques are increasingly being used to address bottlenecks in prostate cancer patient management. These techniques yield qualitative, semi-quantitative and fully quantitative biomarkers that reflect on the underlying biological status of a tumour. If these techniques are to have a role in patient management, then standard methods of data acquisition, analysis and reporting have to be developed. Effective communication by the use of scoring systems, structured reporting and a graphical interface that matches prostate anatomy are key elements. Practical guidelines for integrating multiparametric MRI into clinical practice are presented. PMID:22187067
Establishing Content Validity for a Literacy Coach Performance Appraisal Instrument
ERIC Educational Resources Information Center
Lane, Mae; Robbins, Mary; Price, Debra
2013-01-01
This study's purpose was to determine whether or not the Literacy Coach Appraisal Instrument developed for use in evaluating literacy coaches had content validity. The study, a fully mixed concurrent equal status design conducted from a pragmatist philosophy, collected qualitative and quantitative data from literacy experts about the elements of…
Examining the Elements of Online Learning Quality in a Fully Online Doctoral Program
ERIC Educational Resources Information Center
Templeton, Nathan R.; Ballenger, Julia N.; Thompson, J. Ray
2015-01-01
The purpose of this descriptive quantitative study was to examine the quality elements of online learning in a regional doctoral program. Utilizing the six quality dimensions of Hathaway's (2009) theory of online learning quality as a framework, the study investigated instructor-learner, learner-learner, learner-content, learner-interface,…
Researching Students with Disabilities: The Importance of Critical Perspectives
ERIC Educational Resources Information Center
Vaccaro, Annemarie; Kimball, Ezekiel W.; Wells, Ryan S.; Ostiguy, Benjamin J.
2014-01-01
In this chapter, the authors critically review the current state of quantitative research on college students with disabilities and examine the exclusion of this marginalized population from much of our research. They propose ways to conduct research that more fully accounts for this diverse and important college population. The authors argue that…
Improving College Students English Learning with Dr. Eye Android Mid
ERIC Educational Resources Information Center
Yang, Ju Yin; Che, Pei-Chun
2015-01-01
This paper investigates college students' English language learning through use of Dr. Eye Android handheld mobile Internet device (MID). Compared to related studies, students' English learning using MIDs has not been evaluated and fully understood in the field of higher education. Quantitatively, the researchers used TOEIC pretest and posttest to…
Cunefare, David; Cooper, Robert F; Higgins, Brian; Katz, David F; Dubra, Alfredo; Carroll, Joseph; Farsiu, Sina
2016-05-01
Quantitative analysis of the cone photoreceptor mosaic in the living retina is potentially useful for early diagnosis and prognosis of many ocular diseases. Non-confocal split detector based adaptive optics scanning light ophthalmoscope (AOSLO) imaging reveals the cone photoreceptor inner segment mosaics often not visualized on confocal AOSLO imaging. Despite recent advances in automated cone segmentation algorithms for confocal AOSLO imagery, quantitative analysis of split detector AOSLO images is currently a time-consuming manual process. In this paper, we present the fully automatic adaptive filtering and local detection (AFLD) method for detecting cones in split detector AOSLO images. We validated our algorithm on 80 images from 10 subjects, showing an overall mean Dice's coefficient of 0.95 (standard deviation 0.03), when comparing our AFLD algorithm to an expert grader. This is comparable to the inter-observer Dice's coefficient of 0.94 (standard deviation 0.04). To the best of our knowledge, this is the first validated, fully-automated segmentation method which has been applied to split detector AOSLO images.
Hutson, Joel D; Hutson, Kelda N
2013-01-15
Using the extant phylogenetic bracket of dinosaurs (crocodylians and birds), recent work has reported that elbow joint range of motion (ROM) studies of fossil dinosaur forearms may be providing conservative underestimates of fully fleshed in vivo ROM. As humeral ROM occupies a more central role in forelimb movements, the placement of quantitative constraints on shoulder joint ROM could improve fossil reconstructions. Here, we investigated whether soft tissues affect the more mobile shoulder joint in the same manner in which they affect elbow joint ROM in an extant archosaur. This test involved separately and repeatedly measuring humeral ROM in Alligator mississippiensis as soft tissues were dissected away in stages to bare bone. Our data show that the ROMs of humeral flexion and extension, as well as abduction and adduction, both show a statistically significant increase as flesh is removed, but then decrease when the bones must be physically articulated and moved until they separate from one another and/or visible joint surfaces. A similar ROM pattern is inferred for humeral pronation and supination. All final skeletonized ROMs were less than initial fully fleshed ROMs. These results are consistent with previously reported elbow joint ROM patterns from the extant phylogenetic bracket of dinosaurs. Thus, studies that avoid separation of complementary articular surfaces may be providing fossil shoulder joint ROMs that underestimate in vivo ROM in dinosaurs, as well as other fossil archosaurs.
Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo
2008-01-23
To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.
Brodersen, Jakob; Seehausen, Ole
2014-01-01
While ecological monitoring and biodiversity assessment programs are widely implemented and relatively well developed to survey and monitor the structure and dynamics of populations and communities in many ecosystems, quantitative assessment and monitoring of genetic and phenotypic diversity that is important to understand evolutionary dynamics is only rarely integrated. As a consequence, monitoring programs often fail to detect changes in these key components of biodiversity until after major loss of diversity has occurred. The extensive efforts in ecological monitoring have generated large data sets of unique value to macro-scale and long-term ecological research, but the insights gained from such data sets could be multiplied by the inclusion of evolutionary biological approaches. We argue that the lack of process-based evolutionary thinking in ecological monitoring means a significant loss of opportunity for research and conservation. Assessment of genetic and phenotypic variation within and between species needs to be fully integrated to safeguard biodiversity and the ecological and evolutionary dynamics in natural ecosystems. We illustrate our case with examples from fishes and conclude with examples of ongoing monitoring programs and provide suggestions on how to improve future quantitative diversity surveys. PMID:25553061
Julian, Timothy R; Bustos, Carla; Kwong, Laura H; Badilla, Alejandro D; Lee, Julia; Bischel, Heather N; Canales, Robert A
2018-05-08
Quantitative data on human-environment interactions are needed to fully understand infectious disease transmission processes and conduct accurate risk assessments. Interaction events occur during an individual's movement through, and contact with, the environment, and can be quantified using diverse methodologies. Methods that utilize videography, coupled with specialized software, can provide a permanent record of events, collect detailed interactions in high resolution, be reviewed for accuracy, capture events difficult to observe in real-time, and gather multiple concurrent phenomena. In the accompanying video, the use of specialized software to capture humanenvironment interactions for human exposure and disease transmission is highlighted. Use of videography, combined with specialized software, allows for the collection of accurate quantitative representations of human-environment interactions in high resolution. Two specialized programs include the Virtual Timing Device for the Personal Computer, which collects sequential microlevel activity time series of contact events and interactions, and LiveTrak, which is optimized to facilitate annotation of events in real-time. Opportunities to annotate behaviors at high resolution using these tools are promising, permitting detailed records that can be summarized to gain information on infectious disease transmission and incorporated into more complex models of human exposure and risk.
Spatio-temporal models of mental processes from fMRI.
Janoos, Firdaus; Machiraju, Raghu; Singh, Shantanu; Morocz, Istvan Ákos
2011-07-15
Understanding the highly complex, spatially distributed and temporally organized phenomena entailed by mental processes using functional MRI is an important research problem in cognitive and clinical neuroscience. Conventional analysis methods focus on the spatial dimension of the data discarding the information about brain function contained in the temporal dimension. This paper presents a fully spatio-temporal multivariate analysis method using a state-space model (SSM) for brain function that yields not only spatial maps of activity but also its temporal structure along with spatially varying estimates of the hemodynamic response. Efficient algorithms for estimating the parameters along with quantitative validations are given. A novel low-dimensional feature-space for representing the data, based on a formal definition of functional similarity, is derived. Quantitative validation of the model and the estimation algorithms is provided with a simulation study. Using a real fMRI study for mental arithmetic, the ability of this neurophysiologically inspired model to represent the spatio-temporal information corresponding to mental processes is demonstrated. Moreover, by comparing the models across multiple subjects, natural patterns in mental processes organized according to different mental abilities are revealed. Copyright © 2011 Elsevier Inc. All rights reserved.
The role of 3-D interactive visualization in blind surveys of H I in galaxies
NASA Astrophysics Data System (ADS)
Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.
2015-09-01
Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.
Wu, Q; Zhao, X; You, H
2017-05-18
This study aimed to test the diagnostic performance of a fully quantitative fibrosis assessment tool for liver fibrosis in patients with chronic hepatitis B (CHB), primary biliary cirrhosis (PBC) and non-alcoholic steatohepatitis (NASH). A total of 117 patients with liver fibrosis were included in this study, including 50 patients with CHB, 49 patients with PBC and 18 patients with NASH. All patients underwent liver biopsy (LB). Fibrosis stages were assessed by two experienced pathologists. Histopathological images of LB slices were processed by second harmonic generation (SHG)/two-photon excited fluorescence (TPEF) microscopy without staining, a system called qFibrosis (quantitative fibrosis) system. Altogether 101 quantitative features of the SHG/TPEF images were acquired. The parameters of aggregated collagen in portal, septal and fibrillar areas increased significantly with stages of liver fibrosis in PBC and CHB (P<0.05), but the same was not found for parameters of distributed collagen (P>0.05). There was a significant correlation between parameters of aggregated collagen in portal, septal and fibrillar areas and stages of liver fibrosis from CHB and PBC (P<0.05), but no correlation was found between the distributed collagen parameters and the stages of liver fibrosis from those patients (P>0.05). There was no significant correlation between NASH parameters and stages of fibrosis (P>0.05). For CHB and PBC patients, the highest correlation was between septal parameters and fibrosis stages, the second highest was between portal parameters and fibrosis stages and the lowest correlation was between fibrillar parameters and fibrosis stages. The correlation between the septal parameters of the PBC and stages is significantly higher than the parameters of the other two areas (P<0.05). The qFibrosis candidate parameters based on CHB were also applicable for quantitative analysis of liver fibrosis in PBC patients. Different parameters should be selected for liver fibrosis assessment in different stages of PBC compared with CHB.
Wu, Q.; Zhao, X.; You, H.
2017-01-01
This study aimed to test the diagnostic performance of a fully quantitative fibrosis assessment tool for liver fibrosis in patients with chronic hepatitis B (CHB), primary biliary cirrhosis (PBC) and non-alcoholic steatohepatitis (NASH). A total of 117 patients with liver fibrosis were included in this study, including 50 patients with CHB, 49 patients with PBC and 18 patients with NASH. All patients underwent liver biopsy (LB). Fibrosis stages were assessed by two experienced pathologists. Histopathological images of LB slices were processed by second harmonic generation (SHG)/two-photon excited fluorescence (TPEF) microscopy without staining, a system called qFibrosis (quantitative fibrosis) system. Altogether 101 quantitative features of the SHG/TPEF images were acquired. The parameters of aggregated collagen in portal, septal and fibrillar areas increased significantly with stages of liver fibrosis in PBC and CHB (P<0.05), but the same was not found for parameters of distributed collagen (P>0.05). There was a significant correlation between parameters of aggregated collagen in portal, septal and fibrillar areas and stages of liver fibrosis from CHB and PBC (P<0.05), but no correlation was found between the distributed collagen parameters and the stages of liver fibrosis from those patients (P>0.05). There was no significant correlation between NASH parameters and stages of fibrosis (P>0.05). For CHB and PBC patients, the highest correlation was between septal parameters and fibrosis stages, the second highest was between portal parameters and fibrosis stages and the lowest correlation was between fibrillar parameters and fibrosis stages. The correlation between the septal parameters of the PBC and stages is significantly higher than the parameters of the other two areas (P<0.05). The qFibrosis candidate parameters based on CHB were also applicable for quantitative analysis of liver fibrosis in PBC patients. Different parameters should be selected for liver fibrosis assessment in different stages of PBC compared with CHB. PMID:28538834
NASA Astrophysics Data System (ADS)
Tao, Yu; Muller, Jan-Peter; Poole, William
2016-12-01
We present a wide range of research results in the area of orbit-to-orbit and orbit-to-ground data fusion, achieved within the EU-FP7 PRoVisG project and EU-FP7 PRoViDE project. We focus on examples from three Mars rover missions, i.e. MER-A/B and MSL, to provide examples of a new fully automated offline method for rover localisation. We start by introducing the mis-registration discovered between the current HRSC and HiRISE datasets. Then we introduce the HRSC to CTX and CTX to HiRISE co-registration workflow. Finally, we demonstrate results of wide baseline stereo reconstruction with fixed mast position rover stereo imagery and its application to ground-to-orbit co-registration with HiRISE orthorectified image. We show examples of the quantitative assessment of recomputed rover traverses, and extensional exploitation of the co-registered datasets in visualisation and within an interactive web-GIS.
Atomistic and coarse-grained computer simulations of raft-like lipid mixtures.
Pandit, Sagar A; Scott, H Larry
2007-01-01
Computer modeling can provide insights into the existence, structure, size, and thermodynamic stability of localized raft-like regions in membranes. However, the challenges in the construction and simulation of accurate models of heterogeneous membranes are great. The primary obstacle in modeling the lateral organization within a membrane is the relatively slow lateral diffusion rate for lipid molecules. Microsecond or longer time-scales are needed to fully model the formation and stability of a raft in a membra ne. Atomistic simulations currently are not able to reach this scale, but they do provide quantitative information on the intermolecular forces and correlations that are involved in lateral organization. In this chapter, the steps needed to carry out and analyze atomistic simulations of hydrated lipid bilayers having heterogeneous composition are outlined. It is then shown how the data from a molecular dynamics simulation can be used to construct a coarse-grained model for the heterogeneous bilayer that can predict the lateral organization and stability of rafts at up to millisecond time-scales.
Nijran, Kuldip S; Houston, Alex S; Fleming, John S; Jarritt, Peter H; Heikkinen, Jari O; Skrypniuk, John V
2014-07-01
In this second UK audit of quantitative parameters obtained from renography, phantom simulations were used in cases in which the 'true' values could be estimated, allowing the accuracy of the parameters measured to be assessed. A renal physical phantom was used to generate a set of three phantom simulations (six kidney functions) acquired on three different gamma camera systems. A total of nine phantom simulations and three real patient studies were distributed to UK hospitals participating in the audit. Centres were asked to provide results for the following parameters: relative function and time-to-peak (whole kidney and cortical region). As with previous audits, a questionnaire collated information on methodology. Errors were assessed as the root mean square deviation from the true value. Sixty-one centres responded to the audit, with some hospitals providing multiple sets of results. Twenty-one centres provided a complete set of parameter measurements. Relative function and time-to-peak showed a reasonable degree of accuracy and precision in most UK centres. The overall average root mean squared deviation of the results for (i) the time-to-peak measurement for the whole kidney and (ii) the relative function measurement from the true value was 7.7 and 4.5%, respectively. These results showed a measure of consistency in the relative function and time-to-peak that was similar to the results reported in a previous renogram audit by our group. Analysis of audit data suggests a reasonable degree of accuracy in the quantification of renography function using relative function and time-to-peak measurements. However, it is reasonable to conclude that the objectives of the audit could not be fully realized because of the limitations of the mechanical phantom in providing true values for renal parameters.
NASA Astrophysics Data System (ADS)
Hardie, Russell C.; Rucci, Michael A.; Dapore, Alexander J.; Karch, Barry K.
2017-07-01
We present a block-matching and Wiener filtering approach to atmospheric turbulence mitigation for long-range imaging of extended scenes. We evaluate the proposed method, along with some benchmark methods, using simulated and real-image sequences. The simulated data are generated with a simulation tool developed by one of the authors. These data provide objective truth and allow for quantitative error analysis. The proposed turbulence mitigation method takes a sequence of short-exposure frames of a static scene and outputs a single restored image. A block-matching registration algorithm is used to provide geometric correction for each of the individual input frames. The registered frames are then averaged, and the average image is processed with a Wiener filter to provide deconvolution. An important aspect of the proposed method lies in how we model the degradation point spread function (PSF) for the purposes of Wiener filtering. We use a parametric model that takes into account the level of geometric correction achieved during image registration. This is unlike any method we are aware of in the literature. By matching the PSF to the level of registration in this way, the Wiener filter is able to fully exploit the reduced blurring achieved by registration. We also describe a method for estimating the atmospheric coherence diameter (or Fried parameter) from the estimated motion vectors. We provide a detailed performance analysis that illustrates how the key tuning parameters impact system performance. The proposed method is relatively simple computationally, yet it has excellent performance in comparison with state-of-the-art benchmark methods in our study.
Quantitative mass spectrometry: an overview
NASA Astrophysics Data System (ADS)
Urban, Pawel L.
2016-10-01
Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.
The NIST Quantitative Infrared Database
Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.
1999-01-01
With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.
Yin, Xiao-Li; Gu, Hui-Wen; Liu, Xiao-Lu; Zhang, Shan-Hui; Wu, Hai-Long
2018-03-05
Multiway calibration in combination with spectroscopic technique is an attractive tool for online or real-time monitoring of target analyte(s) in complex samples. However, how to choose a suitable multiway calibration method for the resolution of spectroscopic-kinetic data is a troubling problem in practical application. In this work, for the first time, three-way and four-way fluorescence-kinetic data arrays were generated during the real-time monitoring of the hydrolysis of irinotecan (CPT-11) in human plasma by excitation-emission matrix fluorescence. Alternating normalization-weighted error (ANWE) and alternating penalty trilinear decomposition (APTLD) were used as three-way calibration for the decomposition of the three-way kinetic data array, whereas alternating weighted residual constraint quadrilinear decomposition (AWRCQLD) and alternating penalty quadrilinear decomposition (APQLD) were applied as four-way calibration to the four-way kinetic data array. The quantitative results of the two kinds of calibration models were fully compared from the perspective of predicted real-time concentrations, spiked recoveries of initial concentration, and analytical figures of merit. The comparison study demonstrated that both three-way and four-way calibration models could achieve real-time quantitative analysis of the hydrolysis of CPT-11 in human plasma under certain conditions. However, it was also found that both of them possess some critical advantages and shortcomings during the process of dynamic analysis. The conclusions obtained in this paper can provide some helpful guidance for the reasonable selection of multiway calibration models to achieve the real-time quantitative analysis of target analyte(s) in complex dynamic systems. Copyright © 2017 Elsevier B.V. All rights reserved.
Leong, James; McAuslane, Neil; Walker, Stuart; Salek, Sam
2013-09-01
To explore the current status and need for a universal benefit-risk framework for medicines in regulatory agencies and pharmaceutical companies. A questionnaire was developed and sent to 14 mature regulatory agencies and 24 major companies. The data were analysed using descriptive statistics, for a minority of questions preceded by manual grouping of the responses. Overall response rate was 82%, and study participants included key decision makers from agencies and companies. None used a fully quantitative system, most companies preferring a qualitative method. The major reasons for this group not using semi-quantitative or quantitative systems were lack of a universal and scientifically validated framework. The main advantages of a benefit-risk framework were that it provided a systematic standardised approach to decision-making and that it acted as a tool to enhance quality of communication. It was also reported that a framework should be of value to both agencies and companies throughout the life cycle of a product. They believed that it is possible to develop an overarching benefit-risk framework that should involve relevant stakeholders in the development, validation and application of a universal framework. The entire cohort indicated common barriers to implementing a framework were resource limitations, a lack of knowledge and a scientifically validated and acceptable framework. Stakeholders prefer a semi-quantitative, overarching framework that incorporates a toolbox of different methodologies. A coordinating committee of relevant stakeholders should be formed to guide its development and implementation. Through engaging the stakeholders, these outcomes confirm sentiments and need for developing a universal benefit-risk assessment framework. Copyright © 2013 John Wiley & Sons, Ltd.
Wood-adhesive bonding failure : modeling and simulation
Zhiyong Cai
2010-01-01
The mechanism of wood bonding failure when exposed to wet conditions or wet/dry cycles is not fully understood and the role of the resulting internal stresses exerted upon the wood-adhesive bondline has yet to be quantitatively determined. Unlike previous modeling this study has developed a new two-dimensional internal-stress model on the basis of the mechanics of...
ERIC Educational Resources Information Center
Hwang, Eunjin; Smith, Rachel N.; Byers, Valerie Tharp; Dickerson, Shirley; McAlister-Shields, Leah; Onwuegbuzie, Anthony J.; Benge, Cindy
2015-01-01
The non-completion of doctoral degrees has been a concern due to its economic, social, and personal consequences. In the current study, the researchers investigated perceived barriers of select doctoral students in completing their doctoral degrees by utilizing a fully mixed sequential mixed research design. The quantitative and qualitative data…
Integrated Approach To Design And Analysis Of Systems
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Iverson, David L.
1993-01-01
Object-oriented fault-tree representation unifies evaluation of reliability and diagnosis of faults. Programming/fault tree described more fully in "Object-Oriented Algorithm For Evaluation Of Fault Trees" (ARC-12731). Augmented fault tree object contains more information than fault tree object used in quantitative analysis of reliability. Additional information needed to diagnose faults in system represented by fault tree.
Does Home Visiting Benefit Only First-Time Mothers?: Evidence from Healthy Families Virginia
ERIC Educational Resources Information Center
Huntington, Lee; Galano, Joseph
2013-01-01
It is a common assumption that mothers who have had previous births would participate less fully and have poorer outcomes from early home visitation programs than would first-time mothers. The authors conducted a qualitative and quantitative study to test that assumption by measuring three aspects of participation: time in the program, the number…
Combining Efforts to Encourage Student Research in Collaborative Quantitative Fields
ERIC Educational Resources Information Center
Nadolski, Jeremy; Smith, Lee Ann
2010-01-01
As technology and science advance, the boundary between the disciplines begins to blur, emphasizing that it is now, more than ever, a requirement to have a solid background in multiple fields to fully understand emerging scientific advances. As faculty, we need to equip our undergraduate students not only with an introduction to these modern…
USDA-ARS?s Scientific Manuscript database
Plants must respond to environmental cues and schedule their development in order to react to periods of abiotic stress and commit fully to growth and reproduction under favorable conditions. This study was initiated to identify SNP markers for characters expressed from the seedling stage to plant m...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Chuan, E-mail: chuan.huang@stonybrookmedicine.edu; Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115; Departments of Radiology, Psychiatry, Stony Brook Medicine, Stony Brook, New York 11794
2015-02-15
Purpose: Degradation of image quality caused by cardiac and respiratory motions hampers the diagnostic quality of cardiac PET. It has been shown that improved diagnostic accuracy of myocardial defect can be achieved by tagged MR (tMR) based PET motion correction using simultaneous PET-MR. However, one major hurdle for the adoption of tMR-based PET motion correction in the PET-MR routine is the long acquisition time needed for the collection of fully sampled tMR data. In this work, the authors propose an accelerated tMR acquisition strategy using parallel imaging and/or compressed sensing and assess the impact on the tMR-based motion corrected PETmore » using phantom and patient data. Methods: Fully sampled tMR data were acquired simultaneously with PET list-mode data on two simultaneous PET-MR scanners for a cardiac phantom and a patient. Parallel imaging and compressed sensing were retrospectively performed by GRAPPA and kt-FOCUSS algorithms with various acceleration factors. Motion fields were estimated using nonrigid B-spline image registration from both the accelerated and fully sampled tMR images. The motion fields were incorporated into a motion corrected ordered subset expectation maximization reconstruction algorithm with motion-dependent attenuation correction. Results: Although tMR acceleration introduced image artifacts into the tMR images for both phantom and patient data, motion corrected PET images yielded similar image quality as those obtained using the fully sampled tMR images for low to moderate acceleration factors (<4). Quantitative analysis of myocardial defect contrast over ten independent noise realizations showed similar results. It was further observed that although the image quality of the motion corrected PET images deteriorates for high acceleration factors, the images were still superior to the images reconstructed without motion correction. Conclusions: Accelerated tMR images obtained with more than 4 times acceleration can still provide relatively accurate motion fields and yield tMR-based motion corrected PET images with similar image quality as those reconstructed using fully sampled tMR data. The reduction of tMR acquisition time makes it more compatible with routine clinical cardiac PET-MR studies.« less
The current use of patient-centered/reported outcomes in implant dentistry: a systematic review.
De Bruyn, Hugo; Raes, Stefanie; Matthys, Carine; Cosyn, Jan
2015-09-01
To provide an update on the use of Patient-Reported Outcome Measures (PROMs) in the field of implant dentistry (1); to compare PROMs for prostheses supported by one or more implants to alternative treatment options or a healthy dentition (2). The dental literature was searched on PubMed until December 31, 2014, using a general search algorithm. An overall quantitative analysis was performed, and a qualitative appraisal was made on the output of the last 6 years. Per type of edentulism and prosthetic treatment, the general search algorithm was refined in order to select controlled studies comparing PROMs for prostheses supported by one or more implants to alternative treatment options or a healthy dentition. With nearly half of the output (300 of 635) published in the last 6 years, there is a growing interest in PROMs by the scientific community. When scrutinizing the 300 most recent publications, only 84 controlled studies could be identified among which 38 RCTs and 31 cohort studies. An "ad hoc" approach is commonly employed using non-standardized questions and different scoring methods, which may compromise validity and reliability. Overall, 39 eligible papers related to fully edentulous patients treated with an implant overdenture (IOD) and 9 to fully edentulous patients treated with a fixed implant prosthesis (FIP). There is plenty of evidence from well-controlled studies showing that fully edentulous patients in the mandible experience higher satisfaction with an IOD when compared to a conventional denture (CD). This may not hold true for fully edentulous patients in the maxilla. In general, fully edentulous patients seem to opt for a fixed or removable rehabilitation on implants for specific reasons. Data pertaining to partially edentulous patients were limited (FIP: n = 6; single implants: n = 16). In these patients, the timing of implant placement does not seem to affect patient satisfaction. Patients seem to prefer straightforward implant surgery over complex surgery that includes bone grafting. There is an urgent need for standardized reporting of PROMs in the field of implant dentistry. Fully edentulous patients in the mandible experience higher satisfaction with an IOD when compared to a CD. All other types of prostheses have been underexposed to research. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Lueders, Tillmann; Manefield, Mike; Friedrich, Michael W
2004-01-01
Stable isotope probing (SIP) of nucleic acids allows the detection and identification of active members of natural microbial populations that are involved in the assimilation of an isotopically labelled compound into nucleic acids. SIP is based on the separation of isotopically labelled DNA or rRNA by isopycnic density gradient centrifugation. We have developed a highly sensitive protocol for the detection of 'light' and 'heavy' nucleic acids in fractions of centrifugation gradients. It involves the fluorometric quantification of total DNA or rRNA, and the quantification of either 16S rRNA genes or 16S rRNA in gradient fractions by real-time PCR with domain-specific primers. Using this approach, we found that fully 13C-labelled DNA or rRNA of Methylobacterium extorquens was quantitatively resolved from unlabelled DNA or rRNA of Methanosarcina barkeri by cesium chloride or cesium trifluoroacetate density gradient centrifugation respectively. However, a constant low background of unspecific nucleic acids was detected in all DNA or rRNA gradient fractions, which is important for the interpretation of environmental SIP results. Consequently, quantitative analysis of gradient fractions provides a higher precision and finer resolution for retrieval of isotopically enriched nucleic acids than possible using ethidium bromide or gradient fractionation combined with fingerprinting analyses. This is a prerequisite for the fine-scale tracing of microbial populations metabolizing 13C-labelled compounds in natural ecosystems.
Markerless 3D motion capture for animal locomotion studies
Sellers, William Irvin; Hirasaki, Eishi
2014-01-01
ABSTRACT Obtaining quantitative data describing the movements of animals is an essential step in understanding their locomotor biology. Outside the laboratory, measuring animal locomotion often relies on video-based approaches and analysis is hampered because of difficulties in calibration and often the limited availability of possible camera positions. It is also usually restricted to two dimensions, which is often an undesirable over-simplification given the essentially three-dimensional nature of many locomotor performances. In this paper we demonstrate a fully three-dimensional approach based on 3D photogrammetric reconstruction using multiple, synchronised video cameras. This approach allows full calibration based on the separation of the individual cameras and will work fully automatically with completely unmarked and undisturbed animals. As such it has the potential to revolutionise work carried out on free-ranging animals in sanctuaries and zoological gardens where ad hoc approaches are essential and access within enclosures often severely restricted. The paper demonstrates the effectiveness of video-based 3D photogrammetry with examples from primates and birds, as well as discussing the current limitations of this technique and illustrating the accuracies that can be obtained. All the software required is open source so this can be a very cost effective approach and provides a methodology of obtaining data in situations where other approaches would be completely ineffective. PMID:24972869
Magnetic Propulsion of Microswimmers with DNA-Based Flagellar Bundles.
Maier, Alexander M; Weig, Cornelius; Oswald, Peter; Frey, Erwin; Fischer, Peer; Liedl, Tim
2016-02-10
We show that DNA-based self-assembly can serve as a general and flexible tool to construct artificial flagella of several micrometers in length and only tens of nanometers in diameter. By attaching the DNA flagella to biocompatible magnetic microparticles, we provide a proof of concept demonstration of hybrid structures that, when rotated in an external magnetic field, propel by means of a flagellar bundle, similar to self-propelling peritrichous bacteria. Our theoretical analysis predicts that flagellar bundles that possess a length-dependent bending stiffness should exhibit a superior swimming speed compared to swimmers with a single appendage. The DNA self-assembly method permits the realization of these improved flagellar bundles in good agreement with our quantitative model. DNA flagella with well-controlled shape could fundamentally increase the functionality of fully biocompatible nanorobots and extend the scope and complexity of active materials.
NASA Astrophysics Data System (ADS)
Neumann, Karl
1987-06-01
In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.
Whole lung morphometry with 3D multiple b-value hyperpolarized gas MRI and compressed sensing.
Chan, Ho-Fung; Stewart, Neil J; Parra-Robles, Juan; Collier, Guilhem J; Wild, Jim M
2017-05-01
To demonstrate three-dimensional (3D) multiple b-value diffusion-weighted (DW) MRI of hyperpolarized 3 He gas for whole lung morphometry with compressed sensing (CS). A fully-sampled, two b-value, 3D hyperpolarized 3 He DW-MRI dataset was acquired from the lungs of a healthy volunteer and retrospectively undersampled in the k y and k z phase-encoding directions for CS simulations. Optimal k-space undersampling patterns were determined by minimizing the mean absolute error between reconstructed and fully-sampled 3 He apparent diffusion coefficient (ADC) maps. Prospective three-fold, undersampled, 3D multiple b-value 3 He DW-MRI datasets were acquired from five healthy volunteers and one chronic obstructive pulmonary disease (COPD) patient, and the mean values of maps of ADC and mean alveolar dimension (Lm D ) were validated against two-dimensional (2D) and 3D fully-sampled 3 He DW-MRI experiments. Reconstructed undersampled datasets showed no visual artifacts and good preservation of the main image features and quantitative information. A good agreement between fully-sampled and prospective undersampled datasets was found, with a mean difference of +3.4% and +5.1% observed in mean global ADC and Lm D values, respectively. These differences were within the standard deviation range and consistent with values reported from healthy and COPD lungs. Accelerated CS acquisition has facilitated 3D multiple b-value 3 He DW-MRI scans in a single breath-hold, enabling whole lung morphometry mapping. Magn Reson Med 77:1916-1925, 2017. © 2016 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2016 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.
Liu, Dan; Li, Xingrui; Zhou, Junkai; Liu, Shibo; Tian, Tian; Song, Yanling; Zhu, Zhi; Zhou, Leiji; Ji, Tianhai; Yang, Chaoyong
2017-10-15
Enzyme-linked immunosorbent assay (ELISA) is a popular laboratory technique for detection of disease-specific protein biomarkers with high specificity and sensitivity. However, ELISA requires labor-intensive and time-consuming procedures with skilled operators and spectroscopic instrumentation. Simplification of the procedures and miniaturization of the devices are crucial for ELISA-based point-of-care (POC) testing in resource-limited settings. Here, we present a fully integrated, instrument-free, low-cost and portable POC platform which integrates the process of ELISA and the distance readout into a single microfluidic chip. Based on manipulation using a permanent magnet, the process is initiated by moving magnetic beads with capture antibody through different aqueous phases containing ELISA reagents to form bead/antibody/antigen/antibody sandwich structure, and finally converts the molecular recognition signal into a highly sensitive distance readout for visual quantitative bioanalysis. Without additional equipment and complicated operations, our integrated ELISA-Chip with distance readout allows ultrasensitive quantitation of disease biomarkers within 2h. The ELISA-Chip method also showed high specificity, good precision and great accuracy. Furthermore, the ELISA-Chip system is highly applicable as a sandwich-based platform for the detection of a variety of protein biomarkers. With the advantages of visual analysis, easy operation, high sensitivity, and low cost, the integrated sample-in-answer-out ELISA-Chip with distance readout shows great potential for quantitative POCT in resource-limited settings. Copyright © 2017. Published by Elsevier B.V.
Fully On-the-Job Training: Experiences and Steps Ahead: Support Document
ERIC Educational Resources Information Center
Wood, Susanne
2004-01-01
This document was produced by DBM Consultants, who provided the research on Susanne Wood's report "Fully On-the-Job Training: Experiences and Steps Ahead." It contains the appendix: Stage 3--CATI Questionnaire for Fully On-the-Job trainees/apprentices and is provided as an added resource for further information. [Full Report available at ED493985.
López-Moreno, Sergio; Martínez-Ojeda, Rosa Haydeé; López-Arellano, Oliva; Jarillo-Soto, Edgar; Castro-Albarrán, Juan Manuel
2011-01-01
To assess the consequences of private outsourcing on the overall supply and filling of prescriptions in state health services. The research was conducted using quantitative and qualitative techniques in 13 states. The information was collected through interviews and direct observation. The interviews were carried on staff of state health services related to the drug supply chain and users of health services. The quantitative approach examined the percentage of stocked full recipes in a sample of users. States that have opted for the fully outsourced model, and properly monitored this choice, have increased the supply of drugs to their users and guaranteed the supply in the care units in charge. Other states with the outsourced model have multiple problems: direct purchase of drugs not included in the basic drugs catalogue, failure of suppliers and shortage of supplies in the laboratories that provide the company. The main disadvantages identified in all models were: the subordination of the medical criteria to administrative criteria, insufficient planning based on local care needs, heterogeneous procedures, insufficient knowledge of regulations and lack of normativity. The results indicate that the incorporation of private providers in the drug supply chain may not be the solution to bring down the shortage faced by health services, especially at the hospital level. The shift to outsourcing models has developed without incorporating evaluation mechanisms and the consequences that this transition can have on state health systems must be investigated more deeply.
Why Do Women Not Use Preconception Care? A Systematic Review On Barriers And Facilitators.
Poels, Marjolein; Koster, Maria P H; Boeije, Hennie R; Franx, Arie; van Stel, Henk F
2016-10-01
Preconception care (PCC) has the potential to optimize pregnancy outcomes. However, awareness of PCC among the target population is generally limited, and the use of PCC remains low. The objective of this study was to review the literature on women's perceptions regarding barriers and facilitators for the use of PCC. A systematic search was conducted in MEDLINE, Embase, CINAHL, and PsycINFO for published studies until February 2015. Original qualitative and quantitative peer-reviewed studies from Western countries in English, holding women's perceptions regarding barriers and facilitators for the use of PCC. Data extraction and analysis were performed using NVivo version 10 software. A coding frame was derived from the findings and applied by 2 authors. Thematic analysis was used to identify key topics and themes. Twenty-one good-quality articles were included, of which 10 qualitative and 11 quantitative studies. Seven main themes were identified: preconditions, emotions and beliefs, perceived need, knowledge and experience, social structure, accessibility, and provider characteristics. "Not (fully) planning pregnancy", "perceived absence of risks", "lack of awareness", and "pregnancy experiences" were the most frequently identified barriers and "believing in the benefits" and "availability of PCC" the most frequently identified facilitators for PCC use. Women perceive more barriers than facilitators related to PCC uptake, which explains why the use of PCC remains low. Our results provide a starting point to refocus interventions and strategies, aiming on enlarging the awareness, perceived importance, and accessibility of PCC to improve its uptake.
2017-01-01
Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). Conclusions This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. PMID:28292738
Pertuz, Said; McDonald, Elizabeth S.; Weinstein, Susan P.; Conant, Emily F.
2016-01-01
Purpose To assess a fully automated method for volumetric breast density (VBD) estimation in digital breast tomosynthesis (DBT) and to compare the findings with those of full-field digital mammography (FFDM) and magnetic resonance (MR) imaging. Materials and Methods Bilateral DBT images, FFDM images, and sagittal breast MR images were retrospectively collected from 68 women who underwent breast cancer screening from October 2011 to September 2012 with institutional review board–approved, HIPAA-compliant protocols. A fully automated computer algorithm was developed for quantitative estimation of VBD from DBT images. FFDM images were processed with U.S. Food and Drug Administration–cleared software, and the MR images were processed with a previously validated automated algorithm to obtain corresponding VBD estimates. Pearson correlation and analysis of variance with Tukey-Kramer post hoc correction were used to compare the multimodality VBD estimates. Results Estimates of VBD from DBT were significantly correlated with FFDM-based and MR imaging–based estimates with r = 0.83 (95% confidence interval [CI]: 0.74, 0.90) and r = 0.88 (95% CI: 0.82, 0.93), respectively (P < .001). The corresponding correlation between FFDM and MR imaging was r = 0.84 (95% CI: 0.76, 0.90). However, statistically significant differences after post hoc correction (α = 0.05) were found among VBD estimates from FFDM (mean ± standard deviation, 11.1% ± 7.0) relative to MR imaging (16.6% ± 11.2) and DBT (19.8% ± 16.2). Differences between VDB estimates from DBT and MR imaging were not significant (P = .26). Conclusion Fully automated VBD estimates from DBT, FFDM, and MR imaging are strongly correlated but show statistically significant differences. Therefore, absolute differences in VBD between FFDM, DBT, and MR imaging should be considered in breast cancer risk assessment. © RSNA, 2015 Online supplemental material is available for this article. PMID:26491909
Thompson, Holly M; Minamimoto, Ryogo; Jamali, Mehran; Barkhodari, Amir; von Eyben, Rie; Iagaru, Andrei
2016-07-01
As quantitative F-FDG PET numbers and pooling of results from different PET/CT scanners become more influential in the management of patients, it becomes imperative that we fully interrogate differences between scanners to fully understand the degree of scanner bias on the statistical power of studies. Participants with body mass index (BMI) greater than 25, scheduled on a time-of-flight (TOF)-capable PET/CT scanner, had a consecutive scan on a non-TOF-capable PET/CT scanner and vice versa. SUVmean in various tissues and SUVmax of malignant lesions were measured from both scans, matched to each subject. Data were analyzed using a mixed-effects model, and statistical significance was determined using equivalence testing, with P < 0.05 being significant. Equivalence was established in all baseline organs, except the cerebellum, matched per patient between scanner types. Mixed-effects method analysis of lesions, repeated between scan types and matched per patient, demonstrated good concordance between scanner types. Patients could be scanned on either a TOF or non-TOF-capable PET/CT scanner without clinical compromise to quantitative SUV measurements.
Shi, Chuan; Goldberg, Shalom; Lin, Tricia; Dudkin, Vadim; Widdison, Wayne; Harris, Luke; Wilhelm, Sharon; Jmeian, Yazen; Davis, Darryl; O'Neil, Karyn; Weng, Naidong; Jian, Wenying
2018-04-17
Bioanalysis of antibody-drug conjugates (ADCs) is challenging due to the complex, heterogeneous nature of their structures and their complicated catabolism. To fully describe the pharmacokinetics (PK) of an ADC, several analytes are commonly quantified, including total antibody, conjugate, and payload. Among them, conjugate is the most challenging to measure, because it requires detection of both small and large molecules as one entity. Existing approaches to quantify the conjugated species of ADCs involve a ligand binding assay (LBA) for conjugated antibody or hybrid LBA/liquid chromatography/tandem mass spectrometry (LC/MS/MS) for quantitation of conjugated drug. In our current work for a protein-drug conjugate (PDC) using the Centyrin scaffold, a similar concept to ADCs but with smaller protein size, an alternative method to quantify the conjugate by using a surrogate peptide approach, was utilized. The His-tagged proteins were isolated from biological samples using immobilized metal affinity chromatography (IMAC), followed by trypsin digestion. The tryptic peptide containing the linker attached to the payload was used as a surrogate of the conjugate and monitored by LC/MS/MS analysis. During method development and its application, we found that hydrolysis of the succinimide ring of the linker was ubiquitous, taking place at many stages during the lifetime of the PDC including in the initial drug product, in vivo in circulation in the animals, and ex vivo during the trypsin digestion step of the sample preparation. We have shown that hydrolysis during trypsin digestion is concentration-independent and consistent during the work flow-therefore, having no impact on assay performance. However, for samples that have undergone extensive hydrolysis prior to trypsin digestion, significant bias could be introduced if only the non-hydrolyzed form is considered in the quantitation. Therefore, it is important to incorporate succinimide hydrolysis products in the quantitation method in order to provide an accurate estimation of the total conjugate level. More importantly, the LC/MS/MS-based method described here provides a useful tool to quantitatively evaluate succinimide hydrolysis of ADCs in vivo, which has been previously reported to have significant impact on their stability, exposure, and efficacy.
Probing lipid membrane electrostatics
NASA Astrophysics Data System (ADS)
Yang, Yi
The electrostatic properties of lipid bilayer membranes play a significant role in many biological processes. Atomic force microscopy (AFM) is highly sensitive to membrane surface potential in electrolyte solutions. With fully characterized probe tips, AFM can perform quantitative electrostatic analysis of lipid membranes. Electrostatic interactions between Silicon nitride probes and supported zwitterionic dioleoylphosphatidylcholine (DOPC) bilayer with a variable fraction of anionic dioleoylphosphatidylserine (DOPS) were measured by AFM. Classical Gouy-Chapman theory was used to model the membrane electrostatics. The nonlinear Poisson-Boltzmann equation was numerically solved with finite element method to provide the potential distribution around the AFM tips. Theoretical tip-sample electrostatic interactions were calculated with the surface integral of both Maxwell and osmotic stress tensors on tip surface. The measured forces were interpreted with theoretical forces and the resulting surface charge densities of the membrane surfaces were in quantitative agreement with the Gouy-Chapman-Stern model of membrane charge regulation. It was demonstrated that the AFM can quantitatively detect membrane surface potential at a separation of several screening lengths, and that the AFM probe only perturbs the membrane surface potential by <2%. One important application of this technique is to estimate the dipole density of lipid membrane. Electrostatic analysis of DOPC lipid bilayers with the AFM reveals a repulsive force between the negatively charged probe tips and the zwitterionic lipid bilayers. This unexpected interaction has been analyzed quantitatively to reveal that the repulsion is due to a weak external field created by the internai membrane dipole moment. The analysis yields a dipole moment of 1.5 Debye per lipid with a dipole potential of +275 mV for supported DOPC membranes. This new ability to quantitatively measure the membrane dipole density in a noninvasive manner will be useful in identifying the biological effects of the dipole potential. Finally, heterogeneous model membranes were studied with fluid electric force microscopy (FEFM). Electrostatic mapping was demonstrated with 50 nm resolution. The capabilities of quantitative electrostatic measurement and lateral charge density mapping make AFM a unique and powerful probe of membrane electrostatics.
Battiston, Marco; Grussu, Francesco; Ianus, Andrada; Schneider, Torben; Prados, Ferran; Fairney, James; Ourselin, Sebastien; Alexander, Daniel C; Cercignani, Mara; Gandini Wheeler-Kingshott, Claudia A M; Samson, Rebecca S
2018-05-01
To develop a framework to fully characterize quantitative magnetization transfer indices in the human cervical cord in vivo within a clinically feasible time. A dedicated spinal cord imaging protocol for quantitative magnetization transfer was developed using a reduced field-of-view approach with echo planar imaging (EPI) readout. Sequence parameters were optimized based in the Cramer-Rao-lower bound. Quantitative model parameters (i.e., bound pool fraction, free and bound pool transverse relaxation times [ T2F, T2B], and forward exchange rate [k FB ]) were estimated implementing a numerical model capable of dealing with the novelties of the sequence adopted. The framework was tested on five healthy subjects. Cramer-Rao-lower bound minimization produces optimal sampling schemes without requiring the establishment of a steady-state MT effect. The proposed framework allows quantitative voxel-wise estimation of model parameters at the resolution typically used for spinal cord imaging (i.e. 0.75 × 0.75 × 5 mm 3 ), with a protocol duration of ∼35 min. Quantitative magnetization transfer parametric maps agree with literature values. Whole-cord mean values are: bound pool fraction = 0.11(±0.01), T2F = 46.5(±1.6) ms, T2B = 11.0(±0.2) µs, and k FB = 1.95(±0.06) Hz. Protocol optimization has a beneficial effect on reproducibility, especially for T2B and k FB . The framework developed enables robust characterization of spinal cord microstructure in vivo using qMT. Magn Reson Med 79:2576-2588, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.
Going fully digital: Perspective of a Dutch academic pathology lab
Stathonikos, Nikolas; Veta, Mitko; Huisman, André; van Diest, Paul J.
2013-01-01
During the last years, whole slide imaging has become more affordable and widely accepted in pathology labs. Digital slides are increasingly being used for digital archiving of routinely produced clinical slides, remote consultation and tumor boards, and quantitative image analysis for research purposes and in education. However, the implementation of a fully digital Pathology Department requires an in depth look into the suitability of digital slides for routine clinical use (the image quality of the produced digital slides and the factors that affect it) and the required infrastructure to support such use (the storage requirements and integration with lab management and hospital information systems). Optimization of digital pathology workflow requires communication between several systems, which can be facilitated by the use of open standards for digital slide storage and scanner management. Consideration of these aspects along with appropriate validation of the use of digital slides for routine pathology can pave the way for pathology departments to go “fully digital.” In this paper, we summarize our experiences so far in the process of implementing a fully digital workflow at our Pathology Department and the steps that are needed to complete this process. PMID:23858390
Chong, Siang Yew; Tiňo, Peter; He, Jun; Yao, Xin
2017-11-20
Studying coevolutionary systems in the context of simplified models (i.e., games with pairwise interactions between coevolving solutions modeled as self plays) remains an open challenge since the rich underlying structures associated with pairwise-comparison-based fitness measures are often not taken fully into account. Although cyclic dynamics have been demonstrated in several contexts (such as intransitivity in coevolutionary problems), there is no complete characterization of cycle structures and their effects on coevolutionary search. We develop a new framework to address this issue. At the core of our approach is the directed graph (digraph) representation of coevolutionary problems that fully captures structures in the relations between candidate solutions. Coevolutionary processes are modeled as a specific type of Markov chains-random walks on digraphs. Using this framework, we show that coevolutionary problems admit a qualitative characterization: a coevolutionary problem is either solvable (there is a subset of solutions that dominates the remaining candidate solutions) or not. This has an implication on coevolutionary search. We further develop our framework that provides the means to construct quantitative tools for analysis of coevolutionary processes and demonstrate their applications through case studies. We show that coevolution of solvable problems corresponds to an absorbing Markov chain for which we can compute the expected hitting time of the absorbing class. Otherwise, coevolution will cycle indefinitely and the quantity of interest will be the limiting invariant distribution of the Markov chain. We also provide an index for characterizing complexity in coevolutionary problems and show how they can be generated in a controlled manner.
A conservative fully implicit algorithm for predicting slug flows
NASA Astrophysics Data System (ADS)
Krasnopolsky, Boris I.; Lukyanov, Alexander A.
2018-02-01
An accurate and predictive modelling of slug flows is required by many industries (e.g., oil and gas, nuclear engineering, chemical engineering) to prevent undesired events potentially leading to serious environmental accidents. For example, the hydrodynamic and terrain-induced slugging leads to unwanted unsteady flow conditions. This demands the development of fast and robust numerical techniques for predicting slug flows. The presented in this paper study proposes a multi-fluid model and its implementation method accounting for phase appearance and disappearance. The numerical modelling of phase appearance and disappearance presents a complex numerical challenge for all multi-component and multi-fluid models. Numerical challenges arise from the singular systems of equations when some phases are absent and from the solution discontinuity when some phases appear or disappear. This paper provides a flexible and robust solution to these issues. A fully implicit formulation described in this work enables to efficiently solve governing fluid flow equations. The proposed numerical method provides a modelling capability of phase appearance and disappearance processes, which is based on switching procedure between various sets of governing equations. These sets of equations are constructed using information about the number of phases present in the computational domain. The proposed scheme does not require an explicit truncation of solutions leading to a conservative scheme for mass and linear momentum. A transient two-fluid model is used to verify and validate the proposed algorithm for conditions of hydrodynamic and terrain-induced slug flow regimes. The developed modelling capabilities allow to predict all the major features of the experimental data, and are in a good quantitative agreement with them.
Optimization and automation of quantitative NMR data extraction.
Bernstein, Michael A; Sýkora, Stan; Peng, Chen; Barba, Agustín; Cobas, Carlos
2013-06-18
NMR is routinely used to quantitate chemical species. The necessary experimental procedures to acquire quantitative data are well-known, but relatively little attention has been applied to data processing and analysis. We describe here a robust expert system that can be used to automatically choose the best signals in a sample for overall concentration determination and determine analyte concentration using all accepted methods. The algorithm is based on the complete deconvolution of the spectrum which makes it tolerant of cases where signals are very close to one another and includes robust methods for the automatic classification of NMR resonances and molecule-to-spectrum multiplets assignments. With the functionality in place and optimized, it is then a relatively simple matter to apply the same workflow to data in a fully automatic way. The procedure is desirable for both its inherent performance and applicability to NMR data acquired for very large sample sets.
Analog to digital workflow improvement: a quantitative study.
Wideman, Catherine; Gallet, Jacqueline
2006-01-01
This study tracked a radiology department's conversion from utilization of a Kodak Amber analog system to a Kodak DirectView DR 5100 digital system. Through the use of ProModel Optimization Suite, a workflow simulation software package, significant quantitative information was derived from workflow process data measured before and after the change to a digital system. Once the digital room was fully operational and the radiology staff comfortable with the new system, average patient examination time was reduced from 9.24 to 5.28 min, indicating that a higher patient throughput could be achieved. Compared to the analog system, chest examination time for modality specific activities was reduced by 43%. The percentage of repeat examinations experienced with the digital system also decreased to 8% vs. the level of 9.5% experienced with the analog system. The study indicated that it is possible to quantitatively study clinical workflow and productivity by using commercially available software.
Localization-based super-resolution imaging meets high-content screening.
Beghin, Anne; Kechkar, Adel; Butler, Corey; Levet, Florian; Cabillic, Marine; Rossier, Olivier; Giannone, Gregory; Galland, Rémi; Choquet, Daniel; Sibarita, Jean-Baptiste
2017-12-01
Single-molecule localization microscopy techniques have proven to be essential tools for quantitatively monitoring biological processes at unprecedented spatial resolution. However, these techniques are very low throughput and are not yet compatible with fully automated, multiparametric cellular assays. This shortcoming is primarily due to the huge amount of data generated during imaging and the lack of software for automation and dedicated data mining. We describe an automated quantitative single-molecule-based super-resolution methodology that operates in standard multiwell plates and uses analysis based on high-content screening and data-mining software. The workflow is compatible with fixed- and live-cell imaging and allows extraction of quantitative data like fluorophore photophysics, protein clustering or dynamic behavior of biomolecules. We demonstrate that the method is compatible with high-content screening using 3D dSTORM and DNA-PAINT based super-resolution microscopy as well as single-particle tracking.
Open Science CBS Neuroimaging Repository: Sharing ultra-high-field MR images of the brain.
Tardif, Christine Lucas; Schäfer, Andreas; Trampel, Robert; Villringer, Arno; Turner, Robert; Bazin, Pierre-Louis
2016-01-01
Magnetic resonance imaging at ultra high field opens the door to quantitative brain imaging at sub-millimeter isotropic resolutions. However, novel image processing tools to analyze these new rich datasets are lacking. In this article, we introduce the Open Science CBS Neuroimaging Repository: a unique repository of high-resolution and quantitative images acquired at 7 T. The motivation for this project is to increase interest for high-resolution and quantitative imaging and stimulate the development of image processing tools developed specifically for high-field data. Our growing repository currently includes datasets from MP2RAGE and multi-echo FLASH sequences from 28 and 20 healthy subjects respectively. These datasets represent the current state-of-the-art in in-vivo relaxometry at 7 T, and are now fully available to the entire neuroimaging community. Copyright © 2015 Elsevier Inc. All rights reserved.
Pateman, B; Jinks, A M
1999-01-01
The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.
You can run, you can hide: The epidemiology and statistical mechanics of zombies
NASA Astrophysics Data System (ADS)
Alemi, Alexander A.; Bierbaum, Matthew; Myers, Christopher R.; Sethna, James P.
2015-11-01
We use a popular fictional disease, zombies, in order to introduce techniques used in modern epidemiology modeling, and ideas and techniques used in the numerical study of critical phenomena. We consider variants of zombie models, from fully connected continuous time dynamics to a full scale exact stochastic dynamic simulation of a zombie outbreak on the continental United States. Along the way, we offer a closed form analytical expression for the fully connected differential equation, and demonstrate that the single person per site two dimensional square lattice version of zombies lies in the percolation universality class. We end with a quantitative study of the full scale US outbreak, including the average susceptibility of different geographical regions.
Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo
2017-08-04
Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.
Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya
2010-04-01
Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.
Sun, Wanxin; Chang, Shi; Tai, Dean C S; Tan, Nancy; Xiao, Guangfa; Tang, Huihuan; Yu, Hanry
2008-01-01
Liver fibrosis is associated with an abnormal increase in an extracellular matrix in chronic liver diseases. Quantitative characterization of fibrillar collagen in intact tissue is essential for both fibrosis studies and clinical applications. Commonly used methods, histological staining followed by either semiquantitative or computerized image analysis, have limited sensitivity, accuracy, and operator-dependent variations. The fibrillar collagen in sinusoids of normal livers could be observed through second-harmonic generation (SHG) microscopy. The two-photon excited fluorescence (TPEF) images, recorded simultaneously with SHG, clearly revealed the hepatocyte morphology. We have systematically optimized the parameters for the quantitative SHG/TPEF imaging of liver tissue and developed fully automated image analysis algorithms to extract the information of collagen changes and cell necrosis. Subtle changes in the distribution and amount of collagen and cell morphology are quantitatively characterized in SHG/TPEF images. By comparing to traditional staining, such as Masson's trichrome and Sirius red, SHG/TPEF is a sensitive quantitative tool for automated collagen characterization in liver tissue. Our system allows for enhanced detection and quantification of sinusoidal collagen fibers in fibrosis research and clinical diagnostics.
The protective role of coastal marshes: a systematic review and meta-analysis.
Shepard, Christine C; Crain, Caitlin M; Beck, Michael W
2011-01-01
Salt marshes lie between many human communities and the coast and have been presumed to protect these communities from coastal hazards by providing important ecosystem services. However, previous characterizations of these ecosystem services have typically been based on a small number of historical studies, and the consistency and extent to which marshes provide these services has not been investigated. Here, we review the current evidence for the specific processes of wave attenuation, shoreline stabilization and floodwater attenuation to determine if and under what conditions salt marshes offer these coastal protection services. We conducted a thorough search and synthesis of the literature with reference to these processes. Seventy-five publications met our selection criteria, and we conducted meta-analyses for publications with sufficient data available for quantitative analysis. We found that combined across all studies (n = 7), salt marsh vegetation had a significant positive effect on wave attenuation as measured by reductions in wave height per unit distance across marsh vegetation. Salt marsh vegetation also had a significant positive effect on shoreline stabilization as measured by accretion, lateral erosion reduction, and marsh surface elevation change (n = 30). Salt marsh characteristics that were positively correlated to both wave attenuation and shoreline stabilization were vegetation density, biomass production, and marsh size. Although we could not find studies quantitatively evaluating floodwater attenuation within salt marshes, there are several studies noting the negative effects of wetland alteration on water quantity regulation within coastal areas. Our results show that salt marshes have value for coastal hazard mitigation and climate change adaptation. Because we do not yet fully understand the magnitude of this value, we propose that decision makers employ natural systems to maximize the benefits and ecosystem services provided by salt marshes and exercise caution when making decisions that erode these services.
Modeling microbial reaction rates in a submarine hydrothermal vent chimney wall
NASA Astrophysics Data System (ADS)
LaRowe, Douglas E.; Dale, Andrew W.; Aguilera, David R.; L'Heureux, Ivan; Amend, Jan P.; Regnier, Pierre
2014-01-01
The fluids emanating from active submarine hydrothermal vent chimneys provide a window into subseafloor processes and, through mixing with seawater, are responsible for steep thermal and compositional gradients that provide the energetic basis for diverse biological communities. Although several models have been developed to better understand the dynamic interplay of seawater, hydrothermal fluid, minerals and microorganisms inside chimney walls, none provide a fully integrated approach to quantifying the biogeochemistry of these hydrothermal systems. In an effort to remedy this, a fully coupled biogeochemical reaction-transport model of a hydrothermal vent chimney has been developed that explicitly quantifies the rates of microbial catalysis while taking into account geochemical processes such as fluid flow, solute transport and oxidation-reduction reactions associated with fluid mixing as a function of temperature. The metabolisms included in the reaction network are methanogenesis, aerobic oxidation of hydrogen, sulfide and methane and sulfate reduction by hydrogen and methane. Model results indicate that microbial catalysis is generally fastest in the hottest habitable portion of the vent chimney (77-102 °C), and methane and sulfide oxidation peak near the seawater-side of the chimney. The fastest metabolisms are aerobic oxidation of H2 and sulfide and reduction of sulfate by H2 with maximum rates of 140, 900 and 800 pmol cm-3 d-1, respectively. The maximum rate of hydrogenotrophic methanogenesis is just under 0.03 pmol cm-3 d-1, the slowest of the metabolisms considered. Due to thermodynamic inhibition, there is no anaerobic oxidation of methane by sulfate (AOM). These simulations are consistent with vent chimney metabolic activity inferred from phylogenetic data reported in the literature. The model developed here provides a quantitative approach to describing the rates of biogeochemical transformations in hydrothermal systems and can be used to constrain the role of microbial activity in the deep subsurface.
Wollstein, Andreas; Walsh, Susan; Liu, Fan; Chakravarthy, Usha; Rahu, Mati; Seland, Johan H; Soubrane, Gisèle; Tomazzoli, Laura; Topouzis, Fotis; Vingerling, Johannes R; Vioque, Jesus; Böhringer, Stefan; Fletcher, Astrid E; Kayser, Manfred
2017-02-27
Success of genetic association and the prediction of phenotypic traits from DNA are known to depend on the accuracy of phenotype characterization, amongst other parameters. To overcome limitations in the characterization of human iris pigmentation, we introduce a fully automated approach that specifies the areal proportions proposed to represent differing pigmentation types, such as pheomelanin, eumelanin, and non-pigmented areas within the iris. We demonstrate the utility of this approach using high-resolution digital eye imagery and genotype data from 12 selected SNPs from over 3000 European samples of seven populations that are part of the EUREYE study. In comparison to previous quantification approaches, (1) we achieved an overall improvement in eye colour phenotyping, which provides a better separation of manually defined eye colour categories. (2) Single nucleotide polymorphisms (SNPs) known to be involved in human eye colour variation showed stronger associations with our approach. (3) We found new and confirmed previously noted SNP-SNP interactions. (4) We increased SNP-based prediction accuracy of quantitative eye colour. Our findings exemplify that precise quantification using the perceived biological basis of pigmentation leads to enhanced genetic association and prediction of eye colour. We expect our approach to deliver new pigmentation genes when applied to genome-wide association testing.
Quantitative analysis of Paratethys sea level change during the Messinian Salinity Crisis
NASA Astrophysics Data System (ADS)
de la Vara, Alba; Meijer, Paul; van Baak, Christiaan; Marzocchi, Alice; Grothe, Arjen
2016-04-01
At the time of the Messinian Salinity Crisis in the Mediterranean Sea (i.e., the Pontian stage of the Paratethys), the Paratethys sea level dropped also. Evidence found in the sedimentary record of the Black Sea and the Caspian Sea has been interpreted to indicate that a sea level fall occurred between 5.6 and 5.5 Ma. Estimates for the magnitude of the fall range between tens of meters to more than 1500 m. The purpose of this study is to provide quantitative insight into the sensitivity of the water level of the Black Sea and the Caspian Sea to the hydrologic budget, for the case that the Paratethys is disconnected from the Mediterranean. Using a Late Miocene bathymetry based on a palaeographic map by Popov et al. (2004) we quantify the fall in sea level, the mean salinity, and the time to reach equilibrium for a wide range of negative hydrologic budgets. By combining our results with (i) estimates derived from a recent global Late Miocene climate simulation and (ii) reconstructed basin salinities, we are able to rule out a drop in sea level of the order of 1000 m in the Caspian Sea during this time period. In the Black Sea, however, such a large sea level fall cannot be fully discarded.
Aguilar, Carlos A.; Shcherbina, Anna; Ricke, Darrell O.; Pop, Ramona; Carrigan, Christopher T.; Gifford, Casey A.; Urso, Maria L.; Kottke, Melissa A.; Meissner, Alexander
2015-01-01
Traumatic lower-limb musculoskeletal injuries are pervasive amongst athletes and the military and typically an individual returns to activity prior to fully healing, increasing a predisposition for additional injuries and chronic pain. Monitoring healing progression after a musculoskeletal injury typically involves different types of imaging but these approaches suffer from several disadvantages. Isolating and profiling transcripts from the injured site would abrogate these shortcomings and provide enumerative insights into the regenerative potential of an individual’s muscle after injury. In this study, a traumatic injury was administered to a mouse model and healing progression was examined from 3 hours to 1 month using high-throughput RNA-Sequencing (RNA-Seq). Comprehensive dissection of the genome-wide datasets revealed the injured site to be a dynamic, heterogeneous environment composed of multiple cell types and thousands of genes undergoing significant expression changes in highly regulated networks. Four independent approaches were used to determine the set of genes, isoforms, and genetic pathways most characteristic of different time points post-injury and two novel approaches were developed to classify injured tissues at different time points. These results highlight the possibility to quantitatively track healing progression in situ via transcript profiling using high- throughput sequencing. PMID:26381351
Macdonald, Patrick J.; Chen, Yan; Mueller, Joachim D.
2012-01-01
Cell-free synthesis, a method for the rapid expression of proteins, is increasingly used to study interactions of complex biological systems. GFP and its variants have become indispensable for fluorescence studies in live cells and are equally attractive as reporters for cell-free systems. This work investigates the use of fluorescence fluctuation spectroscopy (FFS) as a tool for quantitative analysis of protein interactions in cell-free expression systems. We also explore chromophore maturation of fluorescent proteins, which is of crucial importance for fluorescence studies. A droplet sample protocol was developed that ensured sufficient oxygenation for chromophore maturation and ease of manipulation for titration studies. The kinetics of chromophore maturation of EGFP, EYFP, and mCherry were analyzed as a function of temperature. A strong increase in the rate from room temperature to 37 °C was observed. We further demonstrate that all EGFP proteins fully mature in the cell-free solution and that brightness is a robust parameter specifying stoichiometry. Finally, FFS is applied to study the stoichiometry of the nuclear transport factor 2 in a cell-free system over a broad concentration range. We conclude that combining cell-free expression and FFS provides a powerful technique for quick, quantitative study of chromophore maturation and protein-protein interaction. PMID:22093611
Zero bias STS Kondo anomalies of Co impurities on Cu surfaces: do ab initio calculations work?
NASA Astrophysics Data System (ADS)
Baruselli, Pier Paolo; Smogunov, Alexander; Fabrizio, Michele; Requist, Ryan; Tosatti, Erio
2012-02-01
Transition metal atoms such as Co on Cu (111), (100), and (110) surfaces produce STS I-V spectra showing different zero bias Kondo anomalies [1] but these differences have been neither quantitatively predicted nor fully explained theoretically. We apply to this problem the DFT+NRG scheme of Lucignano et al [2], where one solves by NRG an Anderson model built from ab initio phase shifts provided by DFT. For Co/Cu(100) and Co/Cu(110) our calculations describe correctly the experimental trend of Kondo temperatures, and fairly the lineshapes too. By contrast, they fail to describe Co/Cu(111) where in particular the anti-lorentzian lineshape found in experiment remains unexplained. This failure underscores the role of surface states, probably relevant for Co/Cu(111) [3] but not correctly described by our thin slab calculations. Future efforts to quantitatively include Kondo screening by surface states are therefore called for. 1. N. Knorr et al PRL 88, 096804 (2002); M. Ternes et al 2009 J. Phys.: Cond. Matt. 21, 053001 (2009); A. Gumbsch et al PRB81, 165420 (2010). 2. P. Lucignano et al Nature Mat. 8, 563 (2009); P.P. Baruselli et al, Physica E, doi:10.1016/j.physe.2011.05.005. 3. C. Lin et al. PRB 71, 035417 (2005).
Moyanova, S; Kortenska, L; Kirov, R; Iliev, I
1998-12-01
The powerful vasoconstrictor peptide endothelin-1 (ET1) has been shown to reduce local cerebral blood flow in brain areas supplied by the middle cerebral artery (MCA) to a pathologically low level upon intracerebral injection adjacent to the MCA. This reduction manifests itself as an ischemic infarct, that is fully developed within 3 days after ET1 injection. The aim of the present study is to examine the effect of ET1 on electroencephalographic (EEG) activity. ET1 was microinjected unilaterally at a dose of 60 pmol in 3 microl of saline to the MCA in conscious rats. EEG signals were recorded from the frontoparietal cortical area, supplied by MCA, from the first up to the fourteenth day after ET1 injection. EEG activity was analyzed by the fast Fourier transformation. A significant shift to a lower EEG frequency, i.e., augmentation of slow waves and a reduction of alpha-like and faster EEG waves was found post-ET1. This effect was maximal after 3-7 days when the most severe destruction of neurons in this cortical area occurs, as has been previously demonstrated. The results suggest that the quantitative EEG analysis may provide useful additional information about the functional disturbances associated with focal cerebral ischemia.
Wollstein, Andreas; Walsh, Susan; Liu, Fan; Chakravarthy, Usha; Rahu, Mati; Seland, Johan H.; Soubrane, Gisèle; Tomazzoli, Laura; Topouzis, Fotis; Vingerling, Johannes R.; Vioque, Jesus; Böhringer, Stefan; Fletcher, Astrid E.; Kayser, Manfred
2017-01-01
Success of genetic association and the prediction of phenotypic traits from DNA are known to depend on the accuracy of phenotype characterization, amongst other parameters. To overcome limitations in the characterization of human iris pigmentation, we introduce a fully automated approach that specifies the areal proportions proposed to represent differing pigmentation types, such as pheomelanin, eumelanin, and non-pigmented areas within the iris. We demonstrate the utility of this approach using high-resolution digital eye imagery and genotype data from 12 selected SNPs from over 3000 European samples of seven populations that are part of the EUREYE study. In comparison to previous quantification approaches, (1) we achieved an overall improvement in eye colour phenotyping, which provides a better separation of manually defined eye colour categories. (2) Single nucleotide polymorphisms (SNPs) known to be involved in human eye colour variation showed stronger associations with our approach. (3) We found new and confirmed previously noted SNP-SNP interactions. (4) We increased SNP-based prediction accuracy of quantitative eye colour. Our findings exemplify that precise quantification using the perceived biological basis of pigmentation leads to enhanced genetic association and prediction of eye colour. We expect our approach to deliver new pigmentation genes when applied to genome-wide association testing. PMID:28240252
NASA Astrophysics Data System (ADS)
Lebec, Michael Thomas
Due to discipline specific shortages, web-based learning has been proposed as a convenient way to upgrade the content knowledge of instructors interested in learning to teach science. Despite quantitative evidence that web-based instruction is equivalent to traditional methods, questions remain regarding its use. The efficiency and practicality of this approach with teachers in particular has not been extensively studied. This investigation examines learning in an online biology course designed to help teachers prepare for science certification exams. Research questions concern flow teachers learn biology in the online environment and how this setting influences the learning process. Quantitative and qualitative methodologies are employed in an attempt to provide a more complete perspective than typical studies of online learning. Concept maps, tests, and online discussion transcripts are compared as measures of assimilated knowledge, while interviews reflect participants' views on the course. Findings indicate that participants experienced gains in declarative knowledge, but little improvement with respect to conditional knowledge. Qualitative examination of concept maps demonstrates gaps in participants' understandings of key course ideas. Engagement in the use of online resources varied according to participants' attitudes towards online learning. Subjects also reported a lack of motivation to fully engage in the course due to busy teaching schedules and the absence of accountability.
Eckstein, Felix; Kunz, Manuela; Hudelmaier, Martin; Jackson, Rebecca; Yu, Joseph; Eaton, Charles B; Schneider, Erika
2007-02-01
Phased-array (PA) coils generally provide higher signal-to-noise ratios (SNRs) than quadrature knee coils. In this pilot study for the Osteoarthritis Initiative (OAI) we compared these two types of coils in terms of contrast-to-noise ratio (CNR), precision, and consistency of quantitative femorotibial cartilage measurements. Test-retest measurements were acquired using coronal fast low-angle shot with water excitation (FLASHwe) and coronal multiplanar reconstruction (MPR) of sagittal double-echo steady state with water excitation (DESSwe) at 3T. The precision errors for cartilage volume and thickness were
NASA Technical Reports Server (NTRS)
Riley, D. R.; Miller, G. K., Jr.
1978-01-01
The effect of time delay was determined in the visual and motion cues in a flight simulator on pilot performance in tracking a target aircraft that was oscillating sinusoidally in altitude only. An audio side task was used to assure the subject was fully occupied at all times. The results indicate that, within the test grid employed, about the same acceptable time delay (250 msec) was obtained for a single aircraft (fighter type) by each of two subjects for both fixed-base and motion-base conditions. Acceptable time delay is defined as the largest amount of delay that can be inserted simultaneously into the visual and motion cues before performance degradation occurs. A statistical analysis of the data was made to establish this value of time delay. Audio side task provided quantitative data that documented the subject's work level.
Experimental study of flow reattachment in a single-sided sudden expansion
NASA Technical Reports Server (NTRS)
Westphal, R. V.; Johnston, J. P.; Eaton, J. K.
1984-01-01
The reattachment of a fully turbulent, two dimensional, separated shear layer downstream of a single-sided sudden expansion in a planar duct flow was examined experimentally. The importance of changing the structure of the separated shear layer on the reattachment process itself was examined. For all cases, the Reynolds number based on step height was greater than 20,000, the expansion ratio was 5/3, and the inlet boundary layer was less than one-half step height in thickness. A crucially important phase was the development of a pulsed wall probe for measurement of skin friction in the reattachment region, thus providing an unambiguous definition of the reattachment length. Quantitative features of reattachment - including streamwise development of the mean and fluctuating velocity field, pressure rise, and skin friction - were found to be similar for all cases studied when scaled by the reattachment length. A definition of the reattachment zone is proposed.
volBrain: An Online MRI Brain Volumetry System
Manjón, José V.; Coupé, Pierrick
2016-01-01
The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372
Sempere, Lorenzo F
2014-01-01
miRNAs are short, non-coding, regulatory RNAs that exert cell type-dependent, context-dependent, transcriptome-wide gene expression control under physiological and pathological conditions. Tissue slide-based assays provide qualitative (tumor compartment) and semi-quantitative (expression levels) information about altered miRNA expression at single-cell resolution in clinical tumor specimens. Reviewed here are key technological advances in the last 5 years that have led to implementation of fully automated, robust and reproducible tissue slide-based assays for in situ miRNA detection on US FDA-approved instruments; recent tissue slide-based discovery studies that suggest potential clinical applications of specific miRNAs in cancer medicine are highlighted; and the challenges in bringing tissue slide-based miRNA assays into the clinic are discussed, including clinical validation, biomarker performance, biomarker space and integration with other biomarkers. PMID:25090088
Quantitative Study of Blue Stars in NGC 55
NASA Astrophysics Data System (ADS)
Castro, N.; Herrero, A.; Urbaneja, M. A.; García, M.; Simón-Díaz, S.; Bresolin, F.; Pietrzynski, G.; Kudritzki, R.-P.; Gieren, W.
2012-12-01
Massive blue stars are the rarest in number compared with other stars; however, they are the main engines in the chemical and dynamical evolution of galaxies in the Universe. They are also among the brightest stars, making it possible to be observed (and hence studied) beyond the edges of the Milky Way. In the case of the galaxy NGC 55 (1.9 Mpc), presented in this work, it has been not only possible to provide the first census of massive blue stars, but also perform a fully characterization of these stars, including the stellar parameters, the chemical abundances, and information about their evolutionary stages. Even so, that permitted to derive important properties of the host galaxy. This challenging study is based on an objective and fast automatic technique built upon a new state-of-the-art FASTWIND atmosphere model grid. Both the tool and the grid were specially developed for this project.
Ruckh, Timothy T.; Mehta, Ankeeta A.; Dubach, J. Matthew; Clark, Heather A.
2013-01-01
This work introduces a polymer-free optode nanosensor for ratiometric sodium imaging. Transmembrane ion dynamics are often captured by electrophysiology and calcium imaging, but sodium dyes suffer from short excitation wavelengths and poor selectivity. Optodes, optical sensors composed of a polymer matrix with embedded sensing chemistry, have been translated into nanosensors that selectively image ion concentrations. Polymer-free nanosensors were fabricated by emulsification and were stable by diameter and sensitivity for at least one week. Ratiometric fluorescent measurements demonstrated that the nanosensors are selective for sodium over potassium by ~1.4 orders of magnitude, have a dynamic range centered at 20 mM, and are fully reversible. The ratiometric signal changes by 70% between 10 and 100 mM sodium, showing that they are sensitive to changes in sodium concentration. These nanosensors will provide a new tool for sensitive and quantitative ion imaging. PMID:24284431
Measuring the topology of large-scale structure in the universe
NASA Technical Reports Server (NTRS)
Gott, J. Richard, III
1988-01-01
An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.
Measuring the topology of large-scale structure in the universe
NASA Astrophysics Data System (ADS)
Gott, J. Richard, III
1988-11-01
An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.
Unfolding large-scale online collaborative human dynamics
Zha, Yilong; Zhou, Tao; Zhou, Changsong
2016-01-01
Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with a precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double–power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to previous actions with a power-law waiting time, and (iii) population growth due to the increasing number of interacting individuals. This unfolding allows us to obtain an analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal “simplicity” beyond complex interacting human activities. PMID:27911766
volBrain: An Online MRI Brain Volumetry System.
Manjón, José V; Coupé, Pierrick
2016-01-01
The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.
Automated volumetric evaluation of stereoscopic disc photography
Xu, Juan; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A; Kagemann, Larry; Craig, Jamie E; Mackey, David A; Hewitt, Alex W; Schuman, Joel S
2010-01-01
PURPOSE: To develop a fully automated algorithm (AP) to perform a volumetric measure of the optic disc using conventional stereoscopic optic nerve head (ONH) photographs, and to compare algorithm-produced parameters with manual photogrammetry (MP), scanning laser ophthalmoscope (SLO) and optical coherence tomography (OCT) measurements. METHODS: One hundred twenty-two stereoscopic optic disc photographs (61 subjects) were analyzed. Disc area, rim area, cup area, cup/disc area ratio, vertical cup/disc ratio, rim volume and cup volume were automatically computed by the algorithm. Latent variable measurement error models were used to assess measurement reproducibility for the four techniques. RESULTS: AP had better reproducibility for disc area and cup volume and worse reproducibility for cup/disc area ratio and vertical cup/disc ratio, when the measurements were compared to the MP, SLO and OCT methods. CONCLUSION: AP provides a useful technique for an objective quantitative assessment of 3D ONH structures. PMID:20588996
Participatory action research: involving students in parent education.
Fowler, Cathrine; Wu, Cynthia; Lam, Winsome
2014-01-01
Competition for scarce clinical placements has increased requiring new and innovative models to be developed to meet the growing need. A participatory action research project was used to provide a community nursing clinical experience of involvement in parent education. Nine Hong Kong nursing students self-selected to participate in the project to implement a parenting program called Parenting Young Children in a Digital World. Three project cycles were used: needs identification, skills development and program implementation. Students were fully involved in each cycle's planning, action and reflection phase. Qualitative and quantitative data were collected to inform the project. The overall outcome of the project was the provision of a rich and viable clinical placement experience that created significant learning opportunities for the students and researchers. This paper will explore the student's participation in this PAR project as an innovative clinical practice opportunity. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gupta, Anshu; Krishnan, Badri; Nielsen, Alex B.; Schnetter, Erik
2018-04-01
The behavior of quasilocal black hole horizons in a binary black hole merger is studied numerically. We compute the horizon multipole moments, fluxes, and other quantities on black hole horizons throughout the merger. These lead to a better qualitative and quantitative understanding of the coalescence of two black holes: how the final black hole is formed, initially grows, and then settles down to a Kerr black hole. We calculate the rate at which the final black hole approaches equilibrium in a fully nonperturbative situation and identify a time at which the linear ringdown phase begins. Finally, we provide additional support for the conjecture that fields at the horizon are correlated with fields in the wave zone by comparing the in-falling gravitational wave flux at the horizon to the outgoing flux as estimated from the gravitational waveform.
Foglia, L.; Hill, Mary C.; Mehl, Steffen W.; Burlando, P.
2009-01-01
We evaluate the utility of three interrelated means of using data to calibrate the fully distributed rainfall‐runoff model TOPKAPI as applied to the Maggia Valley drainage area in Switzerland. The use of error‐based weighting of observation and prior information data, local sensitivity analysis, and single‐objective function nonlinear regression provides quantitative evaluation of sensitivity of the 35 model parameters to the data, identification of data types most important to the calibration, and identification of correlations among parameters that contribute to nonuniqueness. Sensitivity analysis required only 71 model runs, and regression required about 50 model runs. The approach presented appears to be ideal for evaluation of models with long run times or as a preliminary step to more computationally demanding methods. The statistics used include composite scaled sensitivities, parameter correlation coefficients, leverage, Cook's D, and DFBETAS. Tests suggest predictive ability of the calibrated model typical of hydrologic models.
NASA Astrophysics Data System (ADS)
Yong, Yan Ling; Tan, Li Kuo; McLaughlin, Robert A.; Chee, Kok Han; Liew, Yih Miin
2017-12-01
Intravascular optical coherence tomography (OCT) is an optical imaging modality commonly used in the assessment of coronary artery diseases during percutaneous coronary intervention. Manual segmentation to assess luminal stenosis from OCT pullback scans is challenging and time consuming. We propose a linear-regression convolutional neural network to automatically perform vessel lumen segmentation, parameterized in terms of radial distances from the catheter centroid in polar space. Benchmarked against gold-standard manual segmentation, our proposed algorithm achieves average locational accuracy of the vessel wall of 22 microns, and 0.985 and 0.970 in Dice coefficient and Jaccard similarity index, respectively. The average absolute error of luminal area estimation is 1.38%. The processing rate is 40.6 ms per image, suggesting the potential to be incorporated into a clinical workflow and to provide quantitative assessment of vessel lumen in an intraoperative time frame.
Functional Advantages of Conserved Intrinsic Disorder in RNA-Binding Proteins.
Varadi, Mihaly; Zsolyomi, Fruzsina; Guharoy, Mainak; Tompa, Peter
2015-01-01
Proteins form large macromolecular assemblies with RNA that govern essential molecular processes. RNA-binding proteins have often been associated with conformational flexibility, yet the extent and functional implications of their intrinsic disorder have never been fully assessed. Here, through large-scale analysis of comprehensive protein sequence and structure datasets we demonstrate the prevalence of intrinsic structural disorder in RNA-binding proteins and domains. We addressed their functionality through a quantitative description of the evolutionary conservation of disordered segments involved in binding, and investigated the structural implications of flexibility in terms of conformational stability and interface formation. We conclude that the functional role of intrinsically disordered protein segments in RNA-binding is two-fold: first, these regions establish extended, conserved electrostatic interfaces with RNAs via induced fit. Second, conformational flexibility enables them to target different RNA partners, providing multi-functionality, while also ensuring specificity. These findings emphasize the functional importance of intrinsically disordered regions in RNA-binding proteins.
ERIC Educational Resources Information Center
Klinkenberg, Laurel Beth
2013-01-01
In recent years, community colleges have come into the spotlight nationally in terms of their potential to assist in the revitalization of the economy. This has resulted in an increased need for community colleges to understand more fully the factors that influence student persistence. The purpose of this quasi-experimental study was to…
Three-Dimensional Computer Graphics Brain-Mapping Project.
1987-03-15
NEUROQUANT . This package was directed towards quantitative microneuroanatomic data acquisition and analysis. Using this interface, image frames captured...populations of brains. This would have been aprohibitive task if done manually with a densitometer and film, due to user error and bias. NEUROQUANT functioned...of cells were of interest. NEUROQUANT is presently being implemented with a more fully automatic method of localizing the cell bodies directly
Code of Federal Regulations, 2010 CFR
2010-01-01
... annuity or partially reduced annuity to provide a former spouse annuity. 831.632 Section 831.632...) RETIREMENT Survivor Annuities Post-Retirement Elections § 831.632 Post-retirement election of fully reduced annuity or partially reduced annuity to provide a former spouse annuity. (a)(1) Except as provided in...
Rambo, Robert P.; Tainer, John A.
2011-01-01
Unstructured proteins, RNA or DNA components provide functionally important flexibility that is key to many macromolecular assemblies throughout cell biology. As objective, quantitative experimental measures of flexibility and disorder in solution are limited, small angle scattering (SAS), and in particular small angle X-ray scattering (SAXS), provides a critical technology to assess macromolecular flexibility as well as shape and assembly. Here, we consider the Porod-Debye law as a powerful tool for detecting biopolymer flexibility in SAS experiments. We show that the Porod-Debye region fundamentally describes the nature of the scattering intensity decay, which captures information needed for distinguishing between folded and flexible particles. Particularly for comparative SAS experiments, application of the law, as described here, can distinguish between discrete conformational changes and localized flexibility relevant to molecular recognition and interaction networks. This approach aids insightful analyses of fully and partly flexible macromolecules that is more robust and conclusive than traditional Kratky analyses. Furthermore, we demonstrate for prototypic SAXS data that the ability to calculate particle density by the Porod-Debye criteria, as shown here, provides an objective quality assurance parameter that may prove of general use for SAXS modeling and validation. PMID:21509745
Quantitative Cryo-Scanning Transmission Electron Microscopy of Biological Materials.
Elbaum, Michael
2018-05-11
Electron tomography provides a detailed view into the 3D structure of biological cells and tissues. Physical fixation by vitrification of the aqueous medium provides the most faithful preservation of biological specimens in the native, fully hydrated state. Cryo-microscopy is challenging, however, because of the sensitivity to electron irradiation and due to the weak electron scattering of organic material. Tomography is even more challenging because of the dependence on multiple exposures of the same area. Tomographic imaging is typically performed in wide-field transmission electron microscopy (TEM) mode with phase contrast generated by defocus. Scanning transmission electron microscopy (STEM) is an alternative mode based on detection of scattering from a focused probe beam, without imaging optics following the specimen. While careful configuration of the illumination and detectors is required to generate useful contrast, STEM circumvents the major restrictions of phase contrast TEM to very thin specimens and provides a signal that is more simply interpreted in terms of local composition and density. STEM has gained popularity in recent years for materials science. The extension of STEM to cryomicroscopy and tomography of cells and macromolecules is summarized herein. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cryogenic Pressure Control Modeling for Ellipsoidal Space Tanks
NASA Technical Reports Server (NTRS)
Lopez, Alfredo; Grayson, Gary D.; Chandler, Frank O.; Hastings, Leon J.; Heyadat, Ali
2007-01-01
A computational fluid dynamics (CFD) model is developed to simulate pressure control of an ellipsoidal-shaped liquid hydrogen tank under external heating in normal gravity. Pressure control is provided by an axial jet thermodynamic vent system (TVS) centered within the vessel that injects cooler liquid into the tank, mixing the contents and reducing tank pressure. The two-phase cryogenic tank model considers liquid hydrogen in its own vapor with liquid density varying with temperature only and a fully compressible ullage. The axisymmetric model is developed using a custom version of the commercially available FLOW-31) software. Quantitative model validation is ,provided by engineering checkout tests performed at Marshall Space Flight Center in 1999 in support of the Solar Thermal Upper Stage_ Technology Demonstrator (STUSTD) program. The engineering checkout tests provide cryogenic tank self-pressurization test data at various heat leaks and tank fill levels. The predicted self-pressurization rates, ullage and liquid temperatures at discrete locations within the STUSTD tank are in good agreement with test data. The work presented here advances current CFD modeling capabilities for cryogenic pressure control and helps develop a low cost CFD-based design process for space hardware.
Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.
2016-01-01
Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969
Iancu, Ovidiu D; Darakjian, Priscila; Kawane, Sunita; Bottomly, Daniel; Hitzemann, Robert; McWeeney, Shannon
2012-01-01
Complex Mus musculus crosses, e.g., heterogeneous stock (HS), provide increased resolution for quantitative trait loci detection. However, increased genetic complexity challenges detection methods, with discordant results due to low data quality or complex genetic architecture. We quantified the impact of theses factors across three mouse crosses and two different detection methods, identifying procedures that greatly improve detection quality. Importantly, HS populations have complex genetic architectures not fully captured by the whole genome kinship matrix, calling for incorporating chromosome specific relatedness information. We analyze three increasingly complex crosses, using gene expression levels as quantitative traits. The three crosses were an F(2) intercross, a HS formed by crossing four inbred strains (HS4), and a HS (HS-CC) derived from the eight lines found in the collaborative cross. Brain (striatum) gene expression and genotype data were obtained using the Illumina platform. We found large disparities between methods, with concordance varying as genetic complexity increased; this problem was more acute for probes with distant regulatory elements (trans). A suite of data filtering steps resulted in substantial increases in reproducibility. Genetic relatedness between samples generated overabundance of detected eQTLs; an adjustment procedure that includes the kinship matrix attenuates this problem. However, we find that relatedness between individuals is not evenly distributed across the genome; information from distinct chromosomes results in relatedness structure different from the whole genome kinship matrix. Shared polymorphisms from distinct chromosomes collectively affect expression levels, confounding eQTL detection. We suggest that considering chromosome specific relatedness can result in improved eQTL detection.
NASA Astrophysics Data System (ADS)
Woolford, Alison; Holden, Marcia; Salit, Marc; Burns, Malcolm; Ellison, Stephen L. R.
2009-01-01
Key comparison CCQM-K61 was performed to demonstrate and document the capability of interested national metrology institutes in the determination of the quantity of specific DNA target in an aqueous solution. The study provides support for the following measurement claim: "Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA". The comparison was an activity of the Bioanalysis Working Group (BAWG) of the Comité Consultatif pour la Quantité de Matière and was coordinated by NIST (Gaithersburg, USA) and LGC (Teddington, UK). The following laboratories (in alphabetical order) participated in this key comparison. DMSC (Thailand); IRMM (European Union); KRISS (Republic of Korea); LGC (UK); NIM (China); NIST (USA); NMIA (Australia); NMIJ (Japan); VNIIM (Russian Federation) Good agreement was observed between the reported results of all nine of the participants. Uncertainty estimates did not account fully for the dispersion of results even after allowance for possible inhomogeneity in calibration materials. Preliminary studies suggest that the effects of fluorescence threshold setting might contribute to the excess dispersion, and further study of this topic is suggested Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).
Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G
2015-10-01
Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.
2011-01-01
Background Several materials are available in the market that work on the principle of protein magnetic fishing by their histidine (His) tags. Little information is available on their performance and it is often quoted that greatly improved purification of histidine-tagged proteins from crude extracts could be achieved. While some commercial magnetic matrices could be used successfully for purification of several His-tagged proteins, there are some which have been proved to operate just for a few extent of His-tagged proteins. Here, we address quantitative evaluation of three commercially available Nickel nanomagnetic beads for purification of two His-tagged proteins expressed in Escherichia coli and present helpful hints for optimized purification of such proteins and preparation of nanomagnetisable matrices. Results Marked differences in the performance of nanomagnetic matrices, principally on the basis of their specific binding capacity, recovery profile, the amount of imidazole needed for protein elution and the extent of target protein loss and purity were obtained. Based on the aforesaid criteria, one of these materials featured the best purification results (SiMAG/N-NTA/Nickel) for both proteins at the concentration of 4 mg/ml, while the other two (SiMAC-Nickel and SiMAG/CS-NTA/Nickel) did not work well with respect to specific binding capacity and recovery profile. Conclusions Taken together, functionality of different types of nanomagnetic matrices vary considerably. This variability may not only be dependent upon the structure and surface chemistry of the matrix which in turn determine the affinity of interaction, but, is also influenced to a lesser extent by the physical properties of the protein itself. Although the results of the present study may not be fully applied for all nanomagnetic matrices, but provide a framework which could be used to profiling and quantitative evaluation of other magnetisable matrices and also provide helpful hints for those researchers facing same challenge. PMID:21824404
In situ semi-quantitative analysis of polluted soils by laser-induced breakdown spectroscopy (LIBS).
Ismaël, Amina; Bousquet, Bruno; Michel-Le Pierrès, Karine; Travaillé, Grégoire; Canioni, Lionel; Roy, Stéphane
2011-05-01
Time-saving, low-cost analyses of soil contamination are required to ensure fast and efficient pollution removal and remedial operations. In this work, laser-induced breakdown spectroscopy (LIBS) has been successfully applied to in situ analyses of polluted soils, providing direct semi-quantitative information about the extent of pollution. A field campaign has been carried out in Brittany (France) on a site presenting high levels of heavy metal concentrations. Results on iron as a major component as well as on lead and copper as minor components are reported. Soil samples were dried and prepared as pressed pellets to minimize the effects of moisture and density on the results. LIBS analyses were performed with a Nd:YAG laser operating at 1064 nm, 60 mJ per 10 ns pulse, at a repetition rate of 10 Hz with a diameter of 500 μm on the sample surface. Good correlations were obtained between the LIBS signals and the values of concentrations deduced from inductively coupled plasma atomic emission spectroscopy (ICP-AES). This result proves that LIBS is an efficient method for optimizing sampling operations. Indeed, "LIBS maps" were established directly on-site, providing valuable assistance in optimizing the selection of the most relevant samples for future expensive and time-consuming laboratory analysis and avoiding useless analyses of very similar samples. Finally, it is emphasized that in situ LIBS is not described here as an alternative quantitative analytical method to the usual laboratory measurements but simply as an efficient time-saving tool to optimize sampling operations and to drastically reduce the number of soil samples to be analyzed, thus reducing costs. The detection limits of 200 ppm for lead and 80 ppm for copper reported here are compatible with the thresholds of toxicity; thus, this in situ LIBS campaign was fully validated for these two elements. Consequently, further experiments are planned to extend this study to other chemical elements and other matrices of soils.
Recruitment of Community College Students Into a Web-Assisted Tobacco Intervention Study
Johnson, Tye; Wall, Andrew F; Prokhorov, Alexander V; Calabro, Karen Sue; Ververs, Duncan; Assibey-Mensah, Vanessa; Ossip, Deborah J
2017-01-01
Background United States college students, particularly those attending community colleges, have higher smoking rates than the national average. Recruitment of such smokers into research studies has not been studied in depth, despite a moderate amount information on study recruitment success with smokers from traditional four-year colleges. Recruitment channels and success are evolving as technology evolves, so it is important to understand how to best target, implement, and evaluate recruitment strategies. Objective The aim of this paper is to both qualitatively and quantitatively explore recruitment channels (eg, mass email, in-person referral, posted materials) and their success with enrollment into a Web-Assisted Tobacco Intervention study in this priority population of underserved and understudied smokers. Methods Qualitative research methods included key informant interviews (n=18) and four focus groups (n=37). Quantitative research methods included observed online responsiveness to any channel (n=10,914), responses from those completing online screening and study consent (n=2696), and responses to a baseline questionnaire from the fully enrolled study participants (n=1452). Results Qualitative results prior to recruitment provided insights regarding the selection of a variety of recruitment channels proposed to be successful, and provided context for the unique attributes of the study sample. Quantitative analysis of self-reported channels used to engage with students, and to enroll participants into the study, revealed the relative utilization of channels at several recruitment points. The use of mass emails to the student body was reported by the final sample as the most influential channel, accounting for 60.54% (879/1452) of the total enrolled sample. Conclusions Relative channel efficiency was analyzed across a wide variety of channels. One primary channel (mass emails) and a small number of secondary channels (including college websites and learning management systems) accounted for most of the recruitment success. Trial Registration ClinicalTrials.gov NCT01692730; https://clinicaltrials.gov/ct2/show/NCT01692730 (Archived by WebCite at http://www.webcitation.org/6qEcFQN9Q) PMID:28483741
Vikingsson, Svante; Dahlberg, Jan-Olof; Hansson, Johan; Höiom, Veronica; Gréen, Henrik
2017-06-01
Dabrafenib is an inhibitor of BRAF V600E used for treating metastatic melanoma but a majority of patients experience adverse effects. Methods to measure the levels of dabrafenib and major metabolites during treatment are needed to allow development of individualized dosing strategies to reduce the burden of such adverse events. In this study, an LC-MS/MS method capable of measuring dabrafenib quantitatively and six metabolites semi-quantitatively is presented. The method is fully validated with regard to dabrafenib in human plasma in the range 5-5000 ng/mL. The analytes were separated on a C18 column after protein precipitation and detected in positive electrospray ionization mode using a Xevo TQ triple quadrupole mass spectrometer. As no commercial reference standards are available, the calibration curve of dabrafenib was used for semi-quantification of dabrafenib metabolites. Compared to earlier methods the presented method represents a simpler and more cost-effective approach suitable for clinical studies. Graphical abstract Combined multi reaction monitoring transitions of dabrafenib and metabolites in a typical case sample.
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J; Kertesz, Vilmos; Gan, Jinping
2016-03-25
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites were studied. Major organs (brain, lung, liver, kidney and muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed the same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. In addition, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement. Copyright © 2015 Elsevier B.V. All rights reserved.
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.; ...
2015-11-03
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less
Olea, Ricardo A.; Luppens, James A.
2012-01-01
There are multiple ways to characterize uncertainty in the assessment of coal resources, but not all of them are equally satisfactory. Increasingly, the tendency is toward borrowing from the statistical tools developed in the last 50 years for the quantitative assessment of other mineral commodities. Here, we briefly review the most recent of such methods and formulate a procedure for the systematic assessment of multi-seam coal deposits taking into account several geological factors, such as fluctuations in thickness, erosion, oxidation, and bed boundaries. A lignite deposit explored in three stages is used for validating models based on comparing a first set of drill holes against data from infill and development drilling. Results were fully consistent with reality, providing a variety of maps, histograms, and scatterplots characterizing the deposit and associated uncertainty in the assessments. The geostatistical approach was particularly informative in providing a probability distribution modeling deposit wide uncertainty about total resources and a cumulative distribution of coal tonnage as a function of local uncertainty.
The Role of Narratives in Sociohydrological Models of Flood Behaviors
NASA Astrophysics Data System (ADS)
Leong, Ching
2018-04-01
While current efforts to model sociohydrologic phenomena provide crucial insight, critics argue that these do not fully reflect the complexity one observes empirically in the real world. The policy sciences, with its focus on the interaction between human agency and the institutions that constrain public choice, can complement such efforts by providing a narrative approach. This paper demonstrates this complementarity by investigating the idea of resilience in a community response to floods. Using the quantitative Q methodology, we trace the dynamics of a common sociohydrologic hypothesis—the "memory effect" and how it decreases vulnerability and, more crucially, the instances when such memory effects do not obtain. Our analysis of a floodprone maladaptive community in Assam, India, finds four distinct narrative types: the Hardened Preparer, the Engineer, Discontent, and the Pessimist. This paper put forward an explicitly sociohydrological conception of resilience which takes into account the role of sociological indicators such as narrative types and perceptions. Such contextual understandings and narrative types can form the basis of generic resilience indicators which complement the anticipated outcomes of sociohydrologic models generally.
Translating the simulation of procedural drilling techniques for interactive neurosurgical training.
Stredney, Don; Rezai, Ali R; Prevedello, Daniel M; Elder, J Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J
2013-10-01
Through previous efforts we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. These volumetric data help drive an interactive multisensory, ie, visual (stereo), aural (stereo), and tactile, simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the Congress of Neurological Surgeons simulation initiative. To deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. We discuss issues of biofidelity and our methods to provide objective, quantitative and automated assessment for the residents. We conclude with a discussion of our experiences by reporting preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum.
Modifying the Human-Machine Interface Based on Quantitative Measurements of the Level of Awareness
NASA Technical Reports Server (NTRS)
Freund, Louis E.; Knapp, Benjamin
1999-01-01
This project got underway without funding approved during the summer of 1998. The initial project steps were to identify previously published work in the fields of error classification systems, physiological measurements of awareness, and related topics. This agenda was modified at the request of NASA Ames in August, 1998 to include supporting the new Cargo Air Association (CAA) evaluation of the Human Factors related to the ADS-B technology. Additional funding was promised to fully support both efforts. Work on library research ended in the late Fall, 1998 when the SJSU project directors were informed that NASA would not be adding to the initial funding of the research project as had been initially committed. However, NASA did provide additional funding for the CAA project activity. NASA elected to leave the research grant in place to provide a pathway for the CAA project funding to SJSU (San Jose State University) to support Dr. Freund's work on the CAA tasks. Dr. Knapp essentially terminated his involvement with the project at this time.
Fang, Qi; Curatolo, Andrea; Wijesinghe, Philip; Yeow, Yen Ling; Hamzah, Juliana; Noble, Peter B.; Karnowski, Karol; Sampson, David D.; Ganss, Ruth; Kim, Jun Ki; Lee, Woei M.; Kennedy, Brendan F.
2017-01-01
In this paper, we describe a technique capable of visualizing mechanical properties at the cellular scale deep in living tissue, by incorporating a gradient-index (GRIN)-lens micro-endoscope into an ultrahigh-resolution optical coherence elastography system. The optical system, after the endoscope, has a lateral resolution of 1.6 µm and an axial resolution of 2.2 µm. Bessel beam illumination and Gaussian mode detection are used to provide an extended depth-of-field of 80 µm, which is a 4-fold improvement over a fully Gaussian beam case with the same lateral resolution. Using this system, we demonstrate quantitative elasticity imaging of a soft silicone phantom containing a stiff inclusion and a freshly excised malignant murine pancreatic tumor. We also demonstrate qualitative strain imaging below the tissue surface on in situ murine muscle. The approach we introduce here can provide high-quality extended-focus images through a micro-endoscope with potential to measure cellular-scale mechanics deep in tissue. We believe this tool is promising for studying biological processes and disease progression in vivo. PMID:29188108
Phosphorus detection in vitrified bacteria by cryo-STEM annular dark-field analysis.
Wolf, Sharon Grayer; Rez, Peter; Elbaum, Michael
2015-11-01
Bacterial cells often contain dense granules. Among these, polyphosphate bodies (PPBs) store inorganic phosphate for a variety of essential functions. Identification of PPBs has until now been accomplished by analytical methods that required drying or chemically fixing the cells. These methods entail large electron doses that are incompatible with low-dose imaging of cryogenic specimens. We show here that Scanning Transmission Electron Microscopy (STEM) of fully hydrated, intact, vitrified bacteria provides a simple means for mapping of phosphorus-containing dense granules based on quantitative sensitivity of the electron scattering to atomic number. A coarse resolution of the scattering angles distinguishes phosphorus from the abundant lighter atoms: carbon, nitrogen and oxygen. The theoretical basis is similar to Z contrast of materials science. EDX provides a positive identification of phosphorus, but importantly, the method need not involve a more severe electron dose than that required for imaging. The approach should prove useful in general for mapping of heavy elements in cryopreserved specimens when the element identity is known from the biological context. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.
Liu, Jie; Zhuang, Xiahai; Wu, Lianming; An, Dongaolei; Xu, Jianrong; Peters, Terry; Gu, Lixu
2017-11-01
Objective: In this paper, we propose a fully automatic framework for myocardium segmentation of delayed-enhancement (DE) MRI images without relying on prior patient-specific information. Methods: We employ a multicomponent Gaussian mixture model to deal with the intensity heterogeneity of myocardium caused by the infarcts. To differentiate the myocardium from other tissues with similar intensities, while at the same time maintain spatial continuity, we introduce a coupled level set (CLS) to regularize the posterior probability. The CLS, as a spatial regularization, can be adapted to the image characteristics dynamically. We also introduce an image intensity gradient based term into the CLS, adding an extra force to the posterior probability based framework, to improve the accuracy of myocardium boundary delineation. The prebuilt atlases are propagated to the target image to initialize the framework. Results: The proposed method was tested on datasets of 22 clinical cases, and achieved Dice similarity coefficients of 87.43 ± 5.62% (endocardium), 90.53 ± 3.20% (epicardium) and 73.58 ± 5.58% (myocardium), which have outperformed three variants of the classic segmentation methods. Conclusion: The results can provide a benchmark for the myocardial segmentation in the literature. Significance: DE MRI provides an important tool to assess the viability of myocardium. The accurate segmentation of myocardium, which is a prerequisite for further quantitative analysis of myocardial infarction (MI) region, can provide important support for the diagnosis and treatment management for MI patients. Objective: In this paper, we propose a fully automatic framework for myocardium segmentation of delayed-enhancement (DE) MRI images without relying on prior patient-specific information. Methods: We employ a multicomponent Gaussian mixture model to deal with the intensity heterogeneity of myocardium caused by the infarcts. To differentiate the myocardium from other tissues with similar intensities, while at the same time maintain spatial continuity, we introduce a coupled level set (CLS) to regularize the posterior probability. The CLS, as a spatial regularization, can be adapted to the image characteristics dynamically. We also introduce an image intensity gradient based term into the CLS, adding an extra force to the posterior probability based framework, to improve the accuracy of myocardium boundary delineation. The prebuilt atlases are propagated to the target image to initialize the framework. Results: The proposed method was tested on datasets of 22 clinical cases, and achieved Dice similarity coefficients of 87.43 ± 5.62% (endocardium), 90.53 ± 3.20% (epicardium) and 73.58 ± 5.58% (myocardium), which have outperformed three variants of the classic segmentation methods. Conclusion: The results can provide a benchmark for the myocardial segmentation in the literature. Significance: DE MRI provides an important tool to assess the viability of myocardium. The accurate segmentation of myocardium, which is a prerequisite for further quantitative analysis of myocardial infarction (MI) region, can provide important support for the diagnosis and treatment management for MI patients.
EBSD Imaging of Monazite: a Petrochronological Tool?
NASA Astrophysics Data System (ADS)
Mottram, C. M.; Cottle, J. M.
2014-12-01
Recent advances in in-situ U-Th/Pb monazite petrochronology allow ages obtained from micron-scale portions of texturally-constrained, individual crystals to be placed directly into a quantitative Pressure-Temperature framework. However, there remain major unresolved challenges in linking monazite ages to specific deformation events and discerning the effects of deformation on the isotopic and elemental tracers in these phases. Few studies have quantitatively investigated monazite microstructure, and these studies have largely focused only on crystals produced experimentally (e.g. Reddy et al., 2010). The dispersion in age data commonly yielded from monazite U-Th/Pb datasets suggest that monazite dynamically recrystallises during deformation. It remains unclear how this continual recrystallisation is reflected in the monazite crystal structure, and how this subsequently impacts the ages (or age ranges) yielded from single crystals. Here, combined laser ablation split-stream analysis of deformed monazite, EBSD imaging and Pressure-Temperature (P-T) phase equilibria modelling is used to quantify the influence of deformation on monazite (re)crystallisation mechanisms and its subsequent effect on the crystallographic structure, ages and trace-element distribution in individual grains. These data provide links between ages and specific deformation events, thus helping further our understanding of the role of dynamic recrystallisation in producing age variation within and between crystals in a deformed rock. These data provide a new dimension to the field of petrochronology, demonstrating the importance of fully integrating the Pressure-Temperature-time-deformation history of accessory phases to better interpret the meaningfulness of ages yielded from deformed rocks. Reddy, S. et al., 2010. Mineralogical Magazine 74: 493-506
Development of optical neuroimaging to detect drug-induced brain functional changes in vivo
NASA Astrophysics Data System (ADS)
Du, Congwu; Pan, Yingtian
2014-03-01
Deficits in prefrontal function play a crucial role in compulsive cocaine use, which is a hallmark of addiction. Dysfunction of the prefrontal cortex might result from effects of cocaine on neurons as well as from disruption of cerebral blood vessels. However, the mechanisms underlying cocaine's neurotoxic effects are not fully understood, partially due to technical limitations of current imaging techniques (e.g., PET, fMRI) to differentiate vascular from neuronal effects at sufficiently high temporal and spatial resolutions. We have recently developed a multimodal imaging platform which can simultaneously characterize the changes in cerebrovascular hemodynamics, hemoglobin oxygenation and intracellular calcium fluorescence for monitoring the effects of cocaine on the brain. Such a multimodality imaging technique (OFI) provides several uniquely important merits, including: 1) a large field-of-view, 2) high spatiotemporal resolutions, 3) quantitative 3D imaging of the cerebral blood flow (CBF) networks, 4) label-free imaging of hemodynamic changes, 5) separation of vascular compartments (e.g., arterial and venous vessels) and monitoring of cortical brain metabolic changes, 6) discrimination of cellular (neuronal) from vascular responses. These imaging features have been further advanced in combination with microprobes to form micro-OFI that allows quantification of drug effects on subcortical brain. In addition, our ultrahigh-resolution ODT (μODT) enables 3D microangiography and quantitative imaging of capillary CBF networks. These optical strategies have been used to investigate the effects of cocaine on brain physiology to facilitate the studies of brain functional changes induced by addictive substance to provide new insights into neurobiological effects of the drug on the brain.
Kline, Margaret C; Duewer, David L; Redman, Janette W; Butler, John M; Boyer, David A
2002-04-15
In collaboration with the Armed Forces Institute of Pathology's Department of Defense DNA Registry, the National Institute of Standards and Technology recently evaluated the performance of a short tandem repeat multiplex with dried whole blood stains on four different commercially available identification card matrixes. DNA from 70 stains that had been stored for 19 months at ambient temperature was extracted or directly amplified and then processed using routine methods. All four storage media provided fully typeable (qualitatively identical) samples. After standardization, the average among-locus fluorescence intensity (electropherographic peak height or area) provided a suitable metric for quantitative analysis of the relative amounts of amplifiable DNA in an archived sample. The amounts of DNA in Chelex extracts from stains on two untreated high-purity cotton linter pulp papers and a paper treated with a DNA-binding coating were essentially identical. Average intensities for the aqueous extracts from a paper treated with a DNA-releasing coating were somewhat lower but also somewhat less variable than for the Chelex extracts. Average intensities of directly amplified punches of the DNA-binding paper were much larger but somewhat more variable than the Chelex extracts. Approximately 25% of the observed variation among the intensity measurements is shared among the four media and thus can be attributed to intrinsic variation in white blood count among the donors. All of the evaluated media adequately "bank" forensically useful DNA in well-dried whole blood stains for at least 19 months at ambient temperature.
Evaporation, diffusion and self-assembly at drying interfaces.
Roger, K; Sparr, E; Wennerström, H
2018-04-18
Water evaporation from complex aqueous solutions leads to the build-up of structure and composition gradients at their interface with air. We recently introduced an experimental setup for quantitatively studying such gradients and discussed how structure formation can lead to a self-regulation mechanism for controlling water evaporation through self-assembly. Here, we provide a detailed theoretical analysis using an advection/diffusion transport equation that takes into account thermodynamically non-ideal conditions and we directly relate the theoretical description to quantitative experimental data. We derive that the concentration profile develops according to a general square root of time scaling law, which fully agrees with experimental observations. The evaporation rate notably decreases with time as t-1/2, which shows that diffusion in the liquid phase is the rate limiting step for this system, in contrast to pure water evaporation. For the particular binary system that was investigated experimentally, which is composed of water and a sugar-based surfactant (α-dodecylmaltoside), the interfacial layer consists in a sequence of liquid crystalline phases of different mesostructures. We extract values for mutual diffusion coefficients of lamellar, hexagonal and micellar cubic phases, which are consistent with previously reported values and simple models. We thus provide a method to estimate the transport properties of oriented mesophases. The macroscopic humidity-independence of the evaporation rate up to 85% relative humidities is shown to result from both an extremely low mutual diffusion coefficient and the large range of water activities corresponding to relative humidities below 85%, at which the lamellar phase exists. Such a humidity self-regulation mechanism is expected for a large variety of complex system.
Shao, Qiang
2016-10-26
Large-scale conformational changes in proteins are important for their functions. Tracking the conformational change in real time at the level of a single protein molecule, however, remains a great challenge. In this article, we present a novel in silico approach with the combination of normal mode analysis and integrated-tempering-sampling molecular simulation (NMA-ITS) to give quantitative data for exploring the conformational transition pathway in multi-dimensional energy landscapes starting only from the knowledge of the two endpoint structures of the protein. The open-to-closed transitions of three proteins, including nCaM, AdK, and HIV-1 PR, were investigated using NMA-ITS simulations. The three proteins have varied structural flexibilities and domain communications in their respective conformational changes. The transition state structure in the conformational change of nCaM and the associated free-energy barrier are in agreement with those measured in a standard explicit-solvent REMD simulation. The experimentally measured transition intermediate structures of the intrinsically flexible AdK are captured by the conformational transition pathway measured here. The dominant transition pathways between the closed and fully open states of HIV-1 PR are very similar to those observed in recent REMD simulations. Finally, the evaluated relaxation times of the conformational transitions of three proteins are roughly at the same level as reported experimental data. Therefore, the NMA-ITS method is applicable for a variety of cases, providing both qualitative and quantitative insights into the conformational changes associated with the real functions of proteins.
Solomon, Keith R; Stephenson, Gladys L
2017-01-01
A quantitative weight of evidence (QWoE) methodology was developed and used to assess many higher-tier studies on the effects of three neonicotinoid insecticides: clothianidin (CTD), imidacloprid (IMI), and thiamethoxam (TMX) on honeybees. A general problem formulation, a conceptual model for exposures of honeybees, and an analysis plan were developed. A QWoE methodology was used to characterize the quality of the available studies from the literature and unpublished reports of studies conducted by or for the registrants. These higher-tier studies focused on the exposures of honeybees to neonicotinoids via several matrices as measured in the field as well as the effects in experimentally controlled field studies. Reports provided by Bayer Crop Protection and Syngenta Crop Protection and papers from the open literature were assessed in detail, using predefined criteria for quality and relevance to develop scores (on a relative scale of 0-4) to separate the higher-quality from lower-quality studies and those relevant from less-relevant results. The scores from the QWoEs were summarized graphically to illustrate the overall quality of the studies and their relevance. Through mean and standard errors, this method provided graphical and numerical indications of the quality and relevance of the responses observed in the studies and the uncertainty associated with these two metrics. All analyses were conducted transparently and the derivations of the scores were fully documented. The results of these analyses are presented in three companion papers and the QWoE analyses for each insecticide are presented in detailed supplemental information (SI) in these papers.
Classification of microscopy images of Langerhans islets
NASA Astrophysics Data System (ADS)
Å vihlík, Jan; Kybic, Jan; Habart, David; Berková, Zuzana; Girman, Peter; Kříž, Jan; Zacharovová, Klára
2014-03-01
Evaluation of images of Langerhans islets is a crucial procedure for planning an islet transplantation, which is a promising diabetes treatment. This paper deals with segmentation of microscopy images of Langerhans islets and evaluation of islet parameters such as area, diameter, or volume (IE). For all the available images, the ground truth and the islet parameters were independently evaluated by four medical experts. We use a pixelwise linear classifier (perceptron algorithm) and SVM (support vector machine) for image segmentation. The volume is estimated based on circle or ellipse fitting to individual islets. The segmentations were compared with the corresponding ground truth. Quantitative islet parameters were also evaluated and compared with parameters given by medical experts. We can conclude that accuracy of the presented fully automatic algorithm is fully comparable with medical experts.
Towards quantitative PET/MRI: a review of MR-based attenuation correction techniques.
Hofmann, Matthias; Pichler, Bernd; Schölkopf, Bernhard; Beyer, Thomas
2009-03-01
Positron emission tomography (PET) is a fully quantitative technology for imaging metabolic pathways and dynamic processes in vivo. Attenuation correction of raw PET data is a prerequisite for quantification and is typically based on separate transmission measurements. In PET/CT attenuation correction, however, is performed routinely based on the available CT transmission data. Recently, combined PET/magnetic resonance (MR) has been proposed as a viable alternative to PET/CT. Current concepts of PET/MRI do not include CT-like transmission sources and, therefore, alternative methods of PET attenuation correction must be found. This article reviews existing approaches to MR-based attenuation correction (MR-AC). Most groups have proposed MR-AC algorithms for brain PET studies and more recently also for torso PET/MR imaging. Most MR-AC strategies require the use of complementary MR and transmission images, or morphology templates generated from transmission images. We review and discuss these algorithms and point out challenges for using MR-AC in clinical routine. MR-AC is work-in-progress with potentially promising results from a template-based approach applicable to both brain and torso imaging. While efforts are ongoing in making clinically viable MR-AC fully automatic, further studies are required to realize the potential benefits of MR-based motion compensation and partial volume correction of the PET data.
Karanasios, Evangelos C; Tsiropoulos, Nikolaos G; Karpouzas, Dimitrios G
2013-09-01
Biobed substrates commonly exhibit high degradation capacity. However, degradation does not always lead to detoxification and information on the metabolic pathways of pesticides in biobeds is scarce. We studied the degradation and metabolism of three pesticides in selected biomixtures and soil. Biomixtures stimulated degradation of terbuthylazine and metribuzin, whereas chlorpyrifos degraded faster in soil. The latter was attributed to the lipophilicity of chlorpyrifos which increased adsorption and limited biodegradation in organic-rich biomixtures. Although the same metabolites were detected in all substrates, qualitative and quantitative differences in the metabolic routes of pesticides in the various substrates were observed. Chlorpyrifos was hydrolyzed to 3,5,6-tricholorpyridinol (TCP) which was further degraded only in compost-biomixture CBX1. Metabolism of terbuthylazine in compost biomixtures (BX) and soil resulted in the formation of desethyl-terbuthylazine (DES) which was fully degraded only in the compost-biomixture CBX2, whereas peat-based biomixture (OBX) promoted the hydroxylation of terbuthylazine. Desamino- (DA) (dominant) and diketo- (DK) metribuzin appear as intermediate metabolites in all substrates and were further transformed to desamino-diketo-metribuzin (DADK) which was fully degraded only in compost-biomixture GSBX. Overall, lower amounts of metabolites were accumulated in biomixtures compared to soil stressing the higher depuration efficiency of biobeds. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Rui-Wu; Dunn, Derek W.; Luo, Jun; He, Jun-Zhou; Shi, Lei
2015-10-01
Understanding the factors that enable mutualisms to evolve and to subsequently remain stable over time, is essential to fully understand patterns of global biodiversity and for evidence based conservation policy. Theoretically, spatial heterogeneity of mutualists, through increased likelihood of fidelity between cooperative partners in structured populations, and ‘self-restraint’ of symbionts, due to selection against high levels of virulence leading to short-term host overexploitation, will result in either a positive correlation between the reproductive success of both mutualists prior to the total exploitation of any host resource or no correlation after any host resource has been fully exploited. A quantitative review by meta-analysis on the results of 96 studies from 35 papers, showed no evidence of a significant fitness correlation between mutualists across a range of systems that captured much taxonomic diversity. However, when the data were split according to four categories of host: 1) cnidarian corals, 2) woody plants, 3) herbaceous plants, and 4) insects, a significantly positive effect in corals was revealed. The trends for the remaining three categories did not significantly differ to zero. Our results suggest that stability in mutualisms requires alternative processes, or mechanisms in addition to, spatial heterogeneity of hosts and/or ‘self-restraint’ of symbionts.
Fournier-Level, Alexandre; Le Cunff, Loïc; Gomez, Camila; Doligez, Agnès; Ageorges, Agnès; Roux, Catherine; Bertrand, Yves; Souquet, Jean-Marc; Cheynier, Véronique; This, Patrice
2009-11-01
The combination of QTL mapping studies of synthetic lines and association mapping studies of natural diversity represents an opportunity to throw light on the genetically based variation of quantitative traits. With the positional information provided through quantitative trait locus (QTL) mapping, which often leads to wide intervals encompassing numerous genes, it is now feasible to directly target candidate genes that are likely to be responsible for the observed variation in completely sequenced genomes and to test their effects through association genetics. This approach was performed in grape, a newly sequenced genome, to decipher the genetic architecture of anthocyanin content. Grapes may be either white or colored, ranging from the lightest pink to the darkest purple tones according to the amount of anthocyanin accumulated in the berry skin, which is a crucial trait for both wine quality and human nutrition. Although the determinism of the white phenotype has been fully identified, the genetic bases of the quantitative variation of anthocyanin content in berry skin remain unclear. A single QTL responsible for up to 62% of the variation in the anthocyanin content was mapped on a Syrah x Grenache F(1) pseudo-testcross. Among the 68 unigenes identified in the grape genome within the QTL interval, a cluster of four Myb-type genes was selected on the basis of physiological evidence (VvMybA1, VvMybA2, VvMybA3, and VvMybA4). From a core collection of natural resources (141 individuals), 32 polymorphisms revealed significant association, and extended linkage disequilibrium was observed. Using a multivariate regression method, we demonstrated that five polymorphisms in VvMybA genes except VvMybA4 (one retrotransposon, three single nucleotide polymorphisms and one 2-bp insertion/deletion) accounted for 84% of the observed variation. All these polymorphisms led to either structural changes in the MYB proteins or differences in the VvMybAs promoters. We concluded that the continuous variation in anthocyanin content in grape was explained mainly by a single gene cluster of three VvMybA genes. The use of natural diversity helped to reduce one QTL to a set of five quantitative trait nucleotides and gave a clear picture of how isogenes combined their effects to shape grape color. Such analysis also illustrates how isogenes combine their effect to shape a complex quantitative trait and enables the definition of markers directly targeted for upcoming breeding programs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... reduced annuity or one-half reduced annuity to provide a former spouse annuity. 842.611 Section 842.611... EMPLOYEES RETIREMENT SYSTEM-BASIC ANNUITY Survivor Elections § 842.611 Post-retirement election of a fully reduced annuity or one-half reduced annuity to provide a former spouse annuity. (a) Except as provided in...
Chen, Kun; Wu, Tao; Wei, Haoyun; Zhou, Tian; Li, Yan
2016-01-01
Coherent anti-Stokes Raman microscopy (CARS) is a quantitative, chemically specific, and label-free optical imaging technique for studying inhomogeneous systems. However, the complicating influence of the nonresonant response on the CARS signal severely limits its sensitivity and specificity and especially limits the extent to which CARS microscopy has been used as a fully quantitative imaging technique. On the basis of spectral focusing mechanism, we establish a dual-soliton Stokes based CARS microspectroscopy and microscopy scheme capable of quantifying the spatial information of densities and chemical composition within inhomogeneous samples, using a single fiber laser. Dual-soliton Stokes scheme not only removes the nonresonant background but also allows robust acquisition of multiple characteristic vibrational frequencies. This all-fiber based laser source can cover the entire fingerprint (800-2200 cm−1) region with a spectral resolution of 15 cm−1. We demonstrate that quantitative degree determination of lipid-chain unsaturation in the fatty acids mixture can be achieved by the characterization of C = C stretching and CH2 deformation vibrations. For microscopy purposes, we show that the spatially inhomogeneous distribution of lipid droplets can be further quantitatively visualized using this quantified degree of lipid unsaturation in the acyl chain for contrast in the hyperspectral CARS images. The combination of compact excitation source and background-free capability to facilitate extraction of quantitative composition information with multiplex spectral peaks will enable wider applications of quantitative chemical imaging in studying biological and material systems. PMID:27867704
Comparing fully general relativistic and Newtonian calculations of structure formation
NASA Astrophysics Data System (ADS)
East, William E.; Wojtak, Radosław; Abel, Tom
2018-02-01
In the standard approach to studying cosmological structure formation, the overall expansion of the Universe is assumed to be homogeneous, with the gravitational effect of inhomogeneities encoded entirely in a Newtonian potential. A topic of ongoing debate is to what degree this fully captures the dynamics dictated by general relativity, especially in the era of precision cosmology. To quantitatively assess this, we directly compare standard N-body Newtonian calculations to full numerical solutions of the Einstein equations, for cold matter with various magnitude initial inhomogeneities on scales comparable to the Hubble horizon. We analyze the differences in the evolution of density, luminosity distance, and other quantities defined with respect to fiducial observers. This is carried out by reconstructing the effective spacetime and matter fields dictated by the Newtonian quantities, and by taking care to distinguish effects of numerical resolution. We find that the fully general relativistic and Newtonian calculations show excellent agreement, even well into the nonlinear regime. They only notably differ in regions where the weak gravity assumption breaks down, which arise when considering extreme cases with perturbations exceeding standard values.
Badran, Hani; Pluye, Pierre; Grad, Roland
2017-03-14
The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. ©Hani Badran, Pierre Pluye, Roland Grad. Originally published in JMIR Medical Education (http://mededu.jmir.org), 14.03.2017.
DOT National Transportation Integrated Search
2014-08-01
Fully automated or autonomous vehicles (AVs) hold great promise for the future of transportation. By 2020 : Google, auto manufacturers and other technology providers intend to introduce self-driving cars to the public with : either limited or fully a...
NASA Astrophysics Data System (ADS)
Noh, S. J.; Lee, J. H.; Lee, S.; Zhang, Y.; Seo, D. J.
2017-12-01
Hurricane Harvey was one of the most extreme weather events in Texas history and left significant damages in the Houston and adjoining coastal areas. To understand better the relative impact to urban flooding of extreme amount and spatial extent of rainfall, unique geography, land use and storm surge, high-resolution water modeling is necessary such that natural and man-made components are fully resolved. In this presentation, we reconstruct spatiotemporal evolution of inundation during Hurricane Harvey using hyper-resolution modeling and quantitative image reanalysis. The two-dimensional urban flood model used is based on dynamic wave approximation and 10 m-resolution terrain data, and is forced by the radar-based multisensor quantitative precipitation estimates. The model domain includes Buffalo, Brays, Greens and White Oak Bayous in Houston. The model is simulated using hybrid parallel computing. To evaluate dynamic inundation mapping, we combine various qualitative crowdsourced images and video footages with LiDAR-based terrain data.
NASA Technical Reports Server (NTRS)
Hoebel, Louis J.
1993-01-01
The problem of plan generation (PG) and the problem of plan execution monitoring (PEM), including updating, queries, and resource-bounded replanning, have different reasoning and representation requirements. PEM requires the integration of qualitative and quantitative information. PEM is the receiving of data about the world in which a plan or agent is executing. The problem is to quickly determine the relevance of the data, the consistency of the data with respect to the expected effects, and if execution should continue. Only spatial and temporal aspects of the plan are addressed for relevance in this work. Current temporal reasoning systems are deficient in computational aspects or expressiveness. This work presents a hybrid qualitative and quantitative system that is fully expressive in its assertion language while offering certain computational efficiencies. In order to proceed, methods incorporating approximate reasoning using hierarchies, notions of locality, constraint expansion, and absolute parameters need be used and are shown to be useful for the anytime nature of PEM.
NASA Astrophysics Data System (ADS)
Ragno, Rino; Ballante, Flavio; Pirolli, Adele; Wickersham, Richard B.; Patsilinakos, Alexandros; Hesse, Stéphanie; Perspicace, Enrico; Kirsch, Gilbert
2015-08-01
Vascular endothelial growth factor receptor-2, (VEGFR-2), is a key element in angiogenesis, the process by which new blood vessels are formed, and is thus an important pharmaceutical target. Here, 3-D quantitative structure-activity relationship (3-D QSAR) were used to build a quantitative screening and pharmacophore model of the VEGFR-2 receptors for design of inhibitors with improved activities. Most of available experimental data information has been used as training set to derive optimized and fully cross-validated eight mono-probe and a multi-probe quantitative models. Notable is the use of 262 molecules, aligned following both structure-based and ligand-based protocols, as external test set confirming the 3-D QSAR models' predictive capability and their usefulness in design new VEGFR-2 inhibitors. From a survey on literature, this is the first generation of a wide-ranging computational medicinal chemistry application on VEGFR2 inhibitors.
NASA Astrophysics Data System (ADS)
Wang, Hui; Wang, Jian-Tao; Cao, Ze-Xian; Zhang, Wen-Jun; Lee, Chun-Sing; Lee, Shuit-Tong; Zhang, Xiao-Hong
2015-03-01
While the vapour-liquid-solid process has been widely used for growing one-dimensional nanostructures, quantitative understanding of the process is still far from adequate. For example, the origins for the growth of periodic one-dimensional nanostructures are not fully understood. Here we observe that morphologies in a wide range of periodic one-dimensional nanostructures can be described by two quantitative relationships: first, inverse of the periodic spacing along the length direction follows an arithmetic sequence; second, the periodic spacing in the growth direction varies linearly with the diameter of the nanostructure. We further find that these geometric relationships can be explained by considering the surface curvature oscillation of the liquid sphere at the tip of the growing nanostructure. The work reveals the requirements of vapour-liquid-solid growth. It can be applied for quantitative understanding of vapour-liquid-solid growth and to design experiments for controlled growth of nanostructures with custom-designed morphologies.
Gravitational Force and the Cardiovascular System
NASA Technical Reports Server (NTRS)
Pendergast, D. R.; Olszowka, A. J.; Rokitka, M. A.; Farhi, L. E.
1991-01-01
Cardiovascular responses to changes in gravitational force are considered. Man is ideally suited to his 1-g environment. Although cardiovascular adjustments are required to accommodate to postural changes and exercise, these are fully accomplished for short periods (min). More challenging stresses are those of short-term microgravity (h) and long-term microgravity (days) and of gravitational forces greater than that of Earth. The latter can be simulated in the laboratory and quantitative studies can be conducted.
Nagai, Kanto; Hoshino, Yuichi; Nishizawa, Yuichiro; Araki, Daisuke; Matsushita, Takehiko; Matsumoto, Tomoyuki; Takayama, Koji; Nagamune, Kouki; Kurosaka, Masahiro; Kuroda, Ryosuke
2015-10-01
Tibial acceleration during the pivot shift test is a potential quantitative parameter to evaluate rotational laxity of anterior cruciate ligament (ACL) insufficiency. However, clinical application of this measurement has not been fully examined. This study aimed to measure and compare tibial acceleration before and after ACL reconstruction (ACLR) in ACL-injured patients. We hypothesized tibial acceleration would be reduced by ACLR and tibial acceleration would be consistent in the same knee at different time points. Seventy ACL-injured patients who underwent ACLR were enrolled. Tibial acceleration during the pivot shift test was measured using an electromagnetic measurement system before ALCR and at the second-look arthroscopy 1 year post-operatively. Tibial acceleration was compared to clinical grading and between ACL-injured/ACL-reconstructed and contralateral knees. Pre-operative tibial acceleration was increased stepwise with the increase in clinical grading (P < 0.01). Tibial acceleration in ACL-injured knee (1.9 ± 1.2 m/s(2)) was larger than that in the contralateral knee (0.8 ± 0.3 m/s(2), P < 0.01), and reduced to 0.9 ± 0.3 m/s(2) post-operatively (P < 0.01). There was no difference between ACL-reconstructed and contralateral knee (n.s.). Tibial acceleration in contralateral knees was consistent pre- and post-operatively (n.s.). Tibial acceleration measurement demonstrated increased rotational laxity in ACL-injured knees and its reduction by ALCR. Additionally, consistent measurements were obtained in ACL-intact knees at different time points. Therefore, tibial acceleration during the pivot shift test could provide quantitative evaluation of rotational stability before and after ACL reconstruction. III.
Štrukil, Vjekoslav; Igrc, Marina D; Eckert-Maksić, Mirjana; Friščić, Tomislav
2012-07-02
Mechanochemical methods of neat grinding and liquid-assisted grinding have been applied to the synthesis of mono- and bis(thiourea)s by using the click coupling of aromatic and aliphatic diamines with aromatic isothiocyanates. The ability to modify the reaction conditions allowed the optimization of each reaction, leading to the quantitative formation of chiral bis(thiourea)s with known uses as organocatalysts or anion sensors. Quantitative reaction yields, combined with the fact that mechanochemical reaction conditions avoid the use of bulk solvents, enabled solution-based purification methods (such as chromatography or recrystallization) to be completely avoided. Importantly, by using selected model reactions, we also show that the described mechanochemical reaction procedures can be readily scaled up to at least the one-gram scale. In that way, mechanochemical synthesis provides a facile method to fully transform valuable enantiomerically pure reagents into useful products that can immediately be applied in their designed purpose. This was demonstrated by using some of the mechanochemically prepared reagents as organocatalysts in a model Morita-Baylis-Hillman reaction and as cyanide ion sensors in organic solvents. The use of electronically and sterically hindered ortho-phenylenediamine revealed that mechanochemical reaction conditions can be readily optimized to form either the 1:1 or the 1:2 click-coupling product, demonstrating that reaction stoichiometry can be more efficiently controlled under these conditions than in solution-based syntheses. In this way, it was shown that excellent stoichiometric control by mechanochemistry, previously established for mechanochemical syntheses of cocrystals and coordination polymers, can also be achieved in the context of covalent-bond formation. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chemical Bonding in Sulfide Minerals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, David J.; Rosso, Kevin M.
An understanding of chemical bonding and electronic structure in sulfide minerals is central to any attempt at understanding their crystal structures, stabilities and physical properties. It is also an essential precursor to understanding reactivity through modeling surface structure at the molecular scale. In recent decades, there have been remarkable advances in first principles (ab initio) methods for the quantitative calculation of electronic structure. These advances have been made possible by the very rapid development of high performance computers. Several review volumes that chart the applications of these developments in mineralogy and geochemistry are available (Tossell and Vaughan, 1992; Cygan andmore » Kubicki, 2001). An important feature of the sulfide minerals is the diversity of their electronic structures, as evidenced by their electrical and magnetic properties (see Pearce et al. 2006, this volume). Thus, sulfide minerals range from insulators through semiconductors to metals, and exhibit every type of magnetic behavior. This has presented problems for those attempting to develop bonding models for sulfides, and also led to certain misconceptions regarding the kinds of models that may be appropriate. In this chapter, chemical bonding and electronic structure models for sulfides are reviewed with emphasis on more recent developments. Although the fully ab initio quantitative methods are now capable of a remarkable degree of sophistication in terms of agreement with experiment and potential to interpret and predict behavior with varying conditions, both qualitative and more simplistic quantitative approaches will also be briefly discussed. This is because we believe that the insights which they provide are still helpful to those studying sulfide minerals. In addition to the application of electronic structure models and calculations to solid sulfides, work on sulfide mineral surfaces (Rosso and Vaughan 2006a,b) and solution complexes and clusters (Rickard and Luther, 2006) are discussed in detail later in this volume.« less
Lysanets, Yuliia V; Bieliaieva, Olena M
2018-02-23
This paper focuses on the prevalence of Latin terms and terminological collocations in the issues of Journal of Medical Case Reports (February 2007-August 2017) and discusses the role of Latin terminology in the contemporary process of writing medical case reports. The objective of the research is to study the frequency of using Latin terminology in English-language medical case reports, thus providing relevant guidelines for medical professionals who deal with this genre and drawing their attention to the peculiarities of using Latin in case reports. The selected medical case reports are considered, using methods of quantitative examination and structural, narrative, and contextual analyses. We developed structural and thematic typologies of Latin terms and expressions, and we conducted a quantitative analysis that enabled us to observe the tendencies in using these lexical units in medical case reports. The research revealed that the use of Latin fully complies with the communicative strategies of medical case reports as a genre. Owing to the fact that Latin medical lexis is internationally adopted and understood worldwide, it promotes the conciseness of medical case reports, as well as contributes to their narrative style and educational intentions. The adequate use of Latin terms in medical case reports is an essential prerequisite of effective sharing of one's clinical findings with fellow researchers from all over the world. Therefore, it is highly important to draw students' attention to Latin terms and expressions that are used in medical case reports most frequently. Hence, the analysis of structural, thematic, and contextual features of Latin terms in case reports should be an integral part of curricula at medical universities.
Leung, Jacqueline M.; Rould, Mark A.; Konradt, Christoph; Hunter, Christopher A.; Ward, Gary E.
2014-01-01
T. gondii uses substrate-dependent gliding motility to invade cells of its hosts, egress from these cells at the end of its lytic cycle and disseminate through the host organism during infection. The ability of the parasite to move is therefore critical for its virulence. T. gondii engages in three distinct types of gliding motility on coated two-dimensional surfaces: twirling, circular gliding and helical gliding. We show here that motility in a three-dimensional Matrigel-based environment is strikingly different, in that all parasites move in irregular corkscrew-like trajectories. Methods developed for quantitative analysis of motility parameters along the smoothed trajectories demonstrate a complex but periodic pattern of motility with mean and maximum velocities of 0.58±0.07 µm/s and 2.01±0.17 µm/s, respectively. To test how a change in the parasite's crescent shape might affect trajectory parameters, we compared the motility of Δphil1 parasites, which are shorter and wider than wild type, to the corresponding parental and complemented lines. Although comparable percentages of parasites were moving for all three lines, the Δphil1 mutant exhibited significantly decreased trajectory lengths and mean and maximum velocities compared to the parental parasite line. These effects were either partially or fully restored upon complementation of the Δphil1 mutant. These results show that alterations in morphology may have a significant impact on T. gondii motility in an extracellular matrix-like environment, provide a possible explanation for the decreased fitness of Δphil1 parasites in vivo, and demonstrate the utility of the quantitative three-dimensional assay for studying parasite motility. PMID:24489670
Desmarais, Samantha M.; Leitner, Thomas; Barron, Annelise E.
2012-01-01
DNA barcodes are short, unique ssDNA primers that “mark” individual biomolecules. To gain better understanding of biophysical parameters constraining primer-dimer formation between primers that incorporate barcode sequences, we have developed a capillary electrophoresis method that utilizes drag-tag-DNA conjugates to quantify dimerization risk between primer-barcode pairs. Results obtained with this unique free-solution conjugate electrophoresis (FSCE) approach are useful as quantitatively precise input data to parameterize computation models of dimerization risk. A set of fluorescently labeled, model primer-barcode conjugates were designed with complementary regions of differing lengths to quantify heterodimerization as a function of temperature. Primer-dimer cases comprised two 30-mer primers, one of which was covalently conjugated to a lab-made, chemically synthesized poly-N-methoxyethylglycine drag-tag, which reduced electrophoretic mobility of ssDNA to distinguish it from ds primer-dimers. The drag-tags also provided a shift in mobility for the dsDNA species, which allowed us to quantitate primer-dimer formation. In the experimental studies, pairs of oligonucleotide primer-barcodes with fully or partially complementary sequences were annealed, and then separated by free-solution conjugate CE at different temperatures, to assess effects on primer-dimer formation. When less than 30 out of 30 basepairs were bonded, dimerization was inversely correlated to temperature. Dimerization occurred when more than 15 consecutive basepairs formed, yet non-consecutive basepairs did not create stable dimers even when 20 out of 30 possible basepairs bonded. The use of free-solution electrophoresis in combination with a peptoid drag-tag and different fluorophores enabled precise separation of short DNA fragments to establish a new mobility shift assay for detection of primer-dimer formation. PMID:22331820
Extracting quantitative measures from EAP: a small clinical study using BFOR.
Hosseinbor, A Pasha; Chung, Moo K; Wu, Yu-Chien; Fleming, John O; Field, Aaron S; Alexander, Andrew L
2012-01-01
The ensemble average propagator (EAP) describes the 3D average diffusion process of water molecules, capturing both its radial and angular contents, and hence providing rich information about complex tissue microstructure properties. Bessel Fourier orientation reconstruction (BFOR) is one of several analytical, non-Cartesian EAP reconstruction schemes employing multiple shell acquisitions that have recently been proposed. Such modeling bases have not yet been fully exploited in the extraction of rotationally invariant q-space indices that describe the degree of diffusion anisotropy/restrictivity. Such quantitative measures include the zero-displacement probability (P(o)), mean squared displacement (MSD), q-space inverse variance (QIV), and generalized fractional anisotropy (GFA), and all are simply scalar features of the EAP. In this study, a general relationship between MSD and q-space diffusion signal is derived and an EAP-based definition of GFA is introduced. A significant part of the paper is dedicated to utilizing BFOR in a clinical dataset, comprised of 5 multiple sclerosis (MS) patients and 4 healthy controls, to estimate P(o), MSD, QIV, and GFA of corpus callosum, and specifically, to see if such indices can detect changes between normal appearing white matter (NAWM) and healthy white matter (WM). Although the sample size is small, this study is a proof of concept that can be extended to larger sample sizes in the future.
NASA Technical Reports Server (NTRS)
Manney, Gloria; Daffer, William H.; Zawodny, Joseph M.; Bernath, Peter F.; Hoppel, Karl W.; Walker, Kaley A.; Knosp, Brian W.; Boone, Chris; Remsberg, Ellis E.; Santee, Michelle L.;
2007-01-01
Derived Meteorological Products (DMPs, including potential temperature (theta), potential vorticity, equivalent latitude (EqL), horizontal winds and tropopause locations) have been produced for the locations and times of measurements by several solar occultation (SO) instruments and the Aura Microwave Limb Sounder (MLS). DMPs are calculated from several meteorological analyses for the Atmospheric Chemistry Experiment-Fourier Transform Spectrometer, Stratospheric Aerosol and Gas Experiment II and III, Halogen Occultation Experiment, and Polar Ozone and Aerosol Measurement II and III SO instruments and MLS. Time-series comparisons of MLS version 1.5 and SO data using DMPs show good qualitative agreement in time evolution of O3, N2O, H20, CO, HNO3, HCl and temperature; quantitative agreement is good in most cases. EqL-coordinate comparisons of MLS version 2.2 and SO data show good quantitative agreement throughout the stratosphere for most of these species, with significant biases for a few species in localized regions. Comparisons in EqL coordinates of MLS and SO data, and of SO data with geographically coincident MLS data provide insight into where and how sampling effects are important in interpretation of the sparse SO data, thus assisting in fully utilizing the SO data in scientific studies and comparisons with other sparse datasets. The DMPs are valuable for scientific studies and to facilitate validation of non-coincident measurements.
Evidence of reduced recombination rate in human regulatory domains.
Liu, Yaping; Sarkar, Abhishek; Kheradpour, Pouya; Ernst, Jason; Kellis, Manolis
2017-10-20
Recombination rate is non-uniformly distributed across the human genome. The variation of recombination rate at both fine and large scales cannot be fully explained by DNA sequences alone. Epigenetic factors, particularly DNA methylation, have recently been proposed to influence the variation in recombination rate. We study the relationship between recombination rate and gene regulatory domains, defined by a gene and its linked control elements. We define these links using expression quantitative trait loci (eQTLs), methylation quantitative trait loci (meQTLs), chromatin conformation from publicly available datasets (Hi-C and ChIA-PET), and correlated activity links that we infer across cell types. Each link type shows a "recombination rate valley" of significantly reduced recombination rate compared to matched control regions. This recombination rate valley is most pronounced for gene regulatory domains of early embryonic development genes, housekeeping genes, and constitutive regulatory elements, which are known to show increased evolutionary constraint across species. Recombination rate valleys show increased DNA methylation, reduced doublestranded break initiation, and increased repair efficiency, specifically in the lineage leading to the germ line. Moreover, by using only the overlap of functional links and DNA methylation in germ cells, we are able to predict the recombination rate with high accuracy. Our results suggest the existence of a recombination rate valley at regulatory domains and provide a potential molecular mechanism to interpret the interplay between genetic and epigenetic variations.
Strategies facilitating practice change in pediatric cancer: a systematic review.
Robinson, Paula D; Dupuis, Lee L; Tomlinson, George; Phillips, Bob; Greenberg, Mark; Sung, Lillian
2016-09-01
By conducting a systematic review, we describe strategies to actively disseminate knowledge or facilitate practice change among healthcare providers caring for children with cancer and we evaluate the effectiveness of these strategies. We searched Ovid Medline, EMBASE and PsychINFO. Fully published primary studies were included if they evaluated one or more professional intervention strategies to actively disseminate knowledge or facilitate practice change in pediatric cancer or hematopoietic stem cell transplantation. Data extracted included study characteristics and strategies evaluated. In studies with a quantitative analysis of patient outcomes, the relationship between study-level characteristics and statistically significant primary analyses was evaluated. Of 20 644 titles and abstracts screened, 146 studies were retrieved in full and 60 were included. In 20 studies, quantitative evaluation of patient outcomes was examined and a primary outcome was stated. Eighteen studies were 'before and after' design; there were no randomized studies. All studies were at risk for bias. Interrupted time series was never the primary analytic approach. No specific strategy type was successful at improving patient outcomes. Literature describing strategies to facilitate practice change in pediatric cancer is emerging. However, major methodological limitations exist. Studies with robust designs are required to identify effective strategies to effect practice change. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Leaf epidermis images for robust identification of plants
da Silva, Núbia Rosa; Oliveira, Marcos William da Silva; Filho, Humberto Antunes de Almeida; Pinheiro, Luiz Felipe Souza; Rossatto, Davi Rodrigo; Kolb, Rosana Marta; Bruno, Odemir Martinez
2016-01-01
This paper proposes a methodology for plant analysis and identification based on extracting texture features from microscopic images of leaf epidermis. All the experiments were carried out using 32 plant species with 309 epidermal samples captured by an optical microscope coupled to a digital camera. The results of the computational methods using texture features were compared to the conventional approach, where quantitative measurements of stomatal traits (density, length and width) were manually obtained. Epidermis image classification using texture has achieved a success rate of over 96%, while success rate was around 60% for quantitative measurements taken manually. Furthermore, we verified the robustness of our method accounting for natural phenotypic plasticity of stomata, analysing samples from the same species grown in different environments. Texture methods were robust even when considering phenotypic plasticity of stomatal traits with a decrease of 20% in the success rate, as quantitative measurements proved to be fully sensitive with a decrease of 77%. Results from the comparison between the computational approach and the conventional quantitative measurements lead us to discover how computational systems are advantageous and promising in terms of solving problems related to Botany, such as species identification. PMID:27217018
Quantitative filter forensics for indoor particle sampling.
Haaland, D; Siegel, J A
2017-03-01
Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Tidwell, W James; Owen, Cindy E; Kulp-Shorten, Carol; Maity, Abhishek; McCall, Michael; Brown, Timothy S
2016-11-01
Ablative laser resurfacing is a common treatment for post-surgical scars. Fractional ablative laser resurfacing has been an emerging treatment option that is replacing fully ablative lasers in many applications. Data comparing fractionated and fully ablative lasers in treating post-operative scars are lacking. Twenty patients were enrolled in a split scar study following excisions from dermatologic surgery. Wounds had to be older than 8 weeks but less than 1 year. The scars were randomly divided into two halves. One half of the scar was treated with fully ablative erbium-doped yttrium aluminum garnet (Er:YAG) and the other was treated with fractionated Er:YAG. The scars were treated at monthly intervals for 3 months, then followed up at months 1 and 2 after the last treatment. POSAS was used to evaluate the scars by a panel of dermatologists blinded to the lasers in conjunction with the patients, who were also blinded. Physicians and patients both observed a superior outcome of 32.5% (P = 0.019) and 58.1% (P = 0.001), respectively, using the POSAS. There was no trend in difference in pain reported by the patient between the two lasers. Patients overwhelmingly preferred the fractionated Er:YAG laser (94%) to the fully ablative laser when asked at the end of the study. Although this study is limited by a short follow-up period, it shows a statistically significant superior outcome in fractionated Er:YAG over fully ablative Er:YAG for scar revision. It also adds quantitative values to the assessment of scar appearance when treated with fractionated lasers compared to fully ablative lasers. It was also found that the fractionated Er:YAG had increased patient satisfaction, but there was no difference in reported pain scores. These data are useful when counseling patients undergoing laser surgery. Lasers Surg. Med. 48:837-843, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Fully wireless pressure sensor based on endoscopy images
NASA Astrophysics Data System (ADS)
Maeda, Yusaku; Mori, Hirohito; Nakagawa, Tomoaki; Takao, Hidekuni
2018-04-01
In this paper, the result of developing a fully wireless pressure sensor based on endoscopy images for an endoscopic surgery is reported for the first time. The sensor device has structural color with a nm-scale narrow gap, and the gap is changed by air pressure. The structural color of the sensor is acquired from camera images. Pressure detection can be realized with existing endoscope configurations only. The inner air pressure of the human body should be measured under flexible-endoscope operation using the sensor. Air pressure monitoring, has two important purposes. The first is to quantitatively measure tumor size under a constant air pressure for treatment selection. The second purpose is to prevent the endangerment of a patient due to over transmission of air. The developed sensor was evaluated, and the detection principle based on only endoscopy images has been successfully demonstrated.
NASA Astrophysics Data System (ADS)
Huang, Xia; Li, Chunqiang; Xiao, Chuan; Sun, Wenqing; Qian, Wei
2017-03-01
The temporal focusing two-photon microscope (TFM) is developed to perform depth resolved wide field fluorescence imaging by capturing frames sequentially. However, due to strong nonignorable noises and diffraction rings surrounding particles, further researches are extremely formidable without a precise particle localization technique. In this paper, we developed a fully-automated scheme to locate particles positions with high noise tolerance. Our scheme includes the following procedures: noise reduction using a hybrid Kalman filter method, particle segmentation based on a multiscale kernel graph cuts global and local segmentation algorithm, and a kinematic estimation based particle tracking method. Both isolated and partial-overlapped particles can be accurately identified with removal of unrelated pixels. Based on our quantitative analysis, 96.22% isolated particles and 84.19% partial-overlapped particles were successfully detected.
NASA Astrophysics Data System (ADS)
Lottermoser, Werner; Redhammer, Günther J.; Weber, Sven-Ulf; Litterst, Fred Jochen; Tippelt, Gerold; Dlugosz, Stephen; Bank, Hermann; Amthauer, Georg; Grodzicki, Michael
2011-12-01
This work reports on the evaluation of the electric field gradient (EFG) in natural chrysoberyl Al2BeO4 and sinhalite MgAlBO4 using two different procedures: (1) experimental, with single crystal Mössbauer spectroscopy (SCMBS) on the three principal sections of each sample and (2) a "fully quantitative" method with cluster molecular orbital calculations based on the density functional theory. Whereas the experimental and theoretical results for the EFG tensor are in quantitative agreement, the calculated isomer shifts and optical d-d-transitions exhibit systematic deviations from the measured values. These deviations indicate that the substitution of Al and Mg with iron should be accompanied by considerable local expansion of the coordination octahedra.
Dou, Maowei; Lopez, Juan; Rios, Misael; Garcia, Oscar; Xiao, Chuan; Eastman, Michael
2016-01-01
A cost-effective battery-powered spectrophotometric system (BASS) was developed for quantitative point-of-care (POC) analysis on a microfluidic chip. By using methylene blue as a model analyte, we first compared the performance of the BASS with a commercial spectrophotometric system, and further applied the BASS for loop-mediated isothermal amplification (LAMP) detection and subsequent quantitative nucleic acid analysis which exhibited a comparable limit of detection to that of Nanodrop. Compared to the commercial spectrophotometric system, our spectrophotometric system is lower-cost, consumes less reagents, and has a higher detection sensitivity. Most importantly, it does not rely on external power supplies. All these features make our spectrophotometric system highly suitable for a variety of POC analyses, such as field detection. PMID:27143408
NASA Astrophysics Data System (ADS)
Singla, Neeru; Srivastava, Vishal; Singh Mehta, Dalip
2018-02-01
We report the first fully automated detection of human skin burn injuries in vivo, with the goal of automatic surgical margin assessment based on optical coherence tomography (OCT) images. Our proposed automated procedure entails building a machine-learning-based classifier by extracting quantitative features from normal and burn tissue images recorded by OCT. In this study, 56 samples (28 normal, 28 burned) were imaged by OCT and eight features were extracted. A linear model classifier was trained using 34 samples and 22 samples were used to test the model. Sensitivity of 91.6% and specificity of 90% were obtained. Our results demonstrate the capability of a computer-aided technique for accurately and automatically identifying burn tissue resection margins during surgical treatment.
NASA Astrophysics Data System (ADS)
Hincks, Ian; Granade, Christopher; Cory, David G.
2018-01-01
The analysis of photon count data from the standard nitrogen vacancy (NV) measurement process is treated as a statistical inference problem. This has applications toward gaining better and more rigorous error bars for tasks such as parameter estimation (e.g. magnetometry), tomography, and randomized benchmarking. We start by providing a summary of the standard phenomenological model of the NV optical process in terms of Lindblad jump operators. This model is used to derive random variables describing emitted photons during measurement, to which finite visibility, dark counts, and imperfect state preparation are added. NV spin-state measurement is then stated as an abstract statistical inference problem consisting of an underlying biased coin obstructed by three Poisson rates. Relevant frequentist and Bayesian estimators are provided, discussed, and quantitatively compared. We show numerically that the risk of the maximum likelihood estimator is well approximated by the Cramér-Rao bound, for which we provide a simple formula. Of the estimators, we in particular promote the Bayes estimator, owing to its slightly better risk performance, and straightforward error propagation into more complex experiments. This is illustrated on experimental data, where quantum Hamiltonian learning is performed and cross-validated in a fully Bayesian setting, and compared to a more traditional weighted least squares fit.
Microfluidic sorting and multimodal typing of cancer cells in self-assembled magnetic arrays.
Saliba, Antoine-Emmanuel; Saias, Laure; Psychari, Eleni; Minc, Nicolas; Simon, Damien; Bidard, François-Clément; Mathiot, Claire; Pierga, Jean-Yves; Fraisier, Vincent; Salamero, Jean; Saada, Véronique; Farace, Françoise; Vielh, Philippe; Malaquin, Laurent; Viovy, Jean-Louis
2010-08-17
We propose a unique method for cell sorting, "Ephesia," using columns of biofunctionalized superparamagnetic beads self-assembled in a microfluidic channel onto an array of magnetic traps prepared by microcontact printing. It combines the advantages of microfluidic cell sorting, notably the application of a well controlled, flow-activated interaction between cells and beads, and those of immunomagnetic sorting, notably the use of batch-prepared, well characterized antibody-bearing beads. On cell lines mixtures, we demonstrated a capture yield better than 94%, and the possibility to cultivate in situ the captured cells. A second series of experiments involved clinical samples--blood, pleural effusion, and fine needle aspirates--issued from healthy donors and patients with B-cell hematological malignant tumors (leukemia and lymphoma). The immunophenotype and morphology of B-lymphocytes were analyzed directly in the microfluidic chamber, and compared with conventional flow cytometry and visual cytology data, in a blind test. Immunophenotyping results using Ephesia were fully consistent with those obtained by flow cytometry. We obtained in situ high resolution confocal three-dimensional images of the cell nuclei, showing intranuclear details consistent with conventional cytological staining. Ephesia thus provides a powerful approach to cell capture and typing allowing fully automated high resolution and quantitative immunophenotyping and morphological analysis. It requires at least 10 times smaller sample volume and cell numbers than cytometry, potentially increasing the range of indications and the success rate of microbiopsy-based diagnosis, and reducing analysis time and cost.
Automated determination of arterial input function for DCE-MRI of the prostate
NASA Astrophysics Data System (ADS)
Zhu, Yingxuan; Chang, Ming-Ching; Gupta, Sandeep
2011-03-01
Prostate cancer is one of the commonest cancers in the world. Dynamic contrast enhanced MRI (DCE-MRI) provides an opportunity for non-invasive diagnosis, staging, and treatment monitoring. Quantitative analysis of DCE-MRI relies on determination of an accurate arterial input function (AIF). Although several methods for automated AIF detection have been proposed in literature, none are optimized for use in prostate DCE-MRI, which is particularly challenging due to large spatial signal inhomogeneity. In this paper, we propose a fully automated method for determining the AIF from prostate DCE-MRI. Our method is based on modeling pixel uptake curves as gamma variate functions (GVF). First, we analytically compute bounds on GVF parameters for more robust fitting. Next, we approximate a GVF for each pixel based on local time domain information, and eliminate the pixels with false estimated AIFs using the deduced upper and lower bounds. This makes the algorithm robust to signal inhomogeneity. After that, according to spatial information such as similarity and distance between pixels, we formulate the global AIF selection as an energy minimization problem and solve it using a message passing algorithm to further rule out the weak pixels and optimize the detected AIF. Our method is fully automated without training or a priori setting of parameters. Experimental results on clinical data have shown that our method obtained promising detection accuracy (all detected pixels inside major arteries), and a very good match with expert traced manual AIF.
Wels, Michael; Carneiro, Gustavo; Aplas, Alexander; Huber, Martin; Hornegger, Joachim; Comaniciu, Dorin
2008-01-01
In this paper we present a fully automated approach to the segmentation of pediatric brain tumors in multi-spectral 3-D magnetic resonance images. It is a top-down segmentation approach based on a Markov random field (MRF) model that combines probabilistic boosting trees (PBT) and lower-level segmentation via graph cuts. The PBT algorithm provides a strong discriminative observation model that classifies tumor appearance while a spatial prior takes into account the pair-wise homogeneity in terms of classification labels and multi-spectral voxel intensities. The discriminative model relies not only on observed local intensities but also on surrounding context for detecting candidate regions for pathology. A mathematically sound formulation for integrating the two approaches into a unified statistical framework is given. The proposed method is applied to the challenging task of detection and delineation of pediatric brain tumors. This segmentation task is characterized by a high non-uniformity of both the pathology and the surrounding non-pathologic brain tissue. A quantitative evaluation illustrates the robustness of the proposed method. Despite dealing with more complicated cases of pediatric brain tumors the results obtained are mostly better than those reported for current state-of-the-art approaches to 3-D MR brain tumor segmentation in adult patients. The entire processing of one multi-spectral data set does not require any user interaction, and takes less time than previously proposed methods.
Microfluidic sorting and multimodal typing of cancer cells in self-assembled magnetic arrays
Saliba, Antoine-Emmanuel; Saias, Laure; Psychari, Eleni; Minc, Nicolas; Simon, Damien; Bidard, François-Clément; Mathiot, Claire; Pierga, Jean-Yves; Fraisier, Vincent; Salamero, Jean; Saada, Véronique; Farace, Françoise; Vielh, Philippe; Malaquin, Laurent; Viovy, Jean-Louis
2010-01-01
We propose a unique method for cell sorting, “Ephesia,” using columns of biofunctionalized superparamagnetic beads self-assembled in a microfluidic channel onto an array of magnetic traps prepared by microcontact printing. It combines the advantages of microfluidic cell sorting, notably the application of a well controlled, flow-activated interaction between cells and beads, and those of immunomagnetic sorting, notably the use of batch-prepared, well characterized antibody-bearing beads. On cell lines mixtures, we demonstrated a capture yield better than 94%, and the possibility to cultivate in situ the captured cells. A second series of experiments involved clinical samples—blood, pleural effusion, and fine needle aspirates— issued from healthy donors and patients with B-cell hematological malignant tumors (leukemia and lymphoma). The immunophenotype and morphology of B-lymphocytes were analyzed directly in the microfluidic chamber, and compared with conventional flow cytometry and visual cytology data, in a blind test. Immunophenotyping results using Ephesia were fully consistent with those obtained by flow cytometry. We obtained in situ high resolution confocal three-dimensional images of the cell nuclei, showing intranuclear details consistent with conventional cytological staining. Ephesia thus provides a powerful approach to cell capture and typing allowing fully automated high resolution and quantitative immunophenotyping and morphological analysis. It requires at least 10 times smaller sample volume and cell numbers than cytometry, potentially increasing the range of indications and the success rate of microbiopsy-based diagnosis, and reducing analysis time and cost. PMID:20679245
Anibamine and its Analogues as Novel Anti Prostate Cancer Agents
2010-06-01
PC- 3, and DU-145 has been conducted continuously to evaluate the efficacy of more ligands. A molecular modeling study (3D QSAR ) protocol has been... Toxicology at Virginia Commonwealth University. Both the PI’s lab and Dr. 10 Selley’s lab have fully functional binding assay facility. The assays is...pursue the docking study and 3D QSAR study. 5.3 3D QSAR (Quantitative Structure-Activity Relationships) Study As proposed in our proposal, we will
Quantitative DLA-based compressed sensing for T1-weighted acquisitions
NASA Astrophysics Data System (ADS)
Svehla, Pavel; Nguyen, Khieu-Van; Li, Jing-Rebecca; Ciobanu, Luisa
2017-08-01
High resolution Manganese Enhanced Magnetic Resonance Imaging (MEMRI), which uses manganese as a T1 contrast agent, has great potential for functional imaging of live neuronal tissue at single neuron scale. However, reaching high resolutions often requires long acquisition times which can lead to reduced image quality due to sample deterioration and hardware instability. Compressed Sensing (CS) techniques offer the opportunity to significantly reduce the imaging time. The purpose of this work is to test the feasibility of CS acquisitions based on Diffusion Limited Aggregation (DLA) sampling patterns for high resolution quantitative T1-weighted imaging. Fully encoded and DLA-CS T1-weighted images of Aplysia californica neural tissue were acquired on a 17.2T MRI system. The MR signal corresponding to single, identified neurons was quantified for both versions of the T1 weighted images. For a 50% undersampling, DLA-CS can accurately quantify signal intensities in T1-weighted acquisitions leading to only 1.37% differences when compared to the fully encoded data, with minimal impact on image spatial resolution. In addition, we compared the conventional polynomial undersampling scheme with the DLA and showed that, for the data at hand, the latter performs better. Depending on the image signal to noise ratio, higher undersampling ratios can be used to further reduce the acquisition time in MEMRI based functional studies of living tissues.
Chen, Yongsheng; Persaud, Bhagwant
2014-09-01
Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors. Copyright © 2014 Elsevier Ltd. All rights reserved.
Ahmadi, Mehdi; Shahlaei, Mohsen
2015-01-01
P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure-activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7-7-1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure-activity relationship model suggested is robust and satisfactory.
Ahmadi, Mehdi; Shahlaei, Mohsen
2015-01-01
P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure–activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7−7−1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure–activity relationship model suggested is robust and satisfactory. PMID:26600858
Leb, Victoria; Stöcher, Markus; Valentine-Thon, Elizabeth; Hölzl, Gabriele; Kessler, Harald; Stekel, Herbert; Berg, Jörg
2004-02-01
We report on the development of a fully automated real-time PCR assay for the quantitative detection of hepatitis B virus (HBV) DNA in plasma with EDTA (EDTA plasma). The MagNA Pure LC instrument was used for automated DNA purification and automated preparation of PCR mixtures. Real-time PCR was performed on the LightCycler instrument. An internal amplification control was devised as a PCR competitor and was introduced into the assay at the stage of DNA purification to permit monitoring for sample adequacy. The detection limit of the assay was found to be 200 HBV DNA copies/ml, with a linear dynamic range of 8 orders of magnitude. When samples from the European Union Quality Control Concerted Action HBV Proficiency Panel 1999 were examined, the results were found to be in acceptable agreement with the HBV DNA concentrations of the panel members. In a clinical laboratory evaluation of 123 EDTA plasma samples, a significant correlation was found with the results obtained by the Roche HBV Monitor test on the Cobas Amplicor analyzer within the dynamic range of that system. In conclusion, the newly developed assay has a markedly reduced hands-on time, permits monitoring for sample adequacy, and is suitable for the quantitative detection of HBV DNA in plasma in a routine clinical laboratory.
NASA Astrophysics Data System (ADS)
Sepúlveda, J.; Hoyos Ortiz, C. D.
2017-12-01
An adequate quantification of precipitation over land is critical for many societal applications including agriculture, hydroelectricity generation, water supply, and risk management associated with extreme events. The use of rain gauges, a traditional method for precipitation estimation, and an excellent one, to estimate the volume of liquid water during a particular precipitation event, does not allow to fully capture the highly spatial variability of the phenomena which is a requirement for almost all practical applications. On the other hand, the weather radar, an active remote sensing sensor, provides a proxy for rainfall with fine spatial resolution and adequate temporary sampling, however, it does not measure surface precipitation. In order to fully exploit the capabilities of the weather radar, it is necessary to develop quantitative precipitation estimation (QPE) techniques combining radar information with in-situ measurements. Different QPE methodologies are explored and adapted to local observations in a highly complex terrain region in tropical Colombia using a C-Band radar and a relatively dense network of rain gauges and disdrometers. One important result is that the expressions reported in the literature for extratropical locations are not representative of the conditions found in the tropical region studied. In addition to reproducing the state-of-the-art techniques, a new multi-stage methodology based on radar-derived variables and disdrometer data is proposed in order to achieve the best QPE possible. The main motivation for this new methodology is based on the fact that most traditional QPE methods do not directly take into account the different uncertainty sources involved in the process. The main advantage of the multi-stage model compared to traditional models is that it allows assessing and quantifying the uncertainty in the surface rain rate estimation. The sub-hourly rainfall estimations using the multi-stage methodology are realistic compared to observed data in spite of the many sources of uncertainty including the sampling volume, the different physical principles of the sensors, the incomplete understanding of the microphysics of precipitation and, the most important, the rapidly varying droplet size distribution.
Qazi, Arish A; Pekar, Vladimir; Kim, John; Xie, Jason; Breen, Stephen L; Jaffray, David A
2011-11-01
Intensity modulated radiation therapy (IMRT) allows greater control over dose distribution, which leads to a decrease in radiation related toxicity. IMRT, however, requires precise and accurate delineation of the organs at risk and target volumes. Manual delineation is tedious and suffers from both interobserver and intraobserver variability. State of the art auto-segmentation methods are either atlas-based, model-based or hybrid however, robust fully automated segmentation is often difficult due to the insufficient discriminative information provided by standard medical imaging modalities for certain tissue types. In this paper, the authors present a fully automated hybrid approach which combines deformable registration with the model-based approach to accurately segment normal and target tissues from head and neck CT images. The segmentation process starts by using an average atlas to reliably identify salient landmarks in the patient image. The relationship between these landmarks and the reference dataset serves to guide a deformable registration algorithm, which allows for a close initialization of a set of organ-specific deformable models in the patient image, ensuring their robust adaptation to the boundaries of the structures. Finally, the models are automatically fine adjusted by our boundary refinement approach which attempts to model the uncertainty in model adaptation using a probabilistic mask. This uncertainty is subsequently resolved by voxel classification based on local low-level organ-specific features. To quantitatively evaluate the method, they auto-segment several organs at risk and target tissues from 10 head and neck CT images. They compare the segmentations to the manual delineations outlined by the expert. The evaluation is carried out by estimating two common quantitative measures on 10 datasets: volume overlap fraction or the Dice similarity coefficient (DSC), and a geometrical metric, the median symmetric Hausdorff distance (HD), which is evaluated slice-wise. They achieve an average overlap of 93% for the mandible, 91% for the brainstem, 83% for the parotids, 83% for the submandibular glands, and 74% for the lymph node levels. Our automated segmentation framework is able to segment anatomy in the head and neck region with high accuracy within a clinically-acceptable segmentation time.
HYDRODYNAMIC SIMULATIONS OF H ENTRAINMENT AT THE TOP OF He-SHELL FLASH CONVECTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodward, Paul R.; Lin, Pei-Hung; Herwig, Falk, E-mail: paul@lcse.umn.edu, E-mail: fherwig@uvic.ca
2015-01-01
We present the first three-dimensional, fully compressible gas-dynamics simulations in 4π geometry of He-shell flash convection with proton-rich fuel entrainment at the upper boundary. This work is motivated by the insufficiently understood observed consequences of the H-ingestion flash in post-asymptotic giant branch (post-AGB) stars (Sakurai's object) and metal-poor AGB stars. Our investigation is focused on the entrainment process at the top convection boundary and on the subsequent advection of H-rich material into deeper layers, and we therefore ignore the burning of the proton-rich fuel in this study. We find that for our deep convection zone, coherent convective motions of nearmore » global scale appear to dominate the flow. At the top boundary convective shear flows are stable against Kelvin-Helmholtz instabilities. However, such shear instabilities are induced by the boundary-layer separation in large-scale, opposing flows. This links the global nature of thick shell convection with the entrainment process. We establish the quantitative dependence of the entrainment rate on grid resolution. With our numerical technique, simulations with 1024{sup 3} cells or more are required to reach a numerical fidelity appropriate for this problem. However, only the result from the 1536{sup 3} simulation provides a clear indication that we approach convergence with regard to the entrainment rate. Our results demonstrate that our method, which is described in detail, can provide quantitative results related to entrainment and convective boundary mixing in deep stellar interior environments with very stiff convective boundaries. For the representative case we study in detail, we find an entrainment rate of 4.38 ± 1.48 × 10{sup –13} M {sub ☉} s{sup –1}.« less
Pentacam Scheimpflug quantitative imaging of the crystalline lens and intraocular lens.
Rosales, Patricia; Marcos, Susana
2009-05-01
To implement geometrical and optical distortion correction methods for anterior segment Scheimpflug images obtained with a commercially available system (Pentacam, Oculus Optikgeräte GmbH). Ray tracing algorithms were implemented to obtain corrected ocular surface geometry from the original images captured by the Pentacam's CCD camera. As details of the optical layout were not fully provided by the manufacturer, an iterative procedure (based on imaging of calibrated spheres) was developed to estimate the camera lens specifications. The correction procedure was tested on Scheimpflug images of a physical water cell model eye (with polymethylmethacrylate cornea and a commercial IOL of known dimensions) and of a normal human eye previously measured with a corrected optical and geometrical distortion Scheimpflug camera (Topcon SL-45 [Topcon Medical Systems Inc] from the Vrije University, Amsterdam, Holland). Uncorrected Scheimpflug images show flatter surfaces and thinner lenses than in reality. The application of geometrical and optical distortion correction algorithms improves the accuracy of the estimated anterior lens radii of curvature by 30% to 40% and of the estimated posterior lens by 50% to 100%. The average error in the retrieved radii was 0.37 and 0.46 mm for the anterior and posterior lens radii of curvature, respectively, and 0.048 mm for lens thickness. The Pentacam Scheimpflug system can be used to obtain quantitative information on the geometry of the crystalline lens, provided that geometrical and optical distortion correction algorithms are applied, within the accuracy of state-of-the art phakometry and biometry. The techniques could improve with exact knowledge of the technical specifications of the instrument, improved edge detection algorithms, consideration of aspheric and non-rotationally symmetrical surfaces, and introduction of a crystalline gradient index.
Ultrasonic laboratory measurements of the seismic velocity changes due to CO2 injection
NASA Astrophysics Data System (ADS)
Park, K. G.; Choi, H.; Park, Y. C.; Hwang, S.
2009-04-01
Monitoring the behavior and movement of carbon dioxide (CO2) in the subsurface is a quite important in sequestration of CO2 in geological formation because such information provides a basis for demonstrating the safety of CO2 sequestration. Recent several applications in many commercial and pilot scale projects and researches show that 4D surface or borehole seismic methods are among the most promising techniques for this purpose. However, such information interpreted from the seismic velocity changes can be quite subjective and qualitative without petrophysical characterization for the effect of CO2 saturation on the seismic changes since seismic wave velocity depends on various factors and parameters like mineralogical composition, hydrogeological factors, in-situ conditions. In this respect, we have developed an ultrasonic laboratory measurement system and have carried out measurements for a porous sandstone sample to characterize the effects of CO2 injection to seismic velocity and amplitude. Measurements are done by ultrasonic piezoelectric transducer mounted on both ends of cylindrical core sample under various pressure, temperature, and saturation conditions. According to our fundamental experiments, injected CO2 introduces the decrease of seismic velocity and amplitude. We identified that the velocity decreases about 6% or more until fully saturated by CO2, but the attenuation of seismic amplitude is more drastically than the velocity decrease. We also identified that Vs/Vp or elastic modulus is more sensitive to CO2 saturation. We note that this means seismic amplitude and elastic modulus change can be an alternative target anomaly of seismic techniques in CO2 sequestration monitoring. Thus, we expect that we can estimate more quantitative petrophysical relationships between the changes of seismic attributes and CO2 concentration, which can provide basic relation for the quantitative assessment of CO2 sequestration by further researches.
Khowaja, Asif Raza; Qureshi, Rahat Najam; Sawchuck, Diane; Oladapo, Olufemi T; Adetoro, Olalekan O; Orenuga, Elizabeth A; Bellad, Mrutyunjaya; Mallapur, Ashalata; Charantimath, Umesh; Sevene, Esperança; Munguambe, Khátia; Boene, Helena Edith; Vidler, Marianne; Bhutta, Zulfiqar A; von Dadelszen, Peter
2016-06-08
Globally, pre-eclampsia and eclampsia are major contributors to maternal and perinatal mortality; of which the vast majority of deaths occur in less developed countries. In addition, a disproportionate number of morbidities and mortalities occur due to delayed access to health services. The Community Level Interventions for Pre-eclampsia (CLIP) Trial aims to task-shift to community health workers the identification and emergency management of pre-eclampsia and eclampsia to improve access and timely care. Literature revealed paucity of published feasibility assessments prior to initiating large-scale community-based interventions. Arguably, well-conducted feasibility studies can provide valuable information about the potential success of clinical trials prior to implementation. Failure to fully understand the study context risks the effective implementation of the intervention and limits the likelihood of post-trial scale-up. Therefore, it was imperative to conduct community-level feasibility assessments for a trial of this magnitude. A mixed methods design guided by normalization process theory was used for this study in Nigeria, Mozambique, Pakistan, and India to explore enabling and impeding factors for the CLIP Trial implementation. Qualitative data were collected through participant observation, document review, focus group discussion and in-depth interviews with diverse groups of community members, key informants at community level, healthcare providers, and policy makers. Quantitative data were collected through health facility assessments, self-administered community health worker surveys, and household demographic and health surveillance. Refer to CLIP Trial feasibility publications in the current and/or forthcoming supplement. Feasibility assessments for community level interventions, particularly those involving task-shifting across diverse regions, require an appropriate theoretical framework and careful selection of research methods. The use of qualitative and quantitative methods increased the data richness to better understand the community contexts. NCT01911494.
Some effects of adverse weather conditions on performance of airplane antiskid braking systems
NASA Technical Reports Server (NTRS)
Horne, W. B.; Mccarty, J. L.; Tanner, J. A.
1976-01-01
The performance of current antiskid braking systems operating under adverse weather conditions was analyzed in an effort to both identify the causes of locked-wheel skids which sometimes occur when the runway is slippery and to find possible solutions to this operational problem. This analysis was made possible by the quantitative test data provided by recently completed landing research programs using fully instrumented flight test airplanes and was further supported by tests performed at the Langley aircraft landing loads and traction facility. The antiskid system logic for brake control and for both touchdown and locked-wheel protection is described and its response behavior in adverse weather is discussed in detail with the aid of available data. The analysis indicates that the operational performance of the antiskid logic circuits is highly dependent upon wheel spin-up acceleration and can be adversely affected by certain pilot braking inputs when accelerations are low. Normal antiskid performance is assured if the tire-to-runway traction is sufficient to provide high wheel spin-up accelerations or if the system is provided a continuous, accurate ground speed reference. The design of antiskid systems is complicated by the necessity for tradeoffs between tire braking and cornering capabilities, both of which are necessary to provide safe operations in the presence of cross winds, particularly under slippery runway conditions.
Optical coherence elastography in ophthalmology
NASA Astrophysics Data System (ADS)
Kirby, Mitchell A.; Pelivanov, Ivan; Song, Shaozhen; Ambrozinski, Łukasz; Yoon, Soon Joon; Gao, Liang; Li, David; Shen, Tueng T.; Wang, Ruikang K.; O'Donnell, Matthew
2017-12-01
Optical coherence elastography (OCE) can provide clinically valuable information based on local measurements of tissue stiffness. Improved light sources and scanning methods in optical coherence tomography (OCT) have led to rapid growth in systems for high-resolution, quantitative elastography using imaged displacements and strains within soft tissue to infer local mechanical properties. We describe in some detail the physical processes underlying tissue mechanical response based on static and dynamic displacement methods. Namely, the assumptions commonly used to interpret displacement and strain measurements in terms of tissue elasticity for static OCE and propagating wave modes in dynamic OCE are discussed with the ultimate focus on OCT system design for ophthalmic applications. Practical OCT motion-tracking methods used to map tissue elasticity are also presented to fully describe technical developments in OCE, particularly noting those focused on the anterior segment of the eye. Clinical issues and future directions are discussed in the hope that OCE techniques will rapidly move forward to translational studies and clinical applications.
MEK-Dependent Negative Feedback Underlies BCR-ABL-Mediated Oncogene Addiction
Asmussen, Jennifer; Lasater, Elisabeth A.; Tajon, Cheryl; Oses-Prieto, Juan; Jun, Young-wook; Taylor, Barry S.; Burlingame, Alma; Craik, Charles S.; Shah, Neil P.
2014-01-01
The clinical experience with BCR-ABL tyrosine kinase inhibitors (TKIs) for the treatment of chronic myeloid leukemia (CML) provides compelling evidence for oncogene addiction. Yet, the molecular basis of oncogene addiction remains elusive. Through unbiased quantitative phosphoproteomic analyses of CML cells transiently exposed to BCR-ABL TKI, we identified persistent downregulation of growth factor receptor (GF-R) signaling pathways. We then established and validated a tissue-relevant isogenic model of BCR-ABL-mediated addiction, and found evidence for myeloid GF-R signaling pathway rewiring that profoundly and persistently dampens physiologic pathway activation. We demonstrate that eventual restoration of ligand-mediated GF-R pathway activation is insufficient to fully rescue cells from a competing apoptotic fate. In contrast to previous work with BRAFV600E in melanoma cells, feedback inhibition following BCR-ABL TKI treatment is markedly prolonged, extending beyond the time required to initiate apoptosis. Mechanistically, BCR-ABL-mediated oncogene addiction is facilitated by persistent high levels of MEK-dependent negative feedback. PMID:24362263
Molecular descriptor data explain market prices of a large commercial chemical compound library
NASA Astrophysics Data System (ADS)
Polanski, Jaroslaw; Kucia, Urszula; Duszkiewicz, Roksana; Kurczyk, Agata; Magdziarz, Tomasz; Gasteiger, Johann
2016-06-01
The relationship between the structure and a property of a chemical compound is an essential concept in chemistry guiding, for example, drug design. Actually, however, we need economic considerations to fully understand the fate of drugs on the market. We are performing here for the first time the exploration of quantitative structure-economy relationships (QSER) for a large dataset of a commercial building block library of over 2.2 million chemicals. This investigation provided molecular statistics that shows that on average what we are paying for is the quantity of matter. On the other side, the influence of synthetic availability scores is also revealed. Finally, we are buying substances by looking at the molecular graphs or molecular formulas. Thus, those molecules that have a higher number of atoms look more attractive and are, on average, also more expensive. Our study shows how data binning could be used as an informative method when analyzing big data in chemistry.
Nano Petri dishes: a new polystyrene platform for studying cell-nanoengineered surface interactions
NASA Astrophysics Data System (ADS)
Cha, Kyoung Je; Na, Moon-Hee; Kim, Hyung Woo; Kim, Dong Sung
2014-05-01
In this study, we fabricated and fully characterized a new type of polystyrene (PS) cell-culture platform containing nanoengineered surfaces (NES), referred to as a nano Petri dish, which can be used at the transition stage of basic cell-NES interaction studies for clinical applications. Nano-injection molding in this study was used for the mass production of the nano Petri dish having nanopore arrays. The effects of processing parameters of the injection molding on the replication quality of the nanopore arrays were quantitatively evaluated by means of design of experiments based on the Taguchi method. This allowed efficient and reliable cell culture studies by providing large numbers of the same dishes, in addition to removing the fixation step of the NES plates inside the cell-culture container. Physical, chemical and mechanical properties of the NES, as well as cell behavior including attachment and proliferation of human osteosarcoma MG-63 cells on the NES, were then characterized, with and without the oxygen plasma surface treatment.
Patra, Chandra N
2014-11-14
A systematic investigation of the spherical electric double layers with the electrolytes having size as well as charge asymmetry is carried out using density functional theory and Monte Carlo simulations. The system is considered within the primitive model, where the macroion is a structureless hard spherical colloid, the small ions as charged hard spheres of different size, and the solvent is represented as a dielectric continuum. The present theory approximates the hard sphere part of the one particle correlation function using a weighted density approach whereas a perturbation expansion around the uniform fluid is applied to evaluate the ionic contribution. The theory is in quantitative agreement with Monte Carlo simulation for the density and the mean electrostatic potential profiles over a wide range of electrolyte concentrations, surface charge densities, valence of small ions, and macroion sizes. The theory provides distinctive evidence of charge and size correlations within the electrode-electrolyte interface in spherical geometry.
Dyess, Susan; Sherman, Rose
2011-01-01
The authors of the recently published Institute of Medicine on the Future of Nursing report emphasized the importance of preparing nurses to lead change to advance health care in the United States. Other scholars linked practice environments to safe quality care. In order for nurses to fully actualize this role in practice environments, they need to possess leadership skills sets that identify and respond to challenges faced. New nurses are no exception. This article presents a program with a 5-year track record that is designed to support transition and enhance the skill sets of leadership for new nurses in their first year of practice. Qualitative and quantitative evaluation measurements at baseline and postprogram provided data for evaluation of the first 4 cohorts in the program. Evaluative outcomes presented indicate that new nurses gained leadership and translational research skills that contributed to their ability to influence practice environments. Nonetheless, practice environments continue to need improvement and ongoing leadership from all levels of nursing must be upheld.
Non-radioactive detection of trinucleotide repeat size variability.
Tomé, Stéphanie; Nicole, Annie; Gomes-Pereira, Mario; Gourdon, Genevieve
2014-03-06
Many human diseases are associated with the abnormal expansion of unstable trinucleotide repeat sequences. The mechanisms of trinucleotide repeat size mutation have not been fully dissected, and their understanding must be grounded on the detailed analysis of repeat size distributions in human tissues and animal models. Small-pool PCR (SP-PCR) is a robust, highly sensitive and efficient PCR-based approach to assess the levels of repeat size variation, providing both quantitative and qualitative data. The method relies on the amplification of a very low number of DNA molecules, through sucessive dilution of a stock genomic DNA solution. Radioactive Southern blot hybridization is sensitive enough to detect SP-PCR products derived from single template molecules, separated by agarose gel electrophoresis and transferred onto DNA membranes. We describe a variation of the detection method that uses digoxigenin-labelled locked nucleic acid probes. This protocol keeps the sensitivity of the original method, while eliminating the health risks associated with the manipulation of radiolabelled probes, and the burden associated with their regulation, manipulation and waste disposal.
Yong, Yan Ling; Tan, Li Kuo; McLaughlin, Robert A; Chee, Kok Han; Liew, Yih Miin
2017-12-01
Intravascular optical coherence tomography (OCT) is an optical imaging modality commonly used in the assessment of coronary artery diseases during percutaneous coronary intervention. Manual segmentation to assess luminal stenosis from OCT pullback scans is challenging and time consuming. We propose a linear-regression convolutional neural network to automatically perform vessel lumen segmentation, parameterized in terms of radial distances from the catheter centroid in polar space. Benchmarked against gold-standard manual segmentation, our proposed algorithm achieves average locational accuracy of the vessel wall of 22 microns, and 0.985 and 0.970 in Dice coefficient and Jaccard similarity index, respectively. The average absolute error of luminal area estimation is 1.38%. The processing rate is 40.6 ms per image, suggesting the potential to be incorporated into a clinical workflow and to provide quantitative assessment of vessel lumen in an intraoperative time frame. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Numerical formulation for the prediction of solid/liquid change of a binary alloy
NASA Technical Reports Server (NTRS)
Schneider, G. E.; Tiwari, S. N.
1990-01-01
A computational model is presented for the prediction of solid/liquid phase change energy transport including the influence of free convection fluid flow in the liquid phase region. The computational model considers the velocity components of all non-liquid phase change material control volumes to be zero but fully solves the coupled mass-momentum problem within the liquid region. The thermal energy model includes the entire domain and uses an enthalpy like model and a recently developed method for handling the phase change interface nonlinearity. Convergence studies are performed and comparisons made with experimental data for two different problem specifications. The convergence studies indicate that grid independence was achieved and the comparison with experimental data indicates excellent quantitative prediction of the melt fraction evolution. Qualitative data is also provided in the form of velocity vector diagrams and isotherm plots for selected times in the evolution of both problems. The computational costs incurred are quite low by comparison with previous efforts on solving these problems.
Expansion of Non-Quasi-Neutral Limited Plasmas Driven by Two-Temperature Electron Clouds
NASA Astrophysics Data System (ADS)
Murakami, Masakatsu; Honrubia, Javier
2017-10-01
Fast heating of an isolated solid mass, under irradiation of ultra-intense ultra-short laser pulse, to averaged temperatures of order of keV is theoretically studied. Achievable maximum ion temperatures are determined as a consequence of the interplay of the electron-to-ion energy deposition and nonrelativistic plasma expansion, where fast ion emission plays an important role in the energy balance. To describe the plasma expansion, we develop a self-similar solution, in which the plasma is composed of three fluids, i.e., ions and two-temperature electrons. Under the condition of isothermal electron expansion in cylindrical geometry, such a fluid system, self-consistently incorporated with the Poisson equation, is fully solved. The charge separation and resultant accelerated ion population due to the induced electrostatic field are quantitatively presented. The analytical model is compared with two-dimensional hydrodynamic simulations to provide practical working windows for the target and laser parameters for the fast heating.
Molecular Filters for Noise Reduction.
Laurenti, Luca; Csikasz-Nagy, Attila; Kwiatkowska, Marta; Cardelli, Luca
2018-06-19
Living systems are inherently stochastic and operate in a noisy environment, yet despite all these uncertainties, they perform their functions in a surprisingly reliable way. The biochemical mechanisms used by natural systems to tolerate and control noise are still not fully understood, and this issue also limits our capacity to engineer reliable, quantitative synthetic biological circuits. We study how representative models of biochemical systems propagate and attenuate noise, accounting for intrinsic as well as extrinsic noise. We investigate three molecular noise-filtering mechanisms, study their noise-reduction capabilities and limitations, and show that nonlinear dynamics such as complex formation are necessary for efficient noise reduction. We further suggest that the derived molecular filters are widespread in gene expression and regulation and, particularly, that microRNAs can serve as such noise filters. To our knowledge, our results provide new insight into how biochemical networks control noise and could be useful to build robust synthetic circuits. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Noise from Supersonic Coaxial Jets. Part 1; Mean Flow Predictions
NASA Technical Reports Server (NTRS)
Dahl, Milo D.; Morris, Philip J.
1997-01-01
Recent theories for supersonic jet noise have used an instability wave noise generation model to predict radiated noise. This model requires a known mean flow that has typically been described by simple analytic functions for single jet mean flows. The mean flow of supersonic coaxial jets is not described easily in terms of analytic functions. To provide these profiles at all axial locations, a numerical scheme is developed to calculate the mean flow properties of a coaxial jet. The Reynolds-averaged, compressible, parabolic boundary layer equations are solved using a mixing length turbulence model. Empirical correlations are developed to account for the effects of velocity and temperature ratios and Mach number on the shear layer spreading. Both normal velocity profile and inverted velocity profile coaxial jets are considered. The mixing length model is modified in each case to obtain reasonable results when the two stream jet merges into a single fully developed jet. The mean flow calculations show both good qualitative and quantitative agreement with measurements in single and coaxial jet flows.
TrustBuilder2: A Reconfigurable Framework for Trust Negotiation
NASA Astrophysics Data System (ADS)
Lee, Adam J.; Winslett, Marianne; Perano, Kenneth J.
To date, research in trust negotiation has focused mainly on the theoretical aspects of the trust negotiation process, and the development of proof of concept implementations. These theoretical works and proofs of concept have been quite successful from a research perspective, and thus researchers must now begin to address the systems constraints that act as barriers to the deployment of these systems. To this end, we present TrustBuilder2, a fully-configurable and extensible framework for prototyping and evaluating trust negotiation systems. TrustBuilder2 leverages a plug-in based architecture, extensible data type hierarchy, and flexible communication protocol to provide a framework within which numerous trust negotiation protocols and system configurations can be quantitatively analyzed. In this paper, we discuss the design and implementation of TrustBuilder2, study its performance, examine the costs associated with flexible authorization systems, and leverage this knowledge to identify potential topics for future research, as well as a novel method for attacking trust negotiation systems.
Role of isostaticity and load-bearing microstructure in the elasticity of yielded colloidal gels.
Hsiao, Lilian C; Newman, Richmond S; Glotzer, Sharon C; Solomon, Michael J
2012-10-02
We report a simple correlation between microstructure and strain-dependent elasticity in colloidal gels by visualizing the evolution of cluster structure in high strain-rate flows. We control the initial gel microstructure by inducing different levels of isotropic depletion attraction between particles suspended in refractive index matched solvents. Contrary to previous ideas from mode coupling and micromechanical treatments, our studies show that bond breakage occurs mainly due to the erosion of rigid clusters that persist far beyond the yield strain. This rigidity contributes to gel elasticity even when the sample is fully fluidized; the origin of the elasticity is the slow Brownian relaxation of rigid, hydrodynamically interacting clusters. We find a power-law scaling of the elastic modulus with the stress-bearing volume fraction that is valid over a range of volume fractions and gelation conditions. These results provide a conceptual framework to quantitatively connect the flow-induced microstructure of soft materials to their nonlinear rheology.
Gregus, Michal; Roberg-Larsen, Hanne; Lundanes, Elsa; Foret, Frantisek; Kuban, Petr; Wilson, Steven Ray
2017-10-01
Capillary electrophoresis (CE) can provide high separation efficiency with very simple instrumentation, but has yet to be explored regarding oxysterols/cholesterol. Cholesterol and 25-hydroxycholesterol (both are 4-ene-3-ketosteroids) were quantitatively transformed into hydrazones using Girard P reagent after enzymatic oxidation by cholesterol oxidase. Separation was achieved using non-aqueous capillary electrophoresis with UV detection at 280nm; the "charge-tagging" Girard P reagent ensured both charge and chromophore (which are requirements for CE-UV). Excess reagent was also separated from the two analytes, eliminating the need for removal prior to the analysis. The compounds were separated in less than 5min with excellent separation efficiency, using separation electrolytes fully compatible with mass spectrometry. The CE-UV method was used to optimize steps for charge-tagging, revealing that the procedure is affected by the analyte/reagent ratio and reaction time, but also the analyte structure. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Repellin, Cécile; Cook, Ashley M.; Neupert, Titus; Regnault, Nicolas
2018-03-01
Fractional quantum Hall-superconductor heterostructures may provide a platform towards non-abelian topological modes beyond Majoranas. However their quantitative theoretical study remains extremely challenging. We propose and implement a numerical setup for studying edge states of fractional quantum Hall droplets with a superconducting instability. The fully gapped edges carry a topological degree of freedom that can encode quantum information protected against local perturbations. We simulate such a system numerically using exact diagonalization by restricting the calculation to the quasihole-subspace of a (time-reversal symmetric) bilayer fractional quantum Hall system of Laughlin ν = 1/3 states. We show that the edge ground states are permuted by spin-dependent flux insertion and demonstrate their fractional 6π Josephson effect, evidencing their topological nature and the Cooper pairing of fractionalized quasiparticles. The versatility and efficiency of our setup make it a well suited method to tackle wider questions of edge phases and phase transitions in fractional quantum Hall systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruban, V. P., E-mail: ruban@itp.ac.ru
2015-05-15
The nonlinear dynamics of an obliquely oriented wave packet on a sea surface is analyzed analytically and numerically for various initial parameters of the packet in relation to the problem of the so-called rogue waves. Within the Gaussian variational ansatz applied to the corresponding (1+2)-dimensional hyperbolic nonlinear Schrödinger equation (NLSE), a simplified Lagrangian system of differential equations is derived that describes the evolution of the coefficients of the real and imaginary quadratic forms appearing in the Gaussian. This model provides a semi-quantitative description of the process of nonlinear spatiotemporal focusing, which is one of the most probable mechanisms of roguemore » wave formation in random wave fields. The system of equations is integrated in quadratures, which allows one to better understand the qualitative differences between linear and nonlinear focusing regimes of a wave packet. Predictions of the Gaussian model are compared with the results of direct numerical simulation of fully nonlinear long-crested waves.« less
NASA Astrophysics Data System (ADS)
Refat, Moamen S.; Adam, Abdel Majid A.; Saad, Hosam A.
2015-04-01
The study of the complexing ability of macrocyclic compounds to organic and inorganic substances is of great interest. The aim of this work is to provide basic data that can be used to the assessment of macrocyclic crown ethers quantitatively based on charge-transfer (CT) complexation. This goal was achieved by preparing CT complexes of two interesting mixed nitrogen-oxygen crown ethers with acido acceptors (chloranilic and picric acid), which were fully structurally characterized. The crown ethers are 4,7,13,16,21,24-hexaoxa-1,10-diazabicyclo[8.8.8]hexacosane (HDHC) and 1,4,10-trioxa-7,13-diaza-cyclopentadecane (TDPD). The obtained complexes were structurally characterized via elemental analysis, IR, Raman, 1H NMR, and UV-visible spectroscopy. Thermal properties of these complexes were also studied, and their kinetic thermodynamic parameters were calculated. Furthermore, the microstructure properties of these complexes have also been investigated using X-ray diffraction (XRD) and scanning electron microscope (SEM).
Lamontanara, Allan Joaquim; Georgeon, Sandrine; Tria, Giancarlo; Svergun, Dmitri I; Hantschel, Oliver
2014-11-17
The activity of protein kinases is regulated by multiple molecular mechanisms, and their disruption is a common driver of oncogenesis. A central and almost universal control element of protein kinase activity is the activation loop that utilizes both conformation and phosphorylation status to determine substrate access. In this study, we use recombinant Abl tyrosine kinases and conformation-specific kinase inhibitors to quantitatively analyse structural changes that occur after Abl activation. Allosteric SH2-kinase domain interactions were previously shown to be essential for the leukemogenesis caused by the Bcr-Abl oncoprotein. We find that these allosteric interactions switch the Abl activation loop from a closed to a fully open conformation. This enables the trans-autophosphorylation of the activation loop and requires prior phosphorylation of the SH2-kinase linker. Disruption of the SH2-kinase interaction abolishes activation loop phosphorylation. Our analysis provides a molecular mechanism for the SH2 domain-dependent activation of Abl that may also regulate other tyrosine kinases.
NASA Astrophysics Data System (ADS)
Lamontanara, Allan Joaquim; Georgeon, Sandrine; Tria, Giancarlo; Svergun, Dmitri I.; Hantschel, Oliver
2014-11-01
The activity of protein kinases is regulated by multiple molecular mechanisms, and their disruption is a common driver of oncogenesis. A central and almost universal control element of protein kinase activity is the activation loop that utilizes both conformation and phosphorylation status to determine substrate access. In this study, we use recombinant Abl tyrosine kinases and conformation-specific kinase inhibitors to quantitatively analyse structural changes that occur after Abl activation. Allosteric SH2-kinase domain interactions were previously shown to be essential for the leukemogenesis caused by the Bcr-Abl oncoprotein. We find that these allosteric interactions switch the Abl activation loop from a closed to a fully open conformation. This enables the trans-autophosphorylation of the activation loop and requires prior phosphorylation of the SH2-kinase linker. Disruption of the SH2-kinase interaction abolishes activation loop phosphorylation. Our analysis provides a molecular mechanism for the SH2 domain-dependent activation of Abl that may also regulate other tyrosine kinases.
Smartphones as Integrated Kinematic and Dynamic Sensors for Amusement Park Physics Applications
NASA Astrophysics Data System (ADS)
Peterson, Stephanie; Dennison, J. R.
2010-10-01
USU has hosted Physics Day at Lagoon and has attracted more than 120,000 secondary educators and students over 21 years. During this educational day, students explore basic physics concepts and apply their classroom content outdoors, in real world applications. As part of the event, USU's Physics Department provides curriculum to be used at Lagoon, in similar outside venues, and in the classroom. One such educational instrument, which is a primary focus of this work, is student workbooks filled with activities ranging from very simple to more advanced topics. Workbooks cover the properties of waves, relative velocity, and acceleration, topics which have historically challenged students and future topics include kinematics, energy, and forces. The topics were selected based on requests from teachers throughout the Intermountain Region and identified deficiencies in student performance on core curriculum assessments. An innovative approach is to identify physical application of iPhone and Android smartphone software technologies, which make use of dynamic and kinematic sensors. These technologies will allow students to realize their ability to do quantitative physics calculations anywhere, anytime; a smart device which is highly salable to today's teenage learners. This also provides an exciting approach to more fully engage students in learning physics concepts.
Torabi, Mohsen; Nadali, Iman Zohoorian
2016-01-01
Regarding the importance of health care providers such as nurses who are always in stressful environments, it is imperative to better understand how they become more engaged in their work. The purpose of this paper is to focus on health care providers (nurses), and examine how the interaction between spiritual intelligence and psychological empowerment affect job engagement. This descriptive and quantitative study was conducted among nurses at the Faghihi Hospital in Shiraz, Iran in 2010. A sample of nurses ( n = 179) completed standard survey questionnaire including spiritual intelligence, psychological empowerment, and job engagement which included 5 questions for each dimensions. For testing the hypotheses of the study, results were analyzed through structural equation modeling (SEM) using LISREL 8.8. SEM revealed that psychological empowerment could fully mediate the relationship between spiritual intelligence and job engagement. However, the correlation between spiritual intelligence and job engagement was significant but weak using Pearson coefficient method. This can imply that psychological empowerment plays a crucial role in the relationship between spiritual intelligence and job engagement. This paper indicates that spiritual intelligence might affect different organizational parameters, directly or indirectly. Therefore, it is recommended that the researchers evaluate probable relationships between spiritual intelligence and other variables.
Tavares, Anthony J; Noor, M Omair; Vannoy, Charles H; Algar, W Russ; Krull, Ulrich J
2012-01-03
The glass surface of a glass-polydimethylsiloxane (PDMS) microfluidic channel was modified to develop a solid-phase assay for quantitative determination of nucleic acids. Electroosmotic flow (EOF) within channels was used to deliver and immobilize semiconductor quantum dots (QDs), and electrophoresis was used to decorate the QDs with oligonucleotide probe sequences. These processes took only minutes to complete. The QDs served as energy donors in fluorescence resonance energy transfer (FRET) for transduction of nucleic acid hybridization. Electrokinetic injection of fluorescent dye (Cy3) labeled oligonucleotide target into a microfluidic channel and subsequent hybridization (within minutes) provided the proximity for FRET, with emission from Cy3 being the analytical signal. The quantification of target concentration was achieved by measurement of the spatial length of coverage by target along a channel. Detection of femtomole quantities of target was possible with a dynamic range spanning an order of magnitude. The assay provided excellent resistance to nonspecific interactions of DNA. Further selectivity of the assay was achieved using 20% formamide, which allowed discrimination between a fully complementary target and a 3 base pair mismatch target at a contrast ratio of 4:1. © 2011 American Chemical Society
Extremal entanglement and mixedness in continuous variable systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adesso, Gerardo; Serafini, Alessio; Illuminati, Fabrizio
2004-08-01
We investigate the relationship between mixedness and entanglement for Gaussian states of continuous variable systems. We introduce generalized entropies based on Schatten p norms to quantify the mixedness of a state and derive their explicit expressions in terms of symplectic spectra. We compare the hierarchies of mixedness provided by such measures with the one provided by the purity (defined as tr {rho}{sup 2} for the state {rho}) for generic n-mode states. We then review the analysis proving the existence of both maximally and minimally entangled states at given global and marginal purities, with the entanglement quantified by the logarithmic negativity.more » Based on these results, we extend such an analysis to generalized entropies, introducing and fully characterizing maximally and minimally entangled states for given global and local generalized entropies. We compare the different roles played by the purity and by the generalized p entropies in quantifying the entanglement and the mixedness of continuous variable systems. We introduce the concept of average logarithmic negativity, showing that it allows a reliable quantitative estimate of continuous variable entanglement by direct measurements of global and marginal generalized p entropies.« less
The environmental impact of wind turbine blades
NASA Astrophysics Data System (ADS)
Liu, P.; Barlow, C. Y.
2016-07-01
The first generation of wind turbine (WT) blades are now reaching their end of life, signalling the beginning of a large problem for the future. Currently most waste is sent to landfill, which is not an environmentally desirable solution. Awareness of this issue is rising, but no studies have fully assessed the eco impact of WT blades. The present study aims to provide a macroscopic quantitative assessment of the lifetime environmental impact of WT blades. The first stage has been to analyse global data to calculate the amount of WT blade materials consumed in the past. The life cycle environmental impact of a single WT blade has then been estimated using eco data for raw materials, manufacturing processes, transportation, and operation and maintenance processes. For a typical 45.2 meter 1.5 MW blade this is 795 GJ (CO2 footprint 42.1 tonnes), dominated by manufacturing processes and raw materials (96% of the total. Based on the 2014 installed capacity, the total mass of WTB is 78 kt, their energy consumption is 82 TJ and the carbon dioxide footprint is 4.35 Mt. These figures will provide a basis for suggesting possible solutions to reduce WTB environmental impact.
Thermal image analysis using the serpentine method
NASA Astrophysics Data System (ADS)
Koprowski, Robert; Wilczyński, Sławomir
2018-03-01
Thermal imaging is an increasingly widespread alternative to other imaging methods. As a supplementary method in diagnostics, it can be used both statically and with dynamic temperature changes. The paper proposes a new image analysis method that allows for the acquisition of new diagnostic information as well as object segmentation. The proposed serpentine analysis uses known and new methods of image analysis and processing proposed by the authors. Affine transformations of an image and subsequent Fourier analysis provide a new diagnostic quality. The method is fully repeatable and automatic and independent of inter-individual variability in patients. The segmentation results are by 10% better than those obtained from the watershed method and the hybrid segmentation method based on the Canny detector. The first and second harmonics of serpentine analysis enable to determine the type of temperature changes in the region of interest (gradient, number of heat sources etc.). The presented serpentine method provides new quantitative information on thermal imaging and more. Since it allows for image segmentation and designation of contact points of two and more heat sources (local minimum), it can be used to support medical diagnostics in many areas of medicine.
Genome-wide analysis of the WRKY transcription factors in aegilops tauschii.
Ma, Jianhui; Zhang, Daijing; Shao, Yun; Liu, Pei; Jiang, Lina; Li, Chunxi
2014-01-01
The WRKY transcription factors (TFs) play important roles in responding to abiotic and biotic stress in plants. However, due to its unfinished genome sequencing, relatively few WRKY TFs with full-length coding sequences (CDSs) have been identified in wheat. Instead, the Aegilops tauschii genome, which is the D-genome progenitor of the hexaploid wheat genome, provides important resources for the discovery of new genes. In this study, we performed a bioinformatics analysis to identify WRKY TFs with full-length CDSs from the A. tauschii genome. A detailed evolutionary analysis for all these TFs was conducted, and quantitative real-time PCR was carried out to investigate the expression patterns of the abiotic stress-related WRKY TFs under different abiotic stress conditions in A. tauschii seedlings. A total of 93 WRKY TFs were identified from A. tauschii, and 79 of them were found to be newly discovered genes compared with wheat. Gene phylogeny, gene structure and chromosome location of the 93 WRKY TFs were fully analyzed. These studies provide a global view of the WRKY TFs from A. tauschii and a firm foundation for further investigations in both A. tauschii and wheat. © 2015 S. Karger AG, Basel.
Shear wave velocity imaging using transient electrode perturbation: phantom and ex vivo validation.
DeWall, Ryan J; Varghese, Tomy; Madsen, Ernest L
2011-03-01
This paper presents a new shear wave velocity imaging technique to monitor radio-frequency and microwave ablation procedures, coined electrode vibration elastography. A piezoelectric actuator attached to an ablation needle is transiently vibrated to generate shear waves that are tracked at high frame rates. The time-to-peak algorithm is used to reconstruct the shear wave velocity and thereby the shear modulus variations. The feasibility of electrode vibration elastography is demonstrated using finite element models and ultrasound simulations, tissue-mimicking phantoms simulating fully (phantom 1) and partially ablated (phantom 2) regions, and an ex vivo bovine liver ablation experiment. In phantom experiments, good boundary delineation was observed. Shear wave velocity estimates were within 7% of mechanical measurements in phantom 1 and within 17% in phantom 2. Good boundary delineation was also demonstrated in the ex vivo experiment. The shear wave velocity estimates inside the ablated region were higher than mechanical testing estimates, but estimates in the untreated tissue were within 20% of mechanical measurements. A comparison of electrode vibration elastography and electrode displacement elastography showed the complementary information that they can provide. Electrode vibration elastography shows promise as an imaging modality that provides ablation boundary delineation and quantitative information during ablation procedures.
Retardation effects on the dispersion and propagation of plasmons in metallic nanoparticle chains
NASA Astrophysics Data System (ADS)
Downing, Charles A.; Mariani, Eros; Weick, Guillaume
2018-01-01
We consider a chain of regularly-spaced spherical metallic nanoparticles, where each particle supports three degenerate localized surface plasmons. Due to the dipolar interaction between the nanoparticles, the localized plasmons couple to form extended collective modes. Using an open quantum system approach in which the collective plasmons are interacting with vacuum electromagnetic modes and which, importantly, readily incorporates retardation via the light-matter coupling, we analytically evaluate the resulting radiative frequency shifts of the plasmonic bandstructure. For subwavelength-sized nanoparticles, our analytical treatment provides an excellent quantitative agreement with the results stemming from laborious numerical calculations based on fully-retarded solutions to Maxwell’s equations. Indeed, the explicit expressions for the plasmonic spectrum which we provide showcase how including retardation gives rise to a logarithmic singularity in the bandstructure of transverse-polarized plasmons. We further study the impact of retardation effects on the propagation of plasmonic excitations along the chain. While for the longitudinal modes, retardation has a negligible effect, we find that the retarded dipolar interaction can significantly modify the plasmon propagation in the case of transverse-polarized modes. Moreover, our results elucidate the analogy between radiative effects in nanoplasmonic systems and the cooperative Lamb shift in atomic physics.
Perceptions of drinking water quality and risk and its effect on behaviour: a cross-national study.
Doria, Miguel de França; Pidgeon, Nick; Hunter, Paul R
2009-10-15
There is a growing effort to provide drinking water that has the trust of consumers, but the processes underlying the perception of drinking water quality and risks are still not fully understood. This paper intends to explore the factors involved in public perception of the quality and risks of drinking water. This purpose was addressed with a cross-national mixed-method approach, based on quantitative (survey) and qualitative (focus groups) data collected in the UK and Portugal. The data were analysed using several methods, including structural equation models and generalised linear models. Results suggest that perceptions of water quality and risk result from a complex interaction of diverse factors. The estimation of water quality is mostly influenced by satisfaction with organoleptic properties (especially flavour), risk perception, contextual cues, and perceptions of chemicals (lead, chlorine, and hardness). Risk perception is influenced by organoleptics, perceived water chemicals, external information, past health problems, and trust in water suppliers, among other factors. The use of tap and bottled water to drink was relatively well explained by regression analysis. Several cross-national differences were found and the implications are discussed. Suggestions for future research are provided.
Phase calibration target for quantitative phase imaging with ptychography.
Godden, T M; Muñiz-Piniella, A; Claverley, J D; Yacoot, A; Humphry, M J
2016-04-04
Quantitative phase imaging (QPI) utilizes refractive index and thickness variations that lead to optical phase shifts. This gives contrast to images of transparent objects. In quantitative biology, phase images are used to accurately segment cells and calculate properties such as dry mass, volume and proliferation rate. The fidelity of the measured phase shifts is of critical importance in this field. However to date, there has been no standardized method for characterizing the performance of phase imaging systems. Consequently, there is an increasing need for protocols to test the performance of phase imaging systems using well-defined phase calibration and resolution targets. In this work, we present a candidate for a standardized phase resolution target, and measurement protocol for the determination of the transfer of spatial frequencies, and sensitivity of a phase imaging system. The target has been carefully designed to contain well-defined depth variations over a broadband range of spatial frequencies. In order to demonstrate the utility of the target, we measure quantitative phase images on a ptychographic microscope, and compare the measured optical phase shifts with Atomic Force Microscopy (AFM) topography maps and surface profile measurements from coherence scanning interferometry. The results show that ptychography has fully quantitative nanometer sensitivity in optical path differences over a broadband range of spatial frequencies for feature sizes ranging from micrometers to hundreds of micrometers.
Sasai-Sakuma, Taeko; Frauscher, Birgit; Mitterling, Thomas; Ehrmann, Laura; Gabelia, David; Brandauer, Elisabeth; Inoue, Yuichi; Poewe, Werner; Högl, Birgit
2014-09-01
Rapid eye movement (REM) sleep without atonia (RWA) is observed in some patients without a clinical history of REM sleep behavior disorder (RBD). It remains unknown whether these patients meet the refined quantitative electromyographic (EMG) criteria supporting a clinical RBD diagnosis. We quantitatively evaluated EMG activity and investigated its overnight distribution in patients with isolated qualitative RWA. Fifty participants with an incidental polysomnographic finding of RWA (isolated qualitative RWA) were included. Tonic, phasic, and 'any' EMG activity during REM sleep on PSG were quantified retrospectively. Referring to the quantitative cut-off values for a polysomnographic diagnosis of RBD, 7/50 (14%) and 6/50 (12%) of the patients showed phasic and 'any' EMG activity in the mentalis muscle above the respective cut-off values. No patient was above the cut-off value for tonic EMG activity or phasic EMG activity in the anterior tibialis muscles. Patients with RWA above the cut-off value showed higher amounts of RWA during later REM sleep periods. This is the first study showing that some subjects with incidental RWA meet the refined quantitative EMG criteria for a diagnosis of RBD. Future longitudinal studies must investigate whether this subgroup with isolated qualitative RWA is at an increased risk of developing fully expressed RBD and/or neurodegenerative disease. Copyright © 2014 Elsevier B.V. All rights reserved.
Hill, Jacqueline J; Kuyken, Willem; Richards, David A
2014-11-20
Stepped care is recommended and implemented as a means to organise depression treatment. Compared with alternative systems, it is assumed to achieve equivalent clinical effects and greater efficiency. However, no trials have examined these assumptions. A fully powered trial of stepped care compared with intensive psychological therapy is required but a number of methodological and procedural uncertainties associated with the conduct of a large trial need to be addressed first. STEPS (Developing stepped care treatment for depression) is a mixed methods study to address uncertainties associated with a large-scale evaluation of stepped care compared with high-intensity psychological therapy alone for the treatment of depression. We will conduct a pilot randomised controlled trial with an embedded process study. Quantitative trial data on recruitment, retention and the pathway of patients through treatment will be used to assess feasibility. Outcome data on the effects of stepped care compared with high-intensity therapy alone will inform a sample size calculation for a definitive trial. Qualitative interviews will be undertaken to explore what people think of our trial methods and procedures and the stepped care intervention. A minimum of 60 patients with Major Depressive Disorder will be recruited from an Improving Access to Psychological Therapies service and randomly allocated to receive stepped care or intensive psychological therapy alone. All treatments will be delivered at clinic facilities within the University of Exeter. Quantitative patient-related data on depressive symptoms, worry and anxiety and quality of life will be collected at baseline and 6 months. The pilot trial and interviews will be undertaken concurrently. Quantitative and qualitative data will be analysed separately and then integrated. The outcomes of this study will inform the design of a fully powered randomised controlled trial to evaluate the effectiveness and efficiency of stepped care. Qualitative data on stepped care will be of immediate interest to patients, clinicians, service managers, policy makers and guideline developers. A more informed understanding of the feasibility of a large trial will be obtained than would be possible from a purely quantitative (or qualitative) design. Current Controlled Trials ISRCTN66346646 registered on 2 July 2014.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Election at time of retirement of fully... OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) RETIREMENT Survivor Annuities Elections at the Time of Retirement § 831.611 Election at time of retirement of fully reduced...
Digital Health Education for the Fully Online College Student: An Exploratory Study
ERIC Educational Resources Information Center
Armstrong, Shelley N.; Burcin, Michelle M.
2016-01-01
Background: Just because more online degree programs are available does not mean that each university has the support services to provide health services to their online students. Purpose: The purpose of this study was to determine whether health-related services are provided to fully online students based on the American College Health…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham, Paul E.; Garcia, Benjamin J.; Gunter, Lee E.
Drought stress is a recurring feature of world climate and the single most important factor influencing agricultural yield worldwide. Plants display highly variable, species-specific responses to drought and these responses are multifaceted, requiring physiological and morphological changes influenced by genetic and molecular mechanisms. Moreover, the reproducibility of water deficit studies is very cumbersome, which significantly impedes research on drought tolerance, because how a plant responds is highly influenced by the timing, duration, and intensity of the water deficit. Despite progress in the identification of drought-related mechanisms in many plants, the molecular basis of drought resistance remains to be fully understoodmore » in trees, particularly in poplar species because their wide geographic distribution results in varying tolerances to drought. Herein, we aimed to better understand this complex phenomenon in eastern cottonwood ( Populus deltoides) by performing a detailed contrast of the proteome changes between two different water deficit experiments to identify functional intersections and divergences in proteome responses. We investigated plants subjected to cyclic water deficit and compared these responses to plants subjected to prolonged acute water deficit. In total, we identified 108,012 peptide sequences across both experiments that provided insight into the quantitative state of 22,737 Populus gene models and 8,199 functional protein groups in response to drought. Together, these datasets provide the most comprehensive insight into proteome drought responses in poplar to date and a direct proteome comparison between short period dehydration shock and cyclic, post-drought re-watering. Altogether, this investigation provides novel insights into drought avoidance mechanisms that are distinct from progressive drought stress. Additionally, we identified proteins that have been associated as drought-relevant in previous studies. Importantly, we highlight the RD26 transcription factor as a gene regulated at both the transcript and protein level, regardless of species and drought condition, and, thus, represents a key, universal drought marker for Populus species.« less
Geades, Nicolas; Hunt, Benjamin A E; Shah, Simon M; Peters, Andrew; Mougin, Olivier E; Gowland, Penny A
2017-08-01
To develop a method that fits a multipool model to z-spectra acquired from non-steady state sequences, taking into account the effects of variations in T1 or B1 amplitude and the results estimating the parameters for a four-pool model to describe the z-spectrum from the healthy brain. We compared measured spectra with a look-up table (LUT) of possible spectra and investigated the potential advantages of simultaneously considering spectra acquired at different saturation powers (coupled spectra) to provide sensitivity to a range of different physicochemical phenomena. The LUT method provided reproducible results in healthy controls. The average values of the macromolecular pool sizes measured in white matter (WM) and gray matter (GM) of 10 healthy volunteers were 8.9% ± 0.3% (intersubject standard deviation) and 4.4% ± 0.4%, respectively, whereas the average nuclear Overhauser effect pool sizes in WM and GM were 5% ± 0.1% and 3% ± 0.1%, respectively, and average amide proton transfer pool sizes in WM and GM were 0.21% ± 0.03% and 0.20% ± 0.02%, respectively. The proposed method demonstrated increased robustness when compared with existing methods (such as Lorentzian fitting and asymmetry analysis) while yielding fully quantitative results. The method can be adjusted to measure other parameters relevant to the z-spectrum. Magn Reson Med 78:645-655, 2017. © 2016 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2016 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.
Abraham, Paul E.; Garcia, Benjamin J.; Gunter, Lee E.; ...
2018-02-15
Drought stress is a recurring feature of world climate and the single most important factor influencing agricultural yield worldwide. Plants display highly variable, species-specific responses to drought and these responses are multifaceted, requiring physiological and morphological changes influenced by genetic and molecular mechanisms. Moreover, the reproducibility of water deficit studies is very cumbersome, which significantly impedes research on drought tolerance, because how a plant responds is highly influenced by the timing, duration, and intensity of the water deficit. Despite progress in the identification of drought-related mechanisms in many plants, the molecular basis of drought resistance remains to be fully understoodmore » in trees, particularly in poplar species because their wide geographic distribution results in varying tolerances to drought. Herein, we aimed to better understand this complex phenomenon in eastern cottonwood ( Populus deltoides) by performing a detailed contrast of the proteome changes between two different water deficit experiments to identify functional intersections and divergences in proteome responses. We investigated plants subjected to cyclic water deficit and compared these responses to plants subjected to prolonged acute water deficit. In total, we identified 108,012 peptide sequences across both experiments that provided insight into the quantitative state of 22,737 Populus gene models and 8,199 functional protein groups in response to drought. Together, these datasets provide the most comprehensive insight into proteome drought responses in poplar to date and a direct proteome comparison between short period dehydration shock and cyclic, post-drought re-watering. Altogether, this investigation provides novel insights into drought avoidance mechanisms that are distinct from progressive drought stress. Additionally, we identified proteins that have been associated as drought-relevant in previous studies. Importantly, we highlight the RD26 transcription factor as a gene regulated at both the transcript and protein level, regardless of species and drought condition, and, thus, represents a key, universal drought marker for Populus species.« less
The Protective Role of Coastal Marshes: A Systematic Review and Meta-analysis
Shepard, Christine C.; Crain, Caitlin M.; Beck, Michael W.
2011-01-01
Background Salt marshes lie between many human communities and the coast and have been presumed to protect these communities from coastal hazards by providing important ecosystem services. However, previous characterizations of these ecosystem services have typically been based on a small number of historical studies, and the consistency and extent to which marshes provide these services has not been investigated. Here, we review the current evidence for the specific processes of wave attenuation, shoreline stabilization and floodwater attenuation to determine if and under what conditions salt marshes offer these coastal protection services. Methodology/Principal Findings We conducted a thorough search and synthesis of the literature with reference to these processes. Seventy-five publications met our selection criteria, and we conducted meta-analyses for publications with sufficient data available for quantitative analysis. We found that combined across all studies (n = 7), salt marsh vegetation had a significant positive effect on wave attenuation as measured by reductions in wave height per unit distance across marsh vegetation. Salt marsh vegetation also had a significant positive effect on shoreline stabilization as measured by accretion, lateral erosion reduction, and marsh surface elevation change (n = 30). Salt marsh characteristics that were positively correlated to both wave attenuation and shoreline stabilization were vegetation density, biomass production, and marsh size. Although we could not find studies quantitatively evaluating floodwater attenuation within salt marshes, there are several studies noting the negative effects of wetland alteration on water quantity regulation within coastal areas. Conclusions/Significance Our results show that salt marshes have value for coastal hazard mitigation and climate change adaptation. Because we do not yet fully understand the magnitude of this value, we propose that decision makers employ natural systems to maximize the benefits and ecosystem services provided by salt marshes and exercise caution when making decisions that erode these services. PMID:22132099
Fire frequency in the Interior Columbia River Basin: Building regional models from fire history data
McKenzie, D.; Peterson, D.L.; Agee, James K.
2000-01-01
Fire frequency affects vegetation composition and successional pathways; thus it is essential to understand fire regimes in order to manage natural resources at broad spatial scales. Fire history data are lacking for many regions for which fire management decisions are being made, so models are needed to estimate past fire frequency where local data are not yet available. We developed multiple regression models and tree-based (classification and regression tree, or CART) models to predict fire return intervals across the interior Columbia River basin at 1-km resolution, using georeferenced fire history, potential vegetation, cover type, and precipitation databases. The models combined semiqualitative methods and rigorous statistics. The fire history data are of uneven quality; some estimates are based on only one tree, and many are not cross-dated. Therefore, we weighted the models based on data quality and performed a sensitivity analysis of the effects on the models of estimation errors that are due to lack of cross-dating. The regression models predict fire return intervals from 1 to 375 yr for forested areas, whereas the tree-based models predict a range of 8 to 150 yr. Both types of models predict latitudinal and elevational gradients of increasing fire return intervals. Examination of regional-scale output suggests that, although the tree-based models explain more of the variation in the original data, the regression models are less likely to produce extrapolation errors. Thus, the models serve complementary purposes in elucidating the relationships among fire frequency, the predictor variables, and spatial scale. The models can provide local managers with quantitative information and provide data to initialize coarse-scale fire-effects models, although predictions for individual sites should be treated with caution because of the varying quality and uneven spatial coverage of the fire history database. The models also demonstrate the integration of qualitative and quantitative methods when requisite data for fully quantitative models are unavailable. They can be tested by comparing new, independent fire history reconstructions against their predictions and can be continually updated, as better fire history data become available.
Recruitment of Community College Students Into a Web-Assisted Tobacco Intervention Study.
McIntosh, Scott; Johnson, Tye; Wall, Andrew F; Prokhorov, Alexander V; Calabro, Karen Sue; Ververs, Duncan; Assibey-Mensah, Vanessa; Ossip, Deborah J
2017-05-08
United States college students, particularly those attending community colleges, have higher smoking rates than the national average. Recruitment of such smokers into research studies has not been studied in depth, despite a moderate amount information on study recruitment success with smokers from traditional four-year colleges. Recruitment channels and success are evolving as technology evolves, so it is important to understand how to best target, implement, and evaluate recruitment strategies. The aim of this paper is to both qualitatively and quantitatively explore recruitment channels (eg, mass email, in-person referral, posted materials) and their success with enrollment into a Web-Assisted Tobacco Intervention study in this priority population of underserved and understudied smokers. Qualitative research methods included key informant interviews (n=18) and four focus groups (n=37). Quantitative research methods included observed online responsiveness to any channel (n=10,914), responses from those completing online screening and study consent (n=2696), and responses to a baseline questionnaire from the fully enrolled study participants (n=1452). Qualitative results prior to recruitment provided insights regarding the selection of a variety of recruitment channels proposed to be successful, and provided context for the unique attributes of the study sample. Quantitative analysis of self-reported channels used to engage with students, and to enroll participants into the study, revealed the relative utilization of channels at several recruitment points. The use of mass emails to the student body was reported by the final sample as the most influential channel, accounting for 60.54% (879/1452) of the total enrolled sample. Relative channel efficiency was analyzed across a wide variety of channels. One primary channel (mass emails) and a small number of secondary channels (including college websites and learning management systems) accounted for most of the recruitment success. ClinicalTrials.gov NCT01692730; https://clinicaltrials.gov/ct2/show/NCT01692730 (Archived by WebCite at http://www.webcitation.org/6qEcFQN9Q). ©Scott McIntosh, Tye Johnson, Andrew F Wall, Alexander V Prokhorov, Karen Sue Calabro, Duncan Ververs, Vanessa Assibey-Mensah, Deborah J Ossip. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 08.05.2017.
Comparing fully general relativistic and Newtonian calculations of structure formation
DOE Office of Scientific and Technical Information (OSTI.GOV)
East, William E.; Wojtak, Radosław; Abel, Tom
In the standard approach to studying cosmological structure formation, the overall expansion of the Universe is assumed to be homogeneous, with the gravitational effect of inhomogeneities encoded entirely in a Newtonian potential. A topic of ongoing debate is to what degree this fully captures the dynamics dictated by general relativity, especially in the era of precision cosmology. To quantitatively assess this, in this paper we directly compare standard N-body Newtonian calculations to full numerical solutions of the Einstein equations, for cold matter with various magnitude initial inhomogeneities on scales comparable to the Hubble horizon. We analyze the differences in themore » evolution of density, luminosity distance, and other quantities defined with respect to fiducial observers. This is carried out by reconstructing the effective spacetime and matter fields dictated by the Newtonian quantities, and by taking care to distinguish effects of numerical resolution. We find that the fully general relativistic and Newtonian calculations show excellent agreement, even well into the nonlinear regime. Finally, they only notably differ in regions where the weak gravity assumption breaks down, which arise when considering extreme cases with perturbations exceeding standard values.« less
Comparing fully general relativistic and Newtonian calculations of structure formation
East, William E.; Wojtak, Radosław; Abel, Tom
2018-02-13
In the standard approach to studying cosmological structure formation, the overall expansion of the Universe is assumed to be homogeneous, with the gravitational effect of inhomogeneities encoded entirely in a Newtonian potential. A topic of ongoing debate is to what degree this fully captures the dynamics dictated by general relativity, especially in the era of precision cosmology. To quantitatively assess this, in this paper we directly compare standard N-body Newtonian calculations to full numerical solutions of the Einstein equations, for cold matter with various magnitude initial inhomogeneities on scales comparable to the Hubble horizon. We analyze the differences in themore » evolution of density, luminosity distance, and other quantities defined with respect to fiducial observers. This is carried out by reconstructing the effective spacetime and matter fields dictated by the Newtonian quantities, and by taking care to distinguish effects of numerical resolution. We find that the fully general relativistic and Newtonian calculations show excellent agreement, even well into the nonlinear regime. Finally, they only notably differ in regions where the weak gravity assumption breaks down, which arise when considering extreme cases with perturbations exceeding standard values.« less
Real-time particle tracking for studying intracellular trafficking of pharmaceutical nanocarriers.
Huang, Feiran; Watson, Erin; Dempsey, Christopher; Suh, Junghae
2013-01-01
Real-time particle tracking is a technique that combines fluorescence microscopy with object tracking and computing and can be used to extract quantitative transport parameters for small particles inside cells. Since the success of a nanocarrier can often be determined by how effectively it delivers cargo to the target organelle, understanding the complex intracellular transport of pharmaceutical nanocarriers is critical. Real-time particle tracking provides insight into the dynamics of the intracellular behavior of nanoparticles, which may lead to significant improvements in the design and development of novel delivery systems. Unfortunately, this technique is not often fully understood, limiting its implementation by researchers in the field of nanomedicine. In this chapter, one of the most complicated aspects of particle tracking, the mean square displacement (MSD) calculation, is explained in a simple manner designed for the novice particle tracker. Pseudo code for performing the MSD calculation in MATLAB is also provided. This chapter contains clear and comprehensive instructions for a series of basic procedures in the technique of particle tracking. Instructions for performing confocal microscopy of nanoparticle samples are provided, and two methods of determining particle trajectories that do not require commercial particle-tracking software are provided. Trajectory analysis and determination of the tracking resolution are also explained. By providing comprehensive instructions needed to perform particle-tracking experiments, this chapter will enable researchers to gain new insight into the intracellular dynamics of nanocarriers, potentially leading to the development of more effective and intelligent therapeutic delivery vectors.
In the mind’s eye: Provider and patient attitudes on functional brain imaging
Illes, J.; Lombera, S.; Rosenberg, J.; Arnow, B.
2008-01-01
Success in functional neuroimaging has brought the promise of quantitative data in the form of brain images to the diagnosis of disorders of the central nervous system for which only qualitative clinical criteria have previously existed. Even though the translation of research to clinical neuroimaging for conditions such as major depression may not be available yet, rapid innovation along this trajectory of discovery to implementation compels exploration of how such information will eventually affect providers and patients. Clinical neuroethics is devoted to elucidating ethical challenges prior to and during the transfer of new research capabilities to the bedside. Through a model of proactive ethics, clinical neuroethics promotes the development of responsible social and public policies in response to new diagnostic and prognostic capabilities for the benefit of patients and their families, and for providers within the health care systems in which they practice. To examine views about the potential interaction of clinical neuroimaging and depression, we surveyed both mental health providers and outpatients and inpatients diagnosed with major depressive disorder. From responses of 52 providers and 72 patients, we found high receptivity to brain scans for treatment tailoring and choice, for improving understanding of and coping with disease, and for mitigating the effects of stigma and self-blame. Our results suggest that, once ready, roll out of the fully validated technology has significant potential to reduce social burden associated with highly stigmatized illnesses like depression. PMID:18423669
The local lymph node assay in 2014.
Basketter, David A; Gerberick, G Frank; Kimber, Ian
2014-01-01
Toxicology endeavors to predict the potential of materials to cause adverse health (and environmental) effects and to assess the risk(s) associated with exposure. For skin sensitizers, the local lymph node assay was the first method to be fully and independently validated, as well as the first to offer an objective end point with a quantitative measure of sensitizing potency (in addition to hazard identification). Fifteen years later, it serves as the primary standard for the development of in vitro/in chemico/in silico alternatives.
Identity method to study chemical fluctuations in relativistic heavy-ion collisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gazdzicki, Marek; Grebieszkow, Katarzyna; Mackowiak, Maja
Event-by-event fluctuations of the chemical composition of the hadronic final state of relativistic heavy-ion collisions carry valuable information on the properties of strongly interacting matter produced in the collisions. However, in experiments incomplete particle identification distorts the observed fluctuation signals. The effect is quantitatively studied and a new technique for measuring chemical fluctuations, the identity method, is proposed. The method fully eliminates the effect of incomplete particle identification. The application of the identity method to experimental data is explained.
NASA Technical Reports Server (NTRS)
Schroder, Christian; Klingelhofer, Gostar; Morris, Richard V.; Yen, Albert S.; Renz, Franz; Graff, Trevor G.
2016-01-01
The miniaturized Mossbauer spectrometer MIMOS II is an off-the-shelf instrument, which has been successfully deployed during NASA's Mars Exploration Rover (MER) mission and was on-board the ESA/UK Beagle 2 Mars lander and the Russian Phobos-Grunt sample return mission. We propose to use a fully-qualified flight-spare MIMOS II instrument available from these missions for in situ asteroid characterization with the Asteroid Redirect Robotic Mission (ARRM).
NASA Technical Reports Server (NTRS)
Wilckens, V.
1972-01-01
Present information display concepts for pilot landing guidance are outlined considering manual control as well as substitution of man by fully competent automatics. Display improvements are achieved by compressing the distributed indicators into an accumulative display and thus reducing information scanning. Complete integration of quantitative indications, outer loop information, and real world display in a pictorial information channel geometry constitutes an interface with human ability to differentiate and integrate for optimal manual control of the aircraft.
A Randomized Clinical Trial of Allopregnanolone for the Treatment of Severe Traumatic Brain Injury
2015-12-01
at 30-40 psi of hydrogen with a dry 10% palladium on carbon catalyst. The yield was almost quantitative and less than 1% of byproducts were formed...to fully dissolve the material and heptane was added to precipitate allopregnanolone. After isolation and drying , a 59.2% recovery was obtained...for 66 hours and 28 minutes, and was deemed 1% dry by thermogravimetric analysis. The product was reslurried for 6 hours and 15 minutes, washed with
A color video display technique for flow field surveys
NASA Technical Reports Server (NTRS)
Winkelmann, A. E.; Tsao, C. P.
1982-01-01
A computer driven color video display technique has been developed for the presentation of wind tunnel flow field survey data. The results of both qualitative and quantitative flow field surveys can be presented in high spatial resolutions color coded displays. The technique has been used for data obtained with a hot-wire probe, a split-film probe, a Conrad (pitch) probe and a 5-tube pressure probe in surveys above and behind a wing with partially stalled and fully stalled flow.
Toward Accurate and Quantitative Comparative Metagenomics
Nayfach, Stephen; Pollard, Katherine S.
2016-01-01
Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341
Toward Accurate and Quantitative Comparative Metagenomics.
Nayfach, Stephen; Pollard, Katherine S
2016-08-25
Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.
Real-time quantitative Schlieren imaging by fast Fourier demodulation of a checkered backdrop
NASA Astrophysics Data System (ADS)
Wildeman, Sander
2018-06-01
A quantitative synthetic Schlieren imaging (SSI) method based on fast Fourier demodulation is presented. Instead of a random dot pattern (as usually employed in SSI), a 2D periodic pattern (such as a checkerboard) is used as a backdrop to the refractive object of interest. The range of validity and accuracy of this "Fast Checkerboard Demodulation" (FCD) method are assessed using both synthetic data and experimental recordings of patterns optically distorted by small waves on a water surface. It is found that the FCD method is at least as accurate as sophisticated, multi-stage, digital image correlation (DIC) or optical flow (OF) techniques used with random dot patterns, and it is significantly faster. Efficient, fully vectorized, implementations of both the FCD and DIC/OF schemes developed for this study are made available as open source Matlab scripts.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., within 1 year after a post-retirement marriage, a fully reduced annuity or a partially reduced annuity to... partially reduced annuity at the time of retirement may elect, within 1 year after a postretirement marriage... month beginning 1 year after the date of the post-retirement marriage. (b) Except as provided in...
Teaching quantitative biology: goals, assessments, and resources
Aikens, Melissa L.; Dolan, Erin L.
2014-01-01
More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425
de Kanel, J; Vickery, W E; Waldner, B; Monahan, R M; Diamond, F X
1998-05-01
A forensic procedure for the quantitative confirmation of lysergic acid diethylamide (LSD) and the qualitative confirmation of its metabolite, N-demethyl-LSD, in blood, serum, plasma, and urine samples is presented. The Zymark RapidTrace was used to perform fully automated solid-phase extractions of all specimen types. After extract evaporation, confirmations were performed using liquid chromatography (LC) followed by positive electrospray ionization (ESI+) mass spectrometry/mass spectrometry (MS/MS) without derivatization. Quantitation of LSD was accomplished using LSD-d3 as an internal standard. The limit of quantitation (LOQ) for LSD was 0.05 ng/mL. The limit of detection (LOD) for both LSD and N-demethyl-LSD was 0.025 ng/mL. The recovery of LSD was greater than 95% at levels of 0.1 ng/mL and 2.0 ng/mL. For LSD at 1.0 ng/mL, the within-run and between-run (different day) relative standard deviation (RSD) was 2.2% and 4.4%, respectively.
Valle, Francesco; Brucale, Marco; Chiodini, Stefano; Bystrenova, Eva; Albonetti, Cristiano
2017-09-01
While the widespread emergence of nanoscience and nanotechnology can be dated back to the early eighties, the last decade has witnessed a true coming of age of this research field, with novel nanomaterials constantly finding their way into marketed products. The performance of nanomaterials being dominated by their nanoscale morphology, their quantitative characterization with respect to a number of properties is often crucial. In this context, those imaging techniques able to resolve nanometer scale details are clearly key players. In particular, atomic force microscopy can yield a fully quantitative tridimensional (3D) topography at the nanoscale. Herein, we will review a set of morphological analysis based on the scaling approach, which give access to important quantitative parameters for describing nanomaterial samples. To generalize the use of such morphological analysis on all D-dimensions (1D, 2D and 3D), the review will focus on specific soft matter aggregates with fractal dimension ranging from just above 1 to just below 3. Copyright © 2017 Elsevier Ltd. All rights reserved.
Berendsen, Bjorn J A; Gerritsen, Henk W; Wegh, Robin S; Lameris, Steven; van Sebille, Ralph; Stolker, Alida A M; Nielen, Michel W F
2013-09-01
A comprehensive method for the quantitative residue analysis of trace levels of 22 ß-lactam antibiotics, including penicillins, cephalosporins, and carbapenems, in poultry muscle by liquid chromatography in combination with tandem mass spectrometric detection is reported. The samples analyzed for ß-lactam residues are hydrolyzed using piperidine in order to improve compound stability and to include the total residue content of the cephalosporin ceftifour. The reaction procedure was optimized using a full experimental design. Following detailed isotope labeling, tandem mass spectrometry studies and exact mass measurements using high-resolution mass spectrometry reaction schemes could be proposed for all ß-lactams studied. The main reaction occurring is the hydrolysis of the ß-lactam ring under formation of the piperidine substituted amide. For some ß-lactams, multiple isobaric hydrolysis reaction products are obtained, in accordance with expectations, but this did not hamper quantitative analysis. The final method was fully validated as a quantitative confirmatory residue analysis method according to Commission Decision 2002/657/EC and showed satisfactory quantitative performance for all compounds with trueness between 80 and 110% and within-laboratory reproducibility below 22% at target level, except for biapenem. For biapenem, the method proved to be suitable for qualitative analysis only.
Hematocrit Measurement with R2* and Quantitative Susceptibility Mapping in Postmortem Brain.
Walsh, A J; Sun, H; Emery, D J; Wilman, A H
2018-05-24
Noninvasive venous oxygenation quantification with MR imaging will improve the neurophysiologic investigation and the understanding of the pathophysiology in neurologic diseases. Available MR imaging methods are limited by sensitivity to flow and often require assumptions of the hematocrit level. In situ postmortem imaging enables evaluation of methods in a fully deoxygenated environment without flow artifacts, allowing direct calculation of hematocrit. This study compares 2 venous oxygenation quantification methods in in situ postmortem subjects. Transverse relaxation (R2*) mapping and quantitative susceptibility mapping were performed on a whole-body 4.7T MR imaging system. Intravenous measurements in major draining intracranial veins were compared between the 2 methods in 3 postmortem subjects. The quantitative susceptibility mapping technique was also applied in 10 healthy control subjects and compared with reference venous oxygenation values. In 2 early postmortem subjects, R2* mapping and quantitative susceptibility mapping measurements within intracranial veins had a significant and strong correlation ( R 2 = 0.805, P = .004 and R 2 = 0.836, P = .02). Higher R2* and susceptibility values were consistently demonstrated within gravitationally dependent venous segments during the early postmortem period. Hematocrit ranged from 0.102 to 0.580 in postmortem subjects, with R2* and susceptibility as large as 291 seconds -1 and 1.75 ppm, respectively. Measurements of R2* and quantitative susceptibility mapping within large intracranial draining veins have a high correlation in early postmortem subjects. This study supports the use of quantitative susceptibility mapping for evaluation of in vivo venous oxygenation and postmortem hematocrit concentrations. © 2018 by American Journal of Neuroradiology.
Steiding, Christian; Kolditz, Daniel; Kalender, Willi A
2014-03-01
Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also verified. The maximum percentage interscan variation of repeated measurements was less than 4% and 1.7% on average for all investigated quality criteria. The NPS-based image noise differed by less than 5% from the conventional standard deviation approach and spatially selective 10% MTF values were well comparable to subjective results obtained with 3D resolution pattern. Determining only transverse spatial resolution and global noise behavior in the central field of measurement turned out to be insufficient. The proposed framework transfers QA routines employed in conventional CT in an advanced version to CBCT for fully automated and time-efficient evaluation of technical equipment. With the modular phantom design, a routine as well as an expert version for assessing IQ is provided. The QA program can be used for arbitrary CT units to evaluate 3D imaging characteristics automatically.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steiding, Christian; Kolditz, Daniel; Kalender, Willi A., E-mail: willi.kalender@imp.uni-erlangen.de
Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, andmore » an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also verified. The maximum percentage interscan variation of repeated measurements was less than 4% and 1.7% on average for all investigated quality criteria. The NPS-based image noise differed by less than 5% from the conventional standard deviation approach and spatially selective 10% MTF values were well comparable to subjective results obtained with 3D resolution pattern. Determining only transverse spatial resolution and global noise behavior in the central field of measurement turned out to be insufficient. Conclusions: The proposed framework transfers QA routines employed in conventional CT in an advanced version to CBCT for fully automated and time-efficient evaluation of technical equipment. With the modular phantom design, a routine as well as an expert version for assessing IQ is provided. The QA program can be used for arbitrary CT units to evaluate 3D imaging characteristics automatically.« less
Kohlhoff, Kai J.; Jahn, Thomas R.; Lomas, David A.; Dobson, Christopher M.; Crowther, Damian C.; Vendruscolo, Michele
2016-01-01
The use of animal models in medical research provides insights into molecular and cellular mechanisms of human disease, and helps identify and test novel therapeutic strategies. Drosophila melanogaster – the common fruit fly – is one of the most established model organisms, as its study can be performed more readily and with far less expense than for other model animal systems, such as mice, fish, or indeed primates. In the case of fruit flies, standard assays are based on the analysis of longevity and basic locomotor functions. Here we present the iFly tracking system, which enables to increase the amount of quantitative information that can be extracted from these studies, and to reduce significantly the duration and costs associated with them. The iFly system uses a single camera to simultaneously track the trajectories of up to 20 individual flies with about 100μm spatial and 33ms temporal resolution. The statistical analysis of fly movements recorded with such accuracy makes it possible to perform a rapid and fully automated quantitative analysis of locomotor changes in response to a range of different stimuli. We anticipate that the iFly method will reduce very considerably the costs and the duration of the testing of genetic and pharmacological interventions in Drosophila models, including an earlier detection of behavioural changes and a large increase in throughput compared to current longevity and locomotor assays. PMID:21698336
Doppler Fourier Domain Optical Coherence Tomography for Label-Free Tissue Angiography
NASA Astrophysics Data System (ADS)
Leitgeb, Rainer A.; Szkulmowski, Maciej; Blatter, Cedric; Wojtkowski, Maciej
Information about tissue perfusion and the vascular structure is certainly most important for assessment of tissue state or personal health and the diagnosis of any pathological conditions. It is therefore of key medical interest to have tools available for both quantitative blood flow assessment as well as qualitative vascular imaging. The strength of optical techniques is the unprecedented level of detail even for small capillary structures or microaneurysms and the possibility to combine different techniques for additional tissue spectroscopy giving insight into tissue metabolism. There is an immediate diagnostic and pharmacological demand for high-resolution, label-free, tissue angiography and flow assessment that in addition allow for precise depth gating of flow information. The most promising candidate is Doppler optical coherence tomography (DOCT) being noncontact, label free, and without employing hazardous radiation. DOCT provides fully quantitative volumetric information about blood flow together with the vascular and structural anatomy. Besides flow quantification, analysis of OCT signal fluctuations allows to contrast moving scatterers in tissue such as red blood cells from static tissue. This allows for non-invasive optical angiography and yields high resolution even for smallest capillaries. Because of the huge potential of DOCT and lable-free optical angiography for diagnosis, the last years saw a rapid increase of publications in this field with many different approaches. The present chapter gives an overview over existing Doppler OCT approaches and angiography techniques. It furthermore discusses limitations and noise issues, and gives examples for angiography in the eye and the skin.
The MAGO experiment for dust environment monitoring on the Martian surface
NASA Astrophysics Data System (ADS)
Palumbo, P.; Battaglia, R.; Brucato, J. R.; Colangeli, L.; della Corte, V.; Esposito, F.; Ferrini, G.; Mazzotta Epifani, E.; Mennella, V.; Palomba, E.; Panizza, A.; Rotundi, A.
2004-01-01
Among the main directions identified for future Martian exploration, the study of the properties of dust dispersed in the atmosphere, its cycle and the impact on climate are considered of primary relevance. Dust storms, dust devils and the dust ``cycle'' have been identified and studied by past remote and in situ experiments, but little quantitative information is available on these processes, so far. The airborne dust contributes to the determination of the dynamic and thermodynamic evolution of the atmosphere, including the large-scale circulation processes and its impact on the climate of Mars. Moreover, aeolian erosion, redistribution of dust on the surface and weathering processes are mostly known only qualitatively. In order to improve our knowledge of the airborne dust evolution and other atmospheric processes, it is mandatory to measure the amount, mass-size distribution and dynamical properties of solid particles in the Martian atmosphere as a function of time. In this context, there is clearly a need for the implementation of experiments dedicated to study directly atmospheric dust. The Martian atmospheric grain observer (MAGO) experiment is aimed at providing direct quantitative measurements of mass and size distributions of dust particles, a goal that has never been fully achieved so far. The instrument design combines three types of sensors to monitor in situ the dust mass flux (micro balance system, MBS) and single grain properties (grain detection system, GDS+impact sensor, IS). Technical solutions and science capabilities are discussed in this paper.
Fox, Bridget C; Devonshire, Alison S; Baradez, Marc-Olivier; Marshall, Damian; Foy, Carole A
2012-08-15
Single cell gene expression analysis can provide insights into development and disease progression by profiling individual cellular responses as opposed to reporting the global average of a population. Reverse transcription-quantitative polymerase chain reaction (RT-qPCR) is the "gold standard" for the quantification of gene expression levels; however, the technical performance of kits and platforms aimed at single cell analysis has not been fully defined in terms of sensitivity and assay comparability. We compared three kits using purification columns (PicoPure) or direct lysis (CellsDirect and Cells-to-CT) combined with a one- or two-step RT-qPCR approach using dilutions of cells and RNA standards to the single cell level. Single cell-level messenger RNA (mRNA) analysis was possible using all three methods, although the precision, linearity, and effect of lysis buffer and cell background differed depending on the approach used. The impact of using a microfluidic qPCR platform versus a standard instrument was investigated for potential variability introduced by preamplification of template or scaling down of the qPCR to nanoliter volumes using laser-dissected single cell samples. The two approaches were found to be comparable. These studies show that accurate gene expression analysis is achievable at the single cell level and highlight the importance of well-validated experimental procedures for low-level mRNA analysis. Copyright © 2012 Elsevier Inc. All rights reserved.
Quantitative image reconstruction for total-body PET imaging using the 2-meter long EXPLORER scanner
NASA Astrophysics Data System (ADS)
Zhang, Xuezhu; Zhou, Jian; Cherry, Simon R.; Badawi, Ramsey D.; Qi, Jinyi
2017-03-01
The EXPLORER project aims to build a 2 meter long total-body PET scanner, which will provide extremely high sensitivity for imaging the entire human body. It will possess a range of capabilities currently unavailable to state-of-the-art clinical PET scanners with a limited axial field-of-view. The huge number of lines-of-response (LORs) of the EXPLORER poses a challenge to the data handling and image reconstruction. The objective of this study is to develop a quantitative image reconstruction method for the EXPLORER and compare its performance with current whole-body scanners. Fully 3D image reconstruction was performed using time-of-flight list-mode data with parallel computation. To recover the resolution loss caused by the parallax error between crystal pairs at a large axial ring difference or transaxial radial offset, we applied an image domain resolution model estimated from point source data. To evaluate the image quality, we conducted computer simulations using the SimSET Monte-Carlo toolkit and XCAT 2.0 anthropomorphic phantom to mimic a 20 min whole-body PET scan with an injection of 25 MBq 18F-FDG. We compare the performance of the EXPLORER with a current clinical scanner that has an axial FOV of 22 cm. The comparison results demonstrated superior image quality from the EXPLORER with a 6.9-fold reduction in noise standard deviation comparing with multi-bed imaging using the clinical scanner.
Integrating CNVs into meta-QTL identified GBP4 as positional candidate for adult cattle stature.
Cao, Xiu-Kai; Huang, Yong-Zhen; Ma, Yi-Lei; Cheng, Jie; Qu, Zhen-Xian; Ma, Yun; Bai, Yue-Yu; Tian, Feng; Lin, Feng-Peng; Ma, Yu-Lin; Chen, Hong
2018-05-08
Copy number variation (CNV) of DNA sequences, functionally significant but yet fully ascertained, is believed to confer considerable increments in unexplained heritability of quantitative traits. Identification of phenotype-associated CNVs (paCNVs) therefore is a pressing need in CNV studies to speed up their exploitation in cattle breeding programs. Here, we provided a new avenue to achieve this goal that is to project the published CNV data onto meta-quantitative trait loci (meta-QTL) map which connects causal genes with phenotypes. Any CNVs overlapping meta-QTL therefore will be potential paCNVs. This study reported potential paCNVs in Bos taurus autosome 3 (BTA3). Notably, overview indexes and CNVs both highlighted a narrower region (BTA3 54,500,000-55,000,000 bp, named BTA3_INQTL_6) within one constructed meta-QTL. Then, we ascertained guanylate-binding protein 4 (GBP4) among the nine positional candidate genes was significantly associated with adult cattle stature, including body weight (BW, P < 0.05) and withers height (WHT, P < 0.05), fitting GBP4 CNV either with three levels or with six levels in the model. Although higher copy number downregulated the mRNA levels of GBP2 (P < 0.05) and GBP4 (P < 0.05) in 1-Mb window (54.0-55.0 Mb) in muscle and adipose, additional analyses will be needed to clarify the causality behind the ascertained association.
A Routine 'Top-Down' Approach to Analysis of the Human Serum Proteome.
D'Silva, Arlene M; Hyett, Jon A; Coorssen, Jens R
2017-06-06
Serum provides a rich source of potential biomarker proteoforms. One of the major obstacles in analysing serum proteomes is detecting lower abundance proteins owing to the presence of hyper-abundant species (e.g., serum albumin and immunoglobulins). Although depletion methods have been used to address this, these can lead to the concomitant removal of non-targeted protein species, and thus raise issues of specificity, reproducibility, and the capacity for meaningful quantitative analyses. Altering the native stoichiometry of the proteome components may thus yield a more complex series of issues than dealing directly with the inherent complexity of the sample. Hence, here we targeted method refinements so as to ensure optimum resolution of serum proteomes via a top down two-dimensional gel electrophoresis (2DE) approach that enables the routine assessment of proteoforms and is fully compatible with subsequent mass spectrometric analyses. Testing included various fractionation and non-fractionation approaches. The data show that resolving 500 µg protein on 17 cm 3-10 non-linear immobilised pH gradient strips in the first dimension followed by second dimension resolution on 7-20% gradient gels with a combination of lithium dodecyl sulfate (LDS) and sodium dodecyl sulfate (SDS) detergents markedly improves the resolution and detection of proteoforms in serum. In addition, well established third dimension electrophoretic separations in combination with deep imaging further contributed to the best available resolution, detection, and thus quantitative top-down analysis of serum proteomes.
Genome Scale Modeling in Systems Biology: Algorithms and Resources
Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali
2014-01-01
In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031
Accurate phase measurements for thick spherical objects using optical quadrature microscopy
NASA Astrophysics Data System (ADS)
Warger, William C., II; DiMarzio, Charles A.
2009-02-01
In vitro fertilization (IVF) procedures have resulted in the birth of over three million babies since 1978. Yet the live birth rate in the United States was only 34% in 2005, with 32% of the successful pregnancies resulting in multiple births. These multiple pregnancies were directly attributed to the transfer of multiple embryos to increase the probability that a single, healthy embryo was included. Current viability markers used for IVF, such as the cell number, symmetry, size, and fragmentation, are analyzed qualitatively with differential interference contrast (DIC) microscopy. However, this method is not ideal for quantitative measures beyond the 8-cell stage of development because the cells overlap and obstruct the view within and below the cluster of cells. We have developed the phase-subtraction cell-counting method that uses the combination of DIC and optical quadrature microscopy (OQM) to count the number of cells accurately in live mouse embryos beyond the 8-cell stage. We have also created a preliminary analysis to measure the cell symmetry, size, and fragmentation quantitatively by analyzing the relative dry mass from the OQM image in conjunction with the phase-subtraction count. In this paper, we will discuss the characterization of OQM with respect to measuring the phase accurately for spherical samples that are much larger than the depth of field. Once fully characterized and verified with human embryos, this methodology could provide the means for a more accurate method to score embryo viability.
Quantitative Image Reconstruction for Total-Body PET Imaging Using the 2-meter Long EXPLORER Scanner
Zhang, Xuezhu; Zhou, Jian; Cherry, Simon R.; Badawi, Ramsey D.
2017-01-01
The EXPLORER project aims to build a 2-meter long total-body PET scanner, which will provide extremely high sensitivity for imaging the entire human body. It will possess a range of capabilities currently unavailable to state-of-the-art clinical PET scanners with a limited axial field-of-view. The huge number of lines-of-response (LORs) of the EXPLORER poses a challenge to the data handling and image reconstruction. The objective of this study is to develop a quantitative image reconstruction method for the EXPLORER and compare its performance with current whole-body scanners. Fully 3D image reconstruction was performed using time-of-flight list-mode data with parallel computation. To recover the resolution loss caused by the parallax error between crystal pairs at a large axial ring difference or transaxial radial offset, we applied an image domain resolution model estimated from point source data. To evaluate the image quality, we conducted computer simulations using the SimSET Monte-Carlo toolkit and XCAT 2.0 anthropomorphic phantom to mimic a 20-minute whole-body PET scan with an injection of 25 MBq 18F-FDG. We compare the performance of the EXPLORER with a current clinical scanner that has an axial FOV of 22 cm. The comparison results demonstrated superior image quality from the EXPLORER with a 6.9-fold reduction in noise standard deviation comparing with multi-bed imaging using the clinical scanner. PMID:28240215
Reilly, John F.; Games, Dora; Rydel, Russell E.; Freedman, Stephen; Schenk, Dale; Young, Warren G.; Morrison, John H.; Bloom, Floyd E.
2003-01-01
Various transgenic mouse models of Alzheimer's disease (AD) have been developed that overexpress mutant forms of amyloid precursor protein in an effort to elucidate more fully the potential role of β-amyloid (Aβ) in the etiopathogenesis of the disease. The present study represents the first complete 3D reconstruction of Aβ in the hippocampus and entorhinal cortex of PDAPP transgenic mice. Aβ deposits were detected by immunostaining and thioflavin fluorescence, and quantified by using high-throughput digital image acquisition and analysis. Quantitative analysis of amyloid load in hippocampal subfields showed a dramatic increase between 12 and 15 months of age, with little or no earlier detectable deposition. Three-dimensional reconstruction in the oldest brains visualized previously unrecognized sheets of Aβ coursing through the hippocampus and cerebral cortex. In contrast with previous hypotheses, compact plaques form before significant deposition of diffuse Aβ, suggesting that different mechanisms are involved in the deposition of diffuse amyloid and the aggregation into plaques. The dentate gyrus was the hippocampal subfield with the greatest amyloid burden. Sublaminar distribution of Aβ in the dentate gyrus correlated most closely with the termination of afferent projections from the lateral entorhinal cortex, mirroring the selective vulnerability of this circuit in human AD. This detailed temporal and spatial analysis of Aβ and compact amyloid deposition suggests that specific corticocortical circuits express selective, but late, vulnerability to the pathognomonic markers of amyloid deposition, and can provide a basis for detecting prior vulnerability factors. PMID:12697936
NASA Astrophysics Data System (ADS)
Zhong, Fulin; Li, Ting; Pan, Boan; Wang, Pengbo
2017-02-01
Laser acupuncture is an effective photochemical and nonthermal stimulation of traditional acupuncture points with lowintensity laser irradiation, which is advantageous in painless, sterile, and safe compared to traditional acupuncture. Laser diode (LD) provides single wavelength and relatively-higher power light for phototherapy. The quantitative effect of illumination parameters of LD in use of laser acupuncture is crucial for practical operation of laser acupuncture. However, this issue is not fully demonstrated, especially since experimental methodologies with animals or human are pretty hard to address to this issue. For example, in order to protect viability of cells and tissue, and get better therapeutic effect, it's necessary to control the output power varied at 5mW 10mW range, while the optimized power is still not clear. This study aimed to quantitatively optimize the laser output power, wavelength, and irradiation direction with highly realistic modeling of light transport in acupunctured tissue. A Monte Carlo Simulation software for 3D vowelized media and the highest-precision human anatomical model Visible Chinese Human (VCH) were employed. Our 3D simulation results showed that longer wavelength/higher illumination power, larger absorption in laser acupuncture; the vertical direction emission of the acupuncture laser results in higher amount of light absorption in both the acupunctured voxel of tissue and muscle layer. Our 3D light distribution of laser acupuncture within VCH tissue model is potential to be used in optimization and real time guidance in clinical manipulation of laser acupuncture.
3D-Reconstruction of recent volcanic activity from ROV-video, Charles Darwin Seamounts, Cape Verdes
NASA Astrophysics Data System (ADS)
Kwasnitschka, T.; Hansteen, T. H.; Kutterolf, S.; Freundt, A.; Devey, C. W.
2011-12-01
As well as providing well-localized samples, Remotely Operated Vehicles (ROVs) produce huge quantities of visual data whose potential for geological data mining has seldom if ever been fully realized. We present a new workflow to derive essential results of field geology such as quantitative stratigraphy and tectonic surveying from ROV-based photo and video material. We demonstrate the procedure on the Charles Darwin Seamounts, a field of small hot spot volcanoes recently identified at a depth of ca. 3500m southwest of the island of Santo Antao in the Cape Verdes. The Charles Darwin Seamounts feature a wide spectrum of volcanic edifices with forms suggestive of scoria cones, lava domes, tuff rings and maar-type depressions, all of comparable dimensions. These forms, coupled with the highly fragmented volcaniclastic samples recovered by dredging, motivated surveying parts of some edifices down to centimeter scale. ROV-based surveys yielded volcaniclastic samples of key structures linked by extensive coverage of stereoscopic photographs and high-resolution video. Based upon the latter, we present our workflow to derive three-dimensional models of outcrops from a single-camera video sequence, allowing quantitative measurements of fault orientation, bedding structure, grain size distribution and photo mosaicking within a geo-referenced framework. With this information we can identify episodes of repetitive eruptive activity at individual volcanic centers and see changes in eruptive style over time, which, despite their proximity to each other, is highly variable.
Retinal health information and notification system (RHINO)
NASA Astrophysics Data System (ADS)
Dashtbozorg, Behdad; Zhang, Jiong; Abbasi-Sureshjani, Samaneh; Huang, Fan; ter Haar Romeny, Bart M.
2017-03-01
The retinal vasculature is the only part of the blood circulation system that can be observed non-invasively using fundus cameras. Changes in the dynamic properties of retinal blood vessels are associated with many systemic and vascular diseases, such as hypertension, coronary heart disease and diabetes. The assessment of the characteristics of the retinal vascular network provides important information for an early diagnosis and prognosis of many systemic and vascular diseases. The manual analysis of the retinal vessels and measurement of quantitative biomarkers in large-scale screening programs is a tedious task, time-consuming and costly. This paper describes a reliable, automated, and efficient retinal health information and notification system (acronym RHINO) which can extract a wealth of geometric biomarkers in large volumes of fundus images. The fully automated software presented in this paper includes vessel enhancement and segmentation, artery/vein classification, optic disc, fovea, and vessel junction detection, and bifurcation/crossing discrimination. Pipelining these tools allows the assessment of several quantitative vascular biomarkers: width, curvature, bifurcation geometry features and fractal dimension. The brain-inspired algorithms outperform most of the state-of-the-art techniques. Moreover, several annotation tools are implemented in RHINO for the manual labeling of arteries and veins, marking optic disc and fovea, and delineating vessel centerlines. The validation phase is ongoing and the software is currently being used for the analysis of retinal images from the Maastricht study (the Netherlands) which includes over 10,000 subjects (healthy and diabetic) with a broad spectrum of clinical measurements
Informal information for web-based engineering catalogues
NASA Astrophysics Data System (ADS)
Allen, Richard D.; Culley, Stephen J.; Hicks, Ben J.
2001-10-01
Success is highly dependent on the ability of a company to efficiently produce optimal designs. In order to achieve this companies must minimize time to market and possess the ability to make fully informed decisions at the early phase of the design process. Such decisions may include the choice of component and suppliers, as well as cost and maintenance considerations. Computer modeling and electronic catalogues are becoming the preferred medium for the selection and design of mechanical components. In utilizing these techniques, the designer demands the capability to identify, evaluate and select mechanical components both quantitatively and qualitatively. Quantitative decisions generally encompass performance data included in the formal catalogue representation. It is in the area of qualitative decisions that the use of what the authors call 'Informal Information' is of crucial importance. Thus, 'Informal Information' must often be incorporated into the selection process and selection systems. This would enable more informed decisions to be made quicker, without the need for information retrieval via discussion with colleagues in the design environment. This paper provides an overview of the use of electronic information in the design of mechanical systems, including a discussion of limitations of current technology. The importance of Informal Information is discussed and the requirements for association with web based electronic catalogues are developed. This system is based on a flexible XML schema and enables the storage, classification and recall of Informal Information packets. Furthermore, a strategy for the inclusion of Informal Information is proposed, and an example case is used to illustrate the benefits.
Neuronavigation using three-dimensional proton magnetic resonance spectroscopy data.
Kanberoglu, Berkay; Moore, Nina Z; Frakes, David; Karam, Lina J; Debbins, Josef P; Preul, Mark C
2014-01-01
Applications in clinical medicine can benefit from fusion of spectroscopy data with anatomical imagery. For example, new 3-dimensional (3D) spectroscopy techniques allow for improved correlation of metabolite profiles with specific regions of interest in anatomical tumor images, which can be useful in characterizing and treating heterogeneous tumors that appear structurally homogeneous. We sought to develop a clinical workflow and uniquely capable custom software tool to integrate advanced 3-tesla 3D proton magnetic resonance spectroscopic imaging ((1)H-MRSI) into industry standard image-guided neuronavigation systems, especially for use in brain tumor surgery. (1)H-MRSI spectra from preoperative scanning on 15 patients with recurrent or newly diagnosed meningiomas were processed and analyzed, and specific voxels were selected based on their chemical contents. 3D neuronavigation overlays were then generated and applied to anatomical image data in the operating room. The proposed 3D methods fully account for scanner calibration and comprise tools that we have now made publicly available. The new methods were quantitatively validated through a phantom study and applied successfully to mitigate biopsy uncertainty in a clinical study of meningiomas. The proposed methods improve upon the current state of the art in neuronavigation through the use of detailed 3D (1)H-MRSI data. Specifically, 3D MRSI-based overlays provide comprehensive, quantitative visual cues and location information during neurosurgery, enabling a progressive new form of online spectroscopy-guided neuronavigation. © 2014 S. Karger AG, Basel.
Pramanick, Abhijit; Shapiro, Steve M.; Glavic, Artur; ...
2015-10-14
In this study, ferromagnetic shape memory alloys (FSMAs) have shown great potential as active components in next generation smart devices due to their exceptionally large magnetic-field-induced strains and fast response times. During application of magnetic fields in FSMAs, as is common in several magnetoelastic smart materials, there occurs simultaneous rotation of magnetic moments and reorientation of twin variants, resolving which, although critical for design of new materials and devices, has been difficult to achieve quantitatively with current characterization methods. At the same time, theoretical modeling of these phenomena also faced limitations due to uncertainties in values of physical properties suchmore » as magnetocrystalline anisotropy energy (MCA), especially for off-stoichiometric FSMA compositions. Here, in situ polarized neutron diffraction is used to measure directly the extents of both magnetic moments rotation and crystallographic twin-reorientation in an FSMA single crystal during the application of magnetic fields. Additionally, high-resolution neutron scattering measurements and first-principles calculations based on fully relativistic density functional theory are used to determine accurately the MCA for the compositionally disordered alloy of Ni 2Mn 1.14Ga 0.86. The results from these state-of-the-art experiments and calculations are self-consistently described within a phenomenological framework, which provides quantitative insights into the energetics of magnetostructural coupling in FSMAs. Based on the current model, the energy for magnetoelastic twin boundaries propagation for the studied alloy is estimated to be ~150kJ/m 3.« less
NASA Astrophysics Data System (ADS)
Dan, Li; Guo, Li-Xin; Li, Jiang-Ting; Chen, Wei; Yan, Xu; Huang, Qing-Qing
2017-09-01
The expression of complex dielectric permittivity for non-magnetized fully ionized dusty plasma is obtained based on the kinetic equation in the Fokker-Planck-Landau collision model and the charging equation of the statistical theory. The influences of density, average size of dust grains, and balanced charging of the charge number of dust particles on the attenuation properties of electromagnetic waves in fully ionized dusty plasma are investigated by calculating the attenuation constant. In addition, the attenuation characteristics of weakly ionized and fully ionized dusty plasmas are compared. Results enriched the physical mechanisms of microwave attenuation for fully ionized dusty plasma and provide a theoretical basis for future studies.
Cook, Linda; Ng, Ka-Wing; Bagabag, Arthur; Corey, Lawrence; Jerome, Keith R.
2004-01-01
Hepatitis C virus (HCV) infection is an increasing health problem worldwide. Quantitative assays for HCV viral load are valuable in predicting response to therapy and for following treatment efficacy. Unfortunately, most quantitative tests for HCV RNA are limited by poor sensitivity. We have developed a convenient, highly sensitive real-time reverse transcription-PCR assay for HCV RNA. The assay amplifies a portion of the 5′ untranslated region of HCV, which is then quantitated using the TaqMan 7700 detection system. Extraction of viral RNA for our assay is fully automated with the MagNA Pure LC extraction system (Roche). Our assay has a 100% detection rate for samples containing 50 IU of HCV RNA/ml and is linear up to viral loads of at least 109 IU/ml. The assay detects genotypes 1a, 2a, and 3a with equal efficiency. Quantitative results by our assay correlate well with HCV viral load as determined by the Bayer VERSANT HCV RNA 3.0 bDNA assay. In clinical use, our assay is highly reproducible, with high and low control specimens showing a coefficient of variation for the logarithmic result of 2.8 and 7.0%, respectively. The combination of reproducibility, extreme sensitivity, and ease of performance makes this assay an attractive option for routine HCV viral load testing. PMID:15365000
Competency for graviresponse in the leaf-sheath pulvinus of Avena sativa: onset to loss
NASA Technical Reports Server (NTRS)
Brock, T. G.; Kaufman, P. B.
1988-01-01
The development of the leaf-sheath pulvinus of oat (Avena sativa L. cv. Victory) was studied in terms of its competency to respond to gravistimulation. Stages of onset of competency, maximum competency and loss of competency were identified, using the length of the supertending internode as a developmental marker. During the early phases in the onset of competency, the latency period between stimulus and graviresponse decreased and the steady state response rate increased significantly. When fully competent, the latency period remained constant as the plant continued to develop, suggesting that the latency period is relatively insensitive to quantitative changes (e.g., in carbohydrate or nutrient availability) at the cell level within the plant. In contrast, the response rate was found to increase with plant development, indicating that graviresponse rate is more strongly influenced by quantitative cellular changes. The total possible graviresponse of a single oat pulvinus was confirmed to be significantly less than the original presentation angle. This was shown to not result from a loss of competency, since the graviresponse could be reinitiated by increasing the presentation angle. As a result of the low overall graviresponse of individual pulvini, two or more pulvini are required to bring the plant apex to the vertical. This was determined to occur though the sequential, rather than simultaneous, action of successive pulvini, since a given pulvinus lost competency to gravirespond shortly after the next pulvinus became fully competent.
Hazrati, Sadegh; Harrad, Stuart
2007-03-01
PUF disk passive air samplers are increasingly employed for monitoring of POPs in ambient air. In order to utilize them as quantitative sampling devices, a calibration experiment was conducted. Time integrated indoor air concentrations of PCBs and PBDEs were obtained from a low volume air sampler operated over a 50 d period alongside the PUF disk samplers in the same office microenvironment. Passive sampling rates for the fully-sheltered sampler design employed in our research were determined for the 51 PCB and 7 PBDE congeners detected in all calibration samples. These values varied from 0.57 to 1.55 m3 d(-1) for individual PCBs and from 1.1 to 1.9 m3 d(-1) for PBDEs. These values are appreciably lower than those reported elsewhere for different PUF disk sampler designs (e.g. partially sheltered) employed under different conditions (e.g. in outdoor air), and derived using different calibration experiment configurations. This suggests that sampling rates derived for a specific sampler configuration deployed under specific environmental conditions, should not be extrapolated to different sampler configurations. Furthermore, our observation of variable congener-specific sampling rates (consistent with other studies), implies that more research is required in order to understand fully the factors that influence sampling rates. Analysis of wipe samples taken from the inside of the sampler housing, revealed evidence that the housing surface scavenges particle bound PBDEs.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-01
... emissions, (applicants are encouraged to provide quantitative information regarding expected reductions in...). Applicants are encouraged to provide quantitative information that validates the existence of substantial... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...
NASA Astrophysics Data System (ADS)
Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu
2018-04-01
The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.
NASA Technical Reports Server (NTRS)
Berger, Karen T.
2008-01-01
An experimental wind tunnel program is being conducted in support of a NASA wide effort to develop a Space Shuttle replacement and to support the Agency s long term objective of returning to the Moon and Mars. This report documents experimental measurements made on several scaled ceramic heat transfer models of the proposed Crew Exploration Vehicle Crew Module. The experimental data highlighted in this test report are to be used to assess numerical tools that will be used to generate the flight aerothermodynamic database. Global heat transfer images and heat transfer distributions were obtained over a range of freestream Reynolds numbers and angles of attack with the phosphor thermography technique. Heat transfer data were measured on the forebody and afterbody and were used to infer the heating on the vehicle as well as the boundary layer state on the forebody surface. Several model support configurations were assessed to minimize potential support interference. In addition, the ability of the global phosphor thermography method to provide quantitative heating measurements in the low temperature environment of the capsule base region was assessed. While naturally fully developed turbulent levels were not obtained on the forebody, the use of boundary layer trips generated fully developed turbulent flow. Laminar and turbulent computational results were shown to be in good agreement with the data. Backshell testing demonstrated the ability to obtain data in the low temperature region as well as demonstrating the lack of significant model support hardware influence on heating.
Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro
2013-02-01
The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Berger, Karen T.
2009-01-01
An experimental wind tunnel program is being conducted in support of a NASA wide effort to develop a Space Shuttle replacement and to support the Agency s long term objective of returning to the Moon and Mars. This article documents experimental measurements made on several scaled ceramic heat transfer models of the proposed Crew Exploration Vehicle Crew Module. The experimental data highlighted in this article are to be used to assess numerical tools that will be used to generate the flight aerothermodynamic database. Global heat transfer images and heat transfer distributions were obtained over a range of freestream Reynolds numbers and angles of attack with the phosphor thermography technique. Heat transfer data were measured on the forebody and afterbody and were used to infer the heating on the vehicle as well as the boundary layer state on the forebody surface. Several model support configurations were assessed to minimize potential support interference. In addition, the ability of the global phosphor thermography method to provide quantitative heating measurements in the low temperature environment of the capsule base region was assessed. While naturally fully developed turbulent levels were not obtained on the forebody, the use of boundary layer trips generated fully developed turbulent flow. Laminar and turbulent computational results were shown to be in good agreement with the data. Backshell testing demonstrated the ability to obtain data in the low temperature region as well as demonstrating the lack of significant model support hardware influence on heating.
Paul, Lorna; Coulter, Elaine H; Miller, Linda; McFadyen, Angus; Dorfman, Joe; Mattison, Paul George G
2014-09-01
To explore the effectiveness and participant experience of web-based physiotherapy for people moderately affected with Multiple Sclerosis (MS) and to provide data to establish the sample size required for a fully powered, definitive randomized controlled study. A randomized controlled pilot study. Rehabilitation centre and participants' homes. Thirty community dwelling adults moderately affected by MS (Expanded Disability Status Scale 5-6.5). Twelve weeks of individualised web-based physiotherapy completed twice per week or usual care (control). Online exercise diaries were monitored; participants were telephoned weekly by the physiotherapist and exercise programmes altered remotely by the physiotherapist as required. The following outcomes were completed at baseline and after 12 weeks; 25 Foot Walk, Berg Balance Scale, Timed Up and Go, Multiple Sclerosis Impact Scale, Leeds MS Quality of Life Scale, MS-Related Symptom Checklist and Hospital Anxiety and Depression Scale. The intervention group also completed a website evaluation questionnaire and interviews. Participants reported that website was easy to use, convenient, and motivating and would be happy to use in the future. There was no statistically significant difference in the primary outcome measure, the timed 25ft walk in the intervention group (P=0.170), or other secondary outcome measures, except the Multiple Sclerosis Impact Scale (P=0.048). Effect sizes were generally small to moderate. People with MS were very positive about web-based physiotherapy. The results suggested that 80 participants, 40 in each group, would be sufficient for a fully powered, definitive randomized controlled trial. © The Author(s) 2014.
Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja
2015-02-01
The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative and quantitative MG-based approaches, implying that the current assessment might overestimate breast density with MG.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
... provide quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as... provide quantitative information that validates the existence of substantial transportation-related costs... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...
Shafaati, A; Clark, B J
2000-03-01
The development of a stability-indicating capillary zone electrophoresis (CZE) method for the determination of the drug azathioprine (AZA) and its related substances in bulk and dosage forms is described. Theophylline was used as an internal standard to improve quantitative results. The method was fully validated in terms of repeatability (n = 10, RSD for migration time and peak area ratio were 0.15% and 0.60%, respectively), reproducibility (n = 5, RSD of peak area ratio was 0.84%), linearity at two ranges of the azathioprine concentration, limits of detection (LOD) and quantitation (LOQ), and robustness. The method was applied for determination of the drug in bulk and a commercial tablet dosage form (recovery 98.3-101.3%) and in powder for injection (recovery 98.7-100.6%). The method was fast and reliable for the analysis of AZA and its related substances in bulk and dosage forms.
Acne in adolescents: quality of life, self-esteem, mood, and psychological disorders.
Dunn, Lauren K; O'Neill, Jenna L; Feldman, Steven R
2011-01-15
Acne is a significant adolescent problem and may precipitate emotional and psychological effects. The impact of acne on psychological parameters and implications for acne treatment are not fully understood. We performed a MEDLINE search using the terms "acne" and "adolescent" along with "psychological," "depression," or "psychiatric," which yielded 16 reviewed studies. Qualitative review of the selected articles revealed that the presence of acne has a significant impact on self-esteem and quality of life. Depression and other psychological disorders are more prevalent in acne patients and acne treatment may improve symptoms of these disorders. The reviewed studies were semi-quantitative analyses utilizing various standardized surveys or questionnaires. Therefore, quantitative analysis of selected studies was not possible. The presence of co-morbid psychological disorders should be considered in the treatment of acne patients and future prospective trials are needed to assess the impact of treatment on psychological outcomes.
Electrochemical Detection in Stacked Paper Networks.
Liu, Xiyuan; Lillehoj, Peter B
2015-08-01
Paper-based electrochemical biosensors are a promising technology that enables rapid, quantitative measurements on an inexpensive platform. However, the control of liquids in paper networks is generally limited to a single sample delivery step. Here, we propose a simple method to automate the loading and delivery of liquid samples to sensing electrodes on paper networks by stacking multiple layers of paper. Using these stacked paper devices (SPDs), we demonstrate a unique strategy to fully immerse planar electrodes by aqueous liquids via capillary flow. Amperometric measurements of xanthine oxidase revealed that electrochemical sensors on four-layer SPDs generated detection signals up to 75% higher compared with those on single-layer paper devices. Furthermore, measurements could be performed with minimal user involvement and completed within 30 min. Due to its simplicity, enhanced automation, and capability for quantitative measurements, stacked paper electrochemical biosensors can be useful tools for point-of-care testing in resource-limited settings. © 2015 Society for Laboratory Automation and Screening.
Numerical Investigation of Vertical Plunging Jet Using a Hybrid Multifluid–VOF Multiphase CFD Solver
Shonibare, Olabanji Y.; Wardle, Kent E.
2015-06-28
A novel hybrid multiphase flow solver has been used to conduct simulations of a vertical plunging liquid jet. This solver combines a multifluid methodology with selective interface sharpening to enable simulation of both the initial jet impingement and the long-time entrained bubble plume phenomena. Models are implemented for variable bubble size capturing and dynamic switching of interface sharpened regions to capture transitions between the initially fully segregated flow types into the dispersed bubbly flow regime. It was found that the solver was able to capture the salient features of the flow phenomena under study and areas for quantitative improvement havemore » been explored and identified. In particular, a population balance approach is employed and detailed calibration of the underlying models with experimental data is required to enable quantitative prediction of bubble size and distribution to capture the transition between segregated and dispersed flow types with greater fidelity.« less
Morin, Fanny; Courtecuisse, Hadrien; Reinertsen, Ingerid; Le Lann, Florian; Palombi, Olivier; Payan, Yohan; Chabanas, Matthieu
2017-08-01
During brain tumor surgery, planning and guidance are based on preoperative images which do not account for brain-shift. However, this deformation is a major source of error in image-guided neurosurgery and affects the accuracy of the procedure. In this paper, we present a constraint-based biomechanical simulation method to compensate for craniotomy-induced brain-shift that integrates the deformations of the blood vessels and cortical surface, using a single intraoperative ultrasound acquisition. Prior to surgery, a patient-specific biomechanical model is built from preoperative images, accounting for the vascular tree in the tumor region and brain soft tissues. Intraoperatively, a navigated ultrasound acquisition is performed directly in contact with the organ. Doppler and B-mode images are recorded simultaneously, enabling the extraction of the blood vessels and probe footprint, respectively. A constraint-based simulation is then executed to register the pre- and intraoperative vascular trees as well as the cortical surface with the probe footprint. Finally, preoperative images are updated to provide the surgeon with images corresponding to the current brain shape for navigation. The robustness of our method is first assessed using sparse and noisy synthetic data. In addition, quantitative results for five clinical cases are provided, first using landmarks set on blood vessels, then based on anatomical structures delineated in medical images. The average distances between paired vessels landmarks ranged from 3.51 to 7.32 (in mm) before compensation. With our method, on average 67% of the brain-shift is corrected (range [1.26; 2.33]) against 57% using one of the closest existing works (range [1.71; 2.84]). Finally, our method is proven to be fully compatible with a surgical workflow in terms of execution times and user interactions. In this paper, a new constraint-based biomechanical simulation method is proposed to compensate for craniotomy-induced brain-shift. While being efficient to correct this deformation, the method is fully integrable in a clinical process. Copyright © 2017 Elsevier B.V. All rights reserved.
McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A
2016-08-01
Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.
NASA Astrophysics Data System (ADS)
Tian, Y.; Zheng, Y.; Zheng, C.; Han, F., Sr.
2017-12-01
Physically based and fully-distributed integrated hydrological models (IHMs) can quantitatively depict hydrological processes, both surface and subsurface, with sufficient spatial and temporal details. However, the complexity involved in pre-processing data and setting up models seriously hindered the wider application of IHMs in scientific research and management practice. This study introduces our design and development of Visual HEIFLOW, hereafter referred to as VHF, a comprehensive graphical data processing and modeling system for integrated hydrological simulation. The current version of VHF has been structured to accommodate an IHM named HEIFLOW (Hydrological-Ecological Integrated watershed-scale FLOW model). HEIFLOW is a model being developed by the authors, which has all typical elements of physically based and fully-distributed IHMs. It is based on GSFLOW, a representative integrated surface water-groundwater model developed by USGS. HEIFLOW provides several ecological modules that enable to simulate growth cycle of general vegetation and special plants (maize and populus euphratica). VHF incorporates and streamlines all key steps of the integrated modeling, and accommodates all types of GIS data necessary to hydrological simulation. It provides a GIS-based data processing framework to prepare an IHM for simulations, and has functionalities to flexibly display and modify model features (e.g., model grids, streams, boundary conditions, observational sites, etc.) and their associated data. It enables visualization and various spatio-temporal analyses of all model inputs and outputs at different scales (i.e., computing unit, sub-basin, basin, or user-defined spatial extent). The above system features, as well as many others, can significantly reduce the difficulty and time cost of building and using a complex IHM. The case study in the Heihe River Basin demonstrated the applicability of VHF for large scale integrated SW-GW modeling. Visualization and spatial-temporal analysis of the modeling results by HEIFLOW greatly facilitates our understanding on the complicated hydrologic cycle and relationship among the hydrological and ecological variables in the study area, and provides insights into the regional water resources management.
Critical Quantitative Inquiry in Context
ERIC Educational Resources Information Center
Stage, Frances K.; Wells, Ryan S.
2014-01-01
This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…
Fully nonlinear theory of transcritical shallow-water flow past topography
NASA Astrophysics Data System (ADS)
El, Gennady; Grimshaw, Roger; Smyth, Noel
2010-05-01
In this talk recent results on the generation of undular bores in one-dimensional fully nonlinear shallow-water flows past localised topographies will be presented. The description is made in the framework of the forced Su-Gardner (a.k.a. 1D Green-Naghdi) system of equations, with a primary focus on the transcritical regime when the Froude number of the oncoming flow is close to unity. A combination of the local transcritical hydraulic solution over the localized topography, which produces upstream and downstream hydraulic jumps, and unsteady undular bore solutions describing the resolution of these hydraulic jumps, is used to describe various flow regimes depending on the combination of the topography height and the Froude number. We take advantage of the recently developed modulation theory of Su-Gardner undular bores to derive the main parameters of transcritical fully nonlinear shallow-water flow, such as the leading solitary wave amplitudes for the upstream and downstream undular bores, the speeds of the undular bores edges and the drag force. Our results confirm that most of the features of the previously developed description in the framework of the uni-directional forced KdV model hold up qualitatively for finite amplitude waves, while the quantitative description can be obtained in the framework of the bi-directional forced Su-Gardner system.
Quantitative evaluation methods of skin condition based on texture feature parameters.
Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing
2017-03-01
In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.
Brady, Jacob S.; Romano-Keeler, Joann; Drake, Wonder P.; Norris, Patrick R.; Jenkins, Judith M.; Isaacs, Richard J.; Boczko, Erik M.
2015-01-01
BACKGROUND: Ventilator-associated pneumonia (VAP) remains a common complication in critically ill surgical patients, and its diagnosis remains problematic. Exhaled breath contains aerosolized droplets that reflect the lung microbiota. We hypothesized that exhaled breath condensate fluid (EBCF) in hygroscopic condenser humidifier/heat and moisture exchanger (HCH/HME) filters would contain bacterial DNA that qualitatively and quantitatively correlate with pathogens isolated from quantitative BAL samples obtained for clinical suspicion of pneumonia. METHODS: Forty-eight adult patients who were mechanically ventilated and undergoing quantitative BAL (n = 51) for suspected pneumonia in the surgical ICU were enrolled. Per protocol, patients fulfilling VAP clinical criteria undergo quantitative BAL bacterial culture. Immediately prior to BAL, time-matched HCH/HME filters were collected for study of EBCF by real-time polymerase chain reaction. Additionally, convenience samples of serially collected filters in patients with BAL-diagnosed VAP were analyzed. RESULTS: Forty-nine of 51 time-matched EBCF/BAL fluid samples were fully concordant (concordance > 95% by κ statistic) relative to identified pathogens and strongly correlated with clinical cultures. Regression analysis of quantitative bacterial DNA in paired samples revealed a statistically significant positive correlation (r = 0.85). In a convenience sample, qualitative and quantitative polymerase chain reaction analysis of serial HCH/HME samples for bacterial DNA demonstrated an increase in load that preceded the suspicion of pneumonia. CONCLUSIONS: Bacterial DNA within EBCF demonstrates a high correlation with BAL fluid and clinical cultures. Bacterial DNA within EBCF increases prior to the suspicion of pneumonia. Further study of this novel approach may allow development of a noninvasive tool for the early diagnosis of VAP. PMID:25474571
smiFISH and FISH-quant - a flexible single RNA detection approach with super-resolution capability.
Tsanov, Nikolay; Samacoits, Aubin; Chouaib, Racha; Traboulsi, Abdel-Meneem; Gostan, Thierry; Weber, Christian; Zimmer, Christophe; Zibara, Kazem; Walter, Thomas; Peter, Marion; Bertrand, Edouard; Mueller, Florian
2016-12-15
Single molecule FISH (smFISH) allows studying transcription and RNA localization by imaging individual mRNAs in single cells. We present smiFISH (single molecule inexpensive FISH), an easy to use and flexible RNA visualization and quantification approach that uses unlabelled primary probes and a fluorescently labelled secondary detector oligonucleotide. The gene-specific probes are unlabelled and can therefore be synthesized at low cost, thus allowing to use more probes per mRNA resulting in a substantial increase in detection efficiency. smiFISH is also flexible since differently labelled secondary detector probes can be used with the same primary probes. We demonstrate that this flexibility allows multicolor labelling without the need to synthesize new probe sets. We further demonstrate that the use of a specific acrydite detector oligonucleotide allows smiFISH to be combined with expansion microscopy, enabling the resolution of transcripts in 3D below the diffraction limit on a standard microscope. Lastly, we provide improved, fully automated software tools from probe-design to quantitative analysis of smFISH images. In short, we provide a complete workflow to obtain automatically counts of individual RNA molecules in single cells. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Translating the Simulation of Procedural Drilling Techniques for Interactive Neurosurgical Training
Stredney, Don; Rezai, Ali R.; Prevedello, Daniel M.; Elder, J. Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J.
2014-01-01
Background Through previous and concurrent efforts, we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. This volumetric data helps drive an interactive multi-sensory, i.e., visual (stereo), aural (stereo), and tactile simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the CNS simulation initiative. Objective The goal of this multi-level development is to deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. Methods We discuss issues of biofidelity as well as our methods to provide objective, quantitative automated assessment for the residents. Results We conclude with a discussion of our experiences by reporting on preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. Conclusion We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum. PMID:24051887
Shear Wave Velocity Imaging Using Transient Electrode Perturbation: Phantom and ex vivo Validation
Varghese, Tomy; Madsen, Ernest L.
2011-01-01
This paper presents a new shear wave velocity imaging technique to monitor radio-frequency and microwave ablation procedures, coined electrode vibration elastography. A piezoelectric actuator attached to an ablation needle is transiently vibrated to generate shear waves that are tracked at high frame rates. The time-to-peak algorithm is used to reconstruct the shear wave velocity and thereby the shear modulus variations. The feasibility of electrode vibration elastography is demonstrated using finite element models and ultrasound simulations, tissue-mimicking phantoms simulating fully (phantom 1) and partially ablated (phantom 2) regions, and an ex vivo bovine liver ablation experiment. In phantom experiments, good boundary delineation was observed. Shear wave velocity estimates were within 7% of mechanical measurements in phantom 1 and within 17% in phantom 2. Good boundary delineation was also demonstrated in the ex vivo experiment. The shear wave velocity estimates inside the ablated region were higher than mechanical testing estimates, but estimates in the untreated tissue were within 20% of mechanical measurements. A comparison of electrode vibration elastography and electrode displacement elastography showed the complementary information that they can provide. Electrode vibration elastography shows promise as an imaging modality that provides ablation boundary delineation and quantitative information during ablation procedures. PMID:21075719
NASA Astrophysics Data System (ADS)
Marx, K. D.; Edwards, C. F.
1992-12-01
The effect of the single-particle constraint on the response of phase-Doppler instruments is determined for particle flows which are spatially nonuniform and time-dependent. Poisson statistics are applied to particle positions and arrival times within the phase-Doppler probe volume to determine the probability that a particle is measured successfully. It is shown that the single-particle constraint can be viewed as applying spatial and temporal filters to the particle flow. These filters have the same meaning as those that were defined previously for uniform, steady-state sprays, but in space- and time-dependent form. Criteria are developed for determining when a fully inhomogeneous analysis of a flow is required and when a quasi-steady analysis will suffice. A new bias due to particle arrival time displacement is identified and the conditions under which it must be considered are established. The present work provides the means to rigorously investigate the response of phase-Doppler measurement systems to transient sprays such as those which occur in diesel engines. To this end, the results are applied to a numerical simulation of a diesel spray. The calculated hypothetical response of the ideal instrument provides a quantitative demonstration of the regimes within which measurements can accurately be made in such sprays.
Fitts Cochrane, Jean; Lonsdorf, Eric; Allison, Taber D; Sanders-Reed, Carol A
2015-09-01
Challenges arise when renewable energy development triggers "no net loss" policies for protected species, such as where wind energy facilities affect Golden Eagles in the western United States. When established mitigation approaches are insufficient to fully avoid or offset losses, conservation goals may still be achievable through experimental implementation of unproven mitigation methods provided they are analyzed within a framework that deals transparently and rigorously with uncertainty. We developed an approach to quantify and analyze compensatory mitigation that (1) relies on expert opinion elicited in a thoughtful and structured process to design the analysis (models) and supplement available data, (2) builds computational models as hypotheses about cause-effect relationships, (3) represents scientific uncertainty in stochastic model simulations, (4) provides probabilistic predictions of "relative" mortality with and without mitigation, (5) presents results in clear formats useful to applying risk management preferences (regulatory standards) and selecting strategies and levels of mitigation for immediate action, and (6) defines predictive parameters in units that could be monitored effectively, to support experimental adaptive management and reduction in uncertainty. We illustrate the approach with a case study characterized by high uncertainty about underlying biological processes and high conservation interest: estimating the quantitative effects of voluntary strategies to abate lead poisoning in Golden Eagles in Wyoming due to ingestion of spent game hunting ammunition.
Statistical label fusion with hierarchical performance models
Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.
2014-01-01
Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809
Magnetic small-angle neutron scattering of bulk ferromagnets.
Michels, Andreas
2014-09-24
We summarize recent theoretical and experimental work in the field of magnetic small-angle neutron scattering (SANS) of bulk ferromagnets. The response of the magnetization to spatially inhomogeneous magnetic anisotropy and magnetostatic stray fields is computed using linearized micromagnetic theory, and the ensuing spin-misalignment SANS is deduced. Analysis of experimental magnetic-field-dependent SANS data of various nanocrystalline ferromagnets corroborates the usefulness of the approach, which provides important quantitative information on the magnetic-interaction parameters such as the exchange-stiffness constant, the mean magnetic anisotropy field, and the mean magnetostatic field due to jumps ΔM of the magnetization at internal interfaces. Besides the value of the applied magnetic field, it turns out to be the ratio of the magnetic anisotropy field Hp to ΔM, which determines the properties of the magnetic SANS cross-section of bulk ferromagnets; specifically, the angular anisotropy on a two-dimensional detector, the asymptotic power-law exponent, and the characteristic decay length of spin-misalignment fluctuations. For the two most often employed scattering geometries where the externally applied magnetic field H0 is either perpendicular or parallel to the wave vector k0 of the incoming neutron beam, we provide a compilation of the various unpolarized, half-polarized (SANSPOL), and uniaxial fully-polarized (POLARIS) SANS cross-sections of magnetic materials.
Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng
2010-10-01
Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.
Finite-element modeling and micromagnetic modeling of perpendicular writers
NASA Astrophysics Data System (ADS)
Heinonen, Olle; Bozeman, Steven P.
2006-04-01
We compare finite-element modeling (FEM) and fully micromagnetic modeling results of four prototypical writers for perpendicular recording. In general, the agreement between the two models is quite good in the vicinity of saturated or near-saturated magnetic material, such as the pole tip, for quantities such as the magnetic field, the gradient of the magnetic field and the write width. However, in the vicinity of magnetic material far from saturation, e.g., return pole or trailing edge write shield, there can be large qualitative and quantitative differences.
A random-walk/giant-loop model for interphase chromosomes.
Sachs, R K; van den Engh, G; Trask, B; Yokota, H; Hearst, J E
1995-01-01
Fluorescence in situ hybridization data on distances between defined genomic sequences are used to construct a quantitative model for the overall geometric structure of a human chromosome. We suggest that the large-scale geometry during the G0/G1 part of the cell cycle may consist of flexible chromatin loops, averaging approximately 3 million bp, with a random-walk backbone. A fully explicit, three-parametric polymer model of this random-walk/giant-loop structure can account well for the data. More general models consistent with the data are briefly discussed. PMID:7708711
Analog of small Holstein polaron in hydrogen-bonded amide systems
NASA Astrophysics Data System (ADS)
Alexander, D. M.
1985-01-01
A class of amide-I (C = O stretch) related excitations and their contribution to the spectral function for infrared absorption is determined by use of the Davydov Hamiltonian. The treatment is a fully quantum, finite-temperature one. A consistent picture and a quantitative fit to the absorption data for crystalline acetanilide confirms that the model adequately explains the anomalous behavior cited by Careri et al. The localized excitation responsible for this behavior is the vibronic analog of the small Holstein polaron. The possible extension to other modes and biological relevance is examined.
Rapid video-referenced ratings of reciprocal social behavior in toddlers: A twin study
Marrus, Natasha; Glowinski, Anne L.; Jacob, Theodore; Klin, Ami; Jones, Warren; Drain, Caroline E.; Holzhauer, Kieran E.; Hariprasad, Vaishnavi; Fitzgerald, Rob T.; Mortenson, Erika L.; Sant, Sayli M.; Cole, Lyndsey; Siegel, Satchel A.; Zhang, Yi; Agrawal, Arpana; Heath, Andrew; Constantino, John N.
2015-01-01
Background Reciprocal social behavior (RSB) is a developmental prerequisite for social competency, and deficits in RSB constitute a core feature of autism spectrum disorder (ASD). Although clinical screeners categorically ascertain risk of ASD in early childhood, rapid methods for quantitative measurement of RSB in toddlers are not yet established. Such measurements are critical for tracking developmental trajectories and incremental responses to intervention. Methods We developed and validated a 20-minute video-referenced rating scale, the video-referenced rating of reciprocal social behavior (vrRSB), for untrained caregivers to provide standardized ratings of quantitative variation in RSB. Parents of 252 toddler twins [Monozygotic (MZ)=31 pairs, Dizygotic (DZ)=95 pairs] ascertained through birth records, rated their twins’ RSB at two time points, on average 6 months apart, and completed two developmental measures, the Modified Checklist for Autism in Toddlers (M-CHAT) and the MacArthur Communicative Development Inventory Short Form (MCDI-s). Results Scores on the vrRSB were fully continuously distributed, with excellent 6-month test-retest reliability ([intraclass correlation coefficient] ICC=0.704, p<0.000). MZ twins displayed markedly greater trait concordance than DZ twins, (MZ ICC=0.863, p<0.000, DZ ICC=0.231, p<0.012). VrRSB score distributions were highly distinct for children passing versus failing the M-CHAT (t=−8.588, df=31, p<.000), incrementally improved from 18-24 months, and were inversely correlated with receptive and expressive vocabulary on the MCDI-s. Conclusions Like quantitative autistic trait ratings in school-aged children and adults, toddler scores on the vrRSB are continuously distributed and appear highly heritable. These ratings exhibited minimal measurement error, high inter-individual stability, and developmental progression in RSB as children matured from 18-24 months, supporting their potential utility for serially quantifying the severity of early autistic syndromes over time and in response to intervention. In addition, these findings inform the genetic-environmental structure of RSB in early typical development. PMID:25677414
Four Forms of the Fourier Transform - for Freshmen, using Matlab
NASA Astrophysics Data System (ADS)
Simons, F. J.; Maloof, A. C.
2016-12-01
In 2015, a Fall "Freshman Seminar" at Princeton University (http://geoweb.princeton.edu/people/simons/FRS-SESC.html) taught students to combine field observations of the natural world with quantitative modeling and interpretation, to answer questions like: "How have Earth and human histories been recorded in the geology of Princeton, the Catskills, France and Spain?" (where we took the students on a data-gathering field trip during Fall Break), and "What experiments and analysis can a first-year (possibly non-future-major) do to query such archives of the past?" In the classroom, through problem sets, and around campus, students gained practical experience collecting geological and geophysical data in a geographic context, and analyzing these data using statistical techniques such as regression, time-series and image analysis, with the programming language Matlab. In this presentation I will detail how we instilled basic Matlab skills for quantitative geoscience data analysis through a 6-week progression of topics and exercises. In the 6 weeks after the Fall Break trip, we strengthened these competencies to make our students fully proficient for further learning, as evidenced by their end-of-term independent research work.The particular case study is focused on introducing power-spectral analysis to Freshmen, in a way that even the least quantitative among them could functionally understand. Not counting (0) "inspection", the four ways by which we have successfully instilled the concept of power-spectral analysis in a hands-on fashion are (1) "correlation", (2) "inversion", (3) "stacking", and formal (4) "Fourier transformation". These four provide the main "mappings". Along the way, of course, we also make sure that the students understand that "power-spectral density estimation" is not the same as "Fourier transformation", nor that every Fourier transform has to be "Fast". Hence, concepts from analysis-of-variance techniques, regression, and hypothesis testing, arise in this context, and will be discussed.
NASA Astrophysics Data System (ADS)
Obersteiner, F.; Bönisch, H.; Engel, A.
2016-01-01
We present the characterization and application of a new gas chromatography time-of-flight mass spectrometry instrument (GC-TOFMS) for the quantitative analysis of halocarbons in air samples. The setup comprises three fundamental enhancements compared to our earlier work (Hoker et al., 2015): (1) full automation, (2) a mass resolving power R = m/Δm of the TOFMS (Tofwerk AG, Switzerland) increased up to 4000 and (3) a fully accessible data format of the mass spectrometric data. Automation in combination with the accessible data allowed an in-depth characterization of the instrument. Mass accuracy was found to be approximately 5 ppm in mean after automatic recalibration of the mass axis in each measurement. A TOFMS configuration giving R = 3500 was chosen to provide an R-to-sensitivity ratio suitable for our purpose. Calculated detection limits are as low as a few femtograms by means of the accurate mass information. The precision for substance quantification was 0.15 % at the best for an individual measurement and in general mainly determined by the signal-to-noise ratio of the chromatographic peak. Detector non-linearity was found to be insignificant up to a mixing ratio of roughly 150 ppt at 0.5 L sampled volume. At higher concentrations, non-linearities of a few percent were observed (precision level: 0.2 %) but could be attributed to a potential source within the detection system. A straightforward correction for those non-linearities was applied in data processing, again by exploiting the accurate mass information. Based on the overall characterization results, the GC-TOFMS instrument was found to be very well suited for the task of quantitative halocarbon trace gas observation and a big step forward compared to scanning, quadrupole MS with low mass resolving power and a TOFMS technique reported to be non-linear and restricted by a small dynamical range.
Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuerteltaub, K. W.; Bench, G.; Buchholz, B. A.
The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integratedmore » HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory instruments.4.) Provide high throughput 14C BioAMS analysis for collaborative and service clients.« less
Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turteltaub, K. W.; Bench, G.; Buchholz, B. A.
2016-04-08
The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integratedmore » HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory instruments.4.) Provide high throughput 14C BioAMS analysis for collaborative and service clients.« less
Nyangoga, Hervé; Mercier, Philippe; Libouban, Hélène; Baslé, Michel Félix; Chappard, Daniel
2011-01-01
Background Angiogenesis contributes to proliferation and metastatic dissemination of cancer cells. Anatomy of blood vessels in tumors has been characterized with 2D techniques (histology or angiography). They are not fully representative of the trajectories of vessels throughout the tissues and are not adapted to analyze changes occurring inside the bone marrow cavities. Methodology/Principal Findings We have characterized the vasculature of bone metastases in 3D at different times of evolution of the disease. Metastases were induced in the femur of Wistar rats by a local injection of Walker 256/B cells. Microfil®, (a silicone-based polymer) was injected at euthanasia in the aorta 12, 19 and 26 days after injection of tumor cells. Undecalcified bones (containing the radio opaque vascular casts) were analyzed by microCT, and a first 3D model was reconstructed. Bones were then decalcified and reanalyzed by microCT; a second model (comprising only the vessels) was obtained and overimposed on the former, thus providing a clear visualization of vessel trajectories in the invaded metaphysic allowing quantitative evaluation of the vascular volume and vessel diameter. Histological analysis of the marrow was possible on the decalcified specimens. Walker 256/B cells induced a marked osteolysis with cortical perforations. The metaphysis of invaded bones became progressively hypervascular. New vessels replaced the major central medullar artery coming from the diaphyseal shaft. They sprouted from the periosteum and extended into the metastatic area. The newly formed vessels were irregular in diameter, tortuous with a disorganized architecture. A quantitative analysis of vascular volume indicated that neoangiogenesis increased with the development of the tumor with the appearance of vessels with a larger diameter. Conclusion This new method evidenced the tumor angiogenesis in 3D at different development times of the metastasis growth. Bone and the vascular bed can be identified by a double reconstruction and allowed a quantitative evaluation of angiogenesis upon time. PMID:21464932
A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.
Rutledge, Robert G
2011-03-02
Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.
A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification
Rutledge, Robert G.
2011-01-01
Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812
Pavlovic, Chris; Futamatsu, Hideki; Angiolillo, Dominick J; Guzman, Luis A; Wilke, Norbert; Siragusa, Daniel; Wludyka, Peter; Percy, Robert; Northrup, Martin; Bass, Theodore A; Costa, Marco A
2007-04-01
The purpose of this study is to evaluate the accuracy of semiautomated analysis of contrast enhanced magnetic resonance angiography (MRA) in patients who have undergone standard angiographic evaluation for peripheral vascular disease (PVD). Magnetic resonance angiography is an important tool for evaluating PVD. Although this technique is both safe and noninvasive, the accuracy and reproducibility of quantitative measurements of disease severity using MRA in the clinical setting have not been fully investigated. 43 lesions in 13 patients who underwent both MRA and digital subtraction angiography (DSA) of iliac and common femoral arteries within 6 months were analyzed using quantitative magnetic resonance angiography (QMRA) and quantitative vascular analysis (QVA). Analysis was repeated by a second operator and by the same operator in approximately 1 month time. QMRA underestimated percent diameter stenosis (%DS) compared to measurements made with QVA by 2.47%. Limits of agreement between the two methods were +/- 9.14%. Interobserver variability in measurements of %DS were +/- 12.58% for QMRA and +/- 10.04% for QVA. Intraobserver variability of %DS for QMRA was +/- 4.6% and for QVA was +/- 8.46%. QMRA displays a high level of agreement to QVA when used to determine stenosis severity in iliac and common femoral arteries. Similar levels of interobserver and intraobserver variability are present with each method. Overall, QMRA represents a useful method to quantify severity of PVD.
Liu, Yan; Song, Yang; Madahar, Vipul; Liao, Jiayu
2012-03-01
Förster resonance energy transfer (FRET) technology has been widely used in biological and biomedical research, and it is a very powerful tool for elucidating protein interactions in either dynamic or steady state. SUMOylation (the process of SUMO [small ubiquitin-like modifier] conjugation to substrates) is an important posttranslational protein modification with critical roles in multiple biological processes. Conjugating SUMO to substrates requires an enzymatic cascade. Sentrin/SUMO-specific proteases (SENPs) act as an endopeptidase to process the pre-SUMO or as an isopeptidase to deconjugate SUMO from its substrate. To fully understand the roles of SENPs in the SUMOylation cycle, it is critical to understand their kinetics. Here, we report a novel development of a quantitative FRET-based protease assay for SENP1 kinetic parameter determination. The assay is based on the quantitative analysis of the FRET signal from the total fluorescent signal at acceptor emission wavelength, which consists of three components: donor (CyPet-SUMO1) emission, acceptor (YPet) emission, and FRET signal during the digestion process. Subsequently, we developed novel theoretical and experimental procedures to determine the kinetic parameters, k(cat), K(M), and catalytic efficiency (k(cat)/K(M)) of catalytic domain SENP1 toward pre-SUMO1. Importantly, the general principles of this quantitative FRET-based protease kinetic determination can be applied to other proteases. Copyright © 2011 Elsevier Inc. All rights reserved.
Progress in Fully Automated Abdominal CT Interpretation
Summers, Ronald M.
2016-01-01
OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207
Moazzami-Gudarzi, Mohsen; Adam, Pavel; Smith, Alexander M; Trefalt, Gregor; Szilágyi, István; Maroni, Plinio; Borkovec, Michal
2018-04-04
Direct force measurements involving amidine latex (AL) and sulfate latex (SL) particles in aqueous solutions containing multivalent ferrocyanide anions are presented. These measurements feature three different pairs of particles, namely SL-SL, AL-SL, and AL-AL. The force profiles are quantitatively interpreted in terms of the theory by Derjaguin, Landau, Verwey, and Overbeek (DLVO) that is combined with a short-ranged exponential attraction. In monovalent salt solutions, the AL particles are positively charged, while the SL particles are negatively charged. In solutions containing ferrocyanide, the charge of the AL particles is reversed as the concentration is increased. The longer-ranged component of all force profiles is fully compatible with DLVO theory, provided effects of charge regulation are included. At shorter distances, an additional exponential attraction must be introduced, whereby the respective decay length is about 2 nm for the AL-AL pair, and below 1 nm for the SL-SL pair. This non-DLVO force is intermediate for the asymmetric AL-SL pair. These additional forces are probably related to charge fluctuations, patch-charged interactions, or hydrophobic forces.
Numerical simulation of heat and mass transport during space crystal growth with MEPHISTO
NASA Technical Reports Server (NTRS)
Yao, Minwu; Raman, Raghu; Degroh, Henry C., III
1995-01-01
The MEPHISTO space experiments are collaborative United States and French investigations aimed at understanding the fundamentals of crystal growth. Microgravity experiments were conducted aboard the USMP-1 and -2 missions on STS-52 and 62 in October 1992 and March 1994 respectively. MEPHISTO is a French designed and built Bridgman type furnace which uses the Seebeck technique to monitor the solid/liquid interface temperature and Peltier pulsing to mark the location and shape of the solid/liquid interface. In this paper the Bridgman growth of Sn-Bi and Bi-Sn under terrestrial and microgravity conditions is modeled using the finite element code, FIDAP*. The numerical model considers fully coupled heat and mass transport, fluid motion and solid/liquid phase changes in the crystal growth process. The primary goals of this work are: to provide a quantitative study of the thermal buoyancy-induced convection in the melt for the two flight experiments; to compare the vertical and horizontal growth configurations and systematically evaluate the effects of various gravity levels on the solute segregation. Numerical results of the vertical and horizontal Bridgman growth configurations are presented.
Theory of the milieu dependent isomerisation dynamics of reducing sugars applied to d-erythrose.
Kaufmann, Martin; Mügge, Clemens; Kroh, Lothar W
2015-12-11
Quantitative (1)H selective saturation transfer NMR spectroscopy ((1)H SST qNMR) was used to fully describe the milieu dependent dynamics of the isomeric system of d-erythrose. Thermodynamic activation parameters are calculated for acidic as well as for basic catalysis combining McConnell's modified Bloch equations for the chemical exchange solved for the constraint of saturating the non-hydrated acyclic isomer, the Eyring equation and Hudson's equation for pH dependent catalysis. A detailed mathematical examination describing the milieu dependent dynamics of sugar isomerisation is provided. Thermodynamic data show evidence that photo-catalysed sugar isomerisation as well as degradation has to be considered. Approximations describing the pH and temperature dependence of thermodynamic activation parameters are derived that indicate the possibility of photo-affecting equilibrium constants. Moreover, the results show that isomerisation dynamics are closely related to degradation kinetics and that sugars' reactivities are altered by the concentration of acyclic carbonyl isomer and the sum of its ring closing rate constants. Additionally, it is concluded that sugar solutions show a limited self-stabilising behaviour. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Bergsten, D. E.; Fleeter, S.
1983-01-01
To be of quantitative value to the designer and analyst, it is necessary to experimentally verify the flow modeling and the numerics inherent in calculation codes being developed to predict the three dimensional flow through turbomachine blade rows. This experimental verification requires that predicted flow fields be correlated with three dimensional data obtained in experiments which model the fundamental phenomena existing in the flow passages of modern turbomachines. The Purdue Annular Cascade Facility was designed specifically to provide these required three dimensional data. The overall three dimensional aerodynamic performance of an instrumented classical airfoil cascade was determined over a range of incidence angle values. This was accomplished utilizing a fully automated exit flow data acquisition and analysis system. The mean wake data, acquired at two downstream axial locations, were analyzed to determine the effect of incidence angle, the three dimensionality of the cascade exit flow field, and the similarity of the wake profiles. The hub, mean, and tip chordwise airfoil surface static pressure distributions determined at each incidence angle are correlated with predictions from the MERIDL and TSONIC computer codes.
NASA Astrophysics Data System (ADS)
Li, X.; Guo, F.; Li, G.; Li, H.
2016-12-01
Theories of particle transport and acceleration have shown that fluid compression is the leading mechanism for particle acceleration and plasma energization. However, the role of compression in particle acceleration during magnetic reconnection is unclear. We use two approaches to study this issue. First, using fully kinetic simulations, we quantitatively calculate the effect of compression in energy conversion and particle energization during magnetic reconnection for a range of plasma beta and guide field. We show that compression has an important contribution for the energy conversion between the bulk kinetic energy and the internal energy when the guide field is smaller than the reconnecting component. Based on this result, we then study the large-scale reconnection acceleration by solving the Parker's transport equation in a background reconnecting flow provided by MHD simulations. Due to the compression effect, the simulations suggest fast particle acceleration to high energies in the reconnection layer. This study clarifies the nature of particle acceleration in reconnection layer, and may be important to understand particle acceleration and plasma energization during solar flares.
Fast multiclonal clusterization of V(D)J recombinations from high-throughput sequencing.
Giraud, Mathieu; Salson, Mikaël; Duez, Marc; Villenet, Céline; Quief, Sabine; Caillault, Aurélie; Grardel, Nathalie; Roumier, Christophe; Preudhomme, Claude; Figeac, Martin
2014-05-28
V(D)J recombinations in lymphocytes are essential for immunological diversity. They are also useful markers of pathologies. In leukemia, they are used to quantify the minimal residual disease during patient follow-up. However, the full breadth of lymphocyte diversity is not fully understood. We propose new algorithms that process high-throughput sequencing (HTS) data to extract unnamed V(D)J junctions and gather them into clones for quantification. This analysis is based on a seed heuristic and is fast and scalable because in the first phase, no alignment is performed with germline database sequences. The algorithms were applied to TR γ HTS data from a patient with acute lymphoblastic leukemia, and also on data simulating hypermutations. Our methods identified the main clone, as well as additional clones that were not identified with standard protocols. The proposed algorithms provide new insight into the analysis of high-throughput sequencing data for leukemia, and also to the quantitative assessment of any immunological profile. The methods described here are implemented in a C++ open-source program called Vidjil.
NASA Astrophysics Data System (ADS)
Vedula, Ravi Pramod; Mehrotra, Saumitra; Kubis, Tillmann; Povolotskyi, Michael; Klimeck, Gerhard; Strachan, Alejandro
2015-05-01
We use first principles simulations to engineer Ge nanofins for maximum hole mobility by controlling strain tri-axially through nano-patterning. Large-scale molecular dynamics predict fully relaxed, atomic structures for experimentally achievable nanofins, and orthogonal tight binding is used to obtain the corresponding electronic structure. Hole transport properties are then obtained via a linearized Boltzmann formalism. This approach explicitly accounts for free surfaces and associated strain relaxation as well as strain gradients which are critical for quantitative predictions in nanoscale structures. We show that the transverse strain relaxation resulting from the reduction in the aspect ratio of the fins leads to a significant enhancement in phonon limited hole mobility (7× over unstrained, bulk Ge, and 3.5× over biaxially strained Ge). Maximum enhancement is achieved by reducing the width to be approximately 1.5 times the height and further reduction in width does not result in additional gains. These results indicate significant room for improvement over current-generation Ge nanofins, provide geometrical guidelines to design optimized geometries and insight into the physics behind the significant mobility enhancement.
NASA Astrophysics Data System (ADS)
Zhang, Huifang; Yang, Minghong; Xu, Xueke; Wu, Lunzhe; Yang, Weiguang; Shao, Jianda
2017-10-01
The surface figure control of the conventional annular polishing system is realized ordinarily by the interaction between the conditioner and the lap. The surface profile of the pitch lap corrected by the marble conditioner has been measured and analyzed as a function of kinematics, loading conditions, and polishing time. The surface profile measuring equipment of the large lap based on laser alignment was developed with the accuracy of about 1μm. The conditioning mechanism of the conditioner is simply determined by the kinematics and fully fitting principle, but the unexpected surface profile deviation of the lap emerged frequently due to numerous influencing factors including the geometrical relationship, the pressure distribution at the conditioner/lap interface. Both factors are quantitatively evaluated and described, and have been combined to develop a spatial and temporal model to simulate the surface profile evolution of pitch lap. The simulations are consistent with the experiments. This study is an important step toward deterministic full-aperture annular polishing, providing a beneficial guidance for the surface profile correction of the pitch lap.
Quantitative assessment of human motion using video motion analysis
NASA Technical Reports Server (NTRS)
Probe, John D.
1993-01-01
In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.
Moore, Alex M.; vanMarle, Kristy; Geary, David C.
2016-01-01
Fluency in first graders’ processing of the magnitudes associated with Arabic numerals, collections of objects, and mixtures of objects and numerals predicts current and future mathematics achievement. The quantitative competencies that support the development of fluent processing of magnitude are not fully understood, however. At the beginning and end of preschool (M = 3 years, 9 months at first assessment; range 3 years, 3 months to 4years, 3 months), 112 (51 boys) children completed tasks measuring numeral recognition and comparison, acuity of the approximate number system, and knowledge of counting principles, cardinality, and implicit arithmetic, and completed a magnitude processing task (number sets test) in kindergarten. Use of Bayesian and linear regression techniques revealed that two measures of preschoolers’ cardinal knowledge and their competence at implicit arithmetic predicted later fluency of magnitude processing, controlling domain general factors, preliteracy skills, and parental education. The results help to narrow the search for the early foundation of children’s emerging competence with symbolic mathematics and provide direction for early interventions. PMID:27236038
Moore, Alex M; vanMarle, Kristy; Geary, David C
2016-10-01
Fluency in first graders' processing of the magnitudes associated with Arabic numerals, collections of objects, and mixtures of objects and numerals predicts current and future mathematics achievement. The quantitative competencies that support the development of fluent processing of magnitude, however, are not fully understood. At the beginning and end of preschool (M=3years 9months at first assessment, range=3years 3months to 4years 3months), 112 children (51 boys) completed tasks measuring numeral recognition and comparison, acuity of the approximate number system, and knowledge of counting principles, cardinality, and implicit arithmetic and also completed a magnitude processing task (number sets test) in kindergarten. Use of Bayesian and linear regression techniques revealed that two measures of preschoolers' cardinal knowledge and their competence at implicit arithmetic predicted later fluency of magnitude processing, controlling domain-general factors, preliteracy skills, and parental education. The results help to narrow the search for the early foundation of children's emerging competence with symbolic mathematics and provide direction for early interventions. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Guha, Anirban
2017-11-01
Theoretical studies on linear shear instabilities as well as different kinds of wave interactions often use simple velocity and/or density profiles (e.g. constant, piecewise) for obtaining good qualitative and quantitative predictions of the initial disturbances. Moreover, such simple profiles provide a minimal model to obtain a mechanistic understanding of shear instabilities. Here we have extended this minimal paradigm into nonlinear domain using vortex method. Making use of unsteady Bernoulli's equation in presence of linear shear, and extending Birkhoff-Rott equation to multiple interfaces, we have numerically simulated the interaction between multiple fully nonlinear waves. This methodology is quite general, and has allowed us to simulate diverse problems that can be essentially reduced to the minimal system with interacting waves, e.g. spilling and plunging breakers, stratified shear instabilities (Holmboe, Taylor-Caulfield, stratified Rayleigh), jet flows, and even wave-topography interaction problem like Bragg resonance. We found that the minimal models capture key nonlinear features (e.g. wave breaking features like cusp formation and roll-ups) which are observed in experiments and/or extensive simulations with smooth, realistic profiles.
Finlayson-Pitts, Barbara J
2009-09-28
While particles have significant deleterious impacts on human health, visibility and climate, quantitative understanding of their formation, composition and fates remains problematic. Indeed, in many cases, even qualitative understanding is lacking. One area of particular uncertainty is the nature of particle surfaces and how this determines interactions with gases in the atmosphere, including water, which is important for cloud formation and properties. The focus in this Perspective article is on some chemistry relevant to airborne particles and especially to reactions occurring on their surfaces. The intent is not to provide a comprehensive review, but rather to highlight a few selected examples of interface chemistry involving inorganic and organic species that may be important in the lower atmosphere. This includes sea salt chemistry, nitrate and nitrite ion photochemistry, organics on surfaces and heterogeneous reactions of oxides of nitrogen on proxies for airborne mineral dust and boundary layer surfaces. Emphasis is on the molecular level understanding that can only be gained by fully integrating experiment and theory to elucidate these complex systems.
DNA Detection by Flow Cytometry using PNA-Modified Metal-Organic Framework Particles.
Mejia-Ariza, Raquel; Rosselli, Jessica; Breukers, Christian; Manicardi, Alex; Terstappen, Leon W M M; Corradini, Roberto; Huskens, Jurriaan
2017-03-23
A DNA-sensing platform is developed by exploiting the easy surface functionalization of metal-organic framework (MOF) particles and their highly parallelized fluorescence detection by flow cytometry. Two strategies were employed to functionalize the surface of MIL-88A, using either covalent or non-covalent interactions, resulting in alkyne-modified and biotin-modified MIL-88A, respectively. Covalent surface coupling of an azide-dye and the alkyne-MIL-88A was achieved by means of a click reaction. Non-covalent streptavidin-biotin interactions were employed to link biotin-PNA to biotin-MIL-88A particles mediated by streptavidin. Characterization by confocal imaging and flow cytometry demonstrated that DNA can be bound selectively to the MOF surface. Flow cytometry provided quantitative data of the interaction with DNA. Making use of the large numbers of particles that can be simultaneously processed by flow cytometry, this MOF platform was able to discriminate between fully complementary, single-base mismatched, and randomized DNA targets. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Asghar, Adeel; Lajeunesse, Audrey; Dulla, Kalyan; Combes, Guillaume; Thebault, Philippe; Nigg, Erich A.; Elowe, Sabine
2015-01-01
During mitosis, Bub1 kinase phosphorylates histone H2A-T120 to promote centromere sister chromatid cohesion through recruitment of shugoshin (Sgo) proteins. The regulation and dynamics of H2A-T120 phosphorylation are poorly understood. Using quantitative phosphoproteomics we show that Bub1 is autophosphorylated at numerous sites. We confirm mitosis-specific autophosphorylation of a several residues and show that Bub1 activation is primed in interphase but fully achieved only in mitosis. Mutation of a single autophosphorylation site T589 alters kinetochore turnover of Bub1 and results in uniform H2A-T120 phosphorylation and Sgo recruitment along chromosome arms. Consequently, improper sister chromatid resolution and chromosome segregation errors are observed. Kinetochore tethering of Bub1-T589A refocuses H2A-T120 phosphorylation and Sgo1 to centromeres. Recruitment of the Bub1-Bub3-BubR1 axis to kinetochores has recently been extensively studied. Our data provide novel insight into the regulation and kinetochore residency of Bub1 and indicate that its localization is dynamic and tightly controlled through feedback autophosphorylation. PMID:26399325
Asghar, Adeel; Lajeunesse, Audrey; Dulla, Kalyan; Combes, Guillaume; Thebault, Philippe; Nigg, Erich A; Elowe, Sabine
2015-09-24
During mitosis, Bub1 kinase phosphorylates histone H2A-T120 to promote centromere sister chromatid cohesion through recruitment of shugoshin (Sgo) proteins. The regulation and dynamics of H2A-T120 phosphorylation are poorly understood. Using quantitative phosphoproteomics we show that Bub1 is autophosphorylated at numerous sites. We confirm mitosis-specific autophosphorylation of a several residues and show that Bub1 activation is primed in interphase but fully achieved only in mitosis. Mutation of a single autophosphorylation site T589 alters kinetochore turnover of Bub1 and results in uniform H2A-T120 phosphorylation and Sgo recruitment along chromosome arms. Consequently, improper sister chromatid resolution and chromosome segregation errors are observed. Kinetochore tethering of Bub1-T589A refocuses H2A-T120 phosphorylation and Sgo1 to centromeres. Recruitment of the Bub1-Bub3-BubR1 axis to kinetochores has recently been extensively studied. Our data provide novel insight into the regulation and kinetochore residency of Bub1 and indicate that its localization is dynamic and tightly controlled through feedback autophosphorylation.
NASA Astrophysics Data System (ADS)
Yin, K.; Belonoshko, A. B.; Zhou, H.; Lu, X.
2016-12-01
The melting temperatures of materials in the interior of the Earth has significant implications in many areas of geophysics. The direct calculations of the melting point by atomic simulations would face substantial hysteresis problem. To overcome the hysteresis encountered in the atomic simulations there are a few different melting-point determination methods available nowadays, which are founded independently, such as the free energy method, the two-phase or coexistence method, and the Z method, etc. In this study, we provide a theoretical understanding the relations of these methods from a geometrical perspective based on a quantitative construction of the volume-entropy-energy thermodynamic surface, a model first proposed by J. Willard Gibbs in 1873. Then combining with an experimental data and/or a previous melting-point determination method, we apply this model to derive the high-pressure melting curves for several lower mantle minerals with less computational efforts relative to using previous methods only. Through this way, some polyatomic minerals at extreme pressures which are almost unsolvable before are calculated fully from first principles now.
Multiview point clouds denoising based on interference elimination
NASA Astrophysics Data System (ADS)
Hu, Yang; Wu, Qian; Wang, Le; Jiang, Huanyu
2018-03-01
Newly emerging low-cost depth sensors offer huge potentials for three-dimensional (3-D) modeling, but existing high noise restricts these sensors from obtaining accurate results. Thus, we proposed a method for denoising registered multiview point clouds with high noise to solve that problem. The proposed method is aimed at fully using redundant information to eliminate the interferences among point clouds of different views based on an iterative procedure. In each iteration, noisy points are either deleted or moved to their weighted average targets in accordance with two cases. Simulated data and practical data captured by a Kinect v2 sensor were tested in experiments qualitatively and quantitatively. Results showed that the proposed method can effectively reduce noise and recover local features from highly noisy multiview point clouds with good robustness, compared to truncated signed distance function and moving least squares (MLS). Moreover, the resulting low-noise point clouds can be further smoothed by the MLS to achieve improved results. This study provides the feasibility of obtaining fine 3-D models with high-noise devices, especially for depth sensors, such as Kinect.
New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.
Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G
2012-01-01
This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.
Lee, Hangyeore; Mun, Dong-Gi; So, Jeong Eun; Bae, Jingi; Kim, Hokeun; Masselon, Christophe; Lee, Sang-Won
2016-12-06
Proteomics aims to achieve complete profiling of the protein content and protein modifications in cells, tissues, and biofluids and to quantitatively determine changes in their abundances. This information serves to elucidate cellular processes and signaling pathways and to identify candidate protein biomarkers and/or therapeutic targets. Analyses must therefore be both comprehensive and efficient. Here, we present a novel online two-dimensional reverse-phase/reverse-phase liquid chromatography separation platform, which is based on a newly developed online noncontiguous fractionating and concatenating device (NCFC fractionator). In bottom-up proteomics analyses of a complex proteome, this system provided significantly improved exploitation of the separation space of the two RPs, considerably increasing the numbers of peptides identified compared to a contiguous 2D-RP/RPLC method. The fully automated online 2D-NCFC-RP/RPLC system bypassed a number of labor-intensive manual processes required with the previously described offline 2D-NCFC RP/RPLC method, and thus, it offers minimal sample loss in a context of highly reproducible 2D-RP/RPLC experiments.
Non-linear structure formation in the `Running FLRW' cosmological model
NASA Astrophysics Data System (ADS)
Bibiano, Antonio; Croton, Darren J.
2016-07-01
We present a suite of cosmological N-body simulations describing the `Running Friedmann-Lemaïtre-Robertson-Walker' (R-FLRW) cosmological model. This model is based on quantum field theory in a curved space-time and extends Lambda cold dark matter (ΛCDM) with a time-evolving vacuum density, Λ(z), and time-evolving gravitational Newton's coupling, G(z). In this paper, we review the model and introduce the necessary analytical treatment needed to adapt a reference N-body code. Our resulting simulations represent the first realization of the full growth history of structure in the R-FLRW cosmology into the non-linear regime, and our normalization choice makes them fully consistent with the latest cosmic microwave background data. The post-processing data products also allow, for the first time, an analysis of the properties of the halo and sub-halo populations. We explore the degeneracies of many statistical observables and discuss the steps needed to break them. Furthermore, we provide a quantitative description of the deviations of R-FLRW from ΛCDM, which could be readily exploited by future cosmological observations to test and further constrain the model.
Liu, Yan; Xu, Zhen-Jun
2013-01-01
As a high-risk subindustry involved in construction projects, highway construction safety has experienced major developments in the past 20 years, mainly due to the lack of safe early warnings in Chinese construction projects. By combining the current state of early warning technology with the requirements of the State Administration of Work Safety and using case-based reasoning (CBR), this paper expounds on the concept and flow of highway construction safety early warnings based on CBR. The present study provides solutions to three key issues, index selection, accident cause association analysis, and warning degree forecasting implementation, through the use of association rule mining, support vector machine classifiers, and variable fuzzy qualitative and quantitative change criterion modes, which fully cover the needs of safe early warning systems. Using a detailed description of the principles and advantages of each method and by proving the methods' effectiveness and ability to act together in safe early warning applications, effective means and intelligent technology for a safe highway construction early warning system are established. PMID:24191134
NASA Astrophysics Data System (ADS)
Lash, E. Lara; Schmisseur, John
2017-11-01
Pressure-sensitive paint has been used to evaluate the unsteady dynamics of transitional and turbulent shock wave-boundary layer interactions generated by a vertical cylinder on a flat plate in a Mach 2 freestream. The resulting shock structure consists of an inviscid bow shock that bifurcates into a separation shock and trailing shock. The primary features of interest are the separation shock and an upstream influence shock that is intermittently present in transitional boundary layer interactions, but not observed in turbulent interactions. The power spectral densities, frequency peaks, and normalized wall pressures are analyzed as the incoming boundary layer state changes from transitional to fully turbulent, comparing both centerline and outboard regions of the interaction. The present study compares the scales and frequencies of the dynamics of the separation shock structure in different boundary layer regimes. Synchronized high-speed Schlieren imaging provides quantitative statistical analyses as well as qualitative comparisons to the fast-response pressure sensitive paint measurements. Materials based on research supported by the U.S. Office of Naval Research under Award Number N00014-15-1-2269.
Liu, Yan; Yi, Ting-Hua; Xu, Zhen-Jun
2013-01-01
As a high-risk subindustry involved in construction projects, highway construction safety has experienced major developments in the past 20 years, mainly due to the lack of safe early warnings in Chinese construction projects. By combining the current state of early warning technology with the requirements of the State Administration of Work Safety and using case-based reasoning (CBR), this paper expounds on the concept and flow of highway construction safety early warnings based on CBR. The present study provides solutions to three key issues, index selection, accident cause association analysis, and warning degree forecasting implementation, through the use of association rule mining, support vector machine classifiers, and variable fuzzy qualitative and quantitative change criterion modes, which fully cover the needs of safe early warning systems. Using a detailed description of the principles and advantages of each method and by proving the methods' effectiveness and ability to act together in safe early warning applications, effective means and intelligent technology for a safe highway construction early warning system are established.
Understanding the low uptake of bone-anchored hearing aids: a review.
Powell, R; Wearden, A; Pardesi, S M; Green, K
2017-03-01
Bone-anchored hearing aids improve hearing for patients for whom conventional behind-the-ear aids are problematic. However, uptake of bone-anchored hearing aids is low and it is important to understand why this is the case. A narrative review was conducted. Studies examining why people accept or decline bone-anchored hearing aids and satisfaction levels of people with bone-anchored hearing aids were reviewed. Reasons for declining bone-anchored hearing aids included limited perceived benefits, concerns about surgery, aesthetic concerns and treatment cost. No studies providing in-depth analysis of the reasons for declining or accepting bone-anchored hearing aids were identified. Studies of patient satisfaction showed that most participants reported benefits with bone-anchored hearing aids. However, most studies used cross-sectional and/or retrospective designs and only included people with bone-anchored hearing aids. Important avenues for further research are in-depth qualitative research designed to fully understand the decision-making process for bone-anchored hearing aids and rigorous quantitative research comparing satisfaction of people who receive bone-anchored hearing aids with those who receive alternative (or no) treatments.
Ackermann, Mark R.
2006-01-01
The purpose of this manuscript is to discuss fluorogenic real-time quantitative polymerase chain reaction (qPCR) inhibition and to introduce/define a novel Microsoft Excel-based file system which provides a way to detect and avoid inhibition, and enables investigators to consistently design dynamically-sound, truly LOG-linear qPCR reactions very quickly. The qPCR problems this invention solves are universal to all qPCR reactions, and it performs all necessary qPCR set-up calculations in about 52 seconds (using a pentium 4 processor) for up to seven qPCR targets and seventy-two samples at a time – calculations that commonly take capable investigators days to finish. We have named this custom Excel-based file system "FocusField2-6GallupqPCRSet-upTool-001" (FF2-6-001 qPCR set-up tool), and are in the process of transforming it into professional qPCR set-up software to be made available in 2007. The current prototype is already fully functional. PMID:17033699
Wang, Guangliang; Rajpurohit, Surendra K; Delaspre, Fabien; Walker, Steven L; White, David T; Ceasrine, Alexis; Kuruvilla, Rejji; Li, Ruo-jing; Shim, Joong S; Liu, Jun O; Parsons, Michael J; Mumm, Jeff S
2015-01-01
Whole-organism chemical screening can circumvent bottlenecks that impede drug discovery. However, in vivo screens have not attained throughput capacities possible with in vitro assays. We therefore developed a method enabling in vivo high-throughput screening (HTS) in zebrafish, termed automated reporter quantification in vivo (ARQiv). In this study, ARQiv was combined with robotics to fully actualize whole-organism HTS (ARQiv-HTS). In a primary screen, this platform quantified cell-specific fluorescent reporters in >500,000 transgenic zebrafish larvae to identify FDA-approved (Federal Drug Administration) drugs that increased the number of insulin-producing β cells in the pancreas. 24 drugs were confirmed as inducers of endocrine differentiation and/or stimulators of β-cell proliferation. Further, we discovered novel roles for NF-κB signaling in regulating endocrine differentiation and for serotonergic signaling in selectively stimulating β-cell proliferation. These studies demonstrate the power of ARQiv-HTS for drug discovery and provide unique insights into signaling pathways controlling β-cell mass, potential therapeutic targets for treating diabetes. DOI: http://dx.doi.org/10.7554/eLife.08261.001 PMID:26218223
McClymont, Darryl; Mehnert, Andrew; Trakic, Adnan; Kennedy, Dominic; Crozier, Stuart
2014-04-01
To present and evaluate a fully automatic method for segmentation (i.e., detection and delineation) of suspicious tissue in breast MRI. The method, based on mean-shift clustering and graph-cuts on a region adjacency graph, was developed and its parameters tuned using multimodal (T1, T2, DCE-MRI) clinical breast MRI data from 35 subjects (training data). It was then tested using two data sets. Test set 1 comprises data for 85 subjects (93 lesions) acquired using the same protocol and scanner system used to acquire the training data. Test set 2 comprises data for eight subjects (nine lesions) acquired using a similar protocol but a different vendor's scanner system. Each lesion was manually delineated in three-dimensions by an experienced breast radiographer to establish segmentation ground truth. The regions of interest identified by the method were compared with the ground truth and the detection and delineation accuracies quantitatively evaluated. One hundred percent of the lesions were detected with a mean of 4.5 ± 1.2 false positives per subject. This false-positive rate is nearly 50% better than previously reported for a fully automatic breast lesion detection system. The median Dice coefficient for Test set 1 was 0.76 (interquartile range, 0.17), and 0.75 (interquartile range, 0.16) for Test set 2. The results demonstrate the efficacy and accuracy of the proposed method as well as its potential for direct application across different MRI systems. It is (to the authors' knowledge) the first fully automatic method for breast lesion detection and delineation in breast MRI.
Milles, J; van der Geest, R J; Jerosch-Herold, M; Reiber, J H C; Lelieveldt, B P F
2007-01-01
This paper presents a novel method for registration of cardiac perfusion MRI. The presented method successfully corrects for breathing motion without any manual interaction using Independent Component Analysis to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of ICA, and used to compute the displacement caused by breathing for each frame. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Validation experiments showed a reduction of the average LV motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. We conclude that this fully automatic ICA-based method shows an excellent accuracy, robustness and computation speed, adequate for use in a clinical environment.
The effect of capturing the correct turbulence dissipation rate in BHR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwarzkopf, John Dennis; Ristorcelli, Raymond
In this manuscript, we discuss the shortcoming of a quasi-equilibrium assumption made in the BHR closure model. Turbulence closure models generally assume fully developed turbulence, which is not applicable to 1) non-equilibrium turbulence (e.g. change in mean pressure gradient) or 2) laminar-turbulence transition flows. Based on DNS data, we show that the current BHR dissipation equation [modeled based on the fully developed turbulence phenomenology] does not capture important features of nonequilibrium flows. To demonstrate our thesis, we use the BHR equations to predict a non-equilibrium flow both with the BHR dissipation and the dissipation from DNS. We find that themore » prediction can be substantially improved, both qualitatively and quantitatively, with the correct dissipation rate. We conclude that a new set of nonequilibrium phenomenological assumptions must be used to develop a new model equation for the dissipation to accurately predict the turbulence time scale used by other models.« less
Gramene database in 2010: updates and extensions.
Youens-Clark, Ken; Buckler, Ed; Casstevens, Terry; Chen, Charles; Declerck, Genevieve; Derwent, Paul; Dharmawardhana, Palitha; Jaiswal, Pankaj; Kersey, Paul; Karthikeyan, A S; Lu, Jerry; McCouch, Susan R; Ren, Liya; Spooner, William; Stein, Joshua C; Thomason, Jim; Wei, Sharon; Ware, Doreen
2011-01-01
Now in its 10th year, the Gramene database (http://www.gramene.org) has grown from its primary focus on rice, the first fully-sequenced grass genome, to become a resource for major model and crop plants including Arabidopsis, Brachypodium, maize, sorghum, poplar and grape in addition to several species of rice. Gramene began with the addition of an Ensembl genome browser and has expanded in the last decade to become a robust resource for plant genomics hosting a wide array of data sets including quantitative trait loci (QTL), metabolic pathways, genetic diversity, genes, proteins, germplasm, literature, ontologies and a fully-structured markers and sequences database integrated with genome browsers and maps from various published studies (genetic, physical, bin, etc.). In addition, Gramene now hosts a variety of web services including a Distributed Annotation Server (DAS), BLAST and a public MySQL database. Twice a year, Gramene releases a major build of the database and makes interim releases to correct errors or to make important updates to software and/or data.
Automated frame selection process for high-resolution microendoscopy
NASA Astrophysics Data System (ADS)
Ishijima, Ayumu; Schwarz, Richard A.; Shin, Dongsuk; Mondrik, Sharon; Vigneswaran, Nadarajah; Gillenwater, Ann M.; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca
2015-04-01
We developed an automated frame selection algorithm for high-resolution microendoscopy video sequences. The algorithm rapidly selects a representative frame with minimal motion artifact from a short video sequence, enabling fully automated image analysis at the point-of-care. The algorithm was evaluated by quantitative comparison of diagnostically relevant image features and diagnostic classification results obtained using automated frame selection versus manual frame selection. A data set consisting of video sequences collected in vivo from 100 oral sites and 167 esophageal sites was used in the analysis. The area under the receiver operating characteristic curve was 0.78 (automated selection) versus 0.82 (manual selection) for oral sites, and 0.93 (automated selection) versus 0.92 (manual selection) for esophageal sites. The implementation of fully automated high-resolution microendoscopy at the point-of-care has the potential to reduce the number of biopsies needed for accurate diagnosis of precancer and cancer in low-resource settings where there may be limited infrastructure and personnel for standard histologic analysis.
Does architectural lighting contribute to breast cancer?
Figueiro, Mariana G; Rea, Mark S; Bullough, John D
2006-01-01
Objectives There is a growing interest in the role that light plays on nocturnal melatonin production and, perhaps thereby, the incidence of breast cancer in modern societies. The direct causal relationships in this logical chain have not, however, been fully established and the weakest link is an inability to quantitatively specify architectural lighting as a stimulus for the circadian system. The purpose of the present paper is to draw attention to this weakness. Data Sources and Extraction We reviewed the literature on the relationship between melatonin, light at night, and cancer risk in humans and tumor growth in animals. More specifically, we focused on the impact of light on nocturnal melatonin suppression in humans and on the applicability of these data to women in real-life situations. Photometric measurement data from the lighted environment of women at work and at home is also reported. Data Synthesis The literature review and measurement data demonstrate that more quantitative knowledge is needed about circadian light exposures actually experienced by women and girls in modern societies. Conclusion Without such quantitative knowledge, limited insights can be gained about the causal relationship between melatonin and the etiology of breast cancer from epidemiological studies and from parametric studies using animal models. PMID:16901343
NASA Astrophysics Data System (ADS)
Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis
2016-02-01
Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.
NASA Technical Reports Server (NTRS)
Carpenter, P. K.; Hahn, T. M.; Korotev, R. L.; Ziegler, R. A.; Jolliff, B. L.
2017-01-01
We present the first fully quantitative compositional maps of lunar meteorite NWA 2995 using electron microprobe stage mapping, and compare selected clast mineralogy and chemistry. NWA 2995 is a feldspathic fragmental breccia containing numerous highland fine grained lithologies, including anorthosite, norite, olivine basalt, subophitic basalt, gabbro, KREEP-like basalt, granulitic and glassy impact melts, coarse-grained mineral fragments, Fe-Ni metal, and glassy matrix [1]. Chips of NWA 2995, representing these diverse materials, were analyzed by INAA and fused-bead electron-probe microanalysis (EPMA); comparison of analytical data suggests grouping of lunar meteorites NWA 2995, 2996, 3190, 4503, 5151, and 5152. The mean composition of NWA 2995 corresponds to a 2:1 mixture of feldspathic and mare material, with approximately 5% KREEP component [2]. Clast mineral chemistry and petrologic interpretation of paired stone NWA 2996 has been reported by Mercer et al. [3], and Gross et al. [4]. This study combines advances in quantitative EPMA compositional mapping and data analysis, as applied to selected mafic clasts in a polished section of NWA 2995, to investigate the origin of mafic lithic components and to demonstrate a procedural framework for petrologic analysis.
Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers
NASA Astrophysics Data System (ADS)
Kowalski, Benjamin Andrew
Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.
Steingass, Christof B; Langen, Johannes; Carle, Reinhold; Schmarr, Hans-Georg
2015-02-01
Headspace solid phase microextraction and chirospecific gas chromatography-mass spectrometry in selected ion monitoring mode (HS-SPME-GC-SIM-MS) allowed quantitative determination of δ-lactones (δ-C8, δ-C10) and γ-lactones (γ-C6, γ-C8, γ-C10). A stable isotope dilution assay (SIDA) with d7-γ-decalactone as internal standard was used for quantitative analysis of pineapple lactones that was performed at three progressing post-harvest stages of fully ripe air-freighted and green-ripe sea-freighted fruits, covering the relevant shelf-life of the fruits. Fresh pineapples harvested at full maturity were characterised by γ-C6 of high enantiomeric purity remaining stable during the whole post-harvest period. In contrast, the enantiomeric purity of γ-C6 significantly decreased during post-harvest storage of sea-freighted pineapples. The biogenetical background and the potential of chirospecific analysis of lactones for authentication and quality evaluation of fresh pineapple fruits are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Schlippenbach, Trixi von; Oefner, Peter J; Gronwald, Wolfram
2018-03-09
Non-uniform sampling (NUS) allows the accelerated acquisition of multidimensional NMR spectra. The aim of this contribution was the systematic evaluation of the impact of various quantitative NUS parameters on the accuracy and precision of 2D NMR measurements of urinary metabolites. Urine aliquots spiked with varying concentrations (15.6-500.0 µM) of tryptophan, tyrosine, glutamine, glutamic acid, lactic acid, and threonine, which can only be resolved fully by 2D NMR, were used to assess the influence of the sampling scheme, reconstruction algorithm, amount of omitted data points, and seed value on the quantitative performance of NUS in 1 H, 1 H-TOCSY and 1 H, 1 H-COSY45 NMR spectroscopy. Sinusoidal Poisson-gap sampling and a compressed sensing approach employing the iterative re-weighted least squares method for spectral reconstruction allowed a 50% reduction in measurement time while maintaining sufficient quantitative accuracy and precision for both types of homonuclear 2D NMR spectroscopy. Together with other advances in instrument design, such as state-of-the-art cryogenic probes, use of 2D NMR spectroscopy in large biomedical cohort studies seems feasible.
Designing Online Playgrounds for Learning Mathematics
ERIC Educational Resources Information Center
Johnson, Heather Lynn; Hornbein, Peter; Bryson, Dana
2016-01-01
Fully online courses can provide teachers fresh opportunities to expand their mathematical conceptions and infuse technology into their classroom teaching. In this article, the authors share the experience of two classroom teachers (Hornbein and Bryson) who participated in a fully online mathematics education course--Expanding Conceptions of…
10 CFR Appendix II to Part 504 - Fuel Price Computation
Code of Federal Regulations, 2010 CFR
2010-01-01
... 504—Fuel Price Computation (a) Introduction. This appendix provides the equations and parameters... inflation indices must follow standard statistical procedures and must be fully documented within the... the weighted average fuel price must follow standard statistical procedures and be fully documented...
An exploration of equitable science teaching practices for students with learning disabilities
NASA Astrophysics Data System (ADS)
Morales, Marlene
In this study, a mixed methods approach was used to gather descriptive exploratory information regarding the teaching of science to middle grades students with learning disabilities within a general education classroom. The purpose of this study was to examine teachers' beliefs and their practices concerning providing equitable opportunities for students with learning disabilities in a general education science classroom. Equitable science teaching practices take into account each student's differences and uses those differences to inform instructional decisions and tailor teaching practices based on the student's individualized learning needs. Students with learning disabilities are similar to their non-disabled peers; however, they need some differentiation in instruction to perform to their highest potential achievement levels (Finson, Ormsbee, & Jensen, 2011). In the quantitative phase, the purpose of the study was to identify patterns in the beliefs of middle grades science teachers about the inclusion of students with learning disabilities in the general education classroom. In the qualitative phase, the purpose of the study was to present examples of instruction in the classrooms of science education reform-oriented middle grades science teachers. The quantitative phase of the study collected data from 274 sixth through eighth grade teachers in the State of Florida during the 2007--2008 school year using The Teaching Science to Students with Learning Disabilities Inventory. Overall, the quantitative findings revealed that middle grades science teachers held positive beliefs about the inclusion of students with learning disabilities in the general education science classroom. The qualitative phase collected data from multiple sources (interviews, classroom observations, and artifacts) to develop two case studies of reform-oriented middle grades science teachers who were expected to provide equitable science teaching practices. Based on their responses to The Teaching Science to Students with Learning Disabilities Inventory, the case study teachers demonstrated characteristics of successful teachers of diverse learners developed by Lynch (2000). Overall, the qualitative findings revealed that the case study teachers were unsure how to provide equitable science teaching practices to all students, particularly to students with learning disabilities. They provided students with a variety of learning experiences that entailed high expectations for all; however, these experiences were similar for all students. Had the teachers fully implemented equitable science teaching practices, students would have had multiple options for taking in the information and making sense of it in each lesson. Teaching that includes using a variety of validated practices that take into account students' individualized learning needs can promote aspects of equitable science teaching practices. Finally, this study provides implications for teacher education programs and professional development programs. As teachers implement science education reform efforts related to equitable science teaching practices, both teacher education programs and professional development programs should include opportunities for teachers to reflect on their beliefs about how students with learning disabilities learn and provide them with a variety of validated teaching practices that will assist them in teaching students with learning disabilities in the general education classroom while implementing science reform efforts.
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 17 Commodity and Securities Exchanges 3 2014-04-01 2014-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 17 Commodity and Securities Exchanges 2 2012-04-01 2012-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
Mapping of quantitative trait loci controlling adaptive traits in coastal Douglas-fir
Nicholas C. Wheeler; Kathleen D. Jermstad; Konstantin V. Krutovsky; Sally N. Aitken; Glenn T. Howe; Jodie Krakowski; David B. Neale
2005-01-01
Quantitative trait locus (QTL) analyses are used by geneticists to characterize the genetic architecture of quantitative traits, provide a foundation for marker-aided-selection (MAS), and provide a framework for positional selection of candidate genes. The most useful QTL for breeding applications are those that have been verified in time, space, and/or genetic...
Intelligent Middle-Ware Architecture for Mobile Networks
NASA Astrophysics Data System (ADS)
Rayana, Rayene Ben; Bonnin, Jean-Marie
Recent advances in electronic and automotive industries as well as in wireless telecommunication technologies have drawn a new picture where each vehicle became “fully networked”. Multiple stake-holders (network operators, drivers, car manufacturers, service providers, etc.) will participate in this emerging market, which could grow following various models. To free the market from technical constraints, it is important to return to the basics of the Internet, i.e., providing embarked devices with a fully operational Internet connectivity (IPv6).
Bloomfield, M S
2002-12-06
4-Aminophenol (4AP) is the primary degradation product of paracetamol which is limited at a low level (50 ppm or 0.005% w/w) in the drug substance by the European, United States, British and German Pharmacopoeias, employing a manual colourimetric limit test. The 4AP limit is widened to 1000 ppm or 0.1% w/w for the tablet product monographs, which quote the use of a less sensitive automated HPLC method. The lower drug substance specification limit is applied to our products, (50 ppm, equivalent to 25 mug 4AP in a tablet containing 500-mg paracetamol) and the pharmacopoeial HPLC assay was not suitable at this low level due to matrix interference. For routine analysis a rapid, automated assay was required. This paper presents a highly sensitive, precise and automated method employing the technique of Flow Injection (FI) analysis to quantitatively assay low levels of this degradant. A solution of the drug substance, or an extract of the tablets, containing 4AP and paracetamol is injected into a solvent carrier stream and merged on-line with alkaline sodium nitroprusside reagent, to form a specific blue derivative which is detected spectrophotometrically at 710 nm. Standard HPLC equipment is used throughout. The procedure is fully quantitative and has been optimised for sensitivity and robustness using a multivariate experimental design (multi-level 'Central Composite' response surface) model. The method has been fully validated and is linear down to 0.01 mug ml(-1). The approach should be applicable to a range of paracetamol products.
Oyster reef restoration in the northern Gulf of Mexico: extent, methods and outcomes
LaPeyre, Megan K.; Furlong, Jessica N.; Brown, Laura A.; Piazza, Bryan P.; Brown, Ken
2014-01-01
Shellfish reef restoration to support ecological services has become more common in recent decades, driven by increasing awareness of the functional decline of shellfish systems. Maximizing restoration benefits and increasing efficiency of shellfish restoration activities would greatly benefit from understanding and measurement of system responses to management activities. This project (1) compiles a database of northern Gulf of Mexico inshore artificial oyster reefs created for restoration purposes, and (2) quantitatively assesses a subset of reefs to determine project outcomes. We documented 259 artificial inshore reefs created for ecological restoration. Information on reef material, reef design and monitoring was located for 94, 43 and 20% of the reefs identified. To quantify restoration success, we used diver surveys to quantitatively sample oyster density and substrate volume of 11 created reefs across the coast (7 with rock; 4 with shell), paired with 7 historic reefs. Reefs were defined as fully successful if there were live oysters, and partially successful if there was hard substrate. Of these created reefs, 73% were fully successful, while 82% were partially successful. These data highlight that critical information related to reef design, cost, and success remain difficult to find and are generally inaccessible or lost, ultimately hindering efforts to maximize restoration success rates. Maintenance of reef creation information data, development of standard reef performance measures, and inclusion of material and reef design testing within reef creation projects would be highly beneficial in implementing adaptive management. Adaptive management protocols seek specifically to maximize short and long-term restoration success, but are critically dependent on tracking and measuring system responses to management activities.
Adapting the γ-H2AX assay for automated processing in human lymphocytes. 1. Technological aspects.
Turner, Helen C; Brenner, David J; Chen, Youhua; Bertucci, Antonella; Zhang, Jian; Wang, Hongliang; Lyulko, Oleksandra V; Xu, Yanping; Shuryak, Igor; Schaefer, Julia; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y Lawrence; Amundson, Sally A; Garty, Guy
2011-03-01
The immunofluorescence-based detection of γ-H2AX is a reliable and sensitive method for quantitatively measuring DNA double-strand breaks (DSBs) in irradiated samples. Since H2AX phosphorylation is highly linear with radiation dose, this well-established biomarker is in current use in radiation biodosimetry. At the Center for High-Throughput Minimally Invasive Radiation Biodosimetry, we have developed a fully automated high-throughput system, the RABIT (Rapid Automated Biodosimetry Tool), that can be used to measure γ-H2AX yields from fingerstick-derived samples of blood. The RABIT workstation has been designed to fully automate the γ-H2AX immunocytochemical protocol, from the isolation of human blood lymphocytes in heparin-coated PVC capillaries to the immunolabeling of γ-H2AX protein and image acquisition to determine fluorescence yield. High throughput is achieved through the use of purpose-built robotics, lymphocyte handling in 96-well filter-bottomed plates, and high-speed imaging. The goal of the present study was to optimize and validate the performance of the RABIT system for the reproducible and quantitative detection of γ-H2AX total fluorescence in lymphocytes in a multiwell format. Validation of our biodosimetry platform was achieved by the linear detection of a dose-dependent increase in γ-H2AX fluorescence in peripheral blood samples irradiated ex vivo with γ rays over the range 0 to 8 Gy. This study demonstrates for the first time the optimization and use of our robotically based biodosimetry workstation to successfully quantify γ-H2AX total fluorescence in irradiated peripheral lymphocytes.
Lanzarotta, Adam; Lorenz, Lisa; Voelker, Sarah; Falconer, Travis M; Batson, JaCinta S
2018-05-01
This manuscript is a continuation of a recent study that described the use of fully integrated gas chromatography with direct deposition Fourier transform infrared detection and mass spectrometric detection (GC-FT-IR-MS) to identify and confirm the presence of sibutramine and AB-FUBINACA. The purpose of the current study was to employ the GC-FT-IR portion of the same instrument to quantify these compounds, thereby demonstrating the ability to identify, confirm, and quantify drug substances using a single GC-FT-IR-MS unit. The performance of the instrument was evaluated by comparing quantitative analytical figures of merit to those measured using an established, widely employed method for quantifying drug substances, high performance liquid chromatography with ultraviolet detection (HPLC-UV). The results demonstrated that GC-FT-IR was outperformed by HPLC-UV with regard to sensitivity, precision, and linear dynamic range (LDR). However, sibutramine and AB-FUBINACA concentrations measured using GC-FT-IR were not significantly different at the 95% confidence interval compared to those measured using HPLC-UV, which demonstrates promise for using GC-FT-IR as a semi-quantitative tool at the very least. The most significant advantage of GC-FT-IR compared to HPLC-UV is selectivity; a higher level of confidence regarding the identity of the analyte being quantified is achieved using GC-FT-IR. Additional advantages of using a single GC-FT-IR-MS instrument for identification, confirmation, and quantification are efficiency, increased sample throughput, decreased consumption of laboratory resources (solvents, chemicals, consumables, etc.), and thus cost.
Li, W.; Ma, Q.; Thorne, R. M.; ...
2016-06-10
Various physical processes are known to cause acceleration, loss, and transport of energetic electrons in the Earth's radiation belts, but their quantitative roles in different time and space need further investigation. During the largest storm over the past decade (17 March 2015), relativistic electrons experienced fairly rapid acceleration up to ~7 MeV within 2 days after an initial substantial dropout, as observed by Van Allen Probes. In the present paper, we evaluate the relative roles of various physical processes during the recovery phase of this large storm using a 3-D diffusion simulation. By quantitatively comparing the observed and simulated electronmore » evolution, we found that chorus plays a critical role in accelerating electrons up to several MeV near the developing peak location and produces characteristic flat-top pitch angle distributions. By only including radial diffusion, the simulation underestimates the observed electron acceleration, while radial diffusion plays an important role in redistributing electrons and potentially accelerates them to even higher energies. Moreover, plasmaspheric hiss is found to provide efficient pitch angle scattering losses for hundreds of keV electrons, while its scattering effect on > 1 MeV electrons is relatively slow. Although an additional loss process is required to fully explain the overestimated electron fluxes at multi-MeV, the combined physical processes of radial diffusion and pitch angle and energy diffusion by chorus and hiss reproduce the observed electron dynamics remarkably well, suggesting that quasi-linear diffusion theory is reasonable to evaluate radiation belt electron dynamics during this big storm.« less
Methods for a study of Anticipatory and Preventive multidisciplinary Team Care in a family practice.
Dahrouge, Simone; Hogg, William; Lemelin, Jacques; Liddy, Clare; Legault, Frances
2010-02-01
BACKGROUND T o examine the methodology used to evaluate whether focusing the work of nurse practitioners and a pharmacist on frail and at-risk patients would improve the quality of care for such patients. Evaluation of methodology of a randomized controlled trial including analysis of quantitative and qualitative data over time and analysis of cost-effectiveness. A single practice in a rural area near Ottawa, Ont. A total of 241 frail patients, aged 50 years and older, at risk of experiencing adverse health outcomes. At-risk patients were randomly assigned to receive Anticipatory and Preventive Team Care (from their family physicians, 1 of 3 nurse practitioners, and a pharmacist) or usual care. The principal outcome for the study was the quality of care for chronic disease management. Secondary outcomes included other quality of care measures and evaluation of the program process and its cost-effectiveness. This article examines the effectiveness of the methodology used. Quantitative data from surveys, administrative databases, and medical records were supplemented with qualitative information from interviews, focus groups, work logs, and study notes. Three factors limit our ability to fully demonstrate the potential effects of this team structure. For reasons outside our control, the intervention duration was shorter than intended; the practice's physical layout did not facilitate interactions between the care providers; and contamination of the intervention effect into the control arm cannot be excluded. The study used a randomized design, relied on a multifaceted approach to evaluating its effects, and used several sources of data. TRIAL REGISTRATION NUMBER NCT00238836 (CONSORT).
Automated 3D closed surface segmentation: application to vertebral body segmentation in CT images.
Liu, Shuang; Xie, Yiting; Reeves, Anthony P
2016-05-01
A fully automated segmentation algorithm, progressive surface resolution (PSR), is presented in this paper to determine the closed surface of approximately convex blob-like structures that are common in biomedical imaging. The PSR algorithm was applied to the cortical surface segmentation of 460 vertebral bodies on 46 low-dose chest CT images, which can be potentially used for automated bone mineral density measurement and compression fracture detection. The target surface is realized by a closed triangular mesh, which thereby guarantees the enclosure. The surface vertices of the triangular mesh representation are constrained along radial trajectories that are uniformly distributed in 3D angle space. The segmentation is accomplished by determining for each radial trajectory the location of its intersection with the target surface. The surface is first initialized based on an input high confidence boundary image and then resolved progressively based on a dynamic attraction map in an order of decreasing degree of evidence regarding the target surface location. For the visual evaluation, the algorithm achieved acceptable segmentation for 99.35 % vertebral bodies. Quantitative evaluation was performed on 46 vertebral bodies and achieved overall mean Dice coefficient of 0.939 (with max [Formula: see text] 0.957, min [Formula: see text] 0.906 and standard deviation [Formula: see text] 0.011) using manual annotations as the ground truth. Both visual and quantitative evaluations demonstrate encouraging performance of the PSR algorithm. This novel surface resolution strategy provides uniform angular resolution for the segmented surface with computation complexity and runtime that are linearly constrained by the total number of vertices of the triangular mesh representation.
Physical modeling of long-wave run-up mitigation using submerged breakwaters
NASA Astrophysics Data System (ADS)
Lee, Yu-Ting; Wu, Yun-Ta; Hwung, Hwung-Hweng; Yang, Ray-Yeng
2016-04-01
Natural hazard due to tsunami inundation inland has been viewed as a crucial issue for coastal engineering community. The 2004 India Ocean tsunami and the 2011 Tohoku earthquake tsunami were caused by mega scale earthquakes that brought tremendous catastrophe in the disaster regions. It is thus of great importance to develop innovative approach to achieve the reduction and mitigation of tsunami hazards. In this study, new experiments have been carried out in a laboratory-scale to investigate the physical process of long-wave through submerged breakwaters built upon a mild slope. Solitary-wave is employed to represent the characteristic of long-wave with infinite wavelength and wave period. Our goal is twofold. First of all, through changing the positions of single breakwater and multiple breakwaters upon a mild slope, the optimal locations of breakwaters can be pointed out by means of maximum run-up reduction. Secondly, through using a state-of-the-art measuring technique Bubble Image Velocimetry, which features non-intrusive and image-based measurement, the wave kinematics in the highly aerated region due to solitary-wave shoaling, breaking and uprush can be quantitated. Therefore, the mitigation of long-wave due to the construction of submerged breakwaters built upon a mild slope can be evaluated not only for imaging run-up and run-down characteristics but also for measuring turbulent velocity fields due to breaking wave. Although we understand the most devastating tsunami hazards cannot be fully mitigated with impossibility, this study is to provide quantitated information on what kind of artificial coastal structure that can withstand which level of wave loads.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Si; Brown, Joseph N.; Tolic, Nikola
There are several notable challenges inherent to fully characterizing the entirety of the human saliva proteome using bottom-up approaches, including polymorphic isoforms, post-translational modifications, unique splice variants, deletions, and truncations. To address these challenges, we have developed a top-down based liquid chromatography-mass spectrometry (LC-MS) approach, which cataloged 20 major human salivary proteins with a total of 83 proteoforms, containing a broad range of post-translational modifications. Among these proteins, several previously reported disease biomarker proteins were identified at the intact protein level, such as beta-2 microglobulin (B2M). In addition, intact glycosylated proteoforms of several saliva proteins were also characterized, including intactmore » N-glycosylated protein prolactin inducible protein (PIP) and O-glycosylated acidic protein rich protein (aPRP). These characterized proteoforms constitute an intact saliva proteoform database, which was used for quantitative comparison of intact salivary proteoforms among six healthy individuals. Human parotid (PS) and submandibular/sublingual gland (SMSL) secretion samples (2 μg of protein each) from six healthy individuals were compared using RPLC coupled with the 12T FTICR mass spectrometer. Significantly different protein and PTM patterns were resolved with high reproducibility between PS and SMSL glands. The results from this study provide further insight into the potential mechanisms of PTM pathways in oral glandular secretion, expanding our knowledge of this complex yet easily accessible fluid. Intact protein LC-MS approach presented herein can potentially be applied for rapid and accurate identification of biomarkers from only a few microliters of human glandular saliva.« less
Petersen, Esben Thade; Mouridsen, Kim; Golay, Xavier
2010-01-01
Arterial Spin Labeling (ASL) is a method to measure perfusion using magnetically labeled blood water as an endogenous tracer. Being fully non-invasive, this technique is attractive for longitudinal studies of cerebral blood flow in healthy and diseased individuals, or as a surrogate marker of metabolism. So far, ASL has been restricted mostly to specialist centers due to a generally low SNR of the method and potential issues with user-dependent analysis needed to obtain quantitative measurement of cerebral blood flow (CBF). Here, we evaluated a particular implementation of ASL (called Quantitative STAR labeling of Arterial Regions or QUASAR), a method providing user independent quantification of CBF in a large test-retest study across sites from around the world, dubbed "The QUASAR reproducibility study". Altogether, 28 sites located in Asia, Europe and North America participated and a total of 284 healthy volunteers were scanned. Minimal operator dependence was assured by using an automatic planning tool and its accuracy and potential usefulness in multi-center trials was evaluated as well. Accurate repositioning between sessions was achieved with the automatic planning tool showing mean displacements of 1.87+/-0.95 mm and rotations of 1.56+/-0.66 degrees . Mean gray matter CBF was 47.4+/-7.5 [ml/100 g/min] with a between-subject standard variation SD(b)=5.5 [ml/100 g/min] and a within-subject standard deviation SD(w)=4.7 [ml/100 g/min]. The corresponding repeatability was 13.0 [ml/100 g/min] and was found to be within the range of previous studies.
Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo
2015-01-01
Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.
Liu, Ang; Kozhich, Alexander; Passmore, David; Gu, Huidong; Wong, Richard; Zambito, Frank; Rangan, Vangipuram S; Myler, Heather; Aubry, Anne-Françoise; Arnold, Mark E; Wang, Jian
2015-10-01
Antibody drug conjugates (ADCs) are complex molecules composed of two pharmacologically distinct components, the cytotoxic payload and the antibody. The measurement of the payload molecules that are attached to the antibody in vivo is important for the evaluation of the safety and efficacy of ADCs, and can also provide distinct information compared to the antibody-related analytes. However, analyzing the antibody-conjugated payload is challenging and in some cases may not be feasible. The in vivo change in drug antibody ratio (DAR), due to deconjugation, biotransformation or other clearance phenomena, generates unique and additional challenges for ADC analysis in biological samples. Here, we report a novel hybrid approach with immuno-capture of the ADC, payload cleavage by specific enzyme, and LC-MS/MS of the cleaved payload to quantitatively measure the concentration of payload molecules still attached to the antibody via linker in plasma. The ADC reference material used for the calibration curve is not likely to be identical to the ADC measured in study samples due to the change in DAR distribution over the PK time course. The assay clearly demonstrated that there was no bias in the measurement of antibody-conjugated payload for ADC with varying DAR, which thus allowed accurate quantification even when the DAR distribution dynamically changes in vivo. This hybrid assay was fully validated based on a combination of requirements for both chromatographic and ligand binding methods, and was successfully applied to support a GLP safety study in monkeys. Copyright © 2015 Elsevier B.V. All rights reserved.
Food Web Designer: a flexible tool to visualize interaction networks.
Sint, Daniela; Traugott, Michael
Species are embedded in complex networks of ecological interactions and assessing these networks provides a powerful approach to understand what the consequences of these interactions are for ecosystem functioning and services. This is mandatory to develop and evaluate strategies for the management and control of pests. Graphical representations of networks can help recognize patterns that might be overlooked otherwise. However, there is a lack of software which allows visualizing these complex interaction networks. Food Web Designer is a stand-alone, highly flexible and user friendly software tool to quantitatively visualize trophic and other types of bipartite and tripartite interaction networks. It is offered free of charge for use on Microsoft Windows platforms. Food Web Designer is easy to use without the need to learn a specific syntax due to its graphical user interface. Up to three (trophic) levels can be connected using links cascading from or pointing towards the taxa within each level to illustrate top-down and bottom-up connections. Link width/strength and abundance of taxa can be quantified, allowing generating fully quantitative networks. Network datasets can be imported, saved for later adjustment and the interaction webs can be exported as pictures for graphical display in different file formats. We show how Food Web Designer can be used to draw predator-prey and host-parasitoid food webs, demonstrating that this software is a simple and straightforward tool to graphically display interaction networks for assessing pest control or any other type of interaction in both managed and natural ecosystems from an ecological network perspective.
An Automated Solar Synoptic Analysis Software System
NASA Astrophysics Data System (ADS)
Hong, S.; Lee, S.; Oh, S.; Kim, J.; Lee, J.; Kim, Y.; Lee, J.; Moon, Y.; Lee, D.
2012-12-01
We have developed an automated software system of identifying solar active regions, filament channels, and coronal holes, those are three major solar sources causing the space weather. Space weather forecasters of NOAA Space Weather Prediction Center produce the solar synoptic drawings as a daily basis to predict solar activities, i.e., solar flares, filament eruptions, high speed solar wind streams, and co-rotating interaction regions as well as their possible effects to the Earth. As an attempt to emulate this process with a fully automated and consistent way, we developed a software system named ASSA(Automated Solar Synoptic Analysis). When identifying solar active regions, ASSA uses high-resolution SDO HMI intensitygram and magnetogram as inputs and providing McIntosh classification and Mt. Wilson magnetic classification of each active region by applying appropriate image processing techniques such as thresholding, morphology extraction, and region growing. At the same time, it also extracts morphological and physical properties of active regions in a quantitative way for the short-term prediction of flares and CMEs. When identifying filament channels and coronal holes, images of global H-alpha network and SDO AIA 193 are used for morphological identification and also SDO HMI magnetograms for quantitative verification. The output results of ASSA are routinely checked and validated against NOAA's daily SRS(Solar Region Summary) and UCOHO(URSIgram code for coronal hole information). A couple of preliminary scientific results are to be presented using available output results. ASSA will be deployed at the Korean Space Weather Center and serve its customers in an operational status by the end of 2012.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, W.; Ma, Q.; Thorne, R. M.
Various physical processes are known to cause acceleration, loss, and transport of energetic electrons in the Earth's radiation belts, but their quantitative roles in different time and space need further investigation. During the largest storm over the past decade (17 March 2015), relativistic electrons experienced fairly rapid acceleration up to ~7 MeV within 2 days after an initial substantial dropout, as observed by Van Allen Probes. In the present paper, we evaluate the relative roles of various physical processes during the recovery phase of this large storm using a 3-D diffusion simulation. By quantitatively comparing the observed and simulated electronmore » evolution, we found that chorus plays a critical role in accelerating electrons up to several MeV near the developing peak location and produces characteristic flat-top pitch angle distributions. By only including radial diffusion, the simulation underestimates the observed electron acceleration, while radial diffusion plays an important role in redistributing electrons and potentially accelerates them to even higher energies. Moreover, plasmaspheric hiss is found to provide efficient pitch angle scattering losses for hundreds of keV electrons, while its scattering effect on > 1 MeV electrons is relatively slow. Although an additional loss process is required to fully explain the overestimated electron fluxes at multi-MeV, the combined physical processes of radial diffusion and pitch angle and energy diffusion by chorus and hiss reproduce the observed electron dynamics remarkably well, suggesting that quasi-linear diffusion theory is reasonable to evaluate radiation belt electron dynamics during this big storm.« less
Russell, Richard A; Adams, Niall M; Stephens, David A; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S
2009-04-22
Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments.
Russell, Richard A.; Adams, Niall M.; Stephens, David A.; Batty, Elizabeth; Jensen, Kirsten; Freemont, Paul S.
2009-01-01
Abstract Considerable advances in microscopy, biophysics, and cell biology have provided a wealth of imaging data describing the functional organization of the cell nucleus. Until recently, cell nuclear architecture has largely been assessed by subjective visual inspection of fluorescently labeled components imaged by the optical microscope. This approach is inadequate to fully quantify spatial associations, especially when the patterns are indistinct, irregular, or highly punctate. Accurate image processing techniques as well as statistical and computational tools are thus necessary to interpret this data if meaningful spatial-function relationships are to be established. Here, we have developed a thresholding algorithm, stable count thresholding (SCT), to segment nuclear compartments in confocal laser scanning microscopy image stacks to facilitate objective and quantitative analysis of the three-dimensional organization of these objects using formal statistical methods. We validate the efficacy and performance of the SCT algorithm using real images of immunofluorescently stained nuclear compartments and fluorescent beads as well as simulated images. In all three cases, the SCT algorithm delivers a segmentation that is far better than standard thresholding methods, and more importantly, is comparable to manual thresholding results. By applying the SCT algorithm and statistical analysis, we quantify the spatial configuration of promyelocytic leukemia nuclear bodies with respect to irregular-shaped SC35 domains. We show that the compartments are closer than expected under a null model for their spatial point distribution, and furthermore that their spatial association varies according to cell state. The methods reported are general and can readily be applied to quantify the spatial interactions of other nuclear compartments. PMID:19383481
Wyatt, S K; Barck, K H; Kates, L; Zavala-Solorio, J; Ross, J; Kolumam, G; Sonoda, J; Carano, R A D
2015-11-01
The ability to non-invasively measure body composition in mouse models of obesity and obesity-related disorders is essential for elucidating mechanisms of metabolic regulation and monitoring the effects of novel treatments. These studies aimed to develop a fully automated, high-throughput micro-computed tomography (micro-CT)-based image analysis technique for longitudinal quantitation of adipose, non-adipose and lean tissue as well as bone and demonstrate utility for assessing the effects of two distinct treatments. An initial validation study was performed in diet-induced obesity (DIO) and control mice on a vivaCT 75 micro-CT system. Subsequently, four groups of DIO mice were imaged pre- and post-treatment with an experimental agonistic antibody specific for anti-fibroblast growth factor receptor 1 (anti-FGFR1, R1MAb1), control immunoglobulin G antibody, a known anorectic antiobesity drug (rimonabant, SR141716), or solvent control. The body composition analysis technique was then ported to a faster micro-CT system (CT120) to markedly increase throughput as well as to evaluate the use of micro-CT image intensity for hepatic lipid content in DIO and control mice. Ex vivo chemical analysis and colorimetric analysis of the liver triglycerides were performed as the standard metrics for correlation with body composition and hepatic lipid status, respectively. Micro-CT-based body composition measures correlate with ex vivo chemical analysis metrics and enable distinction between DIO and control mice. R1MAb1 and rimonabant have differing effects on body composition as assessed by micro-CT. High-throughput body composition imaging is possible using a modified CT120 system. Micro-CT also provides a non-invasive assessment of hepatic lipid content. This work describes, validates and demonstrates utility of a fully automated image analysis technique to quantify in vivo micro-CT-derived measures of adipose, non-adipose and lean tissue, as well as bone. These body composition metrics highly correlate with standard ex vivo chemical analysis and enable longitudinal evaluation of body composition and therapeutic efficacy monitoring.