Sample records for activity quantitative estimates

  1. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  2. A method for energy window optimization for quantitative tasks that includes the effects of model-mismatch on bias: application to Y-90 bremsstrahlung SPECT imaging.

    PubMed

    Rong, Xing; Du, Yong; Frey, Eric C

    2012-06-21

    Quantitative Yttrium-90 ((90)Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of (90)Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for (90)Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as (90)Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In (90)Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative (90)Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were derived for calculating the bias due to model-mismatch and the variance of the VOI activity estimates, respectively. To obtain the optimal acquisition energy window for general situations of interest in clinical (90)Y microsphere imaging, we generated phantoms with multiple tumors of various sizes and various tumor-to-normal activity concentration ratios using a digital phantom that realistically simulates human anatomy, simulated (90)Y microsphere imaging with a clinical SPECT system and typical imaging parameters using a previously validated Monte Carlo simulation code, and used a previously proposed method for modeling the image degrading effects in quantitative SPECT reconstruction. The obtained optimal acquisition energy window was 100-160 keV. The values of the proposed FOM were much larger than the FOM taking into account only the variance of the activity estimates, thus demonstrating in our experiment that the bias of the activity estimates due to model-mismatch was a more important factor than the variance in terms of limiting the reliability of activity estimates.

  3. APPLICATION OF RADIOISOTOPES TO THE QUANTITATIVE CHROMATOGRAPHY OF FATTY ACIDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budzynski, A.Z.; Zubrzycki, Z.J.; Campbell, I.G.

    1959-10-31

    The paper reports work done on the use of I/sup 131/, Zn/sup 65/, Sr/sup 90/, Zr/sup 95/, Ce/sup 144/ for the quantitative estimation of fatty acids on paper chromatograms, and for determination of the degree of usaturation of components of resolved fatty acid mixtures. I/sup 131/ is used to iodinate unsaturated fatty acids, and the amount of such acids is determined from the radiochromatogram. The degree of unsaturation of fatty acids is determined by estimation of the specific activiiy of spots. The other isotopes have been examined from the point of view of their suitability for estimation of total amountsmore » of fatty acids by formation of insoluble radioactive soaps held on the chromatogram. In particular, work is reported on the quantitative estimation of saturated fatty acids by measurement of the activity of their insoluble soaps with radioactive metals. Various quantitative relationships are described between amount of fatty acid in spot and such parameters as radiometrically estimated spot length, width, maximum intensity, and integrated spot activity. A convenient detection apparatus for taking radiochromatograms is also described. In conjunction with conventional chromatographic methods for resolving fatty acids the method permits the estimation of composition of fatty acid mixtures obtained from biological material. (auth)« less

  4. Reconstructing Dynamic Promoter Activity Profiles from Reporter Gene Data.

    PubMed

    Kannan, Soumya; Sams, Thomas; Maury, Jérôme; Workman, Christopher T

    2018-03-16

    Accurate characterization of promoter activity is important when designing expression systems for systems biology and metabolic engineering applications. Promoters that respond to changes in the environment enable the dynamic control of gene expression without the necessity of inducer compounds, for example. However, the dynamic nature of these processes poses challenges for estimating promoter activity. Most experimental approaches utilize reporter gene expression to estimate promoter activity. Typically the reporter gene encodes a fluorescent protein that is used to infer a constant promoter activity despite the fact that the observed output may be dynamic and is a number of steps away from the transcription process. In fact, some promoters that are often thought of as constitutive can show changes in activity when growth conditions change. For these reasons, we have developed a system of ordinary differential equations for estimating dynamic promoter activity for promoters that change their activity in response to the environment that is robust to noise and changes in growth rate. Our approach, inference of dynamic promoter activity (PromAct), improves on existing methods by more accurately inferring known promoter activity profiles. This method is also capable of estimating the correct scale of promoter activity and can be applied to quantitative data sets to estimate quantitative rates.

  5. ESTIMATION OF MICROBIAL REDUCTIVE TRANSFORMATION RATES FOR CHLORINATED BENZENES AND PHENOLS USING A QUANTITATIVE STRUCTURE-ACTIVITY RELATIONSHIP APPROACH

    EPA Science Inventory

    A set of literature data was used to derive several quantitative structure-activity relationships (QSARs) to predict the rate constants for the microbial reductive dehalogenation of chlorinated aromatics. Dechlorination rate constants for 25 chloroaromatics were corrected for th...

  6. Toxicity Estimation Software Tool (TEST)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  7. Household physical activity and cancer risk: a systematic review and dose-response meta-analysis of epidemiological studies

    PubMed Central

    Shi, Yun; Li, Tingting; Wang, Ying; Zhou, Lingling; Qin, Qin; Yin, Jieyun; Wei, Sheng; Liu, Li; Nie, Shaofa

    2015-01-01

    Controversial results of the association between household physical activity and cancer risk were reported among previous epidemiological studies. We conducted a meta-analysis to investigate the relationship of household physical activity and cancer risk quantitatively, especially in dose-response manner. PubMed, Embase, Web of science and the Cochrane Library were searched for cohort or case-control studies that examined the association between household physical activity and cancer risks. Random–effect models were conducted to estimate the summary relative risks (RRs), nonlinear or linear dose–response meta-analyses were performed to estimate the trend from the correlated log RR estimates across levels of household physical activity quantitatively. Totally, 30 studies including 41 comparisons met the inclusion criteria. Total cancer risks were reduced 16% among the people with highest household physical activity compared to those with lowest household physical activity (RR = 0.84, 95% CI = 0.76–0.93). The dose-response analyses indicated an inverse linear association between household physical activity and cancer risk. The relative risk was 0.98 (95% CI = 0.97–1.00) for per additional 10 MET-hours/week and it was 0.99 (95% CI = 0.98–0.99) for per 1 hour/week increase. These findings provide quantitative data supporting household physical activity is associated with decreased cancer risk in dose-response effect. PMID:26443426

  8. The effect of respiratory induced density variations on non-TOF PET quantitation in the lung.

    PubMed

    Holman, Beverley F; Cuplov, Vesna; Hutton, Brian F; Groves, Ashley M; Thielemans, Kris

    2016-04-21

    Accurate PET quantitation requires a matched attenuation map. Obtaining matched CT attenuation maps in the thorax is difficult due to the respiratory cycle which causes both motion and density changes. Unlike with motion, little attention has been given to the effects of density changes in the lung on PET quantitation. This work aims to explore the extent of the errors caused by pulmonary density attenuation map mismatch on dynamic and static parameter estimates. Dynamic XCAT phantoms were utilised using clinically relevant (18)F-FDG and (18)F-FMISO time activity curves for all organs within the thorax to estimate the expected parameter errors. The simulations were then validated with PET data from 5 patients suffering from idiopathic pulmonary fibrosis who underwent PET/Cine-CT. The PET data were reconstructed with three gates obtained from the Cine-CT and the average Cine-CT. The lung TACs clearly displayed differences between true and measured curves with error depending on global activity distribution at the time of measurement. The density errors from using a mismatched attenuation map were found to have a considerable impact on PET quantitative accuracy. Maximum errors due to density mismatch were found to be as high as 25% in the XCAT simulation. Differences in patient derived kinetic parameter estimates and static concentration between the extreme gates were found to be as high as 31% and 14%, respectively. Overall our results show that respiratory associated density errors in the attenuation map affect quantitation throughout the lung, not just regions near boundaries. The extent of this error is dependent on the activity distribution in the thorax and hence on the tracer and time of acquisition. Consequently there may be a significant impact on estimated kinetic parameters throughout the lung.

  9. The effect of respiratory induced density variations on non-TOF PET quantitation in the lung

    NASA Astrophysics Data System (ADS)

    Holman, Beverley F.; Cuplov, Vesna; Hutton, Brian F.; Groves, Ashley M.; Thielemans, Kris

    2016-04-01

    Accurate PET quantitation requires a matched attenuation map. Obtaining matched CT attenuation maps in the thorax is difficult due to the respiratory cycle which causes both motion and density changes. Unlike with motion, little attention has been given to the effects of density changes in the lung on PET quantitation. This work aims to explore the extent of the errors caused by pulmonary density attenuation map mismatch on dynamic and static parameter estimates. Dynamic XCAT phantoms were utilised using clinically relevant 18F-FDG and 18F-FMISO time activity curves for all organs within the thorax to estimate the expected parameter errors. The simulations were then validated with PET data from 5 patients suffering from idiopathic pulmonary fibrosis who underwent PET/Cine-CT. The PET data were reconstructed with three gates obtained from the Cine-CT and the average Cine-CT. The lung TACs clearly displayed differences between true and measured curves with error depending on global activity distribution at the time of measurement. The density errors from using a mismatched attenuation map were found to have a considerable impact on PET quantitative accuracy. Maximum errors due to density mismatch were found to be as high as 25% in the XCAT simulation. Differences in patient derived kinetic parameter estimates and static concentration between the extreme gates were found to be as high as 31% and 14%, respectively. Overall our results show that respiratory associated density errors in the attenuation map affect quantitation throughout the lung, not just regions near boundaries. The extent of this error is dependent on the activity distribution in the thorax and hence on the tracer and time of acquisition. Consequently there may be a significant impact on estimated kinetic parameters throughout the lung.

  10. Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.

    PubMed

    Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K

    2017-05-01

    Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.

  11. Obscure phenomena in statistical analysis of quantitative structure-activity relationships. Part 1: Multicollinearity of physicochemical descriptors.

    PubMed

    Mager, P P; Rothe, H

    1990-10-01

    Multicollinearity of physicochemical descriptors leads to serious consequences in quantitative structure-activity relationship (QSAR) analysis, such as incorrect estimators and test statistics of regression coefficients of the ordinary least-squares (OLS) model applied usually to QSARs. Beside the diagnosis of the known simple collinearity, principal component regression analysis (PCRA) also allows the diagnosis of various types of multicollinearity. Only if the absolute values of PCRA estimators are order statistics that decrease monotonically, the effects of multicollinearity can be circumvented. Otherwise, obscure phenomena may be observed, such as good data recognition but low predictive model power of a QSAR model.

  12. Prediction of Solvent Physical Properties using the Hierarchical Clustering Method

    EPA Science Inventory

    Recently a QSAR (Quantitative Structure Activity Relationship) method, the hierarchical clustering method, was developed to estimate acute toxicity values for large, diverse datasets. This methodology has now been applied to the estimate solvent physical properties including sur...

  13. A quantitative reconstruction software suite for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Namías, Mauro; Jeraj, Robert

    2017-11-01

    Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.

  14. Overview of T.E.S.T. (Toxicity Estimation Software Tool)

    EPA Science Inventory

    This talk provides an overview of T.E.S.T. (Toxicity Estimation Software Tool). T.E.S.T. predicts toxicity values and physical properties using a variety of different QSAR (quantitative structure activity relationship) approaches including hierarchical clustering, group contribut...

  15. Estimating the Potential Toxicity of Chemicals Associated with Hydraulic Fracturing Operations Using Quantitative Structure-Activity Relationship Modeling

    EPA Pesticide Factsheets

    Researchers facilitated evaluation of chemicals that lack chronic oral toxicity values using a QSAR model to develop estimates of potential toxicity for chemicals used in HF fluids or found in flowback or produced water

  16. The impact of 3D volume of interest definition on accuracy and precision of activity estimation in quantitative SPECT and planar processing methods

    NASA Astrophysics Data System (ADS)

    He, Bin; Frey, Eric C.

    2010-06-01

    Accurate and precise estimation of organ activities is essential for treatment planning in targeted radionuclide therapy. We have previously evaluated the impact of processing methodology, statistical noise and variability in activity distribution and anatomy on the accuracy and precision of organ activity estimates obtained with quantitative SPECT (QSPECT) and planar (QPlanar) processing. Another important factor impacting the accuracy and precision of organ activity estimates is accuracy of and variability in the definition of organ regions of interest (ROI) or volumes of interest (VOI). The goal of this work was thus to systematically study the effects of VOI definition on the reliability of activity estimates. To this end, we performed Monte Carlo simulation studies using randomly perturbed and shifted VOIs to assess the impact on organ activity estimates. The 3D NCAT phantom was used with activities that modeled clinically observed 111In ibritumomab tiuxetan distributions. In order to study the errors resulting from misdefinitions due to manual segmentation errors, VOIs of the liver and left kidney were first manually defined. Each control point was then randomly perturbed to one of the nearest or next-nearest voxels in three ways: with no, inward or outward directional bias, resulting in random perturbation, erosion or dilation, respectively, of the VOIs. In order to study the errors resulting from the misregistration of VOIs, as would happen, e.g. in the case where the VOIs were defined using a misregistered anatomical image, the reconstructed SPECT images or projections were shifted by amounts ranging from -1 to 1 voxels in increments of with 0.1 voxels in both the transaxial and axial directions. The activity estimates from the shifted reconstructions or projections were compared to those from the originals, and average errors were computed for the QSPECT and QPlanar methods, respectively. For misregistration, errors in organ activity estimations were linear in the shift for both the QSPECT and QPlanar methods. QPlanar was less sensitive to object definition perturbations than QSPECT, especially for dilation and erosion cases. Up to 1 voxel misregistration or misdefinition resulted in up to 8% error in organ activity estimates, with the largest errors for small or low uptake organs. Both types of VOI definition errors produced larger errors in activity estimates for a small and low uptake organs (i.e. -7.5% to 5.3% for the left kidney) than for a large and high uptake organ (i.e. -2.9% to 2.1% for the liver). We observed that misregistration generally had larger effects than misdefinition, with errors ranging from -7.2% to 8.4%. The different imaging methods evaluated responded differently to the errors from misregistration and misdefinition. We found that QSPECT was more sensitive to misdefinition errors, but less sensitive to misregistration errors, as compared to the QPlanar method. Thus, sensitivity to VOI definition errors should be an important criterion in evaluating quantitative imaging methods.

  17. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  18. Effects of finite spatial resolution on quantitative CBF images from dynamic PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phelps, M.E.; Huang, S.C.; Mahoney, D.K.

    1985-05-01

    The finite spatial resolution of PET causes the time-activity responses on pixels around the boundaries between gray and white matter regions to contain kinetic components from tissues of different CBF's. CBF values estimated from kinetics of such mixtures are underestimated because of the nonlinear relationship between the time-activity response and the estimated CBF. Computer simulation is used to investigate these effects on phantoms of circular structures and realistic brain slice in terms of object size and quantitative CBF values. The CBF image calculated is compared to the case of having resolution loss alone. Results show that the size of amore » high flow region in the CBF image is decreased while that of a low flow region is increased. For brain phantoms, the qualitative appearance of CBF images is not seriously affected, but the estimated CBF's are underestimated by 11 to 16 percent in local gray matter regions (of size 1 cm/sup 2/) with about 14 percent reduction in global CBF over the whole slice. It is concluded that the combined effect of finite spatial resolution and the nonlinearity in estimating CBF from dynamic PET is quite significant and must be considered in processing and interpreting quantitative CBF images.« less

  19. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this combined method in terms of quantitative accuracy using the realistic 3D NCAT phantom and an activity distribution obtained from patient studies. We compared the accuracy of organ activity estimates in images reconstructed with and without addition of downscatter compensation from projections with and without downscatter contamination. Results: We observed that the proposed method provided substantial improvements in accuracy compared to no downscatter compensation and had accuracies comparable to reconstructions from projections without downscatter contamination. Conclusions: The results demonstrate that the proposed model-based downscatter compensation method is effective and may have a role in quantitative 131I imaging. PMID:21815394

  20. Absolute activity quantitation from projections using an analytical approach: comparison with iterative methods in Tc-99m and I-123 brain SPECT

    NASA Astrophysics Data System (ADS)

    Fakhri, G. El; Kijewski, M. F.; Moore, S. C.

    2001-06-01

    Estimates of SPECT activity within certain deep brain structures could be useful for clinical tasks such as early prediction of Alzheimer's disease with Tc-99m or Parkinson's disease with I-123; however, such estimates are biased by poor spatial resolution and inaccurate scatter and attenuation corrections. We compared an analytical approach (AA) of more accurate quantitation to a slower iterative approach (IA). Monte Carlo simulated projections of 12 normal and 12 pathologic Tc-99m perfusion studies, as well as 12, normal and 12 pathologic I-123 neurotransmission studies, were generated using a digital brain phantom and corrected for scatter by a multispectral fitting procedure. The AA included attenuation correction by a modified Metz-Fan algorithm and activity estimation by a technique that incorporated Metz filtering to compensate for variable collimator response (VCR), IA-modeled attenuation, and VCR in the projector/backprojector of an ordered subsets-expectation maximization (OSEM) algorithm. Bias and standard deviation over the 12 normal and 12 pathologic patients were calculated with respect to the reference values in the corpus callosum, caudate nucleus, and putamen. The IA and AA yielded similar quantitation results in both Tc-99m and I-123 studies in all brain structures considered in both normal and pathologic patients. The bias with respect to the reference activity distributions was less than 7% for Tc-99m studies, but greater than 30% for I-123 studies, due to partial volume effect in the striata. Our results were validated using I-123 physical acquisitions of an anthropomorphic brain phantom. The IA yielded quantitation accuracy comparable to that obtained with IA, while requiring much less processing time. However, in most conditions, IA yielded lower noise for the same bias than did AA.

  1. Quantitative Comparison of PET and Bremsstrahlung SPECT for Imaging the In Vivo Yttrium-90 Microsphere Distribution after Liver Radioembolization

    PubMed Central

    Elschot, Mattijs; Vermolen, Bart J.; Lam, Marnix G. E. H.; de Keizer, Bart; van den Bosch, Maurice A. A. J.; de Jong, Hugo W. A. M.

    2013-01-01

    Background After yttrium-90 (90Y) microsphere radioembolization (RE), evaluation of extrahepatic activity and liver dosimetry is typically performed on 90Y Bremsstrahlung SPECT images. Since these images demonstrate a low quantitative accuracy, 90Y PET has been suggested as an alternative. The aim of this study is to quantitatively compare SPECT and state-of-the-art PET on the ability to detect small accumulations of 90Y and on the accuracy of liver dosimetry. Methodology/Principal Findings SPECT/CT and PET/CT phantom data were acquired using several acquisition and reconstruction protocols, including resolution recovery and Time-Of-Flight (TOF) PET. Image contrast and noise were compared using a torso-shaped phantom containing six hot spheres of various sizes. The ability to detect extra- and intrahepatic accumulations of activity was tested by quantitative evaluation of the visibility and unique detectability of the phantom hot spheres. Image-based dose estimates of the phantom were compared to the true dose. For clinical illustration, the SPECT and PET-based estimated liver dose distributions of five RE patients were compared. At equal noise level, PET showed higher contrast recovery coefficients than SPECT. The highest contrast recovery coefficients were obtained with TOF PET reconstruction including resolution recovery. All six spheres were consistently visible on SPECT and PET images, but PET was able to uniquely detect smaller spheres than SPECT. TOF PET-based estimates of the dose in the phantom spheres were more accurate than SPECT-based dose estimates, with underestimations ranging from 45% (10-mm sphere) to 11% (37-mm sphere) for PET, and 75% to 58% for SPECT, respectively. The differences between TOF PET and SPECT dose-estimates were supported by the patient data. Conclusions/Significance In this study we quantitatively demonstrated that the image quality of state-of-the-art PET is superior over Bremsstrahlung SPECT for the assessment of the 90Y microsphere distribution after radioembolization. PMID:23405207

  2. ESTIMATION OF CHEMICAL SPECIFIC PARAMETERS WITHIN PHYSIOLOGICALLY BASED PHARMACOKINETIC/PHARMACODYNAMIC MODELS

    EPA Science Inventory

    While relationships between chemical structure and observed properties or activities (QSAR - quantitative structure activity relationship) can be used to predict the behavior of unknown chemicals, this method is semiempirical in nature relying on high quality experimental data to...

  3. 3D-quantitative structure-activity relationship studies on benzothiadiazepine hydroxamates as inhibitors of tumor necrosis factor-alpha converting enzyme.

    PubMed

    Murumkar, Prashant R; Giridhar, Rajani; Yadav, Mange Ram

    2008-04-01

    A set of 29 benzothiadiazepine hydroxamates having selective tumor necrosis factor-alpha converting enzyme inhibitory activity were used to compare the quality and predictive power of 3D-quantitative structure-activity relationship, comparative molecular field analysis, and comparative molecular similarity indices models for the atom-based, centroid/atom-based, data-based, and docked conformer-based alignment. Removal of two outliers from the initial training set of molecules improved the predictivity of models. Among the 3D-quantitative structure-activity relationship models developed using the above four alignments, the database alignment provided the optimal predictive comparative molecular field analysis model for the training set with cross-validated r(2) (q(2)) = 0.510, non-cross-validated r(2) = 0.972, standard error of estimates (s) = 0.098, and F = 215.44 and the optimal comparative molecular similarity indices model with cross-validated r(2) (q(2)) = 0.556, non-cross-validated r(2) = 0.946, standard error of estimates (s) = 0.163, and F = 99.785. These models also showed the best test set prediction for six compounds with predictive r(2) values of 0.460 and 0.535, respectively. The contour maps obtained from 3D-quantitative structure-activity relationship studies were appraised for activity trends for the molecules analyzed. The comparative molecular similarity indices models exhibited good external predictivity as compared with that of comparative molecular field analysis models. The data generated from the present study helped us to further design and report some novel and potent tumor necrosis factor-alpha converting enzyme inhibitors.

  4. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    NASA Astrophysics Data System (ADS)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less smoothing at early time points post-radiopharmaceutical administration but more smoothing and fewer iterations at later time points when the total organ activity was lower. The results of this study demonstrate the importance of using optimal reconstruction and regularization parameters. Optimal results were obtained with different parameters at each time point, but using a single set of parameters for all time points produced near-optimal dose-volume histograms.

  5. Alzheimer disease: Quantitative analysis of I-123-iodoamphetamine SPECT brain imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hellman, R.S.; Tikofsky, R.S.; Collier, B.D.

    1989-07-01

    To enable a more quantitative diagnosis of senile dementia of the Alzheimer type (SDAT), the authors developed and tested a semiautomated method to define regions of interest (ROIs) to be used in quantitating results from single photon emission computed tomography (SPECT) of regional cerebral blood flow performed with N-isopropyl iodine-123-iodoamphetamine. SPECT/IMP imaging was performed in ten patients with probable SDAT and seven healthy subjects. Multiple ROIs were manually and semiautomatically generated, and uptake was quantitated for each ROI. Mean cortical activity was estimated as the average of the mean activity in 24 semiautomatically generated ROIs; mean cerebellar activity was determinedmore » from the mean activity in separate ROIs. A ratio of parietal to cerebellar activity less than 0.60 and a ratio of parietal to mean cortical activity less than 0.90 allowed correct categorization of nine of ten and eight of ten patients, respectively, with SDAT and all control subjects. The degree of diminished mental status observed in patients with SDAT correlated with both global and regional changes in IMP uptake.« less

  6. Investigation of the cortical activation by touching fabric actively using fingers.

    PubMed

    Wang, Q; Yu, W; He, N; Chen, K

    2015-11-01

    Human subjects can tactually estimate the perception of touching fabric. Although many psychophysical and neurophysiological experiments have elucidated the peripheral neural mechanisms that underlie fabric hand estimation, the associated cortical mechanisms are not well understood. To identify the brain regions responsible for the tactile stimulation of fabric against human skin, we used the technology of functional magnetic resonance imaging (fMRI), to observe brain activation when the subjects touched silk fabric actively using fingers. Consistent with previous research about brain cognition on sensory stimulation, large activation in the primary somatosensory cortex (SI), the secondary somatosensory cortex (SII) and moto cortex, and little activation in the posterior insula cortex and Broca's Area were observed when the subjects touched silk fabric. The technology of fMRI is a promising tool to observe and characterize the brain cognition on the tactile stimulation of fabric quantitatively. The intensity and extent of activation in the brain regions, especially the primary somatosensory cortex (SI) and the secondary somatosensory cortex (SII), can represent the perception of stimulation of fabric quantitatively. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  8. Vector analysis of ecoenzyme activities reveal constraints on coupled C, N and P dynamics

    EPA Science Inventory

    We developed a quantitative method for estimating resource allocation strategies of microbial communities based on the proportional activities of four, key extracellular enzymes, 1,4-ß-glucosidase (BG), leucine amino-peptidase (LAP), 1,4-ß-N-acetylglucosaminidase (NAG...

  9. Arctic Stratospheric Temperature In The Winters 1999/2000 and 2000/2001: A Quantitative Assessment and Microphysical Implications

    NASA Astrophysics Data System (ADS)

    Buss, S.; Wernli, H.; Peter, T.; Kivi, R.; Bui, T. P.; Kleinböhl, A.; Schiller, C.

    Stratospheric winter temperatures play a key role in the chain of microphysical and chemical processes that lead to the formation of polar stratospheric clouds (PSCs), chlorine activation and eventually to stratospheric ozone depletion. Here the tempera- ture conditions during the Arctic winters 1999/2000 and 2000/2001 are quantitatively investigated using observed profiles of water vapour and nitric acid, and tempera- tures from high-resolution radiosondes and aircraft observations, global ECMWF and UKMO analyses and mesoscale model simulations over Scandinavia and Greenland. The ECMWF model resolves parts of the gravity wave activity and generally agrees well with the observations. However, for the very cold temperatures near the ice frost point the ECMWF analyses have a warm bias of 1-6 K compared to radiosondes. For the mesoscale model HRM, this bias is generally reduced due to a more accurate rep- resentation of gravity waves. Quantitative estimates of the impact of the mesoscale temperature perturbations indicates that over Scandinavia and Greenland the wave- induced stratospheric cooling (as simulated by the HRM) affects only moderately the estimated chlorine activation and homogeneous NAT particle formation, but strongly enhances the potential for ice formation.

  10. QFASAR: Quantitative fatty acid signature analysis with R

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  11. 42 CFR 82.26 - How will NIOSH report dose reconstruction results?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER... dose reconstruction, justification for the decision, and if possible, a quantitative estimate of the...

  12. 42 CFR 82.26 - How will NIOSH report dose reconstruction results?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER... dose reconstruction, justification for the decision, and if possible, a quantitative estimate of the...

  13. 42 CFR 82.26 - How will NIOSH report dose reconstruction results?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER... dose reconstruction, justification for the decision, and if possible, a quantitative estimate of the...

  14. 42 CFR 82.26 - How will NIOSH report dose reconstruction results?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER... dose reconstruction, justification for the decision, and if possible, a quantitative estimate of the...

  15. 42 CFR 82.26 - How will NIOSH report dose reconstruction results?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER... dose reconstruction, justification for the decision, and if possible, a quantitative estimate of the...

  16. 78 FR 56942 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    .... These indicators are both quantitative and descriptive and may include, for example, the characteristics... Centers, and to evaluate the progress of the program. Estimate of Burden: 185 hours per center for 223...

  17. 78 FR 50452 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    .... These indicators are both quantitative and descriptive and may include, for example, the characteristics... information to continue funding of the Centers, and to evaluate the progress of the program. Estimate of...

  18. 77 FR 32143 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-31

    ... contractor. These indicators are both quantitative and descriptive and may include, for example, the... the Centers, and to evaluate the progress of the program. Estimate of Burden: 100 hours per center for...

  19. Estimating the Potential Toxicity of Chemicals Associated with Hydraulic Fracturing Operations Using Quantitative Structure-Activity Relationship Modeling.

    PubMed

    Yost, Erin E; Stanek, John; DeWoskin, Robert S; Burgoon, Lyle D

    2016-07-19

    The United States Environmental Protection Agency (EPA) identified 1173 chemicals associated with hydraulic fracturing fluids, flowback, or produced water, of which 1026 (87%) lack chronic oral toxicity values for human health assessments. To facilitate the ranking and prioritization of chemicals that lack toxicity values, it may be useful to employ toxicity estimates from quantitative structure-activity relationship (QSAR) models. Here we describe an approach for applying the results of a QSAR model from the TOPKAT program suite, which provides estimates of the rat chronic oral lowest-observed-adverse-effect level (LOAEL). Of the 1173 chemicals, TOPKAT was able to generate LOAEL estimates for 515 (44%). To address the uncertainty associated with these estimates, we assigned qualitative confidence scores (high, medium, or low) to each TOPKAT LOAEL estimate, and found 481 to be high-confidence. For 48 chemicals that had both a high-confidence TOPKAT LOAEL estimate and a chronic oral reference dose from EPA's Integrated Risk Information System (IRIS) database, Spearman rank correlation identified 68% agreement between the two values (permutation p-value =1 × 10(-11)). These results provide support for the use of TOPKAT LOAEL estimates in identifying and prioritizing potentially hazardous chemicals. High-confidence TOPKAT LOAEL estimates were available for 389 of 1026 hydraulic fracturing-related chemicals that lack chronic oral RfVs and OSFs from EPA-identified sources, including a subset of chemicals that are frequently used in hydraulic fracturing fluids.

  20. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  1. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.

  2. History of EPI Suite™ and future perspectives on chemical property estimation in US Toxic Substances Control Act new chemical risk assessments.

    PubMed

    Card, Marcella L; Gomez-Alvarez, Vicente; Lee, Wen-Hsiung; Lynch, David G; Orentas, Nerija S; Lee, Mari Titcombe; Wong, Edmund M; Boethling, Robert S

    2017-03-22

    Chemical property estimation is a key component in many industrial, academic, and regulatory activities, including in the risk assessment associated with the approximately 1000 new chemical pre-manufacture notices the United States Environmental Protection Agency (US EPA) receives annually. The US EPA evaluates fate, exposure and toxicity under the 1976 Toxic Substances Control Act (amended by the 2016 Frank R. Lautenberg Chemical Safety for the 21 st Century Act), which does not require test data with new chemical applications. Though the submission of data is not required, the US EPA has, over the past 40 years, occasionally received chemical-specific data with pre-manufacture notices. The US EPA has been actively using this and publicly available data to develop and refine predictive computerized models, most of which are housed in EPI Suite™, to estimate chemical properties used in the risk assessment of new chemicals. The US EPA develops and uses models based on (quantitative) structure-activity relationships ([Q]SARs) to estimate critical parameters. As in any evolving field, (Q)SARs have experienced successes, suffered failures, and responded to emerging trends. Correlations of a chemical structure with its properties or biological activity were first demonstrated in the late 19 th century and today have been encapsulated in a myriad of quantitative and qualitative SARs. The development and proliferation of the personal computer in the late 20 th century gave rise to a quickly increasing number of property estimation models, and continually improved computing power and connectivity among researchers via the internet are enabling the development of increasingly complex models.

  3. Joint reconstruction of activity and attenuation in Time-of-Flight PET: A Quantitative Analysis.

    PubMed

    Rezaei, Ahmadreza; Deroose, Christophe M; Vahle, Thomas; Boada, Fernando; Nuyts, Johan

    2018-03-01

    Joint activity and attenuation reconstruction methods from time of flight (TOF) positron emission tomography (PET) data provide an effective solution to attenuation correction when no (or incomplete/inaccurate) information on the attenuation is available. One of the main barriers limiting their use in clinical practice is the lack of validation of these methods on a relatively large patient database. In this contribution, we aim at validating the activity reconstructions of the maximum likelihood activity reconstruction and attenuation registration (MLRR) algorithm on a whole-body patient data set. Furthermore, a partial validation (since the scale problem of the algorithm is avoided for now) of the maximum likelihood activity and attenuation reconstruction (MLAA) algorithm is also provided. We present a quantitative comparison of the joint reconstructions to the current clinical gold-standard maximum likelihood expectation maximization (MLEM) reconstruction with CT-based attenuation correction. Methods: The whole-body TOF-PET emission data of each patient data set is processed as a whole to reconstruct an activity volume covering all the acquired bed positions, which helps to reduce the problem of a scale per bed position in MLAA to a global scale for the entire activity volume. Three reconstruction algorithms are used: MLEM, MLRR and MLAA. A maximum likelihood (ML) scaling of the single scatter simulation (SSS) estimate to the emission data is used for scatter correction. The reconstruction results are then analyzed in different regions of interest. Results: The joint reconstructions of the whole-body patient data set provide better quantification in case of PET and CT misalignments caused by patient and organ motion. Our quantitative analysis shows a difference of -4.2% (±2.3%) and -7.5% (±4.6%) between the joint reconstructions of MLRR and MLAA compared to MLEM, averaged over all regions of interest, respectively. Conclusion: Joint activity and attenuation estimation methods provide a useful means to estimate the tracer distribution in cases where CT-based attenuation images are subject to misalignments or are not available. With an accurate estimate of the scatter contribution in the emission measurements, the joint TOF-PET reconstructions are within clinical acceptable accuracy. Copyright © 2018 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  4. Comparison of quantitative Y-90 SPECT and non-time-of-flight PET imaging in post-therapy radioembolization of liver cancer

    PubMed Central

    Yue, Jianting; Mauxion, Thibault; Reyes, Diane K.; Lodge, Martin A.; Hobbs, Robert F.; Rong, Xing; Dong, Yinfeng; Herman, Joseph M.; Wahl, Richard L.; Geschwind, Jean-François H.; Frey, Eric C.

    2016-01-01

    Purpose: Radioembolization with yttrium-90 microspheres may be optimized with patient-specific pretherapy treatment planning. Dose verification and validation of treatment planning methods require quantitative imaging of the post-therapy distribution of yttrium-90 (Y-90). Methods for quantitative imaging of Y-90 using both bremsstrahlung SPECT and PET have previously been described. The purpose of this study was to compare the two modalities quantitatively in humans. Methods: Calibration correction factors for both quantitative Y-90 bremsstrahlung SPECT and a non-time-of-flight PET system without compensation for prompt coincidences were developed by imaging three phantoms. The consistency of these calibration correction factors for the different phantoms was evaluated. Post-therapy images from both modalities were obtained from 15 patients with hepatocellular carcinoma who underwent hepatic radioembolization using Y-90 glass microspheres. Quantitative SPECT and PET images were rigidly registered and the total liver activities and activity distributions estimated for each modality were compared. The activity distributions were compared using profiles, voxel-by-voxel correlation and Bland–Altman analyses, and activity-volume histograms. Results: The mean ± standard deviation of difference in the total activity in the liver between the two modalities was 0% ± 9% (range −21%–18%). Voxel-by-voxel comparisons showed a good agreement in regions corresponding roughly to treated tumor and treated normal liver; the agreement was poorer in regions with low or no expected activity, where PET appeared to overestimate the activity. The correlation coefficients between intrahepatic voxel pairs for the two modalities ranged from 0.86 to 0.94. Cumulative activity volume histograms were in good agreement. Conclusions: These data indicate that, with appropriate reconstruction methods and measured calibration correction factors, either Y-90 SPECT/CT or Y-90 PET/CT can be used for quantitative post-therapy monitoring of Y-90 activity distribution following hepatic radioembolization. PMID:27782730

  5. Comparison of quantitative Y-90 SPECT and non-time-of-flight PET imaging in post-therapy radioembolization of liver cancer.

    PubMed

    Yue, Jianting; Mauxion, Thibault; Reyes, Diane K; Lodge, Martin A; Hobbs, Robert F; Rong, Xing; Dong, Yinfeng; Herman, Joseph M; Wahl, Richard L; Geschwind, Jean-François H; Frey, Eric C

    2016-10-01

    Radioembolization with yttrium-90 microspheres may be optimized with patient-specific pretherapy treatment planning. Dose verification and validation of treatment planning methods require quantitative imaging of the post-therapy distribution of yttrium-90 (Y-90). Methods for quantitative imaging of Y-90 using both bremsstrahlung SPECT and PET have previously been described. The purpose of this study was to compare the two modalities quantitatively in humans. Calibration correction factors for both quantitative Y-90 bremsstrahlung SPECT and a non-time-of-flight PET system without compensation for prompt coincidences were developed by imaging three phantoms. The consistency of these calibration correction factors for the different phantoms was evaluated. Post-therapy images from both modalities were obtained from 15 patients with hepatocellular carcinoma who underwent hepatic radioembolization using Y-90 glass microspheres. Quantitative SPECT and PET images were rigidly registered and the total liver activities and activity distributions estimated for each modality were compared. The activity distributions were compared using profiles, voxel-by-voxel correlation and Bland-Altman analyses, and activity-volume histograms. The mean ± standard deviation of difference in the total activity in the liver between the two modalities was 0% ± 9% (range -21%-18%). Voxel-by-voxel comparisons showed a good agreement in regions corresponding roughly to treated tumor and treated normal liver; the agreement was poorer in regions with low or no expected activity, where PET appeared to overestimate the activity. The correlation coefficients between intrahepatic voxel pairs for the two modalities ranged from 0.86 to 0.94. Cumulative activity volume histograms were in good agreement. These data indicate that, with appropriate reconstruction methods and measured calibration correction factors, either Y-90 SPECT/CT or Y-90 PET/CT can be used for quantitative post-therapy monitoring of Y-90 activity distribution following hepatic radioembolization.

  6. Fitness to work of astronauts in conditions of action of the extreme emotional factors

    NASA Astrophysics Data System (ADS)

    Prisniakova, L. M.

    2004-01-01

    The theoretical model for the quantitative determination of influence of a level of emotional exertion on the success of human activity is presented. The learning curves of fixed words in the groups with a different level of the emotional exertion are analyzed. The obtained magnitudes of time constant T depending on a type of the emotional exertion are a quantitative measure of the emotional exertion. Time constants could also be of use for a prediction of the characteristic of fitness to work of an astronaut in conditions of extreme factors. The inverse of the sign of influencing on efficiency of activity of the man is detected. The paper offers a mathematical model of the relation between successful activity and motivations or the emotional exertion (Yerkes-Dodson law). Proposed models can serve by the theoretical basis of the quantitative characteristics of an estimation of activity of astronauts in conditions of the emotional factors at a phase of their selection.

  7. Fitness to work of astronauts in conditions of action of the extreme emotional factors.

    PubMed

    Prisniakova, L M

    2004-01-01

    The theoretical model for the quantitative determination of influence of a level of emotional exertion on the success of human activity is presented. The learning curves of fixed words in the groups with a different level of the emotional exertion are analyzed. The obtained magnitudes of time constant T depending on a type of the emotional exertion are a quantitative measure of the emotional exertion. Time constants could also be of use for a prediction of the characteristic of fitness to work of an astronaut in conditions of extreme factors. The inverse of the sign of influencing on efficiency of activity of the man is detected. The paper offers a mathematical model of the relation between successful activity and motivations or the emotional exertion (Yerkes-Dodson law). Proposed models can serve by the theoretical basis of the quantitative characteristics of an estimation of activity of astronauts in conditions of the emotional factors at a phase of their selection. Published by Elsevier Ltd on behalf of COSPAR.

  8. A novel Raman spectrophotometric method for quantitative measurement of nucleoside triphosphate hydrolysis.

    PubMed

    Jenkins, R H; Tuma, R; Juuti, J T; Bamford, D H; Thomas, G J

    1999-01-01

    A novel spectrophotometric method, based upon Raman spectroscopy, has been developed for accurate quantitative determination of nucleoside triphosphate phosphohydrolase (NTPase) activity. The method relies upon simultaneous measurement in real time of the intensities of Raman marker bands diagnostic of the triphosphate (1115 cm(-1)) and diphosphate (1085 cm(-1)) moieties of the NTPase substrate and product, respectively. The reliability of the method is demonstrated for the NTPase-active RNA-packaging enzyme (protein P4) of bacteriophage phi6, for which comparative NTPase activities have been estimated independently by radiolabeling assays. The Raman-determined rate for adenosine triphosphate substrate (8.6 +/- 1.3 micromol x mg(-1) x min(-1) at 40 degrees C) is in good agreement with previous estimates. The versatility of the Raman method is demonstrated by its applicability to a variety of nucleotide substrates of P4, including the natural ribonucleoside triphosphates (ATP, GTP) and dideoxynucleoside triphosphates (ddATP, ddGTP). Advantages of the present protocol include conservative sample requirements (approximately 10(-6) g enzyme/protocol) and relative ease of data collection and analysis. The latter conveniences are particularly advantageous for the measurement of activation energies of phosphohydrolase activity.

  9. A Hierarchical Clustering Methodology for the Estimation of Toxicity

    EPA Science Inventory

    A Quantitative Structure Activity Relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural sim...

  10. A quantitative approach to assessing the efficacy of occupant protection programs: A case study from Montana.

    PubMed

    Manlove, Kezia; Stanley, Laura; Peck, Alyssa

    2015-10-01

    Quantitative evaluation of vehicle occupant protection programs is critical for ensuring efficient government resource allocation, but few methods exist for conducting evaluation across multiple programs simultaneously. Here we present an analysis of occupant protection efficacy in the state of Montana. This approach relies on seat belt compliance rates as measured by the National Occupant Protection Usage Survey (NOPUS). A hierarchical logistic regression model is used to estimate the impacts of four Montana Department of Transportation (MDT)-funded occupant protection programs used in the state of Montana, following adjustment for a suite of potential confounders. Activity from two programs, Buckle Up coalitions and media campaigns, are associated with increased seat belt use in Montana, whereas the impact of another program, Selective Traffic Enforcement, is potentially masked by other program activity. A final program, Driver's Education, is not associated with any shift in seat belt use. This method allows for a preliminary quantitative estimation of program impacts without requiring states to obtain any new seat belt use data. This approach provides states a preliminary look at program impacts, and a means for carefully planning future program allocation and investigation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Health Impacts of Increased Physical Activity from Changes in Transportation Infrastructure: Quantitative Estimates for Three Communities

    PubMed Central

    2015-01-01

    Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7–30.6), 0.6 (0.3–0.9), and 4.7 (2.1–7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches. PMID:26504832

  12. Health Impacts of Increased Physical Activity from Changes in Transportation Infrastructure: Quantitative Estimates for Three Communities.

    PubMed

    Mansfield, Theodore J; MacDonald Gibson, Jacqueline

    2015-01-01

    Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7-30.6), 0.6 (0.3-0.9), and 4.7 (2.1-7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches.

  13. Calculation of Drug Solubilities by Pharmacy Students.

    ERIC Educational Resources Information Center

    Cates, Lindley A.

    1981-01-01

    A method of estimating the solubilities of drugs in water is reported that is based on a principle applied in quantitative structure-activity relationships. This procedure involves correlation of partition coefficient values using the octanol/water system and aqueous solubility. (Author/MLW)

  14. Stroke onset time estimation from multispectral quantitative magnetic resonance imaging in a rat model of focal permanent cerebral ischemia.

    PubMed

    McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A

    2016-08-01

    Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.

  15. Estimating contraction level using root mean square amplitude in control subjects and patients with neuromuscular disorders.

    PubMed

    Boe, Shaun G; Rice, Charles L; Doherty, Timothy J

    2008-04-01

    To assess the utility of the surface electromyographic signal as a means of estimating the level of muscle force during quantitative electromyography studies by examining the relationship between muscle force and the amplitude of the surface electromyographic activity signal; and to determine the impact of a reduction in the number of motor units on this relationship, through inclusion of a sample of patients with neuromuscular disease. Cross-sectional, cohort study design. Tertiary care, ambulatory, electromyography laboratory. A volunteer, convenience sample of healthy control subjects (n=10), patients with amyotrophic lateral sclerosis (n=9), and patients with Charcot-Marie-Tooth disease type X (n=5). Not applicable. The first dorsal interosseous (FDI) and biceps brachii muscles were examined. Force values (at 10% increments) were calculated from two 4-second maximal voluntary contractions (MVCs). Surface electromyographic activity was recorded during separate 4-second voluntary contractions at 9 force increments (10% -90% of MVC). Additionally, a motor unit number estimate was derived for each subject to quantify the degree of motor unit loss in patients relative to control subjects. The relationships between force and surface electromyographic activity for both muscles (controls and patients) were best fit by a linear function. The variability about the grouped regression lines was quantified by 95% confidence intervals and found to be +/-6.7% (controls) and +/-8.5% (patients) for the FDI and +/-5% (controls) and +/-6.1% (patients) for the biceps brachii. These results suggest that the amplitude of the surface electromyographic activity signal may be used as a means of estimating the level of muscle force during quantitative electromyography studies. Future studies should be directed at examining if the variability associated with these force and surface electromyographic activity relationships is acceptable in replacing previous methods of measuring muscle force.

  16. A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.

    PubMed

    Rong, Xing; Frey, Eric C

    2013-08-01

    Post-therapy quantitative 90Y bremsstrahlung single photon emission computed tomography (SPECT) has shown great potential to provide reliable activity estimates, which are essential for dose verification. Typically 90Y imaging is performed with high- or medium-energy collimators. However, the energy spectrum of 90Y bremsstrahlung photons is substantially different than typical for these collimators. In addition, dosimetry requires quantitative images, and collimators are not typically optimized for such tasks. Optimizing a collimator for 90Y imaging is both novel and potentially important. Conventional optimization methods are not appropriate for 90Y bremsstrahlung photons, which have a continuous and broad energy distribution. In this work, the authors developed a parallel-hole collimator optimization method for quantitative tasks that is particularly applicable to radionuclides with complex emission energy spectra. The authors applied the proposed method to develop an optimal collimator for quantitative 90Y bremsstrahlung SPECT in the context of microsphere radioembolization. To account for the effects of the collimator on both the bias and the variance of the activity estimates, the authors used the root mean squared error (RMSE) of the volume of interest activity estimates as the figure of merit (FOM). In the FOM, the bias due to the null space of the image formation process was taken in account. The RMSE was weighted by the inverse mass to reflect the application to dosimetry; for a different application, more relevant weighting could easily be adopted. The authors proposed a parameterization for the collimator that facilitates the incorporation of the important factors (geometric sensitivity, geometric resolution, and septal penetration fraction) determining collimator performance, while keeping the number of free parameters describing the collimator small (i.e., two parameters). To make the optimization results for quantitative 90Y bremsstrahlung SPECT more general, the authors simulated multiple tumors of various sizes in the liver. The authors realistically simulated human anatomy using a digital phantom and the image formation process using a previously validated and computationally efficient method for modeling the image-degrading effects including object scatter, attenuation, and the full collimator-detector response (CDR). The scatter kernels and CDR function tables used in the modeling method were generated using a previously validated Monte Carlo simulation code. The hole length, hole diameter, and septal thickness of the obtained optimal collimator were 84, 3.5, and 1.4 mm, respectively. Compared to a commercial high-energy general-purpose collimator, the optimal collimator improved the resolution and FOM by 27% and 18%, respectively. The proposed collimator optimization method may be useful for improving quantitative SPECT imaging for radionuclides with complex energy spectra. The obtained optimal collimator provided a substantial improvement in quantitative performance for the microsphere radioembolization task considered.

  17. Manned Mars mission radiation environment and radiobiology

    NASA Technical Reports Server (NTRS)

    Nachtwey, D. S.

    1986-01-01

    Potential radiation hazards to crew members on manned Mars missions are discussed. It deals briefly with radiation sources and environments likely to be encountered during various phases of such missions, providing quantitative estimates of these environments. Also provided are quantitative data and discussions on the implications of such radiation on the human body. Various sorts of protective measures are suggested. Recent re-evaluation of allowable dose limits by the National Council of Radiation Protection is discussed, and potential implications from such activity are assessed.

  18. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  19. Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images

    PubMed Central

    Frey, Eric C.; Humm, John L.; Ljungberg, Michael

    2012-01-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429

  20. A quantitative characterization of the yeast heterotrimeric G protein cycle

    PubMed Central

    Yi, Tau-Mu; Kitano, Hiroaki; Simon, Melvin I.

    2003-01-01

    The yeast mating response is one of the best understood heterotrimeric G protein signaling pathways. Yet, most descriptions of this system have been qualitative. We have quantitatively characterized the heterotrimeric G protein cycle in yeast based on direct in vivo measurements. We used fluorescence resonance energy transfer to monitor the association state of cyan fluorescent protein (CFP)-Gα and Gβγ-yellow fluorescent protein (YFP), and we found that receptor-mediated G protein activation produced a loss of fluorescence resonance energy transfer. Quantitative time course and dose–response data were obtained for both wild-type and mutant cells possessing an altered pheromone response. These results paint a quantitative portrait of how regulators such as Sst2p and the C-terminal tail of α-factor receptor modulate the kinetics and sensitivity of G protein signaling. We have explored critical features of the dynamics including the rapid rise and subsequent decline of active G proteins during the early response, and the relationship between the G protein activation dose–response curve and the downstream dose–response curves for cell-cycle arrest and transcriptional induction. Fitting the data to a mathematical model produced estimates of the in vivo rates of heterotrimeric G protein activation and deactivation in yeast. PMID:12960402

  1. Synthesis, quantitative structure-property relationship study of novel fluorescence active 2-pyrazolines and application.

    PubMed

    Girgis, Adel S; Basta, Altaf H; El-Saied, Houssni; Mohamed, Mohamed A; Bedair, Ahmad H; Salim, Ahmad S

    2018-03-01

    A variety of fluorescence-active fluorinated pyrazolines 13-33 was synthesized in good yields through cyclocondensation reaction of propenones 1-9 with aryl hydrazines 10-12 . Some of the synthesized compounds provided promising fluorescence properties with quantum yield ( Φ ) higher than that of quinine sulfate (standard reference). Quantitative structure-property relationship studies were undertaken supporting the exhibited fluorescence properties and estimating the parameters governing properties. Five synthesized fluorescence-active pyrazolines ( 13 , 15 , 18 , 19 and 23 ) with variable Φ were selected for treating two types of paper sheets (Fabriano and Bible paper). These investigated fluorescence compounds, especially compounds 19 and 23 , provide improvements in strength properties of paper sheets. Based on the observed performance they can be used as markers in security documents.

  2. Synthesis, quantitative structure–property relationship study of novel fluorescence active 2-pyrazolines and application

    PubMed Central

    Girgis, Adel S.; El-Saied, Houssni; Mohamed, Mohamed A.; Bedair, Ahmad H.; Salim, Ahmad S.

    2018-01-01

    A variety of fluorescence-active fluorinated pyrazolines 13–33 was synthesized in good yields through cyclocondensation reaction of propenones 1–9 with aryl hydrazines 10–12. Some of the synthesized compounds provided promising fluorescence properties with quantum yield (Φ) higher than that of quinine sulfate (standard reference). Quantitative structure–property relationship studies were undertaken supporting the exhibited fluorescence properties and estimating the parameters governing properties. Five synthesized fluorescence-active pyrazolines (13, 15, 18, 19 and 23) with variable Φ were selected for treating two types of paper sheets (Fabriano and Bible paper). These investigated fluorescence compounds, especially compounds 19 and 23, provide improvements in strength properties of paper sheets. Based on the observed performance they can be used as markers in security documents. PMID:29657796

  3. Synthesis, quantitative structure-property relationship study of novel fluorescence active 2-pyrazolines and application

    NASA Astrophysics Data System (ADS)

    Girgis, Adel S.; Basta, Altaf H.; El-Saied, Houssni; Mohamed, Mohamed A.; Bedair, Ahmad H.; Salim, Ahmad S.

    2018-03-01

    A variety of fluorescence-active fluorinated pyrazolines 13-33 was synthesized in good yields through cyclocondensation reaction of propenones 1-9 with aryl hydrazines 10-12. Some of the synthesized compounds provided promising fluorescence properties with quantum yield (Φ) higher than that of quinine sulfate (standard reference). Quantitative structure-property relationship studies were undertaken supporting the exhibited fluorescence properties and estimating the parameters governing properties. Five synthesized fluorescence-active pyrazolines (13, 15, 18, 19 and 23) with variable Φ were selected for treating two types of paper sheets (Fabriano and Bible paper). These investigated fluorescence compounds, especially compounds 19 and 23, provide improvements in strength properties of paper sheets. Based on the observed performance they can be used as markers in security documents.

  4. Comparison of Global and Mode of Action-Based Models for Aquatic Toxicity

    EPA Science Inventory

    The ability to estimate aquatic toxicity for a wide variety of chemicals is a critical need for ecological risk assessment and chemical regulation. The consensus in the literature is that mode of action (MOA) based QSAR (Quantitative Structure Activity Relationship) models yield ...

  5. An ecological framework for informing permitting decisions on scientific activities in protected areas

    PubMed Central

    Saarman, Emily T.; Owens, Brian; Murray, Steven N.; Weisberg, Stephen B.; Field, John C.; Nielsen, Karina J.

    2018-01-01

    There are numerous reasons to conduct scientific research within protected areas, but research activities may also negatively impact organisms and habitats, and thus conflict with a protected area’s conservation goals. We developed a quantitative ecological decision-support framework that estimates these potential impacts so managers can weigh costs and benefits of proposed research projects and make informed permitting decisions. The framework generates quantitative estimates of the ecological impacts of the project and the cumulative impacts of the proposed project and all other projects in the protected area, and then compares the estimated cumulative impacts of all projects with policy-based acceptable impact thresholds. We use a series of simplified equations (models) to assess the impacts of proposed research to: a) the population of any targeted species, b) the major ecological assemblages that make up the community, and c) the physical habitat that supports protected area biota. These models consider both targeted and incidental impacts to the ecosystem and include consideration of the vulnerability of targeted species, assemblages, and habitats, based on their recovery time and ecological role. We parameterized the models for a wide variety of potential research activities that regularly occur in the study area using a combination of literature review and expert judgment with a precautionary approach to uncertainty. We also conducted sensitivity analyses to examine the relationships between model input parameters and estimated impacts to understand the dominant drivers of the ecological impact estimates. Although the decision-support framework was designed for and adopted by the California Department of Fish and Wildlife for permitting scientific studies in the state-wide network of marine protected areas (MPAs), the framework can readily be adapted for terrestrial and freshwater protected areas. PMID:29920527

  6. Investigation of BOLD fMRI Resonance Frequency Shifts and Quantitative Susceptibility Changes at 7 T

    PubMed Central

    Bianciardi, Marta; van Gelderen, Peter; Duyn, Jeff H.

    2013-01-01

    Although blood oxygenation level dependent (BOLD) functional magnetic resonance imaging (fMRI) experiments of brain activity generally rely on the magnitude of the signal, they also provide frequency information that can be derived from the phase of the signal. However, because of confounding effects of instrumental and physiological origin, BOLD related frequency information is difficult to extract and therefore rarely used. Here, we explored the use of high field (7 T) and dedicated signal processing methods to extract frequency information and use it to quantify and interpret blood oxygenation and blood volume changes. We found that optimized preprocessing improves detection of task-evoked and spontaneous changes in phase signals and resonance frequency shifts over large areas of the cortex with sensitivity comparable to that of magnitude signals. Moreover, our results suggest the feasibility of mapping BOLD quantitative susceptibility changes in at least part of the activated area and its largest draining veins. Comparison with magnitude data suggests that the observed susceptibility changes originate from neuronal activity through induced blood volume and oxygenation changes in pial and intracortical veins. Further, from frequency shifts and susceptibility values, we estimated that, relative to baseline, the fractional oxygen saturation in large vessels increased by 0.02–0.05 during stimulation, which is consistent to previously published estimates. Together, these findings demonstrate that valuable information can be derived from fMRI imaging of BOLD frequency shifts and quantitative susceptibility changes. PMID:23897623

  7. EVALUATING THE IMPACTS OF ENVIRONMENTAL ENFORCEMENT: THE CASE OF EFFLUENT DISCHARGES IN THE PETROLEUM REFINING INDUSTRY

    EPA Science Inventory

    This paper looks at the impact of enforcement activity on facility-level behavior and derives quantitative estimates of the impact. We measure facility-level behavior as the levels of Biological Oxygen Demand (BOD) and Total Suspended Solids (TSS) pollutant discharges generated b...

  8. Remote Determination of Auroral Energy Characteristics During Substorm Activity

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Parks, G. K.; Brittnacher, M. J.; Cumnock, J.; Lummerzheim, D.; Spann, J. F., Jr.

    1997-01-01

    Ultraviolet auroral images from the Ultraviolet Imager onboard the POLAR satellite can be used as quantitative remote diagnostics of the auroral regions, yielding estimates of incident energy characteristics, compositional changes, and other higher order data products. In particular, images of long and short wavelength N2 Lyman-Birge-Hopfield (LBH) emissions can be modeled to obtain functions of energy flux and average energy that are basically insensitive to changes in seasonal and solar activity changes. This technique is used in this study to estimate incident electron energy flux and average energy during substorm activity occurring on May 19, 1996. This event was simultaneously observed by WIND, GEOTAIL, INTERBALL, DMSP and NOAA spacecraft as well as by POLAR. Here incident energy estimates derived from Ultraviolet Imager (UVI) are compared with in situ measurements of the same parameters from an overflight by the DMSP F12 satellite coincident with the UVI image times.

  9. Using Weighted Entropy to Rank Chemicals in Quantitative High Throughput Screening Experiments

    PubMed Central

    Shockley, Keith R.

    2014-01-01

    Quantitative high throughput screening (qHTS) experiments can simultaneously produce concentration-response profiles for thousands of chemicals. In a typical qHTS study, a large chemical library is subjected to a primary screen in order to identify candidate hits for secondary screening, validation studies or prediction modeling. Different algorithms, usually based on the Hill equation logistic model, have been used to classify compounds as active or inactive (or inconclusive). However, observed concentration-response activity relationships may not adequately fit a sigmoidal curve. Furthermore, it is unclear how to prioritize chemicals for follow-up studies given the large uncertainties that often accompany parameter estimates from nonlinear models. Weighted Shannon entropy can address these concerns by ranking compounds according to profile-specific statistics derived from estimates of the probability mass distribution of response at the tested concentration levels. This strategy can be used to rank all tested chemicals in the absence of a pre-specified model structure or the approach can complement existing activity call algorithms by ranking the returned candidate hits. The weighted entropy approach was evaluated here using data simulated from the Hill equation model. The procedure was then applied to a chemical genomics profiling data set interrogating compounds for androgen receptor agonist activity. PMID:24056003

  10. The environmental factors as reason for emotional tension

    NASA Astrophysics Data System (ADS)

    Prisniakova, L.

    The information from environment is a reason of activation of an organism, it calls abrupt changings in nervous processes and it offers emotions. One part of emotions organizes and supports activity, others disorganize it. In fields of perception, of making decision, fulfilment of operatings, of learning the emotional excitation raises the level of carrying-out more easy problems and reduces of more difficult one. The report are presented the outcomes of quantitative determination of a level of emotional tension on successful activity. The inverse of the sign of influencing on efficiency of activity of the man is detected. The action of the emotional tension on efficiency of professional work was demonstrated to have similarly to influencing of motivation according to the law Yerkes -Dodson. The report introduces a mathematical model of connection of successful activity and motivations or the emotional tension. Introduced in the report the outcomes can serve the theoretical idealized basis of the quantitative characteristics of an estimation of activity of astronauts in conditions of the emotional factors at a phase of selection

  11. Establishing the value of occupational health nurses' contributions to worker health and safety: a pilot test of a user-friendly estimation tool.

    PubMed

    Graeve, Catherine; McGovern, Patricia; Nachreiner, Nancy M; Ayers, Lynn

    2014-01-01

    Occupational health nurses use their knowledge and skills to improve the health and safety of the working population; however, companies increasingly face budget constraints and may eliminate health and safety programs. Occupational health nurses must be prepared to document their services and outcomes, and use quantitative tools to demonstrate their value to employers. The aim of this project was to create and pilot test a quantitative tool for occupational health nurses to track their activities and potential cost savings for on-site occupational health nursing services. Tool developments included a pilot test in which semi-structured interviews with occupational health and safety leaders were conducted to identify currents issues and products used for estimating the value of occupational health nursing services. The outcome was the creation of a tool that estimates the economic value of occupational health nursing services. The feasibility and potential value of this tool is described.

  12. Magnetoencephalographic Mapping of Epileptic Spike Population Using Distributed Source Analysis: Comparison With Intracranial Electroencephalographic Spikes.

    PubMed

    Tanaka, Naoaki; Papadelis, Christos; Tamilia, Eleonora; Madsen, Joseph R; Pearl, Phillip L; Stufflebeam, Steven M

    2018-04-27

    This study evaluates magnetoencephalographic (MEG) spike population as compared with intracranial electroencephalographic (IEEG) spikes using a quantitative method based on distributed source analysis. We retrospectively studied eight patients with medically intractable epilepsy who had an MEG and subsequent IEEG monitoring. Fifty MEG spikes were analyzed in each patient using minimum norm estimate. For individual spikes, each vertex in the source space was considered activated when its source amplitude at the peak latency was higher than a threshold, which was set at 50% of the maximum amplitude over all vertices. We mapped the total count of activation at each vertex. We also analyzed 50 IEEG spikes in the same manner over the intracranial electrodes and created the activation count map. The location of the electrodes was obtained in the MEG source space by coregistering postimplantation computed tomography to MRI. We estimated the MEG- and IEEG-active regions associated with the spike populations using the vertices/electrodes with a count over 25. The activation count maps of MEG spikes demonstrated the localization associated with the spike population by variable count values at each vertex. The MEG-active region overlapped with 65 to 85% of the IEEG-active region in our patient group. Mapping the MEG spike population is valid for demonstrating the trend of spikes clustering in patients with epilepsy. In addition, comparison of MEG and IEEG spikes quantitatively may be informative for understanding their relationship.

  13. Assessment of Scheduling and Plan Execution of Apollo 14 Lunar Surface Operations

    NASA Technical Reports Server (NTRS)

    Marquez, Jessica J.

    2010-01-01

    Although over forty years have passed since first landing on the Moon, there is not yet a comprehensive, quantitative assessment of Apollo extravehicular activities (EVAs). Quantitatively evaluating lunar EVAs will provide a better understanding of the challenges involved with surface operations. This first evaluation of a surface EVA centers on comparing the planned and the as-ran timeline, specifically collecting data on discrepancies between durations that were estimated versus executed. Differences were summarized by task categories in order to gain insight as to the type of surface operation activities that were most challenging. One Apollo 14 EVA was assessed utilizing the described methodology. Selected metrics and task categorizations were effective, and limitations to this process were identified.

  14. INFLUENCE OF MATRIX FORMULATION ON DERMAL PERCUTANEOUS ABSORPTION OF TRIAZOLE FUNGICIDES USING QSAR AND PBPK / PD MODELS

    EPA Science Inventory

    The objective of this work is to use the Exposure Related Dose Estimating Model (ERDEM) and quantitative structure-activity relationship (QSAR) models to develop an assessment tool for human exposure assessment to triazole fungicides. A dermal exposure route is used for the physi...

  15. Modeling Dynamic Functional Neuroimaging Data Using Structural Equation Modeling

    ERIC Educational Resources Information Center

    Price, Larry R.; Laird, Angela R.; Fox, Peter T.; Ingham, Roger J.

    2009-01-01

    The aims of this study were to present a method for developing a path analytic network model using data acquired from positron emission tomography. Regions of interest within the human brain were identified through quantitative activation likelihood estimation meta-analysis. Using this information, a "true" or population path model was then…

  16. DEVELOPMENT OF QUANTITATIVE STRUCTURE ACTIVITY RELATIONSHIPS (QSARS) TO PREDICT TOXICITY FOR A VARIETY OF HUMAN AND ECOLOGICAL ENDPOINTS

    EPA Science Inventory

    In general, the accuracy of a predicted toxicity value increases with increase in similarity between the query chemical and the chemicals used to develop a QSAR model. A toxicity estimation methodology employing this finding has been developed. A hierarchical based clustering t...

  17. Lower reference limits of quantitative cord glucose-6-phosphate dehydrogenase estimated from healthy term neonates according to the clinical and laboratory standards institute guidelines: a cross sectional retrospective study

    PubMed Central

    2013-01-01

    Background Previous studies have reported the lower reference limit (LRL) of quantitative cord glucose-6-phosphate dehydrogenase (G6PD), but they have not used approved international statistical methodology. Using common standards is expecting to yield more true findings. Therefore, we aimed to estimate LRL of quantitative G6PD detection in healthy term neonates by using statistical analyses endorsed by the International Federation of Clinical Chemistry (IFCC) and the Clinical and Laboratory Standards Institute (CLSI) for reference interval estimation. Methods This cross sectional retrospective study was performed at King Abdulaziz Hospital, Saudi Arabia, between March 2010 and June 2012. The study monitored consecutive neonates born to mothers from one Arab Muslim tribe that was assumed to have a low prevalence of G6PD-deficiency. Neonates that satisfied the following criteria were included: full-term birth (37 weeks); no admission to the special care nursery; no phototherapy treatment; negative direct antiglobulin test; and fathers of female neonates were from the same mothers’ tribe. The G6PD activity (Units/gram Hemoglobin) was measured spectrophotometrically by an automated kit. This study used statistical analyses endorsed by IFCC and CLSI for reference interval estimation. The 2.5th percentiles and the corresponding 95% confidence intervals (CI) were estimated as LRLs, both in presence and absence of outliers. Results 207 males and 188 females term neonates who had cord blood quantitative G6PD testing met the inclusion criteria. Method of Horn detected 20 G6PD values as outliers (8 males and 12 females). Distributions of quantitative cord G6PD values exhibited a normal distribution in absence of the outliers only. The Harris-Boyd method and proportion criteria revealed that combined gender LRLs were reliable. The combined bootstrap LRL in presence of the outliers was 10.0 (95% CI: 7.5-10.7) and the combined parametric LRL in absence of the outliers was 11.0 (95% CI: 10.5-11.3). Conclusion These results contribute to the LRL of quantitative cord G6PD detection in full-term neonates. They are transferable to another laboratory when pre-analytical factors and testing methods are comparable and the IFCC-CLSI requirements of transference are satisfied. We are suggesting using estimated LRL in absence of the outliers as mislabeling G6PD-deficient neonates as normal is intolerable whereas mislabeling G6PD-normal neonates as deficient is tolerable. PMID:24016342

  18. Quantitation using a stable isotope dilution assay (SIDA) and thresholds of taste-active pyroglutamyl decapeptide ethyl esters (PGDPEs) in sake.

    PubMed

    Hashizume, Katsumi; Ito, Toshiko; Igarashi, Shinya

    2017-03-01

    A stable isotope dilution assay (SIDA) for two taste-active pyroglutamyl decapeptide ethyl esters (PGDPE1; (pGlu)LFGPNVNPWCOOC 2 H 5 , PGDPE2; (pGlu)LFNPSTNPWCOOC 2 H 5 ) in sake was developed using deuterated isotopes and high-resolution mass spectrometry. Recognition thresholds of PGDPEs in sake were estimated as 3.8 μg/L for PGDPE1 and 8.1 μg/L for PGDPE2, evaluated using 11 student panelists aged in their twenties. Quantitated concentrations in 18 commercial sake samples ranged from 0 to 27 μg/L for PGDPE1 and from 0 to 202 μg/L for PGDPE2. The maximum levels of PGDPE1 and PGDPE2 in the sake samples were approximately 8 and 25 times higher than the estimated recognition thresholds, respectively. The results indicated that PGDPEs may play significant sensory roles in the sake. The level of PGDPEs in unpasteurized sake samples decreased during storage for 50 days at 6 °C, suggesting PGDPEs may be enzymatically decomposed.

  19. Quantitative analysis of desorption and decomposition kinetics of formic acid on Cu(111): The importance of hydrogen bonding between adsorbed species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiozawa, Yuichiro; Koitaya, Takanori; Mukai, Kozo

    2015-12-21

    Quantitative analysis of desorption and decomposition kinetics of formic acid (HCOOH) on Cu(111) was performed by temperature programmed desorption (TPD), X-ray photoelectron spectroscopy, and time-resolved infrared reflection absorption spectroscopy. The activation energy for desorption is estimated to be 53–75 kJ/mol by the threshold TPD method as a function of coverage. Vibrational spectra of the first layer HCOOH at 155.3 K show that adsorbed molecules form a polymeric structure via the hydrogen bonding network. Adsorbed HCOOH molecules are dissociated gradually into monodentate formate species. The activation energy for the dissociation into monodentate formate species is estimated to be 65.0 kJ/mol atmore » a submonolayer coverage (0.26 molecules/surface Cu atom). The hydrogen bonding between adsorbed HCOOH species plays an important role in the stabilization of HCOOH on Cu(111). The monodentate formate species are stabilized at higher coverages, because of the lack of vacant sites for the bidentate formation.« less

  20. Combining PALM and SOFI for quantitative imaging of focal adhesions in living cells

    NASA Astrophysics Data System (ADS)

    Deschout, Hendrik; Lukes, Tomas; Sharipov, Azat; Feletti, Lely; Lasser, Theo; Radenovic, Aleksandra

    2017-02-01

    Focal adhesions are complicated assemblies of hundreds of proteins that allow cells to sense their extracellular matrix and adhere to it. Although most focal adhesion proteins have been identified, their spatial organization in living cells remains challenging to observe. Photo-activated localization microscopy (PALM) is an interesting technique for this purpose, especially since it allows estimation of molecular parameters such as the number of fluorophores. However, focal adhesions are dynamic entities, requiring a temporal resolution below one minute, which is difficult to achieve with PALM. In order to address this problem, we merged PALM with super-resolution optical fluctuation imaging (SOFI) by applying both techniques to the same data. Since SOFI tolerates an overlap of single molecule images, it can improve the temporal resolution compared to PALM. Moreover, an adaptation called balanced SOFI (bSOFI) allows estimation of molecular parameters, such as the fluorophore density. We therefore performed simulations in order to assess PALM and SOFI for quantitative imaging of dynamic structures. We demonstrated the potential of our PALM-SOFI concept as a quantitative imaging framework by investigating moving focal adhesions in living cells.

  1. Identification and Quantitation of Flavanols and Proanthocyanidins in Foods: How Good are the Datas?

    PubMed Central

    Kelm, Mark A.; Hammerstone, John F.; Schmitz, Harold H.

    2005-01-01

    Evidence suggesting that dietary polyphenols, flavanols, and proanthocyanidins in particular offer significant cardiovascular health benefits is rapidly increasing. Accordingly, reliable and accurate methods are needed to provide qualitative and quantitative food composition data necessary for high quality epidemiological and clinical research. Measurements for flavonoids and proanthocyanidins have employed a range of analytical techniques, with various colorimetric assays still being popular for estimating total polyphenolic content in foods and other biological samples despite advances made with more sophisticated analyses. More crudely, estimations of polyphenol content as well as antioxidant activity are also reported with values relating to radical scavenging activity. High-performance liquid chromatography (HPLC) is the method of choice for quantitative analysis of individual polyphenols such as flavanols and proanthocyanidins. Qualitative information regarding proanthocyanidin structure has been determined by chemical methods such as thiolysis and by HPLC-mass spectrometry (MS) techniques at present. The lack of appropriate standards is the single most important factor that limits the aforementioned analyses. However, with ever expanding research in the arena of flavanols, proanthocyanidins, and health and the importance of their future inclusion in food composition databases, the need for standards becomes more critical. At present, sufficiently well-characterized standard material is available for selective flavanols and proanthocyanidins, and construction of at least a limited food composition database is feasible. PMID:15712597

  2. The effects of physical activity on impulsive choice: Influence of sensitivity to reinforcement amount and delay

    PubMed Central

    Strickland, Justin C.; Feinstein, Max A.; Lacy, Ryan T.; Smith, Mark A.

    2016-01-01

    Impulsive choice is a diagnostic feature and/or complicating factor for several psychological disorders and may be examined in the laboratory using delay-discounting procedures. Recent investigators have proposed using quantitative measures of analysis to examine the behavioral processes contributing to impulsive choice. The purpose of this study was to examine the effects of physical activity (i.e., wheel running) on impulsive choice in a single-response, discrete-trial procedure using two quantitative methods of analysis. To this end, rats were assigned to physical activity or sedentary groups and trained to respond in a delay-discounting procedure. In this procedure, one lever always produced one food pellet immediately, whereas a second lever produced three food pellets after a 0, 10, 20, 40, or 80-second delay. Estimates of sensitivity to reinforcement amount and sensitivity to reinforcement delay were determined using (1) a simple linear analysis and (2) an analysis of logarithmically transformed response ratios. Both analyses revealed that physical activity decreased sensitivity to reinforcement amount and sensitivity to reinforcement delay. These findings indicate that (1) physical activity has significant but functionally opposing effects on the behavioral processes that contribute to impulsive choice and (2) both quantitative methods of analysis are appropriate for use in single-response, discrete-trial procedures. PMID:26964905

  3. Common and distinct neural correlates of personal and vicarious reward: A quantitative meta-analysis

    PubMed Central

    Morelli, Sylvia A.; Sacchet, Matthew D.; Zaki, Jamil

    2015-01-01

    Individuals experience reward not only when directly receiving positive outcomes (e.g., food or money), but also when observing others receive such outcomes. This latter phenomenon, known as vicarious reward, is a perennial topic of interest among psychologists and economists. More recently, neuroscientists have begun exploring the neuroanatomy underlying vicarious reward. Here we present a quantitative whole-brain meta-analysis of this emerging literature. We identified 25 functional neuroimaging studies that included contrasts between vicarious reward and a neutral control, and subjected these contrasts to an activation likelihood estimate (ALE) meta-analysis. This analysis revealed a consistent pattern of activation across studies, spanning structures typically associated with the computation of value (especially ventromedial prefrontal cortex) and mentalizing (including dorsomedial prefrontal cortex and superior temporal sulcus). We further quantitatively compared this activation pattern to activation foci from a previous meta-analysis of personal reward. Conjunction analyses yielded overlapping VMPFC activity in response to personal and vicarious reward. Contrast analyses identified preferential engagement of the nucleus accumbens in response to personal as compared to vicarious reward, and in mentalizing-related structures in response to vicarious as compared to personal reward. These data shed light on the common and unique components of the reward that individuals experience directly and through their social connections. PMID:25554428

  4. Modifiable lifestyle factors affecting bone health using calcaneus quantitative ultrasound in adolescent girls.

    PubMed

    Robinson, M L; Winters-Stone, K; Gabel, K; Dolny, D

    2007-08-01

    One hundred and fourteen girls were measured for calcaneus QUS (stiffness index score), calcium intake, weight, and total hours spent in physical activity (moderate to high-impact activities and low to no-impact activities). Multiple regression analysis indicated that hours spent in moderate to high-impact activities, current calcium intake, and weight significantly predicted SI. To determine the influence of modifiable lifestyle factors on adolescent girls' bone health measured by calcaneus quantitative ultrasound (QUS). One hundred and fourteen girls, ages 14-18 (15.97 +/- .7), enrolled in high school physical education classes, were measured for calcaneus QUS (stiffness index score), height, weight, current calcium intake from 2-3 day food records, and estimated total hours spent in physical activity from kindergarten to present. Cumulative physical activity hours were separated into two classifications (according to their estimated strain from ground reaction force): moderate to high-impact activities and low to no-impact activities. Pearson correlations between stiffness index (SI) and age, height, weight, current calcium intake, and hours spent in moderate to high-impact versus low to no-impact activities indicated a positive relationships between SI and weight (r = .259, p = .005), current calcium intake (r = .286, p = .002), and hours spent in moderate to high-impact activities (r = .451, p < .001). Multiple regression between SI and the above independent variables indicated that collectively, hours spent in moderate to high-impact activities, current calcium intake, and weight (r (2) = .363, p = <.001) significantly predicted SI. Our data indicate that moderate to high-impact activities, current calcium intake, and weight positively influence bone properties of the calcaneus in adolescent girls.

  5. The Mapping Model: A Cognitive Theory of Quantitative Estimation

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2008-01-01

    How do people make quantitative estimations, such as estimating a car's selling price? Traditionally, linear-regression-type models have been used to answer this question. These models assume that people weight and integrate all information available to estimate a criterion. The authors propose an alternative cognitive theory for quantitative…

  6. Quantitative assessment of hit detection and confirmation in single and duplicate high-throughput screenings.

    PubMed

    Wu, Zhijin; Liu, Dongmei; Sui, Yunxia

    2008-02-01

    The process of identifying active targets (hits) in high-throughput screening (HTS) usually involves 2 steps: first, removing or adjusting for systematic variation in the measurement process so that extreme values represent strong biological activity instead of systematic biases such as plate effect or edge effect and, second, choosing a meaningful cutoff on the calculated statistic to declare positive compounds. Both false-positive and false-negative errors are inevitable in this process. Common control or estimation of error rates is often based on an assumption of normal distribution of the noise. The error rates in hit detection, especially false-negative rates, are hard to verify because in most assays, only compounds selected in primary screening are followed up in confirmation experiments. In this article, the authors take advantage of a quantitative HTS experiment in which all compounds are tested 42 times over a wide range of 14 concentrations so true positives can be found through a dose-response curve. Using the activity status defined by dose curve, the authors analyzed the effect of various data-processing procedures on the sensitivity and specificity of hit detection, the control of error rate, and hit confirmation. A new summary score is proposed and demonstrated to perform well in hit detection and useful in confirmation rate estimation. In general, adjusting for positional effects is beneficial, but a robust test can prevent overadjustment. Error rates estimated based on normal assumption do not agree with actual error rates, for the tails of noise distribution deviate from normal distribution. However, false discovery rate based on empirically estimated null distribution is very close to observed false discovery proportion.

  7. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-04-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that there is no particular advantage between quantitative estimation methods nor to performing dose reduction via tube current reduction compared to temporal sampling reduction. These data are important for optimizing implementation of cardiac dynamic CT in clinical practice and in prospective CT MBF trials.

  8. Quantitative evaluation of hidden defects in cast iron components using ultrasound activated lock-in vibrothermography.

    PubMed

    Montanini, R; Freni, F; Rossi, G L

    2012-09-01

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.

  9. Quantitative genetic tools for insecticide resistance risk assessment: estimating the heritability of resistance

    Treesearch

    Michael J. Firko; Jane Leslie Hayes

    1990-01-01

    Quantitative genetic studies of resistance can provide estimates of genetic parameters not available with other types of genetic analyses. Three methods are discussed for estimating the amount of additive genetic variation in resistance to individual insecticides and subsequent estimation of heritability (h2) of resistance. Sibling analysis and...

  10. Quantitative metrics for assessing predicted climate change pressure on North American tree species

    Treesearch

    Kevin M. Potter; William W. Hargrove

    2013-01-01

    Changing climate may pose a threat to forest tree species, forcing three potential population-level responses: toleration/adaptation, movement to suitable environmental conditions, or local extirpation. Assessments that prioritize and classify tree species for management and conservation activities in the face of climate change will need to incorporate estimates of the...

  11. Benzene exposure in the petroleum distribution industry associated with leukemia in the United Kingdom: overview of the methodology of a case-control study.

    PubMed Central

    Rushton, L

    1996-01-01

    This paper describes basic principles underlying the methodology for obtaining quantitative estimates of benzene exposure in the petroleum marketing and distribution industry. Work histories for 91 cases of leukemia and 364 matched controls (4 per case) identified for a cohort of oil distribution workers up to the end of 1992 were obtained, primarily from personnel records. Information on the distribution sites, more than 90% of which were closed at the time of data collection, was obtained from site visits and archive material. Industrial hygiene measurements measured under known conditions were assembled for different tasks. These were adjusted for conditions where measured data were not available using variables known to influence exposure, such as temperature, technology, percentage of benzene in fuel handled, products handled, number of loads, and job activity. A quantitative estimate of dermal contact and peak exposure was also made. PMID:9118922

  12. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  13. Should fatty acid signature proportions sum to 1 for diet estimation?

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Budge, Suzanne M.; Thiemann, Gregory W.

    2016-01-01

    Knowledge of predator diets, including how diets might change through time or differ among predators, provides essential insights into their ecology. Diet estimation therefore remains an active area of research within quantitative ecology. Quantitative fatty acid signature analysis (QFASA) is an increasingly common method of diet estimation. QFASA is based on a data library of prey signatures, which are vectors of proportions summarizing the fatty acid composition of lipids, and diet is estimated as the mixture of prey signatures that most closely approximates a predator’s signature. Diets are typically estimated using proportions from a subset of all fatty acids that are known to be solely or largely influenced by diet. Given the subset of fatty acids selected, the current practice is to scale their proportions to sum to 1.0. However, scaling signature proportions has the potential to distort the structural relationships within a prey library and between predators and prey. To investigate that possibility, we compared the practice of scaling proportions with two alternatives and found that the traditional scaling can meaningfully bias diet estimators under some conditions. Two aspects of the prey types that contributed to a predator’s diet influenced the magnitude of the bias: the degree to which the sums of unscaled proportions differed among prey types and the identifiability of prey types within the prey library. We caution investigators against the routine scaling of signature proportions in QFASA.

  14. Development of quantitative exposure data for a pooled exposure-response analysis of 10 silica cohorts.

    PubMed

    Mannetje, Andrea 't; Steenland, Kyle; Checkoway, Harvey; Koskela, Riitta-Sisko; Koponen, Matti; Attfield, Michael; Chen, Jingqiong; Hnizdo, Eva; DeKlerk, Nicholas; Dosemeci, Mustafa

    2002-08-01

    Comprehensive quantitative silica exposure estimates over time, measured in the same units across a number of cohorts, would make possible a pooled exposure-response analysis for lung cancer. Such an analysis would help clarify the continuing controversy regarding whether silica causes lung cancer. Existing quantitative exposure data for 10 silica-exposed cohorts were retrieved from the original investigators. Occupation- and time-specific exposure estimates were either adopted/adapted or developed for each cohort, and converted to milligram per cubic meter (mg/m(3)) respirable crystalline silica. Quantitative exposure assignments were typically based on a large number (thousands) of raw measurements, or otherwise consisted of exposure estimates by experts (for two cohorts). Median exposure level of the cohorts ranged between 0.04 and 0.59 mg/m(3) respirable crystalline silica. Exposure estimates were partially validated via their successful prediction of silicosis in these cohorts. Existing data were successfully adopted or modified to create comparable quantitative exposure estimates over time for 10 silica-exposed cohorts, permitting a pooled exposure-response analysis. The difficulties encountered in deriving common exposure estimates across cohorts are discussed. Copyright 2002 Wiley-Liss, Inc.

  15. Estimating Active Transportation Behaviors to Support Health Impact Assessment in the United States

    PubMed Central

    Mansfield, Theodore J.; Gibson, Jacqueline MacDonald

    2016-01-01

    Health impact assessment (HIA) has been promoted as a means to encourage transportation and city planners to incorporate health considerations into their decision-making. Ideally, HIAs would include quantitative estimates of the population health effects of alternative planning scenarios, such as scenarios with and without infrastructure to support walking and cycling. However, the lack of baseline estimates of time spent walking or biking for transportation (together known as “active transportation”), which are critically related to health, often prevents planners from developing such quantitative estimates. To address this gap, we use data from the 2009 US National Household Travel Survey to develop a statistical model that estimates baseline time spent walking and biking as a function of the type of transportation used to commute to work along with demographic and built environment variables. We validate the model using survey data from the Raleigh–Durham–Chapel Hill, NC, USA, metropolitan area. We illustrate how the validated model could be used to support transportation-related HIAs by estimating the potential health benefits of built environment modifications that support walking and cycling. Our statistical model estimates that on average, individuals who commute on foot spend an additional 19.8 (95% CI 16.9–23.2) minutes per day walking compared to automobile commuters. Public transit riders walk an additional 5.0 (95% CI 3.5–6.4) minutes per day compared to automobile commuters. Bicycle commuters cycle for an additional 28.0 (95% CI 17.5–38.1) minutes per day compared to automobile commuters. The statistical model was able to predict observed transportation physical activity in the Raleigh–Durham–Chapel Hill region to within 0.5 MET-hours per day (equivalent to about 9 min of daily walking time) for 83% of observations. Across the Raleigh–Durham–Chapel Hill region, an estimated 38 (95% CI 15–59) premature deaths potentially could be avoided if the entire population walked 37.4 min per week for transportation (the amount of transportation walking observed in previous US studies of walkable neighborhoods). The approach developed here is useful both for estimating baseline behaviors in transportation HIAs and for comparing the magnitude of risks associated with physical inactivity to other competing health risks in urban areas. PMID:27200327

  16. Estimating Active Transportation Behaviors to Support Health Impact Assessment in the United States.

    PubMed

    Mansfield, Theodore J; Gibson, Jacqueline MacDonald

    2016-01-01

    Health impact assessment (HIA) has been promoted as a means to encourage transportation and city planners to incorporate health considerations into their decision-making. Ideally, HIAs would include quantitative estimates of the population health effects of alternative planning scenarios, such as scenarios with and without infrastructure to support walking and cycling. However, the lack of baseline estimates of time spent walking or biking for transportation (together known as "active transportation"), which are critically related to health, often prevents planners from developing such quantitative estimates. To address this gap, we use data from the 2009 US National Household Travel Survey to develop a statistical model that estimates baseline time spent walking and biking as a function of the type of transportation used to commute to work along with demographic and built environment variables. We validate the model using survey data from the Raleigh-Durham-Chapel Hill, NC, USA, metropolitan area. We illustrate how the validated model could be used to support transportation-related HIAs by estimating the potential health benefits of built environment modifications that support walking and cycling. Our statistical model estimates that on average, individuals who commute on foot spend an additional 19.8 (95% CI 16.9-23.2) minutes per day walking compared to automobile commuters. Public transit riders walk an additional 5.0 (95% CI 3.5-6.4) minutes per day compared to automobile commuters. Bicycle commuters cycle for an additional 28.0 (95% CI 17.5-38.1) minutes per day compared to automobile commuters. The statistical model was able to predict observed transportation physical activity in the Raleigh-Durham-Chapel Hill region to within 0.5 MET-hours per day (equivalent to about 9 min of daily walking time) for 83% of observations. Across the Raleigh-Durham-Chapel Hill region, an estimated 38 (95% CI 15-59) premature deaths potentially could be avoided if the entire population walked 37.4 min per week for transportation (the amount of transportation walking observed in previous US studies of walkable neighborhoods). The approach developed here is useful both for estimating baseline behaviors in transportation HIAs and for comparing the magnitude of risks associated with physical inactivity to other competing health risks in urban areas.

  17. Inter- and intra-observer agreement of BI-RADS-based subjective visual estimation of amount of fibroglandular breast tissue with magnetic resonance imaging: comparison to automated quantitative assessment.

    PubMed

    Wengert, G J; Helbich, T H; Woitek, R; Kapetas, P; Clauser, P; Baltzer, P A; Vogl, W-D; Weber, M; Meyer-Baese, A; Pinker, Katja

    2016-11-01

    To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. • Subjective FGT estimation with MRI shows moderate intra-/inter-observer agreement in inexperienced readers. • Inter-observer agreement can be improved by practice and experience. • Automated observer-independent quantitative measurements can provide reliable and standardized assessment of FGT with MRI.

  18. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    NASA Astrophysics Data System (ADS)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  19. Reproducibility and Accuracy of Quantitative Myocardial Blood Flow Using 82Rb-PET: Comparison with 13N-Ammonia

    PubMed Central

    Fakhri, Georges El

    2011-01-01

    82Rb cardiac PET allows the assessment of myocardial perfusion using a column generator in clinics that lack a cyclotron. We and others have previously shown that quantitation of myocardial blood flow (MBF) and coronary flow reserve (CFR) is feasible using dynamic 82Rb PET and factor and compartment analyses. The aim of the present work was to determine the intra- and inter-observer variability of MBF estimation using 82Rb PET as well as the reproducibility of our generalized factor + compartment analyses methodology to estimate MBF and assess its accuracy by comparing, in the same subjects, 82Rb estimates of MBF to those obtained using 13N-ammonia. Methods Twenty-two subjects were included in the reproducibility and twenty subjects in the validation study. Patients were injected with 60±5mCi of 82Rb and imaged dynamically for 6 minutes at rest and during dipyridamole stress Left and right ventricular (LV+RV) time-activity curves were estimated by GFADS and used as input to a 2-compartment kinetic analysis that estimates parametric maps of myocardial tissue extraction (K1) and egress (k2), as well as LV+RV contributions (fv,rv). Results Our results show excellent reproducibility of the quantitative dynamic approach itself with coefficients of repeatability of 1.7% for estimation of MBF at rest, 1.4% for MBF at peak stress and 2.8% for CFR estimation. The inter-observer reproducibility between the four observers that participated in this study was also very good with correlation coefficients greater than 0.87 between any two given observers when estimating coronary flow reserve. The reproducibility of MBF in repeated 82Rb studies was good at rest and excellent at peak stress (r2=0.835). Furthermore, the slope of the correlation line was very close to 1 when estimating stress MBF and CFR in repeated 82Rb studies. The correlation between myocardial flow estimates obtained at rest and during peak stress in 82Rb and 13N-ammonia studies was very good at rest (r2=0.843) and stress (r2=0.761). The Bland-Altman plots show no significant presence of proportional error at rest or stress, nor a dependence of the variations on the amplitude of the myocardial blood flow at rest or stress. A small systematic overestimation of 13N-ammonia MBF was observed with 82Rb at rest (0.129 ml/g/min) and the opposite, i.e., underestimation, at stress (0.22 ml/g/min). Conclusions Our results show that absolute quantitation of myocardial bloof flow is reproducible and accurate with 82Rb dynamic cardiac PET as compared to 13N-ammonia. The reproducibility of the quantitation approach itself was very good as well as inter-observer reproducibility. PMID:19525467

  20. Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.

    PubMed

    Obuchowski, Nancy A; Bullen, Jennifer

    2017-01-01

    Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.

  1. Variation in commercial smoking mixtures containing third-generation synthetic cannabinoids.

    PubMed

    Frinculescu, Anca; Lyall, Catherine L; Ramsey, John; Miserez, Bram

    2017-02-01

    Variation in ingredients (qualitative variation) and in quantity of active compounds (quantitative variation) in herbal smoking mixtures containing synthetic cannabinoids has been shown for older products. This can be dangerous to the user, as accurate and reproducible dosing is impossible. In this study, 69 packages containing third-generation cannabinoids of seven brands on the UK market in 2014 were analyzed both qualitatively and quantitatively for variation. When comparing the labels to actual active ingredients identified in the sample, only one brand was shown to be correctly labelled. The other six brands contained less, more, or ingredients other than those listed on the label. Only two brands were inconsistent, containing different active ingredients in different samples. Quantitative variation was assessed both within one package and between several packages. Within-package variation was within a 10% range for five of the seven brands, but two brands showed larger variation, up to 25% (Relative Standard Deviation). Variation between packages was significantly higher, with variation up to 38% and maximum concentration up to 2.7 times higher than the minimum concentration. Both qualitative and quantitative variation are common in smoking mixtures and endanger the user, as it is impossible to estimate the dose or to know the compound consumed when smoking commercial mixtures. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Adsorption of organic compounds onto activated carbons from recycled vegetables biomass.

    PubMed

    Mameli, Anna; Cincotti, Alberto; Lai, Nicola; Crisafulli, Carmelo; Sciré, Salvatore; Cao, Giacomo

    2004-01-01

    The removal of organic species from aqueous solution by activated carbons is investigated. The latter ones are prepared from olive husks and almond shells. A wide range of surface area values are obtained varying temperature and duration of both carbonization and activation steps. The adsorption isotherm of phenol, catechol and 2,6-dichlorophenol involving the activated carbons prepared are obtained at 25 degrees C. The corresponding behavior is quantitatively correlated using classical isotherm, whose parameters are estimated by fitting the equilibrium data. A two component isotherm (phenol/2,6-dichlorophenol) is determined in order to test activated carbon behavior during competitive adsorption.

  3. Functional stability of cerebral circulatory system

    NASA Technical Reports Server (NTRS)

    Moskalenko, Y. Y.

    1980-01-01

    The functional stability of the cerebral circulation system seems to be based on the active mechanisms and on those stemming from specific of the biophysical structure of the system under study. This latter parameter has some relevant criteria for its quantitative estimation. The data obtained suggest that the essential part of the mechanism for active responses of cerebral vessels which maintains the functional stability of this portion of the vascular system, consists of a neurogenic component involving central nervous structures localized, for instance, in the medulla oblongata.

  4. Behavior, sensitivity, and power of activation likelihood estimation characterized by massive empirical simulation.

    PubMed

    Eickhoff, Simon B; Nichols, Thomas E; Laird, Angela R; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T; Bzdok, Danilo; Eickhoff, Claudia R

    2016-08-15

    Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Behavior, Sensitivity, and power of activation likelihood estimation characterized by massive empirical simulation

    PubMed Central

    Eickhoff, Simon B.; Nichols, Thomas E.; Laird, Angela R.; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T.

    2016-01-01

    Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. PMID:27179606

  6. Reactivity of propene, n-butene, and isobutene in the hydrogen transfer steps of n-hexane cracking over zeolites of different structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukyanov, D.B.

    The reaction of n-hexane cracking over HZSM-5, HY zeolite and mordenite (HM) was studied in accordance with the procedure of the [beta]-test recently proposed for quantitative characterization of zeolite hydrogen transfer activity. It is shown that this procedure allows one to obtain quantitative data on propene, n-butene, and isobutene reactivities in the hydrogen transfer steps of the reaction. The results demonstrate that in the absence of steric constraints (large pore HY and HM zeolites) isobutene is approximately 5 times more reactive in hydrogen transfer than n-butene. The latter, in turn, is about 1.3 times more reactive than propene. With mediummore » pore HZSM-5, steric inhibition of the hydrogen transfer between n-hexane and isobutene is observed. This results in a sharp decrease in the isobutene reactivity: over HZSM-5 zeolites isobutene is only 1.2 times more reactive in hydrogen transfer than n-butene. On the basis of these data it is concluded that the [beta]-test measures the [open quotes]real[close quotes] hydrogen transfer activity of zeolites, i.e., the activity that summarizes the effects of the acidic and structural properties of zeolites. An attempt is made to estimate the [open quotes]ideal[close quotes] zeolite hydrogen transfer activity, i.e., the activity determined by the zeolite acidic properties only. The estimations obtained show that this activity is approximately 1.8 and 1.6 times higher for HM zeolite in comparison with HZSM-5 and HY zeolites, respectively. 16 refs., 4 figs., 2 tabs.« less

  7. Motion compensation using origin ensembles in awake small animal positron emission tomography

    NASA Astrophysics Data System (ADS)

    Gillam, John E.; Angelis, Georgios I.; Kyme, Andre Z.; Meikle, Steven R.

    2017-02-01

    In emission tomographic imaging, the stochastic origin ensembles algorithm provides unique information regarding the detected counts given the measured data. Precision in both voxel and region-wise parameters may be determined for a single data set based on the posterior distribution of the count density allowing uncertainty estimates to be allocated to quantitative measures. Uncertainty estimates are of particular importance in awake animal neurological and behavioral studies for which head motion, unique for each acquired data set, perturbs the measured data. Motion compensation can be conducted when rigid head pose is measured during the scan. However, errors in pose measurements used for compensation can degrade the data and hence quantitative outcomes. In this investigation motion compensation and detector resolution models were incorporated into the basic origin ensembles algorithm and an efficient approach to computation was developed. The approach was validated against maximum liklihood—expectation maximisation and tested using simulated data. The resultant algorithm was then used to analyse quantitative uncertainty in regional activity estimates arising from changes in pose measurement precision. Finally, the posterior covariance acquired from a single data set was used to describe correlations between regions of interest providing information about pose measurement precision that may be useful in system analysis and design. The investigation demonstrates the use of origin ensembles as a powerful framework for evaluating statistical uncertainty of voxel and regional estimates. While in this investigation rigid motion was considered in the context of awake animal PET, the extension to arbitrary motion may provide clinical utility where respiratory or cardiac motion perturb the measured data.

  8. The effects of physical activity on impulsive choice: Influence of sensitivity to reinforcement amount and delay.

    PubMed

    Strickland, Justin C; Feinstein, Max A; Lacy, Ryan T; Smith, Mark A

    2016-05-01

    Impulsive choice is a diagnostic feature and/or complicating factor for several psychological disorders and may be examined in the laboratory using delay-discounting procedures. Recent investigators have proposed using quantitative measures of analysis to examine the behavioral processes contributing to impulsive choice. The purpose of this study was to examine the effects of physical activity (i.e., wheel running) on impulsive choice in a single-response, discrete-trial procedure using two quantitative methods of analysis. To this end, rats were assigned to physical activity or sedentary groups and trained to respond in a delay-discounting procedure. In this procedure, one lever always produced one food pellet immediately, whereas a second lever produced three food pellets after a 0, 10, 20, 40, or 80-s delay. Estimates of sensitivity to reinforcement amount and sensitivity to reinforcement delay were determined using (1) a simple linear analysis and (2) an analysis of logarithmically transformed response ratios. Both analyses revealed that physical activity decreased sensitivity to reinforcement amount and sensitivity to reinforcement delay. These findings indicate that (1) physical activity has significant but functionally opposing effects on the behavioral processes that contribute to impulsive choice and (2) both quantitative methods of analysis are appropriate for use in single-response, discrete-trial procedures. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. A quantitative link between face discrimination deficits and neuronal selectivity for faces in autism☆

    PubMed Central

    Jiang, Xiong; Bollich, Angela; Cox, Patrick; Hyder, Eric; James, Joette; Gowani, Saqib Ali; Hadjikhani, Nouchine; Blanz, Volker; Manoach, Dara S.; Barton, Jason J.S.; Gaillard, William D.; Riesenhuber, Maximilian

    2013-01-01

    Individuals with Autism Spectrum Disorder (ASD) appear to show a general face discrimination deficit across a range of tasks including social–emotional judgments as well as identification and discrimination. However, functional magnetic resonance imaging (fMRI) studies probing the neural bases of these behavioral differences have produced conflicting results: while some studies have reported reduced or no activity to faces in ASD in the Fusiform Face Area (FFA), a key region in human face processing, others have suggested more typical activation levels, possibly reflecting limitations of conventional fMRI techniques to characterize neuron-level processing. Here, we test the hypotheses that face discrimination abilities are highly heterogeneous in ASD and are mediated by FFA neurons, with differences in face discrimination abilities being quantitatively linked to variations in the estimated selectivity of face neurons in the FFA. Behavioral results revealed a wide distribution of face discrimination performance in ASD, ranging from typical performance to chance level performance. Despite this heterogeneity in perceptual abilities, individual face discrimination performance was well predicted by neural selectivity to faces in the FFA, estimated via both a novel analysis of local voxel-wise correlations, and the more commonly used fMRI rapid adaptation technique. Thus, face processing in ASD appears to rely on the FFA as in typical individuals, differing quantitatively but not qualitatively. These results for the first time mechanistically link variations in the ASD phenotype to specific differences in the typical face processing circuit, identifying promising targets for interventions. PMID:24179786

  10. TOXNET: Toxicology Data Network

    MedlinePlus

    ... 4. Supporting Data for Carcinogenicity Expand II.B. Quantitative Estimate of Carcinogenic Risk from Oral Exposure II. ... of Confidence (Carcinogenicity, Oral Exposure) Expand II.C. Quantitative Estimate of Carcinogenic Risk from Inhalation Exposure II. ...

  11. Does Community Resource Fit Matter to Fathers? A Study of Employed Fathers, School and School Activity Schedules, and Well-Being

    ERIC Educational Resources Information Center

    Barnett, Rosalind Chait; Gareis, Karen C.

    2009-01-01

    Several scholars have noted that community resources might facilitate or hinder employees' ability to meet their many work and family demands, thereby affecting their psychological well-being. However, this is the first study to estimate these relationships using a newly developed quantitative measure of community resource fit that assesses the…

  12. Standardisation of Gymnema sylvestre R.Br. by high-performance thin-layer chromatography: an improved method.

    PubMed

    Raju, Valivarthi S R; Kannababu, S; Subbaraju, Gottumukkala V

    2006-01-01

    An improved high-performance thin-layer chromatographic (HPTLC) method for the standardisation of Gymnema sylvestre is reported. The method involves the initial hydrolysis of gymnemic acids, the active ingredients, to a common aglycone followed by the quantitative estimation of gymnemagenin. The present method rectifies an error found in an HPTLC method reported recently.

  13. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  14. A quantitative model for transforming reflectance spectra into the Munsell color space using cone sensitivity functions and opponent process weights.

    PubMed

    D'Andrade, Roy G; Romney, A Kimball

    2003-05-13

    This article presents a computational model of the process through which the human visual system transforms reflectance spectra into perceptions of color. Using physical reflectance spectra data and standard human cone sensitivity functions we describe the transformations necessary for predicting the location of colors in the Munsell color space. These transformations include quantitative estimates of the opponent process weights needed to transform cone activations into Munsell color space coordinates. Using these opponent process weights, the Munsell position of specific colors can be predicted from their physical spectra with a mean correlation of 0.989.

  15. Impact of natural gas extraction on PAH levels in ambient air.

    PubMed

    Paulik, L Blair; Donald, Carey E; Smith, Brian W; Tidwell, Lane G; Hobbie, Kevin A; Kincl, Laurel; Haynes, Erin N; Anderson, Kim A

    2015-04-21

    Natural gas extraction, often referred to as "fracking," has increased rapidly in the U.S. in recent years. To address potential health impacts, passive air samplers were deployed in a rural community heavily affected by the natural gas boom. Samplers were analyzed for 62 polycyclic aromatic hydrocarbons (PAHs). Results were grouped based on distance from each sampler to the nearest active well. PAH levels were highest when samplers were closest to active wells. Additionally, PAH levels closest to natural gas activity were an order of magnitude higher than levels previously reported in rural areas. Sourcing ratios indicate that PAHs were predominantly petrogenic, suggesting that elevated PAH levels were influenced by direct releases from the earth. Quantitative human health risk assessment estimated the excess lifetime cancer risks associated with exposure to the measured PAHs. Closest to active wells, the risk estimated for maximum residential exposure was 2.9 in 10 000, which is above the U.S. EPA's acceptable risk level. Overall, risk estimates decreased 30% when comparing results from samplers closest to active wells to those farthest. This work suggests that natural gas extraction may be contributing significantly to PAHs in air, at levels that are relevant to human health.

  16. Impact of natural gas extraction on Pah levels in ambient air

    PubMed Central

    Paulik, L. Blair; Donald, Carey E.; Smith, Brian W.; Tidwell, Lane G.; Hobbie, Kevin A.; Kincl, Laurel; Haynes, Erin N.; Anderson, Kim A.

    2015-01-01

    Natural gas extraction, often referred to as “fracking,” has increased rapidly in the U.S. in recent years. To address potential health impacts, passive air samplers were deployed in a rural community heavily affected by the natural gas boom. Samplers were analyzed for 62 polycyclic aromatic hydrocarbons (PAHs). Results were grouped based on distance from each sampler to the nearest active well. PAH levels were highest when samplers were closest to active wells. Additionally, PAH levels closest to natural gas activity were an order of magnitude higher than levels previously reported in rural areas. Sourcing ratios indicate that PAHs were predominantly petrogenic, suggesting that elevated PAH levels were influenced by direct releases from the earth. Quantitative human health risk assessment estimated the excess lifetime cancer risks associated with exposure to the measured PAHs. Closest to active wells, the risk estimated for maximum residential exposure was 2.9 in 10,000, which is above the U.S. EPA's acceptable risk level. Overall, risk estimates decreased 30% when comparing results from samplers closest to active wells to those farthest. This work suggests that natural gas extraction may be contributing significantly to PAHs in air, at levels that are relevant to human health. PMID:25810398

  17. Linear solvation energy relationships: "rule of thumb" for estimation of variable values

    USGS Publications Warehouse

    Hickey, James P.; Passino-Reader, Dora R.

    1991-01-01

    For the linear solvation energy relationship (LSER), values are listed for each of the variables (Vi/100, π*, &betam, αm) for fundamental organic structures and functional groups. We give the guidelines to estimate LSER variable values quickly for a vast array of possible organic compounds such as those found in the environment. The difficulty in generating these variables has greatly discouraged the application of this quantitative structure-activity relationship (QSAR) method. This paper present the first compilation of molecular functional group values together with a utilitarian set of the LSER variable estimation rules. The availability of these variable values and rules should facilitate widespread application of LSER for hazard evaluation of environmental contaminants.

  18. On sweat analysis for quantitative estimation of dehydration during physical exercise.

    PubMed

    Ring, Matthias; Lohmueller, Clemens; Rauh, Manfred; Eskofier, Bjoern M

    2015-08-01

    Quantitative estimation of water loss during physical exercise is of importance because dehydration can impair both muscular strength and aerobic endurance. A physiological indicator for deficit of total body water (TBW) might be the concentration of electrolytes in sweat. It has been shown that concentrations differ after physical exercise depending on whether water loss was replaced by fluid intake or not. However, to the best of our knowledge, this fact has not been examined for its potential to quantitatively estimate TBW loss. Therefore, we conducted a study in which sweat samples were collected continuously during two hours of physical exercise without fluid intake. A statistical analysis of these sweat samples revealed significant correlations between chloride concentration in sweat and TBW loss (r = 0.41, p <; 0.01), and between sweat osmolality and TBW loss (r = 0.43, p <; 0.01). A quantitative estimation of TBW loss resulted in a mean absolute error of 0.49 l per estimation. Although the precision has to be improved for practical applications, the present results suggest that TBW loss estimation could be realizable using sweat samples.

  19. The physics of functional magnetic resonance imaging (fMRI)

    NASA Astrophysics Data System (ADS)

    Buxton, Richard B.

    2013-09-01

    Functional magnetic resonance imaging (fMRI) is a methodology for detecting dynamic patterns of activity in the working human brain. Although the initial discoveries that led to fMRI are only about 20 years old, this new field has revolutionized the study of brain function. The ability to detect changes in brain activity has a biophysical basis in the magnetic properties of deoxyhemoglobin, and a physiological basis in the way blood flow increases more than oxygen metabolism when local neural activity increases. These effects translate to a subtle increase in the local magnetic resonance signal, the blood oxygenation level dependent (BOLD) effect, when neural activity increases. With current techniques, this pattern of activation can be measured with resolution approaching 1 mm3 spatially and 1 s temporally. This review focuses on the physical basis of the BOLD effect, the imaging methods used to measure it, the possible origins of the physiological effects that produce a mismatch of blood flow and oxygen metabolism during neural activation, and the mathematical models that have been developed to understand the measured signals. An overarching theme is the growing field of quantitative fMRI, in which other MRI methods are combined with BOLD methods and analyzed within a theoretical modeling framework to derive quantitative estimates of oxygen metabolism and other physiological variables. That goal is the current challenge for fMRI: to move fMRI from a mapping tool to a quantitative probe of brain physiology.

  20. The physics of functional magnetic resonance imaging (fMRI)

    PubMed Central

    Buxton, Richard B

    2015-01-01

    Functional magnetic resonance imaging (fMRI) is a methodology for detecting dynamic patterns of activity in the working human brain. Although the initial discoveries that led to fMRI are only about 20 years old, this new field has revolutionized the study of brain function. The ability to detect changes in brain activity has a biophysical basis in the magnetic properties of deoxyhemoglobin, and a physiological basis in the way blood flow increases more than oxygen metabolism when local neural activity increases. These effects translate to a subtle increase in the local magnetic resonance signal, the blood oxygenation level dependent (BOLD) effect, when neural activity increases. With current techniques, this pattern of activation can be measured with resolution approaching 1 mm3 spatially and 1 s temporally. This review focuses on the physical basis of the BOLD effect, the imaging methods used to measure it, the possible origins of the physiological effects that produce a mismatch of blood flow and oxygen metabolism during neural activation, and the mathematical models that have been developed to understand the measured signals. An overarching theme is the growing field of quantitative fMRI, in which other MRI methods are combined with BOLD methods and analyzed within a theoretical modeling framework to derive quantitative estimates of oxygen metabolism and other physiological variables. That goal is the current challenge for fMRI: to move fMRI from a mapping tool to a quantitative probe of brain physiology. PMID:24006360

  1. The physics of functional magnetic resonance imaging (fMRI).

    PubMed

    Buxton, Richard B

    2013-09-01

    Functional magnetic resonance imaging (fMRI) is a methodology for detecting dynamic patterns of activity in the working human brain. Although the initial discoveries that led to fMRI are only about 20 years old, this new field has revolutionized the study of brain function. The ability to detect changes in brain activity has a biophysical basis in the magnetic properties of deoxyhemoglobin, and a physiological basis in the way blood flow increases more than oxygen metabolism when local neural activity increases. These effects translate to a subtle increase in the local magnetic resonance signal, the blood oxygenation level dependent (BOLD) effect, when neural activity increases. With current techniques, this pattern of activation can be measured with resolution approaching 1 mm(3) spatially and 1 s temporally. This review focuses on the physical basis of the BOLD effect, the imaging methods used to measure it, the possible origins of the physiological effects that produce a mismatch of blood flow and oxygen metabolism during neural activation, and the mathematical models that have been developed to understand the measured signals. An overarching theme is the growing field of quantitative fMRI, in which other MRI methods are combined with BOLD methods and analyzed within a theoretical modeling framework to derive quantitative estimates of oxygen metabolism and other physiological variables. That goal is the current challenge for fMRI: to move fMRI from a mapping tool to a quantitative probe of brain physiology.

  2. Quantitative crystalline silica exposure assessment for a historical cohort epidemiologic study in the German porcelain industry.

    PubMed

    Birk, Thomas; Guldner, Karlheinz; Mundt, Kenneth A; Dahmann, Dirk; Adams, Robert C; Parsons, William

    2010-09-01

    A time-dependent quantitative exposure assessment of silica exposure among nearly 18,000 German porcelain workers was conducted. Results will be used to evaluate exposure-response disease risks. Over 8000 historical industrial hygiene (IH) measurements with original sampling and analysis protocols from 1954-2006 were obtained from the German Berufs- genossenschaft der keramischen-und Glas-Industrie (BGGK) and used to construct a job exposure matrix (JEM). Early measurements from different devices were converted to modern gravimetric equivalent values. Conversion factors were derived from parallel historical measurements and new side-by-side measurements using historical and modern devices in laboratory dust tunnels and active workplace locations. Exposure values were summarized and smoothed using LOESS regression; estimates for early years were derived using backward extrapolation techniques. Employee work histories were merged with JEM values to determine cumulative crystalline silica exposures for cohort members. Average silica concentrations were derived for six primary similar exposure groups (SEGs) for 1938-2006. Over 40% of the cohort accumulated <0.5 mg; just over one-third accumulated >1 mg/m(3)-years. Nearly 5000 workers had cumulative crystalline silica estimates >1.5 mg/m(3)-years. Similar numbers of men and women fell into each cumulative exposure category, except for 1113 women and 1567 men in the highest category. Over half of those hired before 1960 accumulated >3 mg/m(3)-years crystalline silica compared with 4.9% of those hired after 1960. Among those ever working in the materials preparation area, half accumulated >3 mg/m(3)-year compared with 12% of those never working in this area. Quantitative respirable silica exposures were estimated for each member of this cohort, including employment periods for which sampling used now obsolete technologies. Although individual cumulative exposure estimates ranged from background to about 40 mg/m(3)-years, many of these estimates reflect long-term exposures near modern exposure limit values, allowing direct evaluation of lung cancer and silicosis risks near these limits without extrapolation. This quantitative exposure assessment is the largest to date in the porcelain industry.

  3. Computational design of a Diels-Alderase from a thermophilic esterase: the importance of dynamics

    NASA Astrophysics Data System (ADS)

    Linder, Mats; Johansson, Adam Johannes; Olsson, Tjelvar S. G.; Liebeschuetz, John; Brinck, Tore

    2012-09-01

    A novel computational Diels-Alderase design, based on a relatively rare form of carboxylesterase from Geobacillus stearothermophilus, is presented and theoretically evaluated. The structure was found by mining the PDB for a suitable oxyanion hole-containing structure, followed by a combinatorial approach to find suitable substrates and rational mutations. Four lead designs were selected and thoroughly modeled to obtain realistic estimates of substrate binding and prearrangement. Molecular dynamics simulations and DFT calculations were used to optimize and estimate binding affinity and activation energies. A large quantum chemical model was used to capture the salient interactions in the crucial transition state (TS). Our quantitative estimation of kinetic parameters was validated against four experimentally characterized Diels-Alderases with good results. The final designs in this work are predicted to have rate enhancements of ≈103-106 and high predicted proficiencies. This work emphasizes the importance of considering protein dynamics in the design approach, and provides a quantitative estimate of the how the TS stabilization observed in most de novo and redesigned enzymes is decreased compared to a minimal, `ideal' model. The presented design is highly interesting for further optimization and applications since it is based on a thermophilic enzyme ( T opt = 70 °C).

  4. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  5. Mathematical modeling of tetrahydroimidazole benzodiazepine-1-one derivatives as an anti HIV agent

    NASA Astrophysics Data System (ADS)

    Ojha, Lokendra Kumar

    2017-07-01

    The goal of the present work is the study of drug receptor interaction via QSAR (Quantitative Structure-Activity Relationship) analysis for 89 set of TIBO (Tetrahydroimidazole Benzodiazepine-1-one) derivatives. MLR (Multiple Linear Regression) method is utilized to generate predictive models of quantitative structure-activity relationships between a set of molecular descriptors and biological activity (IC50). The best QSAR model was selected having a correlation coefficient (r) of 0.9299 and Standard Error of Estimation (SEE) of 0.5022, Fisher Ratio (F) of 159.822 and Quality factor (Q) of 1.852. This model is statistically significant and strongly favours the substitution of sulphur atom, IS i.e. indicator parameter for -Z position of the TIBO derivatives. Two other parameter logP (octanol-water partition coefficient) and SAG (Surface Area Grid) also played a vital role in the generation of best QSAR model. All three descriptor shows very good stability towards data variation in leave-one-out (LOO).

  6. Augmented multivariate image analysis applied to quantitative structure-activity relationship modeling of the phytotoxicities of benzoxazinone herbicides and related compounds on problematic weeds.

    PubMed

    Freitas, Mirlaine R; Matias, Stella V B G; Macedo, Renato L G; Freitas, Matheus P; Venturin, Nelson

    2013-09-11

    Two of major weeds affecting cereal crops worldwide are Avena fatua L. (wild oat) and Lolium rigidum Gaud. (rigid ryegrass). Thus, development of new herbicides against these weeds is required; in line with this, benzoxazinones, their degradation products, and analogues have been shown to be important allelochemicals and natural herbicides. Despite earlier structure-activity studies demonstrating that hydrophobicity (log P) of aminophenoxazines correlates to phytotoxicity, our findings for a series of benzoxazinone derivatives do not show any relationship between phytotoxicity and log P nor with other two usual molecular descriptors. On the other hand, a quantitative structure-activity relationship (QSAR) analysis based on molecular graphs representing structural shape, atomic sizes, and colors to encode other atomic properties performed very accurately for the prediction of phytotoxicities of these compounds against wild oat and rigid ryegrass. Therefore, these QSAR models can be used to estimate the phytotoxicity of new congeners of benzoxazinone herbicides toward A. fatua L. and L. rigidum Gaud.

  7. Assessing covariate balance when using the generalized propensity score with quantitative or continuous exposures.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are increasingly being used to estimate the effects of treatments and exposures when using observational data. The propensity score was initially developed for use with binary exposures (e.g., active treatment vs. control). The generalized propensity score is an extension of the propensity score for use with quantitative exposures (e.g., dose or quantity of medication, income, years of education). A crucial component of any propensity score analysis is that of balance assessment. This entails assessing the degree to which conditioning on the propensity score (via matching, weighting, or stratification) has balanced measured baseline covariates between exposure groups. Methods for balance assessment have been well described and are frequently implemented when using the propensity score with binary exposures. However, there is a paucity of information on how to assess baseline covariate balance when using the generalized propensity score. We describe how methods based on the standardized difference can be adapted for use with quantitative exposures when using the generalized propensity score. We also describe a method based on assessing the correlation between the quantitative exposure and each covariate in the sample when weighted using generalized propensity score -based weights. We conducted a series of Monte Carlo simulations to evaluate the performance of these methods. We also compared two different methods of estimating the generalized propensity score: ordinary least squared regression and the covariate balancing propensity score method. We illustrate the application of these methods using data on patients hospitalized with a heart attack with the quantitative exposure being creatinine level.

  8. Quantitative Doppler Analysis Using Conventional Color Flow Imaging Acquisitions.

    PubMed

    Karabiyik, Yucel; Ekroll, Ingvild Kinn; Eik-Nes, Sturla H; Lovstakken, Lasse

    2018-05-01

    Interleaved acquisitions used in conventional triplex mode result in a tradeoff between the frame rate and the quality of velocity estimates. On the other hand, workflow becomes inefficient when the user has to switch between different modes, and measurement variability is increased. This paper investigates the use of power spectral Capon estimator in quantitative Doppler analysis using data acquired with conventional color flow imaging (CFI) schemes. To preserve the number of samples used for velocity estimation, only spatial averaging was utilized, and clutter rejection was performed after spectral estimation. The resulting velocity spectra were evaluated in terms of spectral width using a recently proposed spectral envelope estimator. The spectral envelopes were also used for Doppler index calculations using in vivo and string phantom acquisitions. In vivo results demonstrated that the Capon estimator can provide spectral estimates with sufficient quality for quantitative analysis using packet-based CFI acquisitions. The calculated Doppler indices were similar to the values calculated using spectrograms estimated on a commercial ultrasound scanner.

  9. 50 CFR Table 2b to Part 660... - 2010, and Beyond, Harvest Guidelines for Minor Rockfish by Depth Sub-groups (weights in metric tons)

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... amount estimated to be taken as research catch and in non-groundfish fisheries is 3,000 mt. The... anticipated to be taken during research activity and 0.14 mt for the amount expected to be taken during EFP... unexploited rockfish population in the California Current ecosystem, a non-quantitative assessment was...

  10. Smile line assessment comparing quantitative measurement and visual estimation.

    PubMed

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  11. Production and distribution of scientific and technical databases - Comparison among Japan, US and Europe

    NASA Astrophysics Data System (ADS)

    Onodera, Natsuo; Mizukami, Masayuki

    This paper estimates several quantitative indice on production and distribution of scientific and technical databases based on various recent publications and attempts to compare the indice internationally. Raw data used for the estimation are brought mainly from the Database Directory (published by MITI) for database production and from some domestic and foreign study reports for database revenues. The ratio of the indice among Japan, US and Europe for usage of database is similar to those for general scientific and technical activities such as population and R&D expenditures. But Japanese contributions to production, revenue and over-countory distribution of databases are still lower than US and European countries. International comparison of relative database activities between public and private sectors is also discussed.

  12. Estimation of Symptom Severity Scores for Patients with Schizophrenia Using ERP Source Activations during a Facial Affect Discrimination Task.

    PubMed

    Kim, Do-Won; Lee, Seung-Hwan; Shim, Miseon; Im, Chang-Hwan

    2017-01-01

    Precise diagnosis of psychiatric diseases and a comprehensive assessment of a patient's symptom severity are important in order to establish a successful treatment strategy for each patient. Although great efforts have been devoted to searching for diagnostic biomarkers of schizophrenia over the past several decades, no study has yet investigated how accurately these biomarkers are able to estimate an individual patient's symptom severity. In this study, we applied electrophysiological biomarkers obtained from electroencephalography (EEG) analyses to an estimation of symptom severity scores of patients with schizophrenia. EEG signals were recorded from 23 patients while they performed a facial affect discrimination task. Based on the source current density analysis results, we extracted voxels that showed a strong correlation between source activity and symptom scores. We then built a prediction model to estimate the symptom severity scores of each patient using the source activations of the selected voxels. The symptom scores of the Positive and Negative Syndrome Scale (PANSS) were estimated using the linear prediction model. The results of leave-one-out cross validation (LOOCV) showed that the mean errors of the estimated symptom scores were 3.34 ± 2.40 and 3.90 ± 3.01 for the Positive and Negative PANSS scores, respectively. The current pilot study is the first attempt to estimate symptom severity scores in schizophrenia using quantitative EEG features. It is expected that the present method can be extended to other cognitive paradigms or other psychological illnesses.

  13. Evaluation and Application of Satellite-Based Latent Heating Profile Estimation Methods

    NASA Technical Reports Server (NTRS)

    Olson, William S.; Grecu, Mircea; Yang, Song; Tao, Wei-Kuo

    2004-01-01

    In recent years, methods for estimating atmospheric latent heating vertical structure from both passive and active microwave remote sensing have matured to the point where quantitative evaluation of these methods is the next logical step. Two approaches for heating algorithm evaluation are proposed: First, application of heating algorithms to synthetic data, based upon cloud-resolving model simulations, can be used to test the internal consistency of heating estimates in the absence of systematic errors in physical assumptions. Second, comparisons of satellite-retrieved vertical heating structures to independent ground-based estimates, such as rawinsonde-derived analyses of heating, provide an additional test. The two approaches are complementary, since systematic errors in heating indicated by the second approach may be confirmed by the first. A passive microwave and combined passive/active microwave heating retrieval algorithm are evaluated using the described approaches. In general, the passive microwave algorithm heating profile estimates are subject to biases due to the limited vertical heating structure information contained in the passive microwave observations. These biases may be partly overcome by including more environment-specific a priori information into the algorithm s database of candidate solution profiles. The combined passive/active microwave algorithm utilizes the much higher-resolution vertical structure information provided by spaceborne radar data to produce less biased estimates; however, the global spatio-temporal sampling by spaceborne radar is limited. In the present study, the passive/active microwave algorithm is used to construct a more physically-consistent and environment-specific set of candidate solution profiles for the passive microwave algorithm and to help evaluate errors in the passive algorithm s heating estimates. Although satellite estimates of latent heating are based upon instantaneous, footprint- scale data, suppression of random errors requires averaging to at least half-degree resolution. Analysis of mesoscale and larger space-time scale phenomena based upon passive and passive/active microwave heating estimates from TRMM, SSMI, and AMSR data will be presented at the conference.

  14. Quantitative structure-activity relationships for predicting potential ecological hazard of organic chemicals for use in regulatory risk assessments.

    PubMed

    Comber, Mike H I; Walker, John D; Watts, Chris; Hermens, Joop

    2003-08-01

    The use of quantitative structure-activity relationships (QSARs) for deriving the predicted no-effect concentration of discrete organic chemicals for the purposes of conducting a regulatory risk assessment in Europe and the United States is described. In the United States, under the Toxic Substances Control Act (TSCA), the TSCA Interagency Testing Committee and the U.S. Environmental Protection Agency (U.S. EPA) use SARs to estimate the hazards of existing and new chemicals. Within the Existing Substances Regulation in Europe, QSARs may be used for data evaluation, test strategy indications, and the identification and filling of data gaps. To illustrate where and when QSARs may be useful and when their use is more problematic, an example, methyl tertiary-butyl ether (MTBE), is given and the predicted and experimental data are compared. Improvements needed for new QSARs and tools for developing and using QSARs are discussed.

  15. Spatio-temporal models of mental processes from fMRI.

    PubMed

    Janoos, Firdaus; Machiraju, Raghu; Singh, Shantanu; Morocz, Istvan Ákos

    2011-07-15

    Understanding the highly complex, spatially distributed and temporally organized phenomena entailed by mental processes using functional MRI is an important research problem in cognitive and clinical neuroscience. Conventional analysis methods focus on the spatial dimension of the data discarding the information about brain function contained in the temporal dimension. This paper presents a fully spatio-temporal multivariate analysis method using a state-space model (SSM) for brain function that yields not only spatial maps of activity but also its temporal structure along with spatially varying estimates of the hemodynamic response. Efficient algorithms for estimating the parameters along with quantitative validations are given. A novel low-dimensional feature-space for representing the data, based on a formal definition of functional similarity, is derived. Quantitative validation of the model and the estimation algorithms is provided with a simulation study. Using a real fMRI study for mental arithmetic, the ability of this neurophysiologically inspired model to represent the spatio-temporal information corresponding to mental processes is demonstrated. Moreover, by comparing the models across multiple subjects, natural patterns in mental processes organized according to different mental abilities are revealed. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. A framework for nowcasting and forecasting of rainfall-triggered landslide activity using remotely sensed data

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Stanley, Thomas

    2016-04-01

    Remote sensing data offers the unique perspective to provide situational awareness of hydrometeorological hazards over large areas in a way that is impossible to achieve with in situ data. Recent work has shown that rainfall-triggered landslides, while typically local hazards that occupy small spatial areas, can be approximated over regional or global scales in near real-time. This work presents a regional and global approach to approximating potential landslide activity using the landslide hazard assessment for situational awareness (LHASA) model. This system couples remote sensing data, including Global Precipitation Measurement rainfall data, Shuttle Radar Topography Mission and other surface variables to estimate where and when landslide activity may be likely. This system also evaluates the effectiveness of quantitative precipitation estimates from the Goddard Earth Observing System Model, Version 5 to provide a 24 forecast of potential landslide activity. Preliminary results of the LHASA model and implications for are presented for a regional version of this system in Central America as well as a prototype global approach.

  17. Comparative analysis of quantitative methodologies for Vibrionaceae biofilms.

    PubMed

    Chavez-Dozal, Alba A; Nourabadi, Neda; Erken, Martina; McDougald, Diane; Nishiguchi, Michele K

    2016-11-01

    Multiple symbiotic and free-living Vibrio spp. grow as a form of microbial community known as a biofilm. In the laboratory, methods to quantify Vibrio biofilm mass include crystal violet staining, direct colony-forming unit (CFU) counting, dry biofilm cell mass measurement, and observation of development of wrinkled colonies. Another approach for bacterial biofilms also involves the use of tetrazolium (XTT) assays (used widely in studies of fungi) that are an appropriate measure of metabolic activity and vitality of cells within the biofilm matrix. This study systematically tested five techniques, among which the XTT assay and wrinkled colony measurement provided the most reproducible, accurate, and efficient methods for the quantitative estimation of Vibrionaceae biofilms.

  18. 75 FR 35990 - Endangered and Threatened Wildlife and Plants; Listing the Flying Earwig Hawaiian Damselfly and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-24

    ... this location in 2008. No quantitative estimate of the size of this remaining population is available... observed in 1998. No quantitative estimates of the size of the extant populations are available. Howarth...

  19. Microscopic Structure and Solubility Predictions of Multifunctional Solids in Supercritical Carbon Dioxide: A Molecular Simulation Study.

    PubMed

    Noroozi, Javad; Paluch, Andrew S

    2017-02-23

    Molecular dynamics simulations were employed to both estimate the solubility of nonelectrolyte solids, such as acetanilide, acetaminophen, phenacetin, methylparaben, and lidocaine, in supercritical carbon dioxide and understand the underlying molecular-level driving forces. The solubility calculations involve the estimation of the solute's limiting activity coefficient, which may be computed using conventional staged free-energy calculations. For the case of lidocaine, wherein the infinite dilution approximation is not appropriate, we demonstrate how the activity coefficient at finite concentrations may be estimated without additional effort using the dilute solution approximation and how this may be used to further understand the solvation process. Combining with experimental pure-solid properties, namely, the normal melting point and enthalpy of fusion, solubilities were estimated. The results are in good quantitative agreement with available experimental data, suggesting that molecular simulations may be a powerful tool for understanding supercritical processes and the design of carbon dioxide-philic molecular systems. Structural analyses were performed to shed light on the microscopic details of the solvation of different functional groups by carbon dioxide and the observed solubility trends.

  20. Estimation of 3-D conduction velocity vector fields from cardiac mapping data.

    PubMed

    Barnette, A R; Bayly, P V; Zhang, S; Walcott, G P; Ideker, R E; Smith, W M

    2000-08-01

    A method to estimate three-dimensional (3-D) conduction velocity vector fields in cardiac tissue is presented. The speed and direction of propagation are found from polynomial "surfaces" fitted to space-time (x, y, z, t) coordinates of cardiac activity. The technique is applied to sinus rhythm and paced rhythm mapped with plunge needles at 396-466 sites in the canine myocardium. The method was validated on simulated 3-D plane and spherical waves. For simulated data, conduction velocities were estimated with an accuracy of 1%-2%. In experimental data, estimates of conduction speeds during paced rhythm were slower than those found during normal sinus rhythm. Vector directions were also found to differ between different types of beats. The technique was able to distinguish between premature ventricular contractions and sinus beats and between sinus and paced beats. The proposed approach to computing velocity vector fields provides an automated, physiological, and quantitative description of local electrical activity in 3-D tissue. This method may provide insight into abnormal conduction associated with fatal ventricular arrhythmias.

  1. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    PubMed

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  2. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    PubMed

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  3. Confidence estimation for quantitative photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Gröhl, Janek; Kirchner, Thomas; Maier-Hein, Lena

    2018-02-01

    Quantification of photoacoustic (PA) images is one of the major challenges currently being addressed in PA research. Tissue properties can be quantified by correcting the recorded PA signal with an estimation of the corresponding fluence. Fluence estimation itself, however, is an ill-posed inverse problem which usually needs simplifying assumptions to be solved with state-of-the-art methods. These simplifications, as well as noise and artifacts in PA images reduce the accuracy of quantitative PA imaging (PAI). This reduction in accuracy is often localized to image regions where the assumptions do not hold true. This impedes the reconstruction of functional parameters when averaging over entire regions of interest (ROI). Averaging over a subset of voxels with a high accuracy would lead to an improved estimation of such parameters. To achieve this, we propose a novel approach to the local estimation of confidence in quantitative reconstructions of PA images. It makes use of conditional probability densities to estimate confidence intervals alongside the actual quantification. It encapsulates an estimation of the errors introduced by fluence estimation as well as signal noise. We validate the approach using Monte Carlo generated data in combination with a recently introduced machine learning-based approach to quantitative PAI. Our experiments show at least a two-fold improvement in quantification accuracy when evaluating on voxels with high confidence instead of thresholding signal intensity.

  4. Data analysis in emission tomography using emission-count posteriors

    NASA Astrophysics Data System (ADS)

    Sitek, Arkadiusz

    2012-11-01

    A novel approach to the analysis of emission tomography data using the posterior probability of the number of emissions per voxel (emission count) conditioned on acquired tomographic data is explored. The posterior is derived from the prior and the Poisson likelihood of the emission-count data by marginalizing voxel activities. Based on emission-count posteriors, examples of Bayesian analysis including estimation and classification tasks in emission tomography are provided. The application of the method to computer simulations of 2D tomography is demonstrated. In particular, the minimum-mean-square-error point estimator of the emission count is demonstrated. The process of finding this estimator can be considered as a tomographic image reconstruction technique since the estimates of the number of emissions per voxel divided by voxel sensitivities and acquisition time are the estimates of the voxel activities. As an example of a classification task, a hypothesis stating that some region of interest (ROI) emitted at least or at most r-times the number of events in some other ROI is tested. The ROIs are specified by the user. The analysis described in this work provides new quantitative statistical measures that can be used in decision making in diagnostic imaging using emission tomography.

  5. Relative equilibrium plot improves graphical analysis and allows bias correction of standardized uptake value ratio in quantitative 11C-PiB PET studies.

    PubMed

    Zhou, Yun; Sojkova, Jitka; Resnick, Susan M; Wong, Dean F

    2012-04-01

    Both the standardized uptake value ratio (SUVR) and the Logan plot result in biased distribution volume ratios (DVRs) in ligand-receptor dynamic PET studies. The objective of this study was to use a recently developed relative equilibrium-based graphical (RE) plot method to improve and simplify the 2 commonly used methods for quantification of (11)C-Pittsburgh compound B ((11)C-PiB) PET. The overestimation of DVR in SUVR was analyzed theoretically using the Logan and the RE plots. A bias-corrected SUVR (bcSUVR) was derived from the RE plot. Seventy-eight (11)C-PiB dynamic PET scans (66 from controls and 12 from participants with mild cognitive impaired [MCI] from the Baltimore Longitudinal Study of Aging) were acquired over 90 min. Regions of interest (ROIs) were defined on coregistered MR images. Both the ROI and the pixelwise time-activity curves were used to evaluate the estimates of DVR. DVRs obtained using the Logan plot applied to ROI time-activity curves were used as a reference for comparison of DVR estimates. Results from the theoretic analysis were confirmed by human studies. ROI estimates from the RE plot and the bcSUVR were nearly identical to those from the Logan plot with ROI time-activity curves. In contrast, ROI estimates from DVR images in frontal, temporal, parietal, and cingulate regions and the striatum were underestimated by the Logan plot (controls, 4%-12%; MCI, 9%-16%) and overestimated by the SUVR (controls, 8%-16%; MCI, 16%-24%). This bias was higher in the MCI group than in controls (P < 0.01) but was not present when data were analyzed using either the RE plot or the bcSUVR. The RE plot improves pixelwise quantification of (11)C-PiB dynamic PET, compared with the conventional Logan plot. The bcSUVR results in lower bias and higher consistency of DVR estimates than of SUVR. The RE plot and the bcSUVR are practical quantitative approaches that improve the analysis of (11)C-PiB studies.

  6. Emissions of Polycyclic Aromatic Hydrocarbons from Natural Gas Extraction into Air.

    PubMed

    Paulik, L Blair; Donald, Carey E; Smith, Brian W; Tidwell, Lane G; Hobbie, Kevin A; Kincl, Laurel; Haynes, Erin N; Anderson, Kim A

    2016-07-19

    Natural gas extraction, often referred to as "fracking", has increased rapidly in the United States in recent years. To address potential health impacts, passive air samplers were deployed in a rural community heavily affected by the natural gas boom. Samplers were analyzed for 62 polycyclic aromatic hydrocarbons (PAHs). Results were grouped based on distance from each sampler to the nearest active well. Levels of benzo[a]pyrene, phenanthrene, and carcinogenic potency of PAH mixtures were highest when samplers were closest to active wells. PAH levels closest to natural gas activity were comparable to levels previously reported in rural areas in winter. Sourcing ratios indicated that PAHs were predominantly petrogenic, suggesting that PAH levels were influenced by direct releases from the earth. Quantitative human health risk assessment estimated the excess lifetime cancer risks associated with exposure to the measured PAHs. At sites closest to active wells, the risk estimated for maximum residential exposure was 0.04 in a million, which is below the U.S. Environmental Protection Agency's acceptable risk level. Overall, risk estimates decreased 30% when comparing results from samplers closest to active wells to those farthest from them. This work suggests that natural gas extraction is contributing PAHs to the air, at levels that would not be expected to increase cancer risk.

  7. Prediction of Physicochemical Properties of Energetic Materials for Identification of Treatment Technologies for Waste Streams

    DTIC Science & Technology

    2010-11-01

    estimate the pharmacokinetics of potential drugs (Horning and Klamt 2005). QSPR/ QSARs also have potential applications in the fuel science field...group contribution methods, and (2) quantitative structure-property/activity relationships (QSPR/ QSAR ). The group contribution methods are primarily...development of QSPR/ QSARs is the identification of the ap- propriate set of descriptors that allow the desired attribute of the compound to be adequately

  8. Metallurgical features of the formation of a solid-phase metal joint upon electric-circuit heating

    NASA Astrophysics Data System (ADS)

    Latypov, R. A.; Bulychev, V. V.; Zybin, I. N.

    2017-06-01

    The thermodynamic conditions of formation of a joint between metals using the solid-phase methods of powder metallurgy, welding, and deposition of functional coatings upon electric-current heating of the surfaces to be joined are studied. Relations are obtained to quantitatively estimate the critical sizes of the circular and linear active centers that result in the formation of stable bonding zones.

  9. Overview of data and conceptual approaches for derivation of quantitative structure-activity relationships for ecotoxicological effects of organic chemicals.

    PubMed

    Bradbury, Steven P; Russom, Christine L; Ankley, Gerald T; Schultz, T Wayne; Walker, John D

    2003-08-01

    The use of quantitative structure-activity relationships (QSARs) in assessing potential toxic effects of organic chemicals on aquatic organisms continues to evolve as computational efficiency and toxicological understanding advance. With the ever-increasing production of new chemicals, and the need to optimize resources to assess thousands of existing chemicals in commerce, regulatory agencies have turned to QSARs as essential tools to help prioritize tiered risk assessments when empirical data are not available to evaluate toxicological effects. Progress in designing scientifically credible QSARs is intimately associated with the development of empirically derived databases of well-defined and quantified toxicity endpoints, which are based on a strategic evaluation of diverse sets of chemical structures, modes of toxic action, and species. This review provides a brief overview of four databases created for the purpose of developing QSARs for estimating toxicity of chemicals to aquatic organisms. The evolution of QSARs based initially on general chemical classification schemes, to models founded on modes of toxic action that range from nonspecific partitioning into hydrophobic cellular membranes to receptor-mediated mechanisms is summarized. Finally, an overview of expert systems that integrate chemical-specific mode of action classification and associated QSAR selection for estimating potential toxicological effects of organic chemicals is presented.

  10. Effects of Horticultural Therapy on Psychosocial Health in Older Nursing Home Residents: A Preliminary Study.

    PubMed

    Chen, Yuh-Min; Ji, Jeng-Yi

    2015-09-01

    This preliminary study examined the effect of horticultural therapy on psychosocial health in older nursing home residents. A combined quantitative and qualitative design was adopted. Convenience sampling was used to recruit 10 older residents from a nursing home in Taichung, Taiwan. Participants joined a 10-week indoor horticultural program once a week, with each session lasting for about 1.5 hours. A single-group design with multiple measurements was adopted for the quantitative component of this study. Interviews held 1-2 days before the intervention (T0) were used to collect baseline data. The two outcome variables of this study, depression and loneliness, were reassessed during the 5th (T1) and 10th (T2) weeks of the intervention. Generalized estimating equations were used to test the mean differences among T0, T1, and T2 measures. After the 10-week program, qualitative data were collected by asking participants to share their program participation experiences. The results of generalized estimating equation showed significant improvements in depression and loneliness. Four categories emerged from the qualitative data content analysis: social connection, anticipation and hope, sense of achievement, and companionship. Given the beneficial effects of the horticulture therapy, the inclusion of horticultural activities in nursing home activity programs is recommended.

  11. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, B.D.; Toole, A.P.; Callahan, B.G.

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromaticmore » ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.« less

  12. Three methods for estimating a range of vehicular interactions

    NASA Astrophysics Data System (ADS)

    Krbálek, Milan; Apeltauer, Jiří; Apeltauer, Tomáš; Szabová, Zuzana

    2018-02-01

    We present three different approaches how to estimate the number of preceding cars influencing a decision-making procedure of a given driver moving in saturated traffic flows. The first method is based on correlation analysis, the second one evaluates (quantitatively) deviations from the main assumption in the convolution theorem for probability, and the third one operates with advanced instruments of the theory of counting processes (statistical rigidity). We demonstrate that universally-accepted premise on short-ranged traffic interactions may not be correct. All methods introduced have revealed that minimum number of actively-followed vehicles is two. It supports an actual idea that vehicular interactions are, in fact, middle-ranged. Furthermore, consistency between the estimations used is surprisingly credible. In all cases we have found that the interaction range (the number of actively-followed vehicles) drops with traffic density. Whereas drivers moving in congested regimes with lower density (around 30 vehicles per kilometer) react on four or five neighbors, drivers moving in high-density flows respond to two predecessors only.

  13. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward.  The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226

  14. Comparison of Maximum Likelihood Estimation Approach and Regression Approach in Detecting Quantitative Trait Lco Using RAPD Markers

    Treesearch

    Changren Weng; Thomas L. Kubisiak; C. Dana Nelson; James P. Geaghan; Michael Stine

    1999-01-01

    Single marker regression and single marker maximum likelihood estimation were tied to detect quantitative trait loci (QTLs) controlling the early height growth of longleaf pine and slash pine using a ((longleaf pine x slash pine) x slash pine) BC, population consisting of 83 progeny. Maximum likelihood estimation was found to be more power than regression and could...

  15. Estimating exposures in the asphalt industry for an international epidemiological cohort study of cancer risk.

    PubMed

    Burstyn, Igor; Boffetta, Paolo; Kauppinen, Timo; Heikkilä, Pirjo; Svane, Ole; Partanen, Timo; Stücker, Isabelle; Frentzel-Beyme, Rainer; Ahrens, Wolfgang; Merzenich, Hiltrud; Heederik, Dick; Hooiveld, Mariëtte; Langård, Sverre; Randem, Britt G; Järvholm, Bengt; Bergdahl, Ingvar; Shaham, Judith; Ribak, Joseph; Kromhout, Hans

    2003-01-01

    An exposure matrix (EM) for known and suspected carcinogens was required for a multicenter international cohort study of cancer risk and bitumen among asphalt workers. Production characteristics in companies enrolled in the study were ascertained through use of a company questionnaire (CQ). Exposures to coal tar, bitumen fume, organic vapor, polycyclic aromatic hydrocarbons, diesel fume, silica, and asbestos were assessed semi-quantitatively using information from CQs, expert judgment, and statistical models. Exposures of road paving workers to bitumen fume, organic vapor, and benzo(a)pyrene were estimated quantitatively by applying regression models, based on monitoring data, to exposure scenarios identified by the CQs. Exposures estimates were derived for 217 companies enrolled in the cohort, plus the Swedish asphalt paving industry in general. Most companies were engaged in road paving and asphalt mixing, but some also participated in general construction and roofing. Coal tar use was most common in Denmark and The Netherlands, but the practice is now obsolete. Quantitative estimates of exposure to bitumen fume, organic vapor, and benzo(a)pyrene for pavers, and semi-quantitative estimates of exposure to these agents among all subjects were strongly correlated. Semi-quantitative estimates of exposure to bitumen fume and coal tar exposures were only moderately correlated. EM assessed non-monotonic historical decrease in exposures to all agents assessed except silica and diesel exhaust. We produced a data-driven EM using methodology that can be adapted for other multicenter studies. Copyright 2003 Wiley-Liss, Inc.

  16. Quantitative Use of Fluorescent In Situ Hybridization To Examine Relationships between Mycolic Acid-Containing Actinomycetes and Foaming in Activated Sludge Plants

    PubMed Central

    Davenport, Russell J.; Curtis, Thomas P.; Goodfellow, Michael; Stainsby, Fiona M.; Bingley, Marc

    2000-01-01

    The formation of viscous foams on aeration basins and secondary clarifiers of activated sludge plants is a common and widespread problem. Foam formation is often attributed to the presence of mycolic acid-containing actinomycetes (mycolata). In order to examine the relationship between the number of mycolata and foam, we developed a group-specific probe targeting the 16S rRNA of the mycolata, a protocol to permeabilize mycolata, and a statistically robust quantification method. Statistical analyses showed that a lipase-based permeabilization method was quantitatively superior to previously described methods (P << 0.05). When mixed liquor and foam samples were examined, most of the mycolata present were rods or cocci, although filamentous mycolata were also observed. A nested analysis of variance showed that virtually all of the measured variance occurred between fields of view and not between samples. On this basis we determined that as few as five fields of view could be used to give a statistically meaningful sample. Quantitative fluorescent in situ hybridization (FISH) was used to examine the relationship between foaming and the concentration of mycolata in a 20-m3 completely mixed activated sludge plant. Foaming occurred when the number of mycolata exceeded a certain threshold value. Baffling of the plant affected foaming without affecting the number of mycolata. We tentatively estimated that the threshold foaming concentration of mycolata was about 2 × 106 cells ml−1 or 4 × 1012 cells m−2. We concluded that quantitative use of FISH is feasible and that quantification is a prerequisite for rational investigation of foaming in activated sludge. PMID:10698786

  17. Biological activity of aldose reductase and lipophilicity of pyrrolyl-acetic acid derivatives

    NASA Astrophysics Data System (ADS)

    Kumari, A.; Kumari, R.; Kumar, R.; Gupta, M.

    2011-12-01

    Quantitative Structure-Activity Relationship modeling is a powerful approach for correlating an organic compound to its lipophilicity. In this paper QSAR models are established for estimation of correlation of the lipophilicity of a series of pyrrolyl-acetic acid derivatives, inhibitors of the aldose reductase enzyme, in the n-octanol-water system with biological activity of aldose reductase. Lipophilicity, expressed by the logarithm of n-octnol-water partition coefficient log P and biological activity of aldose reductase inhibitory activity by log it. Result obtained by QSAR modeling of compound series reveal a definite trend in biological activity and a further improvement in quantitative relationships are established if, beside log P, Hammett electronic constant σ and connectivity index chi-3 (3 χ) term included in the regression equation. The tri-parametric model with log P, 3 χ and σ as correlating parameters have been found to be the best which gives a variance of 87% ( R 2 = 0.8743). A compound has been found to be serious outlier and when the same has been excluded the model explains about 94% variance of the data set ( R 2 = 0.9447). The topological index (3 χ) has been found to be a good parameter for modeling the biological activity.

  18. Trend of telomerase activity change during human iPSC self-renewal and differentiation revealed by a quartz crystal microbalance based assay

    NASA Astrophysics Data System (ADS)

    Zhou, Yitian; Zhou, Ping; Xin, Yinqiang; Wang, Jie; Zhu, Zhiqiang; Hu, Ji; Wei, Shicheng; Ma, Hongwei

    2014-11-01

    Telomerase plays an important role in governing the life span of cells for its capacity to extend telomeres. As high activity of telomerase has been found in stem cells and cancer cells specifically, various methods have been developed for the evaluation of telomerase activity. To overcome the time-consuming procedures and complicated manipulations of existing methods, we developed a novel method named Telomeric Repeat Elongation Assay based on Quartz crystal microbalance (TREAQ) to monitor telomerase activity during the self-renewal and differentiation of human induced pluripotent stem cells (hiPSCs). TREAQ results indicated hiPSCs possess invariable telomerase activity for 11 passages on Matrigel and a steady decline of telomerase activity when differentiated for different periods, which is confirmed with existing golden standard method. The pluripotency of hiPSCs during differentiation could be estimated through monitoring telomerase activity and compared with the expression levels of markers of pluripotency gene via quantitative real time PCR. Regular assessment for factors associated with pluripotency or stemness was expensive and requires excessive sample consuming, thus TREAQ could be a promising alternative technology for routine monitoring of telomerase activity and estimate the pluripotency of stem cells.

  19. Estimation of Anaerobic Debromination Rate Constants of PBDE Pathways Using an Anaerobic Dehalogenation Model.

    PubMed

    Karakas, Filiz; Imamoglu, Ipek

    2017-04-01

    This study aims to estimate anaerobic debromination rate constants (k m ) of PBDE pathways using previously reported laboratory soil data. k m values of pathways are estimated by modifying a previously developed model as Anaerobic Dehalogenation Model. Debromination activities published in the literature in terms of bromine substitutions as well as specific microorganisms and their combinations are used for identification of pathways. The range of estimated k m values is between 0.0003 and 0.0241 d -1 . The median and maximum of k m values are found to be comparable to the few available biologically confirmed rate constants published in the literature. The estimated k m values can be used as input to numerical fate and transport models for a better and more detailed investigation of the fate of individual PBDEs in contaminated sediments. Various remediation scenarios such as monitored natural attenuation or bioremediation with bioaugmentation can be handled in a more quantitative manner with the help of k m estimated in this study.

  20. Approach to estimation of level of information security at enterprise based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    V, Stepanov L.; V, Parinov A.; P, Korotkikh L.; S, Koltsov A.

    2018-05-01

    In the article, the way of formalization of different types of threats of information security and vulnerabilities of an information system of the enterprise and establishment is considered. In a type of complexity of ensuring information security of application of any new organized system, the concept and decisions in the sphere of information security are expedient. One of such approaches is the method of a genetic algorithm. For the enterprises of any fields of activity, the question of complex estimation of the level of security of information systems taking into account the quantitative and qualitative factors characterizing components of information security is relevant.

  1. Monitoring vegetation conditions from LANDSAT for use in range management

    NASA Technical Reports Server (NTRS)

    Haas, R. H.; Deering, D. W.; Rouse, J. W., Jr.; Schell, J. A.

    1975-01-01

    A summary of the LANDSAT Great Plains Corridor projects and the principal results are presented. Emphasis is given to the use of satellite acquired phenological data for range management and agri-business activities. A convenient method of reducing LANDSAT MSS data to provide quantitative estimates of green biomass on rangelands in the Great Plains is explained. Suggestions for the use of this approach for evaluating range feed conditions are presented. A LANDSAT Follow-on project has been initiated which will employ the green biomass estimation method in a quasi-operational monitoring of range readiness and range feed conditions on a regional scale.

  2. Use of Mobile Device Data To Better Estimate Dynamic Population Size for Wastewater-Based Epidemiology.

    PubMed

    Thomas, Kevin V; Amador, Arturo; Baz-Lomba, Jose Antonio; Reid, Malcolm

    2017-10-03

    Wastewater-based epidemiology is an established approach for quantifying community drug use and has recently been applied to estimate population exposure to contaminants such as pesticides and phthalate plasticizers. A major source of uncertainty in the population weighted biomarker loads generated is related to estimating the number of people present in a sewer catchment at the time of sample collection. Here, the population quantified from mobile device-based population activity patterns was used to provide dynamic population normalized loads of illicit drugs and pharmaceuticals during a known period of high net fluctuation in the catchment population. Mobile device-based population activity patterns have for the first time quantified the high degree of intraday, week, and month variability within a specific sewer catchment. Dynamic population normalization showed that per capita pharmaceutical use remained unchanged during the period when static normalization would have indicated an average reduction of up to 31%. Per capita illicit drug use increased significantly during the monitoring period, an observation that was only possible to measure using dynamic population normalization. The study quantitatively confirms previous assessments that population estimates can account for uncertainties of up to 55% in static normalized data. Mobile device-based population activity patterns allow for dynamic normalization that yields much improved temporal and spatial trend analysis.

  3. The interrupted power law and the size of shadow banking.

    PubMed

    Fiaschi, Davide; Kondor, Imre; Marsili, Matteo; Volpati, Valerio

    2014-01-01

    Using public data (Forbes Global 2000) we show that the asset sizes for the largest global firms follow a Pareto distribution in an intermediate range, that is "interrupted" by a sharp cut-off in its upper tail, where it is totally dominated by financial firms. This flattening of the distribution contrasts with a large body of empirical literature which finds a Pareto distribution for firm sizes both across countries and over time. Pareto distributions are generally traced back to a mechanism of proportional random growth, based on a regime of constant returns to scale. This makes our findings of an "interrupted" Pareto distribution all the more puzzling, because we provide evidence that financial firms in our sample should operate in such a regime. We claim that the missing mass from the upper tail of the asset size distribution is a consequence of shadow banking activity and that it provides an (upper) estimate of the size of the shadow banking system. This estimate-which we propose as a shadow banking index-compares well with estimates of the Financial Stability Board until 2009, but it shows a sharper rise in shadow banking activity after 2010. Finally, we propose a proportional random growth model that reproduces the observed distribution, thereby providing a quantitative estimate of the intensity of shadow banking activity.

  4. Kinetic theory for DNA melting with vibrational entropy

    NASA Astrophysics Data System (ADS)

    Sensale, Sebastian; Peng, Zhangli; Chang, Hsueh-Chia

    2017-10-01

    By treating DNA as a vibrating nonlinear lattice, an activated kinetic theory for DNA melting is developed to capture the breakage of the hydrogen bonds and subsequent softening of torsional and bending vibration modes. With a coarse-grained lattice model, we identify a key bending mode with GHz frequency that replaces the hydrogen vibration modes as the dominant out-of-phase phonon vibration at the transition state. By associating its bending modulus to a universal in-phase bending vibration modulus at equilibrium, we can hence estimate the entropic change in the out-of-phase vibration from near-equilibrium all-atom simulations. This and estimates of torsional and bending entropy changes lead to the first predictive and sequence-dependent theory with good quantitative agreement with experimental data for the activation energy of melting of short DNA molecules without intermediate hairpin structures.

  5. Neural electrical activity and neural network growth.

    PubMed

    Gafarov, F M

    2018-05-01

    The development of central and peripheral neural system depends in part on the emergence of the correct functional connectivity in its input and output pathways. Now it is generally accepted that molecular factors guide neurons to establish a primary scaffold that undergoes activity-dependent refinement for building a fully functional circuit. However, a number of experimental results obtained recently shows that the neuronal electrical activity plays an important role in the establishing of initial interneuronal connections. Nevertheless, these processes are rather difficult to study experimentally, due to the absence of theoretical description and quantitative parameters for estimation of the neuronal activity influence on growth in neural networks. In this work we propose a general framework for a theoretical description of the activity-dependent neural network growth. The theoretical description incorporates a closed-loop growth model in which the neural activity can affect neurite outgrowth, which in turn can affect neural activity. We carried out the detailed quantitative analysis of spatiotemporal activity patterns and studied the relationship between individual cells and the network as a whole to explore the relationship between developing connectivity and activity patterns. The model, developed in this work will allow us to develop new experimental techniques for studying and quantifying the influence of the neuronal activity on growth processes in neural networks and may lead to a novel techniques for constructing large-scale neural networks by self-organization. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Quantitative Assessment of Cancer Risk from Exposure to Diesel Engine Emissions

    EPA Science Inventory

    Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. uman target organ dose was estimated with the aid of a comprehensive dosimetry model. This model accounted for rat-hum...

  7. A QUANTITATIVE APPROACH FOR ESTIMATING EXPOSURE TO PESTICIDES IN THE AGRICULTURAL HEALTH STUDY

    EPA Science Inventory

    We developed a quantitative method to estimate chemical-specific pesticide exposures in a large prospective cohort study of over 58,000 pesticide applicators in North Carolina and Iowa. An enrollment questionnaire was administered to applicators to collect basic time- and inten...

  8. Quantification of Microbial Phenotypes

    PubMed Central

    Martínez, Verónica S.; Krömer, Jens O.

    2016-01-01

    Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694

  9. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and posterior eye segment as well as in skin imaging. The new estimator shows superior performance and also shows clearer image contrast.

  10. Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2009-01-01

    The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…

  11. Quantitative endoscopy: initial accuracy measurements.

    PubMed

    Truitt, T O; Adelman, R A; Kelly, D H; Willging, J P

    2000-02-01

    The geometric optics of an endoscope can be used to determine the absolute size of an object in an endoscopic field without knowing the actual distance from the object. This study explores the accuracy of a technique that estimates absolute object size from endoscopic images. Quantitative endoscopy involves calibrating a rigid endoscope to produce size estimates from 2 images taken with a known traveled distance between the images. The heights of 12 samples, ranging in size from 0.78 to 11.80 mm, were estimated with this calibrated endoscope. Backup distances of 5 mm and 10 mm were used for comparison. The mean percent error for all estimated measurements when compared with the actual object sizes was 1.12%. The mean errors for 5-mm and 10-mm backup distances were 0.76% and 1.65%, respectively. The mean errors for objects <2 mm and > or =2 mm were 0.94% and 1.18%, respectively. Quantitative endoscopy estimates endoscopic image size to within 5% of the actual object size. This method remains promising for quantitatively evaluating object size from endoscopic images. It does not require knowledge of the absolute distance of the endoscope from the object, rather, only the distance traveled by the endoscope between images.

  12. Comparison of analytical methods of brain [18F]FDG-PET after severe traumatic brain injury.

    PubMed

    Madsen, Karine; Hesby, Sara; Poulsen, Ingrid; Fuglsang, Stefan; Graff, Jesper; Larsen, Karen B; Kammersgaard, Lars P; Law, Ian; Siebner, Hartwig R

    2017-11-01

    Loss of consciousness has been shown to reduce cerebral metabolic rates of glucose (CMRglc) measured by brain [ 18 F]FDG-PET. Measurements of regional metabolic patterns by normalization to global cerebral metabolism or cerebellum may underestimate widespread reductions. The aim of this study was to compare quantification methods of whole brain glucose metabolism, including whole brain [18F]FDG uptake normalized to uptake in cerebellum, normalized to injected activity, normalized to plasma tracer concentration, and two methods for estimating CMRglc. Six patients suffering from severe traumatic brain injury (TBI) and ten healthy controls (HC) underwent a 10min static [ 18 F]FDG-PET scan and venous blood sampling. Except from normalizing to cerebellum, all quantification methods found significant lower level of whole brain glucose metabolism of 25-33% in TBI patients compared to HC. In accordance these measurements correlated to level of consciousness. Our study demonstrates that the analysis method of the [ 18 F]FDG PET data has a substantial impact on the estimated whole brain cerebral glucose metabolism in patients with severe TBI. Importantly, the SUVR method which is often used in a clinical setting was not able to distinguish patients with severe TBI from HC at the whole-brain level. We recommend supplementing a static [ 18 F]FDG scan with a single venous blood sample in future studies of patients with severe TBI or reduced level of consciousness. This can be used for simple semi-quantitative uptake values by normalizing brain activity uptake to plasma tracer concentration, or quantitative estimates of CMRglc. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Quantitative Susceptibility Mapping and R2* Measured Changes during White Matter Lesion Development in Multiple Sclerosis: Myelin Breakdown, Myelin Debris Degradation and Removal, and Iron Accumulation.

    PubMed

    Zhang, Y; Gauthier, S A; Gupta, A; Chen, W; Comunale, J; Chiang, G C-Y; Zhou, D; Askin, G; Zhu, W; Pitt, D; Wang, Y

    2016-09-01

    Quantitative susceptibility mapping and R2* are sensitive to myelin and iron changes in multiple sclerosis lesions. This study was designed to characterize lesion changes on quantitative susceptibility mapping and R2* at various gadolinium-enhancement stages. This study included 64 patients with MS with different enhancing patterns in white matter lesions: nodular, shell-like, nonenhancing < 1 year old, and nonenhancing 1-3 years old. These represent acute, late acute, early chronic, and late chronic lesions, respectively. Susceptibility values measured on quantitative susceptibility mapping and R2* values were compared among the 4 lesion types. Their differences were assessed with a generalized estimating equation, controlling for Expanded Disability Status Scale score, age, and disease duration. We analyzed 203 lesions: 80 were nodular-enhancing, of which 77 (96.2%) were isointense on quantitative susceptibility mapping; 33 were shell-enhancing, of which 30 (90.9%) were hyperintense on quantitative susceptibility mapping; and 49 were nonenhancing lesions < 1 year old and 41 were nonenhancing lesions 1-3 years old, all of which were hyperintense on quantitative susceptibility mapping. Their relative susceptibility/R2* values were 0.5 ± 4.4 parts per billion/-5.6 ± 2.9 Hz, 10.2 ± 5.4 parts per billion/-8.0 ± 2.6 Hz, 20.2 ± 7.8 parts per billion/-3.1 ± 2.3 Hz, and 33.2 ± 8.2 parts per billion/-2.0 ± 2.6 Hz, respectively, and were significantly different (P < .005). Early active MS lesions with nodular enhancement show R2* decrease but no quantitative susceptibility mapping change, reflecting myelin breakdown; late active lesions with peripheral enhancement show R2* decrease and quantitative susceptibility mapping increase in the lesion center, reflecting further degradation and removal of myelin debris; and early or late chronic nonenhancing lesions show both quantitative susceptibility mapping and R2* increase, reflecting iron accumulation. © 2016 by American Journal of Neuroradiology.

  14. Quantitative Evaluation of Ion-implanted Arsenic in Silicon by Instrumental Neutron Activation Analysis

    NASA Astrophysics Data System (ADS)

    Takatsuka, Toshiko; Hirata, Kouichi; Kobayashi, Yoshinori; Kuroiwa, Takayoshi; Miura, Tsutomu; Matsue, Hideaki

    2008-11-01

    Certified reference materials (CRMs) of shallow arsenic implants in silicon are now under development at the National Metrology Institute of Japan (NMIJ). The amount of ion-implanted arsenic atoms is quantified by Instrumental Neutron Activation Analysis (INAA) using research reactor JRR-3 in Japan Atomic Energy Agency (JAEA). It is found that this method can evaluate arsenic amounts of 1015 atoms/cm2 with small uncertainties, and is adaptable to shallower dopants. The estimated uncertainties can satisfy the industrial demands for reference materials to calibrate the implanted dose of arsenic at shallow junctions.

  15. Measurement of regional cerebral blood flow with copper-62-PTSM and a three-compartment model.

    PubMed

    Okazawa, H; Yonekura, Y; Fujibayashi, Y; Mukai, T; Nishizawa, S; Magata, Y; Ishizu, K; Tamaki, N; Konishi, J

    1996-07-01

    We evaluated quantitatively 62Cu-labeled pyruvaldehyde bis(N4-methylthiosemicarbazone) copper II (62Cu-PTSM) as a brain perfusion tracer for positron emission tomography (PET). For quantitative measurement, the octanol extraction method is needed to correct for arterial radioactivity in estimating the lipophilic input function, but the procedure is not practical for clinical studies. To measure regional cerebral blood flow (rCBF) by 62Cu-PTSM with simple arterial blood sampling, a standard curve of the octanol extraction ratio and a three-compartment model were applied. We performed both 15O-labeled water PET and 62 Cu-PTSM PET with dynamic data acquisition and arterial sampling in six subjects. Data obtained in 10 subjects studied previously were used for the standard octanol extraction curve. Arterial activity was measured and corrected to obtain the true input function using the standard curve. Graphical analysis (Gjedde-Patlak plot) with the data for each subject fitted by a straight regression line suggested that 62Cu-PTSM can be analyzed by the three-compartment model with negligible K4. Using this model, K1-K3 were estimated from curve fitting of the cerebral time-activity curve and the corrected input function. The fractional uptake of 62Cu-PTSM was corrected to rCBF with the individual extraction at steady state calculated from K1-K3. The influx rates (Ki) obtained from three-compartment model and graphical analyses were compared for the validation of the model. A comparison of rCBF values obtained from 62Cu-PTSM and 150-water studies demonstrated excellent correlation. The results suggest the potential feasibility of quantitation of cerebral perfusion with 62Cu-PTSM accompanied by dynamic PET and simple arterial sampling.

  16. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs.

  18. Semi-quantitative estimation of cellular SiO2 nanoparticles using flow cytometry combined with X-ray fluorescence measurements.

    PubMed

    Choi, Seo Yeon; Yang, Nuri; Jeon, Soo Kyung; Yoon, Tae Hyun

    2014-09-01

    In this study, we have demonstrated feasibility of a semi-quantitative approach for the estimation of cellular SiO2 nanoparticles (NPs), which is based on the flow cytometry measurements of their normalized side scattering intensity. In order to improve our understanding on the quantitative aspects of cell-nanoparticle interactions, flow cytometry, transmission electron microscopy, and X-ray fluorescence experiments were carefully performed for the HeLa cells exposed to SiO2 NPs with different core diameters, hydrodynamic sizes, and surface charges. Based on the observed relationships among the experimental data, a semi-quantitative cellular SiO2 NPs estimation method from their normalized side scattering and core diameters was proposed, which can be applied for the determination of cellular SiO2 NPs within their size-dependent linear ranges. © 2014 International Society for Advancement of Cytometry.

  19. CT-SPECT fusion plus conjugate views for determining dosimetry in iodine-131-monoclonal antibody therapy of lymphoma patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koral, K.F.; Zasadny, K.R.; Kessler, M.L.

    A method of performing {sup 131}I quantitative SPECT imaging is described which uses the superimposition of markers placed on the skin to accomplish fusion of computed tomography (CT) and SPECT image sets. To calculate mean absorbed dose after administration of one of two {sup 131}I-labeled monoclonal antibodies (Mabs), the shape of the time-activity curve is measured by daily diagnostic conjugate views, the y-axis of that curve is normalized by a quantitative SPECT measurement (usually intra-therapy), and the tumor mass is deduced from a concurrent CT volume measurement. The method is applied to six B-cell non-Hodgkin`s lymphoma patients. For four tumorsmore » in three patients treated with the MB1 Mab, a correlation appears to be present between resulting mean absorbed dose and disease response. Including all dosimetric estimates for both antibodies, the range for the specific absorbed dose is within that found by others in treating B-cell lymphoma patients. Excluding a retreated anti-B1 patient, the tumor-specific absorbed dose during anti-B1 therapy is from 1.4 to 1.7 mGy/MBq. For the one anti-B1 patient, where quantitative SPECT and conjugate-view imaging was carried out back to back , the quantitative SPECT-measured activity was somewhat less for the spleen and much less for the tumor than that from conjugate views. The quantitative SPECT plus conjugate views method may be of general utility for macro-dosimetry of {sup 131}If therapies. 18 refs., 3 figs., 5 tabs.« less

  20. Estimation of hydrolysis rate constants for carbamates ...

    EPA Pesticide Factsheets

    Cheminformatics based tools, such as the Chemical Transformation Simulator under development in EPA’s Office of Research and Development, are being increasingly used to evaluate chemicals for their potential to degrade in the environment or be transformed through metabolism. Hydrolysis represents a major environmental degradation pathway; unfortunately, only a small fraction of hydrolysis rates for about 85,000 chemicals on the Toxic Substances Control Act (TSCA) inventory are in public domain, making it critical to develop in silico approaches to estimate hydrolysis rate constants. In this presentation, we compare three complementary approaches to estimate hydrolysis rates for carbamates, an important chemical class widely used in agriculture as pesticides, herbicides and fungicides. Fragment-based Quantitative Structure Activity Relationships (QSARs) using Hammett-Taft sigma constants are widely published and implemented for relatively simple functional groups such as carboxylic acid esters, phthalate esters, and organophosphate esters, and we extend these to carbamates. We also develop a pKa based model and a quantitative structure property relationship (QSPR) model, and evaluate them against measured rate constants using R square and root mean square (RMS) error. Our work shows that for our relatively small sample size of carbamates, a Hammett-Taft based fragment model performs best, followed by a pKa and a QSPR model. This presentation compares three comp

  1. The Effect of Pickling on Blue Borscht Gelatin and Other Interesting Diffusive Phenomena.

    ERIC Educational Resources Information Center

    Davis, Lawrence C.; Chou, Nancy C.

    1998-01-01

    Presents some simple demonstrations that students can construct for themselves in class to learn the difference between diffusion and convection rates. Uses cabbage leaves and gelatin and focuses on diffusion in ungelified media, a quantitative diffusion estimate with hydroxyl ions, and a quantitative diffusion estimate with photons. (DDR)

  2. Software metrics: The quantitative impact of four factors on work rates experienced during software development. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Gaffney, J. E., Jr.; Judge, R. W.

    1981-01-01

    A model of a software development process is described. The software development process is seen to consist of a sequence of activities, such as 'program design' and 'module development' (or coding). A manpower estimate is made by multiplying code size by the rates (man months per thousand lines of code) for each of the activities relevant to the particular case of interest and summing up the results. The effect of four objectively determinable factors (organization, software product type, computer type, and code type) on productivity values for each of nine principal software development activities was assessed. Four factors were identified which account for 39% of the observed productivity variation.

  3. Tracer kinetics of forearm endothelial function: comparison of an empirical method and a quantitative modeling technique.

    PubMed

    Zhao, Xueli; Arsenault, Andre; Lavoie, Kim L; Meloche, Bernard; Bacon, Simon L

    2007-01-01

    Forearm Endothelial Function (FEF) is a marker that has been shown to discriminate patients with cardiovascular disease (CVD). FEF has been assessed using several parameters: the Rate of Uptake Ratio (RUR), EWUR (Elbow-to-Wrist Uptake Ratio) and EWRUR (Elbow-to-Wrist Relative Uptake Ratio). However, the modeling functions of FEF require more robust models. The present study was designed to compare an empirical method with quantitative modeling techniques to better estimate the physiological parameters and understand the complex dynamic processes. The fitted time activity curves of the forearms, estimating blood and muscle components, were assessed using both an empirical method and a two-compartment model. Although correlational analyses suggested a good correlation between the methods for RUR (r=.90) and EWUR (r=.79), but not EWRUR (r=.34), Altman-Bland plots found poor agreement between the methods for all 3 parameters. These results indicate that there is a large discrepancy between the empirical and computational method for FEF. Further work is needed to establish the physiological and mathematical validity of the 2 modeling methods.

  4. Quantifying heterogeneity in human tumours using MRI and PET.

    PubMed

    Asselin, Marie-Claude; O'Connor, James P B; Boellaard, Ronald; Thacker, Neil A; Jackson, Alan

    2012-03-01

    Most tumours, even those of the same histological type and grade, demonstrate considerable biological heterogeneity. Variations in genomic subtype, growth factor expression and local microenvironmental factors can result in regional variations within individual tumours. For example, localised variations in tumour cell proliferation, cell death, metabolic activity and vascular structure will be accompanied by variations in oxygenation status, pH and drug delivery that may directly affect therapeutic response. Documenting and quantifying regional heterogeneity within the tumour requires histological or imaging techniques. There is increasing evidence that quantitative imaging biomarkers can be used in vivo to provide important, reproducible and repeatable estimates of tumoural heterogeneity. In this article we review the imaging methods available to provide appropriate biomarkers of tumour structure and function. We also discuss the significant technical issues involved in the quantitative estimation of heterogeneity and the range of descriptive metrics that can be derived. Finally, we have reviewed the existing clinical evidence that heterogeneity metrics provide additional useful information in drug discovery and development and in clinical practice. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Changes in climate suitability for tourism at Adriatic coast since 1961

    NASA Astrophysics Data System (ADS)

    Zaninovic, Ksenija

    2017-04-01

    The aim of the paper is the comparison of suitability of climate conditions for tourism at the eastern Adriatic coast in the period 1961-2015. For quantitative estimation of suitability of climate for different kinds of tourism, climate index for tourism (CIT) is used. CIT integrates thermal, aesthetic and physical facets of atmospheric environment and therefore is suitable for estimation of climate satisfaction that ranges from very poor to very good. The thermal component is estimated using the physiologically equivalent temperature (PET). The index is applied for: beach tourism, cycling, hiking, cultural tourism, golf, football, motor boating and sailing. Changes in climate potential of tourism are estimated by differences of distribution of climate index for tourism. For the warmest part of the day, the results indicate the extension of the summer tourist season for beach tourism at the end of the analyzed period. On the other hand, for other tourist activities in the same period the results indicate more pronounced bimodal distribution of CIT during year, resulting with the seasonality shift of ideal conditions for most activities to spring and autumn. Besides, in the morning the improvement of favourable climate conditions for all types of tourism at the end of the period.

  6. Bridging the Global Precipitation and Soil Moisture Active Passive Missions: Variability of Microwave Surface Emissivity from In situ and Remote Sensing Perspectives

    NASA Astrophysics Data System (ADS)

    Zheng, Y.; Kirstetter, P.; Hong, Y.; Turk, J.

    2016-12-01

    The overland precipitation retrievals from satellite passive microwave (PMW) sensors such as the Global Precipitation Mission (GPM) microwave imager (GMI) are impacted by the land surface emissivity. The estimation of PMW emissivity faces challenges because it is highly variable under the influence of surface properties such as soil moisture, surface roughness and vegetation. This study proposes an improved quantitative understanding of the relationship between the emissivity and surface parameters. Surface parameter information is obtained through (i) in-situ measurements from the International Soil Moisture Network and (ii) satellite measurements from the Soil Moisture Active and Passive mission (SMAP) which provides global scale soil moisture estimates. The variation of emissivity is quantified with soil moisture, surface temperature and vegetation at various frequencies/polarization and over different types of land surfaces to sheds light into the processes governing the emission of the land. This analysis is used to estimate the emissivity under rainy conditions. The framework built with in-situ measurements serves as a benchmark for satellite-based analyses, which paves a way toward global scale emissivity estimates using SMAP.

  7. Application of counterpropagation artificial neural network for modelling properties of fish antibiotics.

    PubMed

    Maran, E; Novic, M; Barbieri, P; Zupan, J

    2004-01-01

    The present study focuses on fish antibiotics which are an important group of pharmaceuticals used in fish farming to treat infections and, until recently, most of them have been exposed to the environment with very little attention. Information about the environmental behaviour and the description of the environmental fate of medical substances are difficult or expensive to obtain. The experimental information in terms of properties is reported when available, in other cases, it is estimated by standard tools as those provided by the United States Environmental Protection Agency EPISuite software and by custom quantitative structure-activity relationship (QSAR) applications. In this study, a QSAR screening of 15 fish antibiotics and 132 xenobiotic molecules was performed with two aims: (i) to develop a model for the estimation of octanol--water partition coefficient (logP) and (ii) to estimate the relative binding affinity to oestrogen receptor (log RBA) using a model constructed on the activities of 132 xenobiotic compounds. The custom models are based on constitutional, topological, electrostatic and quantum chemical descriptors computed by the CODESSA software. Kohonen neural networks (self organising maps) were used to study similarity between the considered chemicals while counter-propagation artificial neural networks were used to estimate the properties.

  8. Development of an agricultural job-exposure matrix for British Columbia, Canada.

    PubMed

    Wood, David; Astrakianakis, George; Lang, Barbara; Le, Nhu; Bert, Joel

    2002-09-01

    Farmers in British Columbia (BC), Canada have been shown to have unexplained elevated proportional mortality rates for several cancers. Because agricultural exposures have never been documented systematically in BC, a quantitative agricultural Job-exposure matrix (JEM) was developed containing exposure assessments from 1950 to 1998. This JEM was developed to document historical exposures and to facilitate future epidemiological studies. Available information regarding BC farming practices was compiled and checklists of potential exposures were produced for each crop. Exposures identified included chemical, biological, and physical agents. Interviews with farmers and agricultural experts were conducted using the checklists as a starting point. This allowed the creation of an initial or 'potential' JEM based on three axes: exposure agent, 'type of work' and time. The 'type of work' axis was determined by combining several variables: region, crop, job title and task. This allowed for a complete description of exposures. Exposure assessments were made quantitatively, where data allowed, or by a dichotomous variable (exposed/unexposed). Quantitative calculations were divided into re-entry and application scenarios. 'Re-entry' exposures were quantified using a standard exposure model with some modification while application exposure estimates were derived using data from the North American Pesticide Handlers Exposure Database (PHED). As expected, exposures differed between crops and job titles both quantitatively and qualitatively. Of the 290 agents included in the exposure axis; 180 were pesticides. Over 3000 estimates of exposure were conducted; 50% of these were quantitative. Each quantitative estimate was at the daily absorbed dose level. Exposure estimates were then rated as high, medium, or low based on comparing them with their respective oral chemical reference dose (RfD) or Acceptable Daily Intake (ADI). This data was mainly obtained from the US Environmental Protection Agency (EPA) Integrated Risk Information System database. Of the quantitative estimates, 74% were rated as low (< 100%) and only 10% were rated as high (>500%). The JEM resulting from this study fills a void concerning exposures for BC farmers and farm workers. While only limited validation of assessments were possible, this JEM can serve as a benchmark for future studies. Preliminary analysis at the BC Cancer Agency (BCCA) using the JEM with prostate cancer records from a large cancer and occupation study/survey has already shown promising results. Development of this JEM provides a useful model for developing historical quantitative exposure estimates where is very little documented information available.

  9. Quantitative genetic analysis of injury liability in infants and toddlers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries.more » Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.« less

  10. Contribution of Insula in Parkinson’s Disease: A Quantitative Meta-Analysis Study

    PubMed Central

    Criaud, Marion; Christopher, Leigh; Boulinguez, Philippe; Ballanger, Benedicte; Lang, Anthony E.; Cho, Sang S.; Houle, Sylvain; Strafella, Antonio P.

    2016-01-01

    The insula region is known to be an integrating hub interacting with multiple brain networks involved in cognitive, affective, sensory, and autonomic processes. There is growing evidence suggesting that this region may have an important role in Parkinson’s disease (PD). Thus, to investigate the functional organization of the insular cortex and its potential role in parkinsonian features, we used a coordinate-based quantitative meta-analysis approach, the activation likelihood estimation. A total of 132 insular foci were selected from 96 published experiments comprising the five functional categories: cognition, affective/behavioral symptoms, bodily awareness/autonomic function, sensorimotor function, and nonspecific resting functional changes associated with the disease. We found a significant convergence of activation maxima related to PD in different insular regions including anterior and posterior regions bilaterally. This study provides evidence of an important functional distribution of different domains within the insular cortex in PD, particularly in relation to nonmotor aspects, with an influence of medication effect. PMID:26800238

  11. Contribution of insula in Parkinson's disease: A quantitative meta-analysis study.

    PubMed

    Criaud, Marion; Christopher, Leigh; Boulinguez, Philippe; Ballanger, Benedicte; Lang, Anthony E; Cho, Sang S; Houle, Sylvain; Strafella, Antonio P

    2016-04-01

    The insula region is known to be an integrating hub interacting with multiple brain networks involved in cognitive, affective, sensory, and autonomic processes. There is growing evidence suggesting that this region may have an important role in Parkinson's disease (PD). Thus, to investigate the functional organization of the insular cortex and its potential role in parkinsonian features, we used a coordinate-based quantitative meta-analysis approach, the activation likelihood estimation. A total of 132 insular foci were selected from 96 published experiments comprising the five functional categories: cognition, affective/behavioral symptoms, bodily awareness/autonomic function, sensorimotor function, and nonspecific resting functional changes associated with the disease. We found a significant convergence of activation maxima related to PD in different insular regions including anterior and posterior regions bilaterally. This study provides evidence of an important functional distribution of different domains within the insular cortex in PD, particularly in relation to nonmotor aspects, with an influence of medication effect. © 2016 Wiley Periodicals, Inc.

  12. Structural parameterization and functional prediction of antigenic polypeptome sequences with biological activity through quantitative sequence-activity models (QSAM) by molecular electronegativity edge-distance vector (VMED).

    PubMed

    Li, ZhiLiang; Wu, ShiRong; Chen, ZeCong; Ye, Nancy; Yang, ShengXi; Liao, ChunYang; Zhang, MengJun; Yang, Li; Mei, Hu; Yang, Yan; Zhao, Na; Zhou, Yuan; Zhou, Ping; Xiong, Qing; Xu, Hong; Liu, ShuShen; Ling, ZiHua; Chen, Gang; Li, GenRong

    2007-10-01

    Only from the primary structures of peptides, a new set of descriptors called the molecular electronegativity edge-distance vector (VMED) was proposed and applied to describing and characterizing the molecular structures of oligopeptides and polypeptides, based on the electronegativity of each atom or electronic charge index (ECI) of atomic clusters and the bonding distance between atom-pairs. Here, the molecular structures of antigenic polypeptides were well expressed in order to propose the automated technique for the computerized identification of helper T lymphocyte (Th) epitopes. Furthermore, a modified MED vector was proposed from the primary structures of polypeptides, based on the ECI and the relative bonding distance of the fundamental skeleton groups. The side-chains of each amino acid were here treated as a pseudo-atom. The developed VMED was easy to calculate and able to work. Some quantitative model was established for 28 immunogenic or antigenic polypeptides (AGPP) with 14 (1-14) A(d) and 14 other restricted activities assigned as "1"(+) and "0"(-), respectively. The latter comprised 6 A(b)(15-20), 3 A(k)(21-23), 2 E(k)(24-26), 2 H-2(k)(27 and 28) restricted sequences. Good results were obtained with 90% correct classification (only 2 wrong ones for 20 training samples) and 100% correct prediction (none wrong for 8 testing samples); while contrastively 100% correct classification (none wrong for 20 training samples) and 88% correct classification (1 wrong for 8 testing samples). Both stochastic samplings and cross validations were performed to demonstrate good performance. The described method may also be suitable for estimation and prediction of classes I and II for major histocompatibility antigen (MHC) epitope of human. It will be useful in immune identification and recognition of proteins and genes and in the design and development of subunit vaccines. Several quantitative structure activity relationship (QSAR) models were developed for various oligopeptides and polypeptides including 58 dipeptides and 31 pentapeptides with angiotensin converting enzyme (ACE) inhibition by multiple linear regression (MLR) method. In order to explain the ability to characterize molecular structure of polypeptides, a molecular modeling investigation on QSAR was performed for functional prediction of polypeptide sequences with antigenic activity and heptapeptide sequences with tachykinin activity through quantitative sequence-activity models (QSAMs) by the molecular electronegativity edge-distance vector (VMED). The results showed that VMED exhibited both excellent structural selectivity and good activity prediction. Moreover, the results showed that VMED behaved quite well for both QSAR and QSAM of poly-and oligopeptides, which exhibited both good estimation ability and prediction power, equal to or better than those reported in the previous references. Finally, a preliminary conclusion was drawn: both classical and modified MED vectors were very useful structural descriptors. Some suggestions were proposed for further studies on QSAR/QSAM of proteins in various fields.

  13. [Quantitative estimation of vegetation cover and management factor in USLE and RUSLE models by using remote sensing data: a review].

    PubMed

    Wu, Chang-Guang; Li, Sheng; Ren, Hua-Dong; Yao, Xiao-Hua; Huang, Zi-Jie

    2012-06-01

    Soil loss prediction models such as universal soil loss equation (USLE) and its revised universal soil loss equation (RUSLE) are the useful tools for risk assessment of soil erosion and planning of soil conservation at regional scale. To make a rational estimation of vegetation cover and management factor, the most important parameters in USLE or RUSLE, is particularly important for the accurate prediction of soil erosion. The traditional estimation based on field survey and measurement is time-consuming, laborious, and costly, and cannot rapidly extract the vegetation cover and management factor at macro-scale. In recent years, the development of remote sensing technology has provided both data and methods for the estimation of vegetation cover and management factor over broad geographic areas. This paper summarized the research findings on the quantitative estimation of vegetation cover and management factor by using remote sensing data, and analyzed the advantages and the disadvantages of various methods, aimed to provide reference for the further research and quantitative estimation of vegetation cover and management factor at large scale.

  14. Parameter estimation and statistical analysis on frequency-dependent active control forces

    NASA Astrophysics Data System (ADS)

    Lim, Tau Meng; Cheng, Shanbao

    2007-07-01

    The active control forces of an active magnetic bearing (AMB) system are known to be frequency dependent in nature. This is due to the frequency-dependent nature of the AMB system, i.e. time lags in sensors, digital signal processing, amplifiers, filters, and eddy current and hysteresis losses in the electromagnetic coils. The stiffness and damping coefficients of these control forces can be assumed to be linear for small limit of perturbations within the air gap. Numerous studies have also attempted to estimate these coefficients directly or indirectly without validating the model and verifying the results. This paper seeks to address these issues, by proposing a one-axis electromagnetic suspension system to simplify the measurement requirements and eliminate the possibility of control force cross-coupling capabilities. It also proposes an on-line frequency domain parameter estimation procedure with statistical information to provide a quantitative measure for model validation and results verification purposes. This would lead to a better understanding and a design platform for optimal vibration control scheme for suspended system. This is achieved by injecting Schroeder Phased Harmonic Sequences (SPHS), a multi-frequency test signal, to persistently excite all possible suspended system modes. By treating the system as a black box, the parameter estimation of the "actual" stiffness and damping coefficients in the frequency domain are realised experimentally. The digitally implemented PID controller also facilitated changes on the feedback gains, and this allowed numerous system response measurements with their corresponding estimated stiffness and damping coefficients.

  15. Derivation and validation of a multivariate model to predict mortality from pulmonary embolism with cancer: the POMPE-C tool

    PubMed Central

    Roy, Pierre-Marie; Than, Martin P.; Hernandez, Jackeline; Courtney, D. Mark; Jones, Alan E.; Penazola, Andrea; Pollack, Charles V.

    2012-01-01

    Background Clinical guidelines recommend risk stratification of patients with acute pulmonary embolism (PE). Active cancer increases risk of PE and worsens prognosis, but also causes incidental PE that may be discovered during cancer staging. No quantitative decision instrument has been derived specifically for patients with active cancer and PE. Methods Classification and regression technique was used to reduce 25 variables prospectively collected from 408 patients with AC and PE. Selected variables were transformed into a logistic regression model, termed POMPE-C, and compared with the pulmonary embolism severity index (PESI) score to predict the outcome variable of death within 30 days. Validation was performed in an independent sample of 182 patients with active cancer and PE. Results POMPE-C included eight predictors: body mass, heart rate >100, respiratory rate, SaO2%, respiratory distress, altered mental status, do not resuscitate status, and unilateral limb swelling. In the derivation set, the area under the ROC curve for POMPE-C was 0.84 (95% CI: 0.82-0.87), significantly greater than PESI (0.68, 0.60-0.76). In the validation sample, POMPE-C had an AUC of 0.86 (0.78-0.93). No patient with POMPE-C estimate ≤5% died within 30 days (0/50, 0-7%), whereas 10/13 (77%, 46-95%) with POMPE-C estimate >50% died within 30 days. Conclusion In patients with active cancer and PE, POMPE-C demonstrated good prognostic accuracy for 30 day mortality and better performance than PESI. If validated in a large sample, POMPE-C may provide a quantitative basis to decide treatment options for PE discovered during cancer staging and with advanced cancer. PMID:22475313

  16. Identification of the subthalamic nucleus in deep brain stimulation surgery with a novel wavelet-derived measure of neural background activity.

    PubMed

    Snellings, André; Sagher, Oren; Anderson, David J; Aldridge, J Wayne

    2009-10-01

    The authors developed a wavelet-based measure for quantitative assessment of neural background activity during intraoperative neurophysiological recordings so that the boundaries of the subthalamic nucleus (STN) can be more easily localized for electrode implantation. Neural electrophysiological data were recorded in 14 patients (20 tracks and 275 individual recording sites) with dopamine-sensitive idiopathic Parkinson disease during the target localization portion of deep brain stimulator implantation surgery. During intraoperative recording, the STN was identified based on audio and visual monitoring of neural firing patterns, kinesthetic tests, and comparisons between neural behavior and the known characteristics of the target nucleus. The quantitative wavelet-based measure was applied offline using commercially available software to measure the magnitude of the neural background activity, and the results of this analysis were compared with the intraoperative conclusions. Wavelet-derived estimates were also compared with power spectral density measurements. The wavelet-derived background levels were significantly higher in regions encompassed by the clinically estimated boundaries of the STN than in the surrounding regions (STN, 225 +/- 61 microV; ventral to the STN, 112 +/- 32 microV; and dorsal to the STN, 136 +/- 66 microV). In every track, the absolute maximum magnitude was found within the clinically identified STN. The wavelet-derived background levels provided a more consistent index with less variability than measurements with power spectral density. Wavelet-derived background activity can be calculated quickly, does not require spike sorting, and can be used to identify the STN reliably with very little subjective interpretation required. This method may facilitate the rapid intraoperative identification of STN borders.

  17. Identification of the subthalamic nucleus in deep brain stimulation surgery with a novel wavelet-derived measure of neural background activity

    PubMed Central

    Snellings, André; Sagher, Oren; Anderson, David J.; Aldridge, J. Wayne

    2016-01-01

    Object A wavelet-based measure was developed to quantitatively assess neural background activity taken during surgical neurophysiological recordings to localize the boundaries of the subthalamic nucleus during target localization for deep brain stimulator implant surgery. Methods Neural electrophysiological data was recorded from 14 patients (20 tracks, n = 275 individual recording sites) with dopamine-sensitive idiopathic Parkinson’s disease during the target localization portion of deep brain stimulator implant surgery. During intraoperative recording the STN was identified based upon audio and visual monitoring of neural firing patterns, kinesthetic tests, and comparisons between neural behavior and known characteristics of the target nucleus. The quantitative wavelet-based measure was applied off-line using MATLAB software to measure the magnitude of the neural background activity, and the results of this analysis were compared to the intraoperative conclusions. Wavelet-derived estimates were compared to power spectral density measures. Results The wavelet-derived background levels were significantly higher in regions encompassed by the clinically estimated boundaries of the STN than in surrounding regions (STN: 225 ± 61 μV vs. ventral to STN: 112 ± 32 μV, and dorsal to STN: 136 ± 66 μV). In every track, the absolute maximum magnitude was found within the clinically identified STN. The wavelet-derived background levels provided a more consistent index with less variability than power spectral density. Conclusions The wavelet-derived background activity assessor can be calculated quickly, requires no spike sorting, and can be reliably used to identify the STN with very little subjective interpretation required. This method may facilitate rapid intraoperative identification of subthalamic nucleus borders. PMID:19344225

  18. Quantitative structure-cytotoxicity relationship of piperic acid amides.

    PubMed

    Shimada, Chiyako; Uesawa, Yoshihiro; Ishihara, Mariko; Kagaya, Hajime; Kanamoto, Taisei; Terakubo, Shigemi; Nakashima, Hideki; Takao, Koichi; Miyashiro, Takaki; Sugita, Yoshiaki; Sakagami, Hiroshi

    2014-09-01

    A total of 12 piperic acid amides, including piperine, were subjected to quantitative structure-activity relationship (QSAR) analysis, based on their cytotoxicity, tumor selectivity and anti-HIV activity, in order to find new biological activities. Cytotoxicity against four human oral squamous cell carcinoma (OSCC) cell lines and three human oral normal cells was determined by the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) method. Tumor selectivity was evaluated by the ratio of the mean 50% cytotoxic concentration (CC50) against normal oral cells to that against OSCC cell lines. Anti-HIV activity was evaluated by the ratio of the CC50 to 50% HIV infection-cytoprotective concentration (EC50). Physicochemical, structural, and quantum-chemical parameters were calculated based on the conformations optimized by LowModeMD method followed by density functional theory method. All compounds showed low-to-moderate tumor selectivity, but no anti-HIV activity. N-Piperoyldopamine ( 8: ) which has a catechol moiety, showed the highest tumor selectivity, possibly due to its unique molecular shape and electrostatic interaction, especially its largest partial equalization of orbital electronegativities and vsurf descriptors. The present study suggests that molecular shape and ability for electrostatic interaction are useful parameters for estimating the tumor selectivity of piperic acid amides. Copyright© 2014 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  19. Quantitative estimation of gymnemagenin in Gymnema sylvestre extract and its marketed formulations using the HPLC-ESI-MS/MS method.

    PubMed

    Kamble, Bhagyashree; Gupta, Ankur; Patil, Dada; Janrao, Shirish; Khatal, Laxman; Duraiswamy, B

    2013-02-01

    Gymnema sylvestre, with gymnemic acids as active pharmacological constituents, is a popular ayurvedic herb and has been used to treat diabetes, as a remedy for cough and as a diuretic. However, very few analytical methods are available for quality control of this herb and its marketed formulations. To develop and validate a new, rapid, sensitive and selective HPLC-ESI (electrospray ionisation)-MS/MS method for quantitative estimation of gymnemagenin in G. sylvestre and its marketed formulations. HPLC-ESI-MS/MS method using a multiple reactions monitoring mode was used for quantitation of gymnemagenin. Separation was carried out on a Luna C-18 column using gradient elution of water and methanol (with 0.1% formic acid and 0.3% ammonia). The developed method was validated as per International Conference on Harmonisation Guideline ICH-Q2B and found to be accurate, precise and linear over a relatively wide range of concentrations (5.280-305.920 ng/mL). Gymnemagenin contents were found from 0.056 ± 0.002 to 4.77 ± 0.59% w/w in G. sylvestre and its marketed formulations. The method established is simple, rapid, with high sample throughput, and can be used as a tool for quality control of G. sylvestre and its formulations. Copyright © 2012 John Wiley & Sons, Ltd.

  20. Quantitative structure-activity relationship study of P2X7 receptor inhibitors using combination of principal component analysis and artificial intelligence methods.

    PubMed

    Ahmadi, Mehdi; Shahlaei, Mohsen

    2015-01-01

    P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure-activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7-7-1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure-activity relationship model suggested is robust and satisfactory.

  1. Quantitative structure–activity relationship study of P2X7 receptor inhibitors using combination of principal component analysis and artificial intelligence methods

    PubMed Central

    Ahmadi, Mehdi; Shahlaei, Mohsen

    2015-01-01

    P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure–activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7−7−1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure–activity relationship model suggested is robust and satisfactory. PMID:26600858

  2. Estimations of BCR-ABL/ABL transcripts by quantitative PCR in chronic myeloid leukaemia after allogeneic bone marrow transplantation and donor lymphocyte infusion.

    PubMed

    Otazú, Ivone B; Tavares, Rita de Cassia B; Hassan, Rocío; Zalcberg, Ilana; Tabak, Daniel G; Seuánez, Héctor N

    2002-02-01

    Serial assays of qualitative (multiplex and nested) and quantitative PCR were carried out for detecting and estimating the level of BCR-ABL transcripts in 39 CML patients following bone marrow transplantation. Seven of these patients, who received donor lymphocyte infusions (DLIs) following to relapse, were also monitored. Quantitative estimates of BCR-ABL transcripts were obtained by co-amplification with a competitor sequence. Estimates of ABL transcripts were used, an internal control and the ratio BCR-ABL/ABL was thus estimated for evaluating the kinetics of residual clones. Twenty four patients were followed shortly after BMT; two of these patients were in cytogenetic relapse coexisting with very high BCR-ABL levels while other 22 were in clinical, haematologic and cytogenetic remission 2-42 months after BMT. In this latter group, seven patients showed a favourable clinical-haematological progression in association with molecular remission while in 14 patients quantitative PCR assays indicated molecular relapse that was not associated with an early cytogenetic-haematologic relapse. BCR-ABL/ABL levels could not be correlated with presence of GVHD in 24 patients after BMT. In all seven patients treated with DLI, high levels of transcripts were detected at least 4 months before the appearance of clinical haematological relapse. Following DLI, five of these patients showed decreasing transcript levels from 2 to 5 logs between 4 and 12 months. In eight other patients studied long after BMT, five showed molecular relapse up to 117 months post-BMT and only one showed cytogenetic relapse. Our findings indicated that quantitative estimates of BCR-ABL transcripts were valuable for monitoring minimal residual disease in each patient.

  3. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  4. Audiovisual quality estimation of mobile phone video cameras with interpretation-based quality approach

    NASA Astrophysics Data System (ADS)

    Radun, Jenni E.; Virtanen, Toni; Olives, Jean-Luc; Vaahteranoksa, Mikko; Vuori, Tero; Nyman, Göte

    2007-01-01

    We present an effective method for comparing subjective audiovisual quality and the features related to the quality changes of different video cameras. Both quantitative estimation of overall quality and qualitative description of critical quality features are achieved by the method. The aim was to combine two image quality evaluation methods, the quantitative Absolute Category Rating (ACR) method with hidden reference removal and the qualitative Interpretation- Based Quality (IBQ) method in order to see how they complement each other in audiovisual quality estimation tasks. 26 observers estimated the audiovisual quality of six different cameras, mainly mobile phone video cameras. In order to achieve an efficient subjective estimation of audiovisual quality, only two contents with different quality requirements were recorded with each camera. The results show that the subjectively important quality features were more related to the overall estimations of cameras' visual video quality than to the features related to sound. The data demonstrated two significant quality dimensions related to visual quality: darkness and sharpness. We conclude that the qualitative methodology can complement quantitative quality estimations also with audiovisual material. The IBQ approach is valuable especially, when the induced quality changes are multidimensional.

  5. Quantitative Estimate of the Relation Between Rolling Resistance on Fuel Consumption of Class 8 Tractor Trailers Using Both New and Retreaded Tires (SAE Paper 2014-01-2425)

    EPA Science Inventory

    Road tests of class 8 tractor trailers were conducted by the US Environmental Protection Agency on new and retreaded tires of varying rolling resistance in order to provide estimates of the quantitative relationship between rolling resistance and fuel consumption.

  6. 77 FR 70727 - Endangered and Threatened Wildlife and Plants; 90-Day Finding on a Petition to List the African...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-27

    ... indirectly through changes in regional climate; and (b) Quantitative research on the relationship of food...). Population Estimates The most quantitative estimate of the historic size of the African lion population... research conducted by Chardonnet et al., three subpopulations were described as consisting of 18 groups...

  7. Approaches to acceptable risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whipple, C

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less

  8. Accounting for Limited Detection Efficiency and Localization Precision in Cluster Analysis in Single Molecule Localization Microscopy

    PubMed Central

    Shivanandan, Arun; Unnikrishnan, Jayakrishnan; Radenovic, Aleksandra

    2015-01-01

    Single Molecule Localization Microscopy techniques like PhotoActivated Localization Microscopy, with their sub-diffraction limit spatial resolution, have been popularly used to characterize the spatial organization of membrane proteins, by means of quantitative cluster analysis. However, such quantitative studies remain challenged by the techniques’ inherent sources of errors such as a limited detection efficiency of less than 60%, due to incomplete photo-conversion, and a limited localization precision in the range of 10 – 30nm, varying across the detected molecules, mainly depending on the number of photons collected from each. We provide analytical methods to estimate the effect of these errors in cluster analysis and to correct for them. These methods, based on the Ripley’s L(r) – r or Pair Correlation Function popularly used by the community, can facilitate potentially breakthrough results in quantitative biology by providing a more accurate and precise quantification of protein spatial organization. PMID:25794150

  9. Estimating the fates of organic contaminants in an aquifer using QSAR.

    PubMed

    Lim, Seung Joo; Fox, Peter

    2013-01-01

    The quantitative structure activity relationship (QSAR) model, BIOWIN, was modified to more accurately estimate the fates of organic contaminants in an aquifer. The predictions from BIOWIN were modified to include oxidation and sorption effects. The predictive model therefore included the effects of sorption, biodegradation, and oxidation. A total of 35 organic compounds were used to validate the predictive model. The majority of the ratios of predicted half-life to measured half-life were within a factor of 2 and no ratio values were greater than a factor of 5. In addition, the accuracy of estimating the persistence of organic compounds in the sub-surface was superior when modified by the relative fraction adsorbed to the solid phase, 1/Rf, to that when modified by the remaining fraction of a given compound adsorbed to a solid, 1 - fs.

  10. Estimation of sample size and testing power (part 5).

    PubMed

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-02-01

    Estimation of sample size and testing power is an important component of research design. This article introduced methods for sample size and testing power estimation of difference test for quantitative and qualitative data with the single-group design, the paired design or the crossover design. To be specific, this article introduced formulas for sample size and testing power estimation of difference test for quantitative and qualitative data with the above three designs, the realization based on the formulas and the POWER procedure of SAS software and elaborated it with examples, which will benefit researchers for implementing the repetition principle.

  11. A new mean estimator using auxiliary variables for randomized response models

    NASA Astrophysics Data System (ADS)

    Ozgul, Nilgun; Cingi, Hulya

    2013-10-01

    Randomized response models are commonly used in surveys dealing with sensitive questions such as abortion, alcoholism, sexual orientation, drug taking, annual income, tax evasion to ensure interviewee anonymity and reduce nonrespondents rates and biased responses. Starting from the pioneering work of Warner [7], many versions of RRM have been developed that can deal with quantitative responses. In this study, new mean estimator is suggested for RRM including quantitative responses. The mean square error is derived and a simulation study is performed to show the efficiency of the proposed estimator to other existing estimators in RRM.

  12. Investigation of practical initial attenuation image estimates in TOF-MLAA reconstruction for PET/MR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Ju-Chieh, E-mail: chengjuchieh@gmail.com; Y

    Purpose: Time-of-flight joint attenuation and activity positron emission tomography reconstruction requires additional calibration (scale factors) or constraints during or post-reconstruction to produce a quantitative μ-map. In this work, the impact of various initializations of the joint reconstruction was investigated, and the initial average mu-value (IAM) method was introduced such that the forward-projection of the initial μ-map is already very close to that of the reference μ-map, thus reducing/minimizing the offset (scale factor) during the early iterations of the joint reconstruction. Consequently, the accuracy and efficiency of unconstrained joint reconstruction such as time-of-flight maximum likelihood estimation of attenuation and activity (TOF-MLAA)more » can be improved by the proposed IAM method. Methods: 2D simulations of brain and chest were used to evaluate TOF-MLAA with various initial estimates which include the object filled with water uniformly (conventional initial estimate), bone uniformly, the average μ-value uniformly (IAM magnitude initialization method), and the perfect spatial μ-distribution but with a wrong magnitude (initialization in terms of distribution). 3D GATE simulation was also performed for the chest phantom under a typical clinical scanning condition, and the simulated data were reconstructed with a fully corrected list-mode TOF-MLAA algorithm with various initial estimates. The accuracy of the average μ-values within the brain, chest, and abdomen regions obtained from the MR derived μ-maps was also evaluated using computed tomography μ-maps as the gold-standard. Results: The estimated μ-map with the initialization in terms of magnitude (i.e., average μ-value) was observed to reach the reference more quickly and naturally as compared to all other cases. Both 2D and 3D GATE simulations produced similar results, and it was observed that the proposed IAM approach can produce quantitative μ-map/emission when the corrections for physical effects such as scatter and randoms were included. The average μ-value obtained from MR derived μ-map was accurate within 5% with corrections for bone, fat, and uniform lungs. Conclusions: The proposed IAM-TOF-MLAA can produce quantitative μ-map without any calibration provided that there are sufficient counts in the measured data. For low count data, noise reduction and additional regularization/rescaling techniques need to be applied and investigated. The average μ-value within the object is prior information which can be extracted from MR and patient database, and it is feasible to obtain accurate average μ-value using MR derived μ-map with corrections as demonstrated in this work.« less

  13. Improvement of semi-quantitative small-animal PET data with recovery coefficients: a phantom and rat study.

    PubMed

    Aide, Nicolas; Louis, Marie-Hélène; Dutoit, Soizic; Labiche, Alexandre; Lemoisson, Edwige; Briand, Mélanie; Nataf, Valérie; Poulain, Laurent; Gauduchon, Pascal; Talbot, Jean-Noël; Montravers, Françoise

    2007-10-01

    To evaluate the accuracy of semi-quantitative small-animal PET data, uncorrected for attenuation, and then of the same semi-quantitative data corrected by means of recovery coefficients (RCs) based on phantom studies. A phantom containing six fillable spheres (diameter range: 4.4-14 mm) was filled with an 18F-FDG solution (spheres/background activity=10.1, 5.1 and 2.5). RCs, defined as measured activity/expected activity, were calculated. Nude rats harbouring tumours (n=50) were imaged after injection of 18F-FDG and sacrificed. The standardized uptake value (SUV) in tumours was determined with small-animal PET and compared to ex-vivo counting (ex-vivo SUV). Small-animal PET SUVs were corrected with RCs based on the greatest tumour diameter. Tumour proliferation was assessed with cyclin A immunostaining and correlated to the SUV. RCs ranged from 0.33 for the smallest sphere to 0.72 for the largest. A sigmoidal correlation was found between RCs and sphere diameters (r(2)=0.99). Small-animal PET SUVs were well correlated with ex-vivo SUVs (y=0.48x-0.2; r(2)=0.71) and the use of RCs based on the greatest tumour diameter significantly improved regression (y=0.84x-0.81; r(2)=0.77), except for tumours with important necrosis. Similar results were obtained without sacrificing animals, by using PET images to estimate tumour dimensions. RC-based corrections improved correlation between small-animal PET SUVs and tumour proliferation (uncorrected data: Rho=0.79; corrected data: Rho=0.83). Recovery correction significantly improves both accuracy of small-animal PET semi-quantitative data in rat studies and their correlation with tumour proliferation, except for largely necrotic tumours.

  14. Statistical Deconvolution for Superresolution Fluorescence Microscopy

    PubMed Central

    Mukamel, Eran A.; Babcock, Hazen; Zhuang, Xiaowei

    2012-01-01

    Superresolution microscopy techniques based on the sequential activation of fluorophores can achieve image resolution of ∼10 nm but require a sparse distribution of simultaneously activated fluorophores in the field of view. Image analysis procedures for this approach typically discard data from crowded molecules with overlapping images, wasting valuable image information that is only partly degraded by overlap. A data analysis method that exploits all available fluorescence data, regardless of overlap, could increase the number of molecules processed per frame and thereby accelerate superresolution imaging speed, enabling the study of fast, dynamic biological processes. Here, we present a computational method, referred to as deconvolution-STORM (deconSTORM), which uses iterative image deconvolution in place of single- or multiemitter localization to estimate the sample. DeconSTORM approximates the maximum likelihood sample estimate under a realistic statistical model of fluorescence microscopy movies comprising numerous frames. The model incorporates Poisson-distributed photon-detection noise, the sparse spatial distribution of activated fluorophores, and temporal correlations between consecutive movie frames arising from intermittent fluorophore activation. We first quantitatively validated this approach with simulated fluorescence data and showed that deconSTORM accurately estimates superresolution images even at high densities of activated fluorophores where analysis by single- or multiemitter localization methods fails. We then applied the method to experimental data of cellular structures and demonstrated that deconSTORM enables an approximately fivefold or greater increase in imaging speed by allowing a higher density of activated fluorophores/frame. PMID:22677393

  15. CO2 fluxes from diffuse degassing in Italy

    NASA Astrophysics Data System (ADS)

    Cardellini, C.; Chiodini, G.; Frondini, F.; Caliro, S.

    2016-12-01

    Central and southern Italy are affected by an intense process of CO2 Earth degassing from both active volcanoes, and tectonically active areas. Regional scale studies, based on C mass balance of groundwater of regional aquifers in not volcanically active areas, highlighted the presence of two large CO2 degassing structures that, for magnitude and the geochemical-isotopic features, were related to a regional process of mantle degassing. Quantitative estimates provided a CO2 flux of 9 Mt/y for the region (62000 km2). Besides the magnitude of the process, a strong link between the deep CO2 degassing and the seismicity of the region and a strict correlation between migration of deep CO2-rich fluids and the heat flux have been highlighted. In addition, the region is also characterised by the presence of many cold gas emissions where deeply derived CO2 is released by vents and soil diffuse degassing areas. Both direct CO2 expulsion at the surface and C-rich groundwater are different manifestations of the same process, in fact, the deeply produced gas can be dissolved by groundwater or emitted directly to the atmosphere depending on the gas flux rate, and the geological-structural and hydrogeological settings. Quantitative estimations of the CO2 fluxes are available only for a limited number ( 30) of the about 270 catalogued gas manifestations allowing an estimations of a CO2 flux of 1.4 Mt/y. Summing the two estimates the non-volcanic CO2 flux from the region results globally relevant, being from 2 to 10% of the estimated present-day global CO2 discharge from subaerial volcanoes. Large amounts of CO2 is also discharged by soil diffuse degassing in volcanic-hydrothermal systems. Specific surveys at Solfatara of Pozzuoli (Campi Flegrei Caldera) pointed out the relevance of this process. CO2 diffuse degassing at Solfatara, measured since 1998 shows a persistent CO2 flux of 1300 t/d (± 390 t/d), a flux comparable to an erupting volcano. The quantification of diffuse CO2 degassing in Italy points out the relevance of non-volcanic CO2 degassing and of soil degassing from volcanoes, suggesting that the actual underestimation of the global CO2 degassing, may arise also from the lack of specific and systematic studies of the numerous "degassing areas" of the world, that would contribute to better constrain the global CO2 budget.

  16. Identification and uncertainty estimation of vertical reflectivity profiles using a Lagrangian approach to support quantitative precipitation measurements by weather radar

    NASA Astrophysics Data System (ADS)

    Hazenberg, P.; Torfs, P. J. J. F.; Leijnse, H.; Delrieu, G.; Uijlenhoet, R.

    2013-09-01

    This paper presents a novel approach to estimate the vertical profile of reflectivity (VPR) from volumetric weather radar data using both a traditional Eulerian as well as a newly proposed Lagrangian implementation. For this latter implementation, the recently developed Rotational Carpenter Square Cluster Algorithm (RoCaSCA) is used to delineate precipitation regions at different reflectivity levels. A piecewise linear VPR is estimated for either stratiform or neither stratiform/convective precipitation. As a second aspect of this paper, a novel approach is presented which is able to account for the impact of VPR uncertainty on the estimated radar rainfall variability. Results show that implementation of the VPR identification and correction procedure has a positive impact on quantitative precipitation estimates from radar. Unfortunately, visibility problems severely limit the impact of the Lagrangian implementation beyond distances of 100 km. However, by combining this procedure with the global Eulerian VPR estimation procedure for a given rainfall type (stratiform and neither stratiform/convective), the quality of the quantitative precipitation estimates increases up to a distance of 150 km. Analyses of the impact of VPR uncertainty shows that this aspect accounts for a large fraction of the differences between weather radar rainfall estimates and rain gauge measurements.

  17. Physiological frailty index (PFI): quantitative in-life estimate of individual biological age in mice.

    PubMed

    Antoch, Marina P; Wrobel, Michelle; Kuropatwinski, Karen K; Gitlin, Ilya; Leonova, Katerina I; Toshkov, Ilia; Gleiberman, Anatoli S; Hutson, Alan D; Chernova, Olga B; Gudkov, Andrei V

    2017-03-19

    The development of healthspan-extending pharmaceuticals requires quantitative estimation of age-related progressive physiological decline. In humans, individual health status can be quantitatively assessed by means of a frailty index (FI), a parameter which reflects the scale of accumulation of age-related deficits. However, adaptation of this methodology to animal models is a challenging task since it includes multiple subjective parameters. Here we report a development of a quantitative non-invasive procedure to estimate biological age of an individual animal by creating physiological frailty index (PFI). We demonstrated the dynamics of PFI increase during chronological aging of male and female NIH Swiss mice. We also demonstrated acceleration of growth of PFI in animals placed on a high fat diet, reflecting aging acceleration by obesity and provide a tool for its quantitative assessment. Additionally, we showed that PFI could reveal anti-aging effect of mTOR inhibitor rapatar (bioavailable formulation of rapamycin) prior to registration of its effects on longevity. PFI revealed substantial sex-related differences in normal chronological aging and in the efficacy of detrimental (high fat diet) or beneficial (rapatar) aging modulatory factors. Together, these data introduce PFI as a reliable, non-invasive, quantitative tool suitable for testing potential anti-aging pharmaceuticals in pre-clinical studies.

  18. Estimated areal extent of colonies of black-tailed prairie dogs in the northern Great Plains

    USGS Publications Warehouse

    Sidle, John G.; Johnson, Douglas H.; Euliss, Betty R.

    2001-01-01

    During 1997–1998, we undertook an aerial survey, with an aerial line-intercept technique, to estimate the extent of colonies of black-tailed prairie dogs (Cynomys ludovicianus) in the northern Great Plains states of Nebraska, North Dakota, South Dakota, and Wyoming. We stratified the survey based on knowledge of colony locations, computed 2 types of estimates for each stratum, and combined ratio estimates for high-density strata with average density estimates for low-density strata. Estimates of colony areas for black-tailed prairie dogs were derived from the average percentages of lines intercepting prairie dog colonies and ratio estimators. We selected the best estimator based on the correlation between length of transect line and length of intercepted colonies. Active colonies of black-tailed prairie dogs occupied 2,377.8 km2 ± 186.4 SE, whereas inactive colonies occupied 560.4 ± 89.2 km2. These data represent the 1st quantitative assessment of black-tailed prairie dog colonies in the northern Great Plains. The survey dispels popular notions that millions of hectares of colonies of black-tailed prairie dogs exist in the northern Great Plains and can form the basis for future survey efforts.

  19. Analysis of automated quantification of motor activity in REM sleep behaviour disorder.

    PubMed

    Frandsen, Rune; Nikolic, Miki; Zoetmulder, Marielle; Kempfner, Lykke; Jennum, Poul

    2015-10-01

    Rapid eye movement (REM) sleep behaviour disorder (RBD) is characterized by dream enactment and REM sleep without atonia. Atonia is evaluated on the basis of visual criteria, but there is a need for more objective, quantitative measurements. We aimed to define and optimize a method for establishing baseline and all other parameters in automatic quantifying submental motor activity during REM sleep. We analysed the electromyographic activity of the submental muscle in polysomnographs of 29 patients with idiopathic RBD (iRBD), 29 controls and 43 Parkinson's (PD) patients. Six adjustable parameters for motor activity were defined. Motor activity was detected and quantified automatically. The optimal parameters for separating RBD patients from controls were investigated by identifying the greatest area under the receiver operating curve from a total of 648 possible combinations. The optimal parameters were validated on PD patients. Automatic baseline estimation improved characterization of atonia during REM sleep, as it eliminates inter/intra-observer variability and can be standardized across diagnostic centres. We found an optimized method for quantifying motor activity during REM sleep. The method was stable and can be used to differentiate RBD from controls and to quantify motor activity during REM sleep in patients with neurodegeneration. No control had more than 30% of REM sleep with increased motor activity; patients with known RBD had as low activity as 4.5%. We developed and applied a sensitive, quantitative, automatic algorithm to evaluate loss of atonia in RBD patients. © 2015 European Sleep Research Society.

  20. Conventional liquid chromatography/triple quadrupole mass spectrometer-based metabolite identification and semi-quantitative estimation approach in the investigation of dabigatran etexilate in vitro metabolism

    PubMed Central

    Hu, Zhe-Yi; Parker, Robert B.; Herring, Vanessa L.; Laizure, S. Casey

    2012-01-01

    Dabigatran etexilate (DABE) is an oral prodrug that is rapidly converted by esterases to dabigatran (DAB), a direct inhibitor of thrombin. To elucidate the esterase-mediated metabolic pathway of DABE, a high-performance liquid chromatography/mass spectrometer (LC-MS/MS)-based metabolite identification and semi-quantitative estimation approach was developed. To overcome the poor full-scan sensitivity of conventional triple quadrupole mass spectrometry, precursor-product ion pairs were predicted, to search for the potential in vitro metabolites. The detected metabolites were confirmed by the product ion scan. A dilution method was introduced to evaluate the matrix effects of tentatively identified metabolites without chemical standards. Quantitative information on detected metabolites was obtained using ‘metabolite standards’ generated from incubation samples that contain a high concentration of metabolite in combination with a correction factor for mass spectrometry response. Two in vitro metabolites of DABE (M1 and M2) were identified, and quantified by the semi-quantitative estimation approach. It is noteworthy that CES1 convert DABE to M1 while CES2 mediates the conversion of DABE to M2. M1 (or M2) was further metabolized to DAB by CES2 (or CES1). The approach presented here provides a solution to a bioanalytical need for fast identification and semi-quantitative estimation of CES metabolites in preclinical samples. PMID:23239178

  1. Multiparametric Quantitative Ultrasound Imaging in Assessment of Chronic Kidney Disease.

    PubMed

    Gao, Jing; Perlman, Alan; Kalache, Safa; Berman, Nathaniel; Seshan, Surya; Salvatore, Steven; Smith, Lindsey; Wehrli, Natasha; Waldron, Levi; Kodali, Hanish; Chevalier, James

    2017-11-01

    To evaluate the value of multiparametric quantitative ultrasound imaging in assessing chronic kidney disease (CKD) using kidney biopsy pathologic findings as reference standards. We prospectively measured multiparametric quantitative ultrasound markers with grayscale, spectral Doppler, and acoustic radiation force impulse imaging in 25 patients with CKD before kidney biopsy and 10 healthy volunteers. Based on all pathologic (glomerulosclerosis, interstitial fibrosis/tubular atrophy, arteriosclerosis, and edema) scores, the patients with CKD were classified into mild (no grade 3 and <2 of grade 2) and moderate to severe (at least 2 of grade 2 or 1 of grade 3) CKD groups. Multiparametric quantitative ultrasound parameters included kidney length, cortical thickness, pixel intensity, parenchymal shear wave velocity, intrarenal artery peak systolic velocity (PSV), end-diastolic velocity (EDV), and resistive index. We tested the difference in quantitative ultrasound parameters among mild CKD, moderate to severe CKD, and healthy controls using analysis of variance, analyzed correlations of quantitative ultrasound parameters with pathologic scores and the estimated glomerular filtration rate (GFR) using Pearson correlation coefficients, and examined the diagnostic performance of quantitative ultrasound parameters in determining moderate CKD and an estimated GFR of less than 60 mL/min/1.73 m 2 using receiver operating characteristic curve analysis. There were significant differences in cortical thickness, pixel intensity, PSV, and EDV among the 3 groups (all P < .01). Among quantitative ultrasound parameters, the top areas under the receiver operating characteristic curves for PSV and EDV were 0.88 and 0.97, respectively, for determining pathologic moderate to severe CKD, and 0.76 and 0.86 for estimated GFR of less than 60 mL/min/1.73 m 2 . Moderate to good correlations were found for PSV, EDV, and pixel intensity with pathologic scores and estimated GFR. The PSV, EDV, and pixel intensity are valuable in determining moderate to severe CKD. The value of shear wave velocity in assessing CKD needs further investigation. © 2017 by the American Institute of Ultrasound in Medicine.

  2. On the Agreement between Manual and Automated Methods for Single-Trial Detection and Estimation of Features from Event-Related Potentials

    PubMed Central

    Biurrun Manresa, José A.; Arguissain, Federico G.; Medina Redondo, David E.; Mørch, Carsten D.; Andersen, Ole K.

    2015-01-01

    The agreement between humans and algorithms on whether an event-related potential (ERP) is present or not and the level of variation in the estimated values of its relevant features are largely unknown. Thus, the aim of this study was to determine the categorical and quantitative agreement between manual and automated methods for single-trial detection and estimation of ERP features. To this end, ERPs were elicited in sixteen healthy volunteers using electrical stimulation at graded intensities below and above the nociceptive withdrawal reflex threshold. Presence/absence of an ERP peak (categorical outcome) and its amplitude and latency (quantitative outcome) in each single-trial were evaluated independently by two human observers and two automated algorithms taken from existing literature. Categorical agreement was assessed using percentage positive and negative agreement and Cohen’s κ, whereas quantitative agreement was evaluated using Bland-Altman analysis and the coefficient of variation. Typical values for the categorical agreement between manual and automated methods were derived, as well as reference values for the average and maximum differences that can be expected if one method is used instead of the others. Results showed that the human observers presented the highest categorical and quantitative agreement, and there were significantly large differences between detection and estimation of quantitative features among methods. In conclusion, substantial care should be taken in the selection of the detection/estimation approach, since factors like stimulation intensity and expected number of trials with/without response can play a significant role in the outcome of a study. PMID:26258532

  3. A new efficient method for synaptic vesicle quantification reveals differences between medial prefrontal cortex perforated and nonperforated synapses.

    PubMed

    Nava, Nicoletta; Chen, Fenghua; Wegener, Gregers; Popoli, Maurizio; Nyengaard, Jens Randel

    2014-02-01

    Communication between neurons is mediated by the release of neurotransmitter-containing vesicles from presynaptic terminals. Quantitative characterization of synaptic vesicles can be highly valuable for understanding mechanisms underlying synaptic function and plasticity. We performed a quantitative ultrastructural analysis of cortical excitatory synapses by mean of a new, efficient method, as an alternative to three-dimensional (3D) reconstruction. Based on a hierarchical sampling strategy and unequivocal identification of the region of interest, serial sections from excitatory synapses of medial prefrontal cortex (mPFC) of six Sprague-Dawley rats were acquired with a transmission electron microscope. Unbiased estimates of total 3D volume of synaptic terminals were obtained through the Cavalieri estimator, and adequate correction factors for vesicle profile number estimation were applied for final vesicle quantification. Our analysis was based on 79 excitatory synapses, nonperforated (NPSs) and perforated (PSs) subtypes. We found that total number of docked and reserve-pool vesicles in PSs significantly exceeded that in NPSs (by, respectively, 77% and 78%). These differences were found to be related to changes in size between the two subtypes (active zone area by 86%; bouton volume by 105%) rather than to postsynaptic density shape. Positive significant correlations were found between number of docked and reserve-pool vesicles, active zone area and docked vesicles, and bouton volume and reserve pool vesicles. Our method confirmed the large size of mPFC PSs and a linear correlation between presynaptic features of typical hippocampal synapses. Moreover, a greater number of docked vesicles in PSs may promote a high synaptic strength of these synapses. Copyright © 2013 Wiley Periodicals, Inc.

  4. Application of Quantitative Structure–Activity Relationship Models of 5-HT1A Receptor Binding to Virtual Screening Identifies Novel and Potent 5-HT1A Ligands

    PubMed Central

    2015-01-01

    The 5-hydroxytryptamine 1A (5-HT1A) serotonin receptor has been an attractive target for treating mood and anxiety disorders such as schizophrenia. We have developed binary classification quantitative structure–activity relationship (QSAR) models of 5-HT1A receptor binding activity using data retrieved from the PDSP Ki database. The prediction accuracy of these models was estimated by external 5-fold cross-validation as well as using an additional validation set comprising 66 structurally distinct compounds from the World of Molecular Bioactivity database. These validated models were then used to mine three major types of chemical screening libraries, i.e., drug-like libraries, GPCR targeted libraries, and diversity libraries, to identify novel computational hits. The five best hits from each class of libraries were chosen for further experimental testing in radioligand binding assays, and nine of the 15 hits were confirmed to be active experimentally with binding affinity better than 10 μM. The most active compound, Lysergol, from the diversity library showed very high binding affinity (Ki) of 2.3 nM against 5-HT1A receptor. The novel 5-HT1A actives identified with the QSAR-based virtual screening approach could be potentially developed as novel anxiolytics or potential antischizophrenic drugs. PMID:24410373

  5. Space spin-offs: is technology transfer worth it?

    NASA Astrophysics Data System (ADS)

    Bush, Lance B.

    Dual-uses, spin-offs, and technology transfer have all become part of the space lexicon, creating a cultural attitude toward space activity justification. From the very beginning of space activities in the late 1950's, this idea of secondary benefits became a major part of the space culture and its beliefs system. Technology transfer has played a central role in public and political debates of funding for space activities. Over the years, several studies of the benefits of space activities have been performed, with some estimates reaching as high as a 60:1 return to the economy for each dollar spent in space activities. Though many of these models claiming high returns have been roundly criticized. More recent studies of technology transfer from federal laboratories to private sector are showing a return on investment of 2.8:1, with little evidence of jobs increases. Yet, a purely quantitative analysis is not sufficient as there exist cultural and social benefits attainable only through case studies. Space projects tend to have a long life cycle, making it difficult to track metrics on their secondary benefits. Recent studies have begun to make inroads towards a better understanding of the benefits and drawbacks of investing in technology transfer activities related to space, but there remains significant analyses to be performed which must include a combination of quantitative and qualitative analyses.

  6. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    NASA Astrophysics Data System (ADS)

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J.

    2008-08-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle.

  7. An assessment of the reliability of quantitative genetics estimates in study systems with high rate of extra-pair reproduction and low recruitment.

    PubMed

    Bourret, A; Garant, D

    2017-03-01

    Quantitative genetics approaches, and particularly animal models, are widely used to assess the genetic (co)variance of key fitness related traits and infer adaptive potential of wild populations. Despite the importance of precision and accuracy of genetic variance estimates and their potential sensitivity to various ecological and population specific factors, their reliability is rarely tested explicitly. Here, we used simulations and empirical data collected from an 11-year study on tree swallow (Tachycineta bicolor), a species showing a high rate of extra-pair paternity and a low recruitment rate, to assess the importance of identity errors, structure and size of the pedigree on quantitative genetic estimates in our dataset. Our simulations revealed an important lack of precision in heritability and genetic-correlation estimates for most traits, a low power to detect significant effects and important identifiability problems. We also observed a large bias in heritability estimates when using the social pedigree instead of the genetic one (deflated heritabilities) or when not accounting for an important cause of resemblance among individuals (for example, permanent environment or brood effect) in model parameterizations for some traits (inflated heritabilities). We discuss the causes underlying the low reliability observed here and why they are also likely to occur in other study systems. Altogether, our results re-emphasize the difficulties of generalizing quantitative genetic estimates reliably from one study system to another and the importance of reporting simulation analyses to evaluate these important issues.

  8. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    NASA Astrophysics Data System (ADS)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  9. Temporal intracavity detection of parasitic infrared absorption in Ti:Sapphire lasers

    NASA Astrophysics Data System (ADS)

    Deleva, A. D.; Peshev, Z. Y.; Aneva, Z. I.

    1993-12-01

    An intracavity technique with temporal sensitivity to optical losses is used to detect parasitic infrared absorption (PIRA) in Ti:sapphire crystals with high active-center concentrations. By means of comparative analysis, re-emission is established of part of the parasitically absorbed energy back into the laser action channel. A method is proposed for approximate quantitative determination of the relative part of re-emitting PIRA-centers with respect to their total number; for the highly-doped crystal described, it is estimated at about 11%.

  10. A heating-superfusion platform technology for the investigation of protein function in single cells.

    PubMed

    Xu, Shijun; Ainla, Alar; Jardemark, Kent; Jesorka, Aldo; Jeffries, Gavin D M

    2015-01-06

    Here, we report on a novel approach for the study of single-cell intracellular enzyme activity at various temperatures, utilizing a localized laser heating probe in combination with a freely positionable microfluidic perfusion device. Through directed exposure of individual cells to the pore-forming agent α-hemolysin, we have controlled the membrane permeability, enabling targeted delivery of the substrate. Mildly permeabilized cells were exposed to fluorogenic substrates to monitor the activity of intracellular enzymes, while adjusting the local temperature surrounding the target cells, using an infrared laser heating system. We generated quantitative estimates for the intracellular alkaline phosphatase activity at five different temperatures in different cell lines, constructing temperature-response curves of enzymatic activity at the single-cell level. Enzymatic activity was determined rapidly after cell permeation, generating five-point temperature-response curves within just 200 s.

  11. Distribution and activity of anaerobic ammonium-oxidising bacteria in natural freshwater wetland soils.

    PubMed

    Shen, Li-dong; Wu, Hong-sheng; Gao, Zhi-qiu; Cheng, Hai-xiang; Li, Ji; Liu, Xu; Ren, Qian-qi

    2016-04-01

    Anaerobic ammonium oxidation (anammox) process plays a significant role in the marine nitrogen cycle. However, the quantitative importance of this process in nitrogen removal in wetland systems, particularly in natural freshwater wetlands, is still not determined. In the present study, we provided the evidence of the distribution and activity of anammox bacteria in a natural freshwater wetland, located in southeastern China, by using (15)N stable isotope measurements, quantitative PCR assays and 16S rRNA gene clone library analysis. The potential anammox rates measured in this wetland system ranged between 2.5 and 25.5 nmol N2 g(-1) soil day(-1), and up to 20% soil dinitrogen gas production could be attributed to the anammox process. Phylogenetic analysis of 16S rRNA genes showed that anammox bacteria related to Candidatus Brocadia, Candidatus Kuenenia, Candidatus Anammoxoglobus and two novel anammox clusters coexisted in the collected soil cores, with Candidatus Brocadia and Candidatus Kuenenia being the dominant anammox genera. Quantitative PCR of hydrazine synthase genes showed that the abundance of anammox bacteria varied from 2.3 × 10(5) to 2.2 × 10(6) copies g(-1) soil in the examined soil cores. Correlation analyses suggested that the soil ammonium concentration had significant influence on the activity of anammox bacteria. On the basis of (15)N tracing technology, it is estimated that a total loss of 31.1 g N m(-2) per year could be linked the anammox process in the examined wetland.

  12. Fluvial drainage networks: the fractal approach as an improvement of quantitative geomorphic analyses

    NASA Astrophysics Data System (ADS)

    Melelli, Laura; Liucci, Luisa; Vergari, Francesca; Ciccacci, Sirio; Del Monte, Maurizio

    2014-05-01

    Drainage basins are primary landscape units for geomorphological investigations. Both hillslopes and river drainage system are fundamental components in drainage basins analysis. As other geomorphological systems, also the drainage basins aim to an equilibrium condition where the sequence of erosion, transport and sedimentation approach to a condition of minimum energy effort. This state is revealed by a typical geometry of landforms and of drainage net. Several morphometric indexes can measure how much a drainage basin is far from the theoretical equilibrium configuration, revealing possible external disarray. In active tectonic areas, the drainage basins have a primary importance in order to highlight style, amount and rate of tectonic impulses, and morphometric indexes allow to estimate the tectonic activity classes of different sectors in a study area. Moreover, drainage rivers are characterized by a self-similarity structure; this promotes the use of fractals theory to investigate the system. In this study, fractals techniques are employed together with quantitative geomorphological analysis to study the Upper Tiber Valley (UTV), a tectonic intermontane basin located in northern Apennines (Umbria, central Italy). The area is the result of different tectonic phases. From Late Pliocene until present time the UTV is strongly controlled by a regional uplift and by an extensional phase with different sets of normal faults playing a fundamental role in basin morphology. Thirty-four basins are taken into account for the quantitative analysis, twenty on the left side of the basin, the others on the right side. Using fractals dimension of drainage networks, Horton's laws results, concavity and steepness indexes, and hypsometric curves, this study aims to obtain an evolutionary model of the UTV, where the uplift is compared to local subsidence induced by normal fault activity. The results highlight a well defined difference between western and eastern tributary basins, suggesting a greater disequilibrium in the last ones. The quantitative analysis points out the segments of the basin boundaries where the fault activity is more efficient and the resulting geomorphological implications.

  13. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    PubMed

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.

  14. Dual respiratory and cardiac motion estimation in PET imaging: Methods design and quantitative evaluation.

    PubMed

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-04-01

    The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be the best option for accurate estimation of dual R&C motion in clinical situation. © 2018 American Association of Physicists in Medicine.

  15. Harnessing quantitative genetics and genomics for understanding and improving complex traits in crops

    USDA-ARS?s Scientific Manuscript database

    Classical quantitative genetics aids crop improvement by providing the means to estimate heritability, genetic correlations, and predicted responses to various selection schemes. Genomics has the potential to aid quantitative genetics and applied crop improvement programs via large-scale, high-thro...

  16. Reduced density gradient as a novel approach for estimating QSAR descriptors, and its application to 1, 4-dihydropyridine derivatives with potential antihypertensive effects.

    PubMed

    Jardínez, Christiaan; Vela, Alberto; Cruz-Borbolla, Julián; Alvarez-Mendez, Rodrigo J; Alvarado-Rodríguez, José G

    2016-12-01

    The relationship between the chemical structure and biological activity (log IC 50 ) of 40 derivatives of 1,4-dihydropyridines (DHPs) was studied using density functional theory (DFT) and multiple linear regression analysis methods. With the aim of improving the quantitative structure-activity relationship (QSAR) model, the reduced density gradient s( r) of the optimized equilibrium geometries was used as a descriptor to include weak non-covalent interactions. The QSAR model highlights the correlation between the log IC 50 with highest molecular orbital energy (E HOMO ), molecular volume (V), partition coefficient (log P), non-covalent interactions NCI(H4-G) and the dual descriptor [Δf(r)]. The model yielded values of R 2 =79.57 and Q 2 =69.67 that were validated with the next four internal analytical validations DK=0.076, DQ=-0.006, R P =0.056, and R N =0.000, and the external validation Q 2 boot =64.26. The QSAR model found can be used to estimate biological activity with high reliability in new compounds based on a DHP series. Graphical abstract The good correlation between the log IC 50 with the NCI (H4-G) estimated by the reduced density gradient approach of the DHP derivatives.

  17. Biochemical interpretation of quantitative structure-activity relationships (QSAR) for biodegradation of N-heterocycles: a complementary approach to predict biodegradability.

    PubMed

    Philipp, Bodo; Hoff, Malte; Germa, Florence; Schink, Bernhard; Beimborn, Dieter; Mersch-Sundermann, Volker

    2007-02-15

    Prediction of the biodegradability of organic compounds is an ecologically desirable and economically feasible tool for estimating the environmental fate of chemicals. We combined quantitative structure-activity relationships (QSAR) with the systematic collection of biochemical knowledge to establish rules for the prediction of aerobic biodegradation of N-heterocycles. Validated biodegradation data of 194 N-heterocyclic compounds were analyzed using the MULTICASE-method which delivered two QSAR models based on 17 activating (OSAR 1) and on 16 inactivating molecular fragments (GSAR 2), which were statistically significantly linked to efficient or poor biodegradability, respectively. The percentages of correct classifications were over 99% for both models, and cross-validation resulted in 67.9% (GSAR 1) and 70.4% (OSAR 2) correct predictions. Biochemical interpretation of the activating and inactivating characteristics of the molecular fragments delivered plausible mechanistic interpretations and enabled us to establish the following biodegradation rules: (1) Target sites for amidohydrolases and for cytochrome P450 monooxygenases enhance biodegradation of nonaromatic N-heterocycles. (2) Target sites for molybdenum hydroxylases enhance biodegradation of aromatic N-heterocycles. (3) Target sites for hydratation by an urocanase-like mechanism enhance biodegradation of imidazoles. Our complementary approach represents a feasible strategy for generating concrete rules for the prediction of biodegradability of organic compounds.

  18. High-throughput process development: determination of dynamic binding capacity using microtiter filter plates filled with chromatography resin.

    PubMed

    Bergander, Tryggve; Nilsson-Välimaa, Kristina; Oberg, Katarina; Lacki, Karol M

    2008-01-01

    Steadily increasing demand for more efficient and more affordable biomolecule-based therapies put a significant burden on biopharma companies to reduce the cost of R&D activities associated with introduction of a new drug to the market. Reducing the time required to develop a purification process would be one option to address the high cost issue. The reduction in time can be accomplished if more efficient methods/tools are available for process development work, including high-throughput techniques. This paper addresses the transitions from traditional column-based process development to a modern high-throughput approach utilizing microtiter filter plates filled with a well-defined volume of chromatography resin. The approach is based on implementing the well-known batch uptake principle into microtiter plate geometry. Two variants of the proposed approach, allowing for either qualitative or quantitative estimation of dynamic binding capacity as a function of residence time, are described. Examples of quantitative estimation of dynamic binding capacities of human polyclonal IgG on MabSelect SuRe and of qualitative estimation of dynamic binding capacity of amyloglucosidase on a prototype of Capto DEAE weak ion exchanger are given. The proposed high-throughput method for determination of dynamic binding capacity significantly reduces time and sample consumption as compared to a traditional method utilizing packed chromatography columns without sacrificing the accuracy of data obtained.

  19. Uncertainty analysis of the radiological characteristics of radioactive waste using a method based on log-normal distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gigase, Yves

    2007-07-01

    Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide asmore » example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)« less

  20. Application of a multicompartment dynamical model to multimodal optical imaging for investigating individual cerebrovascular properties

    NASA Astrophysics Data System (ADS)

    Desjardins, Michèle; Gagnon, Louis; Gauthier, Claudine; Hoge, Rick D.; Dehaes, Mathieu; Desjardins-Crépeau, Laurence; Bherer, Louis; Lesage, Frédéric

    2009-02-01

    Biophysical models of hemodynamics provide a tool for quantitative multimodal brain imaging by allowing a deeper understanding of the interplay between neural activity and blood oxygenation, volume and flow responses to stimuli. Multicompartment dynamical models that describe the dynamics and interactions of the vascular and metabolic components of evoked hemodynamic responses have been developed in the literature. In this work, multimodal data using near-infrared spectroscopy (NIRS) and diffuse correlation flowmetry (DCF) is used to estimate total baseline hemoglobin concentration (HBT0) in 7 adult subjects. A validation of the model estimate and investigation of the partial volume effect is done by comparing with time-resolved spectroscopy (TRS) measures of absolute HBT0. Simultaneous NIRS and DCF measurements during hypercapnia are then performed, but are found to be hardly reproducible. The results raise questions about the feasibility of an all-optical model-based estimation of individual vascular properties.

  1. Nighttime image dehazing using local atmospheric selection rule and weighted entropy for visible-light systems

    NASA Astrophysics Data System (ADS)

    Park, Dubok; Han, David K.; Ko, Hanseok

    2017-05-01

    Optical imaging systems are often degraded by scattering due to atmospheric particles, such as haze, fog, and mist. Imaging under nighttime haze conditions may suffer especially from the glows near active light sources as well as scattering. We present a methodology for nighttime image dehazing based on an optical imaging model which accounts for varying light sources and their glow. First, glow effects are decomposed using relative smoothness. Atmospheric light is then estimated by assessing global and local atmospheric light using a local atmospheric selection rule. The transmission of light is then estimated by maximizing an objective function designed on the basis of weighted entropy. Finally, haze is removed using two estimated parameters, namely, atmospheric light and transmission. The visual and quantitative comparison of the experimental results with the results of existing state-of-the-art methods demonstrates the significance of the proposed approach.

  2. From Magma Fracture to a Seismic Magma Flow Meter

    NASA Astrophysics Data System (ADS)

    Neuberg, J. W.

    2007-12-01

    Seismic swarms of low-frequency events occur during periods of enhanced volcanic activity and have been related to the flow of magma at depth. Often they precede a dome collapse on volcanoes like Soufriere Hills, Montserrat, or Mt St Helens. This contribution is based on the conceptual model of magma rupture as a trigger mechanism. Several source mechanisms and radiation patterns at the focus of a single event are discussed. We investigate the accelerating event rate and seismic amplitudes during one swarm, as well as over a time period of several swarms. The seismic slip vector will be linked to magma flow parameters resulting in estimates of magma flux for a variety of flow models such as plug flow, parabolic- or friction controlled flow. In this way we try to relate conceptual models to quantitative estimations which could lead to estimations of magma flux at depth from seismic low-frequency signals.

  3. 76 FR 13018 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-09

    ... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. Total Burden Estimate for the...

  4. Quantitative analysis of low-density SNP data for parentage assignment and estimation of family contributions to pooled samples.

    PubMed

    Henshall, John M; Dierens, Leanne; Sellars, Melony J

    2014-09-02

    While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are sufficiently accurate to provide useful information for a breeding program. Treating genotypes as quantitative values is an alternative to perturbing genotypes using an assumed error distribution, but can produce very different results. An understanding of the distribution of the error is required for SNP genotyping platforms.

  5. Measurement of lung expansion with computed tomography and comparison with quantitative histology.

    PubMed

    Coxson, H O; Mayo, J R; Behzad, H; Moore, B J; Verburgt, L M; Staples, C A; Paré, P D; Hogg, J C

    1995-11-01

    The total and regional lung volumes were estimated from computed tomography (CT), and the pleural pressure gradient was determined by using the milliliters of gas per gram of tissue estimated from the X-ray attenuation values and the pressure-volume curve of the lung. The data show that CT accurately estimated the volume of the resected lobe but overestimated its weight by 24 +/- 19%. The volume of gas per gram of tissue was less in the gravity-dependent regions due to a pleural pressure gradient of 0.24 +/- 0.08 cmH2O/cm of descent in the thorax. The proportion of tissue to air obtained with CT was similar to that obtained by quantitative histology. We conclude that the CT scan can be used to estimate total and regional lung volumes and that measurements of the proportions of tissue and air within the thorax by CT can be used in conjunction with quantitative histology to evaluate lung structure.

  6. The dangerousness of mountain recreation: A quantitative overview of fatal and non-fatal accidents in France.

    PubMed

    Soulé, Bastien; Lefèvre, Brice; Boutroy, Eric

    2017-08-01

    In France, there is a growing enthusiasm for outdoor sports and recreation. In the meantime, the risk of both severe and frequent injury associated with active pursuits in mountain areas is acknowledged. This paper tackles accidents related to mountain sports, with a focus on three critical activities: hiking, mountaineering and ski touring. The aim consists of estimating the actual number of accidents (whether they entailed deaths or injuries) in the three above-mentioned activities. In order to align the information collected, then provide estimations based on the reasoned crossing of these secondary data, we consulted and summarised all the databases available on the French territory. Results address the trauma-related mortality in absolute values, and a comparison with the death rate of other sports. The calculation of a mortality index, including secondary mortality, is then provided. Elements of mountain sports accidentality are also presented, intending to clarify how many casualities occur each year in the French mountains. Last, a focus on the particularities of each mountain sport is provided.

  7. Galactic Cosmic Ray Intensity in the Upcoming Minimum of the Solar Activity Cycle

    NASA Astrophysics Data System (ADS)

    Krainev, M. B.; Bazilevskaya, G. A.; Kalinin, M. S.; Svirzhevskaya, A. K.; Svirzhevskii, N. S.

    2018-03-01

    During the prolonged and deep minimum of solar activity between cycles 23 and 24, an unusual behavior of the heliospheric characteristics and increased intensity of galactic cosmic rays (GCRs) near the Earth's orbit were observed. The maximum of the current solar cycle 24 is lower than the previous one, and the decline in solar and, therefore, heliospheric activity is expected to continue in the next cycle. In these conditions, it is important for an understanding of the process of GCR modulation in the heliosphere, as well as for applied purposes (evaluation of the radiation safety of planned space flights, etc.), to estimate quantitatively the possible GCR characteristics near the Earth in the upcoming solar minimum ( 2019-2020). Our estimation is based on the prediction of the heliospheric characteristics that are important for cosmic ray modulation, as well as on numeric calculations of GCR intensity. Additionally, we consider the distribution of the intensity and other GCR characteristics in the heliosphere and discuss the intercycle variations in the GCR characteristics that are integral for the whole heliosphere (total energy, mean energy, and charge).

  8. Estimation of tensile force in the hamstring muscles during overground sprinting.

    PubMed

    Ono, T; Higashihara, A; Shinohara, J; Hirose, N; Fukubayashi, T

    2015-02-01

    The purpose of this study was to identify the period of the gait cycle during which the hamstring muscles were likely injured by estimating the magnitude of tensile force in each muscle during overground sprinting. We conducted three-dimensional motion analysis of 12 male athletes performing overground sprinting at their maximal speed and calculated the hamstring muscle-tendon length and joint angles of the right limb throughout a gait cycle during which the ground reaction force was measured. Electromyographic activity during sprinting was recorded for the biceps femoris long head, semitendinosus, and semimembranosus muscles of ipsilateral limb. We estimated the magnitude of tensile force in each muscle by using the length change occurred in the musculotendon and normalized electromyographic activity value. The study found a quick increase of estimated tensile force in the biceps femoris long head during the early stance phase of the gait cycle during which the increased hip flexion angle and ground reaction force occurred at the same time. This study provides quantitative data of tensile force in the hamstring muscles suggesting that the biceps femoris long head muscle is susceptible to a strain injury during the early stance phase of the sprinting gait cycle. © Georg Thieme Verlag KG Stuttgart · New York.

  9. Patient-specific lean body mass can be estimated from limited-coverage computed tomography images.

    PubMed

    Devriese, Joke; Beels, Laurence; Maes, Alex; van de Wiele, Christophe; Pottel, Hans

    2018-06-01

    In PET/CT, quantitative evaluation of tumour metabolic activity is possible through standardized uptake values, usually normalized for body weight (BW) or lean body mass (LBM). Patient-specific LBM can be estimated from whole-body (WB) CT images. As most clinical indications only warrant PET/CT examinations covering head to midthigh, the aim of this study was to develop a simple and reliable method to estimate LBM from limited-coverage (LC) CT images and test its validity. Head-to-toe PET/CT examinations were retrospectively retrieved and semiautomatically segmented into tissue types based on thresholding of CT Hounsfield units. LC was obtained by omitting image slices. Image segmentation was validated on the WB CT examinations by comparing CT-estimated BW with actual BW, and LBM estimated from LC images were compared with LBM estimated from WB images. A direct method and an indirect method were developed and validated on an independent data set. Comparing LBM estimated from LC examinations with estimates from WB examinations (LBMWB) showed a significant but limited bias of 1.2 kg (direct method) and nonsignificant bias of 0.05 kg (indirect method). This study demonstrates that LBM can be estimated from LC CT images with no significant difference from LBMWB.

  10. Quantitative estimation of pesticide-likeness for agrochemical discovery.

    PubMed

    Avram, Sorin; Funar-Timofei, Simona; Borota, Ana; Chennamaneni, Sridhar Rao; Manchala, Anil Kumar; Muresan, Sorel

    2014-12-01

    The design of chemical libraries, an early step in agrochemical discovery programs, is frequently addressed by means of qualitative physicochemical and/or topological rule-based methods. The aim of this study is to develop quantitative estimates of herbicide- (QEH), insecticide- (QEI), fungicide- (QEF), and, finally, pesticide-likeness (QEP). In the assessment of these definitions, we relied on the concept of desirability functions. We found a simple function, shared by the three classes of pesticides, parameterized particularly, for six, easy to compute, independent and interpretable, molecular properties: molecular weight, logP, number of hydrogen bond acceptors, number of hydrogen bond donors, number of rotatable bounds and number of aromatic rings. Subsequently, we describe the scoring of each pesticide class by the corresponding quantitative estimate. In a comparative study, we assessed the performance of the scoring functions using extensive datasets of patented pesticides. The hereby-established quantitative assessment has the ability to rank compounds whether they fail well-established pesticide-likeness rules or not, and offer an efficient way to prioritize (class-specific) pesticides. These findings are valuable for the efficient estimation of pesticide-likeness of vast chemical libraries in the field of agrochemical discovery. Graphical AbstractQuantitative models for pesticide-likeness were derived using the concept of desirability functions parameterized for six, easy to compute, independent and interpretable, molecular properties: molecular weight, logP, number of hydrogen bond acceptors, number of hydrogen bond donors, number of rotatable bounds and number of aromatic rings.

  11. Benchmarking on the evaluation of major accident-related risk assessment.

    PubMed

    Fabbri, Luciano; Contini, Sergio

    2009-03-15

    This paper summarises the main results of a European project BEQUAR (Benchmarking Exercise in Quantitative Area Risk Assessment in Central and Eastern European Countries). This project is among the first attempts to explore how independent evaluations of the same risk study associated with a certain chemical establishment could differ from each other and the consequent effects on the resulting area risk estimate. The exercise specifically aimed at exploring the manner and degree to which independent experts may disagree on the interpretation of quantitative risk assessments for the same entity. The project first compared the results of a number of independent expert evaluations of a quantitative risk assessment study for the same reference chemical establishment. This effort was then followed by a study of the impact of the different interpretations on the estimate of the overall risk on the area concerned. In order to improve the inter-comparability of the results, this exercise was conducted using a single tool for area risk assessment based on the ARIPAR methodology. The results of this study are expected to contribute to an improved understanding of the inspection criteria and practices used by the different national authorities responsible for the implementation of the Seveso II Directive in their countries. The activity was funded under the Enlargement and Integration Action of the Joint Research Centre (JRC), that aims at providing scientific and technological support for promoting integration of the New Member States and assisting the Candidate Countries on their way towards accession to the European Union.

  12. Characterization of bauxite residue (red mud) for 235U, 238U, 232Th and 40K using neutron activation analysis and the radiation dose levels as modeled by MCNP.

    PubMed

    Landsberger, S; Sharp, A; Wang, S; Pontikes, Y; Tkaczyk, A H

    2017-07-01

    This study employs thermal and epithermal neutron activation analysis (NAA) to quantitatively and specifically determine absorption dose rates to various body parts from uranium, thorium and potassium. Specifically, a case study of bauxite residue (red mud) from an industrial facility was used to demonstrate the feasibility of the NAA approach for radiological safety assessment, using small sample sizes to ascertain the activities of 235 U, 238 U, 232 Th and 40 K. This proof-of-concept was shown to produce reliable results and a similar approach could be used for quantitative assessment of other samples with possible radiological significance. 238 U and 232 Th were determined by epithermal and thermal neutron activation analysis, respectively. 235 U was determined based on the known isotopic ratio of 238 U/ 235 U. 40 K was also determined using epithermal neutron activation analysis to measure total potassium content and then subtracting its isotopic contribution. Furthermore, the work demonstrates the application of Monte Carlo Neutral-Particle (MCNP) simulations to estimate the radiation dose from large quantities of red mud, to assure the safety of humans and the surrounding environment. Phantoms were employed to observe the dose distribution throughout the human body demonstrating radiation effects on each individual organ. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Quantitative studies on structure-DPPH• scavenging activity relationships of food phenolic acids.

    PubMed

    Jing, Pu; Zhao, Shu-Juan; Jian, Wen-Jie; Qian, Bing-Jun; Dong, Ying; Pang, Jie

    2012-11-01

    Phenolic acids are potent antioxidants, yet the quantitative structure-activity relationships of phenolic acids remain unclear. The purpose of this study was to establish 3D-QSAR models able to predict phenolic acids with high DPPH• scavenging activity and understand their structure-activity relationships. The model has been established by using a training set of compounds with cross-validated q2 = 0.638/0.855, non-cross-validated r2 = 0.984/0.986, standard error of estimate = 0.236/0.216, and F = 139.126/208.320 for the best CoMFA/CoMSIA models. The predictive ability of the models was validated with the correlation coefficient r2(pred) = 0.971/0.996 (>0.6) for each model. Additionally, the contour map results suggested that structural characteristics of phenolics acids favorable for the high DPPH• scavenging activity might include: (1) bulky and/or electron-donating substituent groups on the phenol ring; (2) electron-donating groups at the meta-position and/or hydrophobic groups at the meta-/ortho-position; (3) hydrogen-bond donor/electron-donating groups at the ortho-position. The results have been confirmed based on structural analyses of phenolic acids and their DPPH• scavenging data from eight recent publications. The findings may provide deeper insight into the antioxidant mechanisms and provide useful information for selecting phenolic acids for free radical scavenging properties.

  14. Quantitative methods for estimating the anisotropy of the strength properties and the phase composition of Mg-Al alloys

    NASA Astrophysics Data System (ADS)

    Betsofen, S. Ya.; Kolobov, Yu. R.; Volkova, E. F.; Bozhko, S. A.; Voskresenskaya, I. I.

    2015-04-01

    Quantitative methods have been developed to estimate the anisotropy of the strength properties and to determine the phase composition of Mg-Al alloys. The efficiency of the methods is confirmed for MA5 alloy subjected to severe plastic deformation. It is shown that the Taylor factors calculated for basal slip averaged over all orientations of a polycrystalline aggregate with allowance for texture can be used for a quantitative estimation of the contribution of the texture of semifinished magnesium alloy products to the anisotropy of their strength properties. A technique of determining the composition of a solid solution and the intermetallic phase Al12Mg17 content is developed using the measurement of the lattice parameters of the solid solution and the known dependence of these lattice parameters on the composition.

  15. Quantitative Aging Pattern in Mouse Urine Vapor as Measured by Gas-Liquid Chromatography

    NASA Technical Reports Server (NTRS)

    Robinson, Arthur B.; Dirren, Henri; Sheets, Alan; Miquel, Jaime; Lundgren, Paul R.

    1975-01-01

    We have discovered a quantitative aging pattern in mouse urine vapor. The diagnostic power of the pattern has been found to be high. We hope that this pattern will eventually allow quantitative estimates of physiological age and some insight into the biochemistry of aging.

  16. Genomic Quantitative Genetics to Study Evolution in the Wild.

    PubMed

    Gienapp, Phillip; Fior, Simone; Guillaume, Frédéric; Lasky, Jesse R; Sork, Victoria L; Csilléry, Katalin

    2017-12-01

    Quantitative genetic theory provides a means of estimating the evolutionary potential of natural populations. However, this approach was previously only feasible in systems where the genetic relatedness between individuals could be inferred from pedigrees or experimental crosses. The genomic revolution opened up the possibility of obtaining the realized proportion of genome shared among individuals in natural populations of virtually any species, which could promise (more) accurate estimates of quantitative genetic parameters in virtually any species. Such a 'genomic' quantitative genetics approach relies on fewer assumptions, offers a greater methodological flexibility, and is thus expected to greatly enhance our understanding of evolution in natural populations, for example, in the context of adaptation to environmental change, eco-evolutionary dynamics, and biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  18. Mixture model based joint-MAP reconstruction of attenuation and activity maps in TOF-PET

    NASA Astrophysics Data System (ADS)

    Hemmati, H.; Kamali-Asl, A.; Ghafarian, P.; Ay, M. R.

    2018-06-01

    A challenge to have quantitative positron emission tomography (PET) images is to provide an accurate and patient-specific photon attenuation correction. In PET/MR scanners, the nature of MR signals and hardware limitations have led to a real challenge on the attenuation map extraction. Except for a constant factor, the activity and attenuation maps from emission data on TOF-PET system can be determined by the maximum likelihood reconstruction of attenuation and activity approach (MLAA) from emission data. The aim of the present study is to constrain the joint estimations of activity and attenuation approach for PET system using a mixture model prior based on the attenuation map histogram. This novel prior enforces non-negativity and its hyperparameters can be estimated using a mixture decomposition step from the current estimation of the attenuation map. The proposed method can also be helpful on the solving of scaling problem and is capable to assign the predefined regional attenuation coefficients with some degree of confidence to the attenuation map similar to segmentation-based attenuation correction approaches. The performance of the algorithm is studied with numerical and Monte Carlo simulations and a phantom experiment and was compared with MLAA algorithm with and without the smoothing prior. The results demonstrate that the proposed algorithm is capable of producing the cross-talk free activity and attenuation images from emission data. The proposed approach has potential to be a practical and competitive method for joint reconstruction of activity and attenuation maps from emission data on PET/MR and can be integrated on the other methods.

  19. Linearization improves the repeatability of quantitative dynamic contrast-enhanced MRI.

    PubMed

    Jones, Kyle M; Pagel, Mark D; Cárdenas-Rodríguez, Julio

    2018-04-01

    The purpose of this study was to compare the repeatabilities of the linear and nonlinear Tofts and reference region models (RRM) for dynamic contrast-enhanced MRI (DCE-MRI). Simulated and experimental DCE-MRI data from 12 rats with a flank tumor of C6 glioma acquired over three consecutive days were analyzed using four quantitative and semi-quantitative DCE-MRI metrics. The quantitative methods used were: 1) linear Tofts model (LTM), 2) non-linear Tofts model (NTM), 3) linear RRM (LRRM), and 4) non-linear RRM (NRRM). The following semi-quantitative metrics were used: 1) maximum enhancement ratio (MER), 2) time to peak (TTP), 3) initial area under the curve (iauc64), and 4) slope. LTM and NTM were used to estimate K trans , while LRRM and NRRM were used to estimate K trans relative to muscle (R Ktrans ). Repeatability was assessed by calculating the within-subject coefficient of variation (wSCV) and the percent intra-subject variation (iSV) determined with the Gage R&R analysis. The iSV for R Ktrans using LRRM was two-fold lower compared to NRRM at all simulated and experimental conditions. A similar trend was observed for the Tofts model, where LTM was at least 50% more repeatable than the NTM under all experimental and simulated conditions. The semi-quantitative metrics iauc64 and MER were as equally repeatable as K trans and R Ktrans estimated by LTM and LRRM respectively. The iSV for iauc64 and MER were significantly lower than the iSV for slope and TTP. In simulations and experimental results, linearization improves the repeatability of quantitative DCE-MRI by at least 30%, making it as repeatable as semi-quantitative metrics. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Quantitative Estimates of the Social Benefits of Learning, 1: Crime. Wider Benefits of Learning Research Report.

    ERIC Educational Resources Information Center

    Feinstein, Leon

    The cost benefits of lifelong learning in the United Kingdom were estimated, based on quantitative evidence. Between 1975-1996, 43 police force areas in England and Wales were studied to determine the effect of wages on crime. It was found that a 10 percent rise in the average pay of those on low pay reduces the overall area property crime rate by…

  1. Modeling Cape- and Ridge-Associated Marine Sand Deposits; A Focus on the U.S. Atlantic Continental Shelf

    USGS Publications Warehouse

    Bliss, James D.; Williams, S. Jeffress; Bolm, Karen S.

    2009-01-01

    Cape- and ridge-associated marine sand deposits, which accumulate on storm-dominated continental shelves that are undergoing Holocene marine transgression, are particularly notable in a segment of the U.S. Atlantic Continental Shelf that extends southward from the east tip of Long Island, N.Y., and eastward from Cape May at the south end of the New Jersey shoreline. These sand deposits commonly contain sand suitable for shore protection in the form of beach nourishment. Increasing demand for marine sand raises questions about both short- and long-term potential supply and the sustainability of beach nourishment with the prospects of accelerating sea-level rise and increasing storm activity. To address these important issues, quantitative assessments of the volume of marine sand resources are needed. Currently, the U.S. Geological Survey is undertaking these assessments through its national Marine Aggregates and Resources Program (URL http://woodshole.er.usgs.gov/project-pages/aggregates/). In this chapter, we present a hypothetical example of a quantitative assessment of cape-and ridge-associated marine sand deposits in the study area, using proven tools of mineral-resource assessment. Applying these tools requires new models that summarize essential data on the quantity and quality of these deposits. Two representative types of model are descriptive models, which consist of a narrative that allows for a consistent recognition of cape-and ridge-associated marine sand deposits, and quantitative models, which consist of empirical statistical distributions that describe significant deposit characteristics, such as volume and grain-size distribution. Variables of the marine sand deposits considered for quantitative modeling in this study include area, thickness, mean grain size, grain sorting, volume, proportion of sand-dominated facies, and spatial density, of which spatial density is particularly helpful in estimating the number of undiscovered deposits within an assessment area. A Monte Carlo simulation that combines the volume of sand-dominated-facies models with estimates of the hypothetical probable number of undiscovered deposits provides a probabilistic approach to estimating marine sand resources within parts of the U.S. Atlantic Continental Shelf and other comparable marine shelves worldwide.

  2. Variants in CPT1A, FADS1, and FADS2 are Associated with Higher Levels of Estimated Plasma and Erythrocyte Delta-5 Desaturases in Alaskan Eskimos.

    PubMed

    Voruganti, V Saroja; Higgins, Paul B; Ebbesson, Sven O E; Kennish, John; Göring, Harald H H; Haack, Karin; Laston, Sandra; Drigalenko, Eugene; Wenger, Charlotte R; Harris, William S; Fabsitz, Richard R; Devereux, Richard B; Maccluer, Jean W; Curran, Joanne E; Carless, Melanie A; Johnson, Matthew P; Moses, Eric K; Blangero, John; Umans, Jason G; Howard, Barbara V; Cole, Shelley A; Comuzzie, Anthony Gean

    2012-01-01

    The delta-5 and delta-6 desaturases (D5D and D6D), encoded by fatty acid desaturase 1 (FADS1) and 2 (FADS2) genes, respectively, are rate-limiting enzymes in the metabolism of ω-3 and ω-6 fatty acids. The objective of this study was to identify genes influencing variation in estimated D5D and D6D activities in plasma and erythrocytes in Alaskan Eskimos (n = 761) participating in the genetics of coronary artery disease in Alaska Natives (GOCADAN) study. Desaturase activity was estimated by product: precursor ratio of polyunsaturated fatty acids. We found evidence of linkage for estimated erythrocyte D5D (eD5D) on chromosome 11q12-q13 (logarithm of odds score = 3.5). The confidence interval contains candidate genes FADS1, FADS2, 7-dehydrocholesterol reductase (DHCR7), and carnitine palmitoyl transferase 1A, liver (CPT1A). Measured genotype analysis found association between CPT1A, FADS1, and FADS2 single-nucleotide polymorphisms (SNPs) and estimated eD5D activity (p-values between 10(-28) and 10(-5)). A Bayesian quantitative trait nucleotide analysis showed that rs3019594 in CPT1A, rs174541 in FADS1, and rs174568 in FADS2 had posterior probabilities > 0.8, thereby demonstrating significant statistical support for a functional effect on eD5D activity. Highly significant associations of FADS1, FADS2, and CPT1A transcripts with their respective SNPs (p-values between 10(-75) and 10(-7)) in Mexican Americans of the San Antonio Family Heart Study corroborated our results. These findings strongly suggest a functional role for FADS1, FADS2, and CPT1A SNPs in the variation in eD5D activity.

  3. A Test of the Active-Day Fraction Method of Sunspot Group Number Calibration: Dependence on the Level of Solar Activity

    NASA Astrophysics Data System (ADS)

    Willamo, T.; Usoskin, I. G.; Kovaltsov, G. A.

    2018-04-01

    The method of active-day fraction (ADF) was proposed recently to calibrate different solar observers to standard observational conditions. The result of the calibration may depend on the overall level of solar activity during the observational period. This dependency is studied quantitatively using data of the Royal Greenwich Observatory by formally calibrating synthetic pseudo-observers to the full reference dataset. It is shown that the sunspot group number is precisely estimated by the ADF method for periods of moderate activity, may be slightly underestimated by 0.5 - 1.5 groups ({≤} 10%) for strong and very strong activity, and is strongly overestimated by up to 2.5 groups ({≤} 30%) for weak-to-moderate activity. The ADF method becomes inapplicable for the periods of grand minima of activity. In general, the ADF method tends to overestimate the overall level of activity and to reduce the long-term trends.

  4. Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.

    PubMed

    Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo

    2015-01-01

    We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced.

  5. NEXRAD quantitative precipitation estimates, data acquisition, and processing for the DuPage County, Illinois, streamflow-simulation modeling system

    USGS Publications Warehouse

    Ortel, Terry W.; Spies, Ryan R.

    2015-11-19

    Next-Generation Radar (NEXRAD) has become an integral component in the estimation of precipitation (Kitzmiller and others, 2013). The high spatial and temporal resolution of NEXRAD has revolutionized the ability to estimate precipitation across vast regions, which is especially beneficial in areas without a dense rain-gage network. With the improved precipitation estimates, hydrologic models can produce reliable streamflow forecasts for areas across the United States. NEXRAD data from the National Weather Service (NWS) has been an invaluable tool used by the U.S. Geological Survey (USGS) for numerous projects and studies; NEXRAD data processing techniques similar to those discussed in this Fact Sheet have been developed within the USGS, including the NWS Quantitative Precipitation Estimates archive developed by Blodgett (2013).

  6. The neuroplastic effect of working memory training in healthy volunteers and patients with schizophrenia: Implications for cognitive rehabilitation.

    PubMed

    Li, Xu; Xiao, Ya-hui; Zhao, Qing; Leung, Ada W W; Cheung, Eric F C; Chan, Raymond C K

    2015-08-01

    We conducted an activation likelihood estimation (ALE) meta-analysis to quantitatively review the existing working memory (WM) training studies that investigated neural activation changes both in healthy individuals and patients with schizophrenia. ALE analysis of studies in healthy individuals indicates a widespread distribution of activation changes with WM training in the frontal and parietal regions, especially the dorsolateral prefrontal cortex, the medial frontal cortex and the precuneus, as well as subcortical regions such as the insula and the striatum. WM training is also accompanied by activation changes in patients with schizophrenia, mainly in the dorsolateral prefrontal cortex, the precuneus and the fusiform gyrus. Our results demonstrate that WM training is accompanied by changes in neural activation patterns in healthy individuals, which may provide the basis for understanding neuroplastic changes in patients with schizophrenia. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Rapid Quantitation of Ascorbic and Folic Acids in SRM 3280 Multivitamin/Multielement Tablets using Flow-Injection Tandem Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhandari, Deepak; Kertesz, Vilmos; Van Berkel, Gary J

    RATIONALE: Ascorbic acid (AA) and folic acid (FA) are water-soluble vitamins and are usually fortified in food and dietary supplements. For the safety of human health, proper intake of these vitamins is recommended. Improvement in the analysis time required for the quantitative determination of these vitamins in food and nutritional formulations is desired. METHODS: A simple and fast (~5 min) in-tube sample preparation was performed, independently for FA and AA, by mixing extraction solvent with a powdered sample aliquot followed by agitation, centrifugation, and filtration to recover an extract for analysis. Quantitative detection was achieved by flow-injection (1 L injectionmore » volume) electrospray ionization tandem mass spectrometry (ESI-MS/MS) in negative ion mode using the method of standard addition. RESULTS: Method of standard addition was employed for the quantitative estimation of each vitamin in a sample extract. At least 2 spiked and 1 non-spiked sample extract were injected in triplicate for each quantitative analysis. Given an injection-to-injection interval of approximately 2 min, about 18 min was required to complete the quantitative estimation of each vitamin. The concentration values obtained for the respective vitamins in the standard reference material (SRM) 3280 using this approach were within the statistical range of the certified values provided in the NIST Certificate of Analysis. The estimated limit of detections of FA and AA were 13 and 5.9 ng/g, respectively. CONCLUSIONS: Flow-injection ESI-MS/MS was successfully applied for the rapid quantitation of FA and AA in SRM 3280 multivitamin/multielement tablets.« less

  8. Correlates of a single cortical action potential in the epidural EEG

    PubMed Central

    Teleńczuk, Bartosz; Baker, Stuart N; Kempter, Richard; Curio, Gabriel

    2015-01-01

    To identify the correlates of a single cortical action potential in surface EEG, we recorded simultaneously epidural EEG and single-unit activity in the primary somatosensory cortex of awake macaque monkeys. By averaging over EEG segments coincident with more than hundred thousand single spikes, we found short-lived (≈ 0.5 ms) triphasic EEG deflections dominated by high-frequency components > 800 Hz. The peak-to-peak amplitude of the grand-averaged spike correlate was 80 nV, which matched theoretical predictions, while single-neuron amplitudes ranged from 12 to 966 nV. Combining these estimates with post-stimulus-time histograms of single-unit responses to median-nerve stimulation allowed us to predict the shape of the evoked epidural EEG response and to estimate the number of contributing neurons. These findings establish spiking activity of cortical neurons as a primary building block of high-frequency epidural EEG, which thus can serve as a quantitative macroscopic marker of neuronal spikes. PMID:25554430

  9. Energy dissipation in slipping biological pumps.

    PubMed

    Kjelstrup, Signe; Rubi, J Miguel; Bedeaux, Dick

    2005-12-07

    We describe active transport in slipping biological pumps, using mesoscopic nonequilibrium thermodynamics. The pump operation is characterised by its stochastic nature and energy dissipation. We show how heating as well as cooling effects can be associated with pump operation. We use as an example the well studied active transport of Ca2+ across a biological membrane by means of its ATPase, and use published data to find values for the transport coefficients of the pump under various conditions. Most of the transport coefficients of the pump, including those that relate ATP hydrolysis or synthesis to thermal effects, are estimated. This can give a quantitative description of thermogenesis. We show by calculation that all of these coupling coefficients are significant.

  10. Line-Constrained Camera Location Estimation in Multi-Image Stereomatching.

    PubMed

    Donné, Simon; Goossens, Bart; Philips, Wilfried

    2017-08-23

    Stereomatching is an effective way of acquiring dense depth information from a scene when active measurements are not possible. So-called lightfield methods take a snapshot from many camera locations along a defined trajectory (usually uniformly linear or on a regular grid-we will assume a linear trajectory) and use this information to compute accurate depth estimates. However, they require the locations for each of the snapshots to be known: the disparity of an object between images is related to both the distance of the camera to the object and the distance between the camera positions for both images. Existing solutions use sparse feature matching for camera location estimation. In this paper, we propose a novel method that uses dense correspondences to do the same, leveraging an existing depth estimation framework to also yield the camera locations along the line. We illustrate the effectiveness of the proposed technique for camera location estimation both visually for the rectification of epipolar plane images and quantitatively with its effect on the resulting depth estimation. Our proposed approach yields a valid alternative for sparse techniques, while still being executed in a reasonable time on a graphics card due to its highly parallelizable nature.

  11. Analysis of ribosomal RNA stability in dead cells of wine yeast by quantitative PCR.

    PubMed

    Sunyer-Figueres, Merce; Wang, Chunxiao; Mas, Albert

    2018-04-02

    During wine production, some yeasts enter a Viable But Not Culturable (VBNC) state, which may influence the quality and stability of the final wine through remnant metabolic activity or by resuscitation. Culture-independent techniques are used for obtaining an accurate estimation of the number of live cells, and quantitative PCR could be the most accurate technique. As a marker of cell viability, rRNA was evaluated by analyzing its stability in dead cells. The species-specific stability of rRNA was tested in Saccharomyces cerevisiae, as well as in three species of non-Saccharomyces yeast (Hanseniaspora uvarum, Torulaspora delbrueckii and Starmerella bacillaris). High temperature and antimicrobial dimethyl dicarbonate (DMDC) treatments were efficient in lysing the yeast cells. rRNA gene and rRNA (as cDNA) were analyzed over 48 h after cell lysis by quantitative PCR. The results confirmed the stability of rRNA for 48 h after the cell lysis treatments. To sum up, rRNA may not be a good marker of cell viability in the wine yeasts that were tested. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Two forms of alpha-amylase in mantle tissue of Mytilus galloprovincialis: purification and molecular properties of form II.

    PubMed

    Lombraña, M; Suárez, P; San Juan, F

    2005-09-01

    alpha-Amylase activity has been shown for the first time in a non-digestive tissue from Mytilus galloprovincialis. alpha-amylase from mussel mantle tissue has been purified by affinity chromatography on insoluble starch, followed by gel-filtration chromatography on Superdex-200. The chromatographic and electrophoretic behaviour of M. galloprovincialis alpha-amylase and stability characteristics suggest two forms of this enzyme: one form forming stable aggregates (form I) and a monomeric form (form II) that is more abundant, active and unstable. Both forms show an inverse quantitative variation. Purified form II was highly unstable and the molecular mass was estimated to be 66 kDa by sodium dodecyl sulphate (SDS)-gel electrophoresis. Maximum activity was noted at pH 6.5 and 35 degrees C.

  13. A Pilot Study on Integrating Videography and Environmental Microbial Sampling to Model Fecal Bacterial Exposures in Peri-Urban Tanzania.

    PubMed

    Julian, Timothy R; Pickering, Amy J

    2015-01-01

    Diarrheal diseases are a leading cause of under-five mortality and morbidity in sub-Saharan Africa. Quantitative exposure modeling provides opportunities to investigate the relative importance of fecal-oral transmission routes (e.g. hands, water, food) responsible for diarrheal disease. Modeling, however, requires accurate descriptions of individuals' interactions with the environment (i.e., activity data). Such activity data are largely lacking for people in low-income settings. In the present study, we collected activity data and microbiological sampling data to develop a quantitative microbial exposure model for two female caretakers in peri-urban Tanzania. Activity data were combined with microbiological data of contacted surfaces and fomites (e.g. broom handle, soil, clothing) to develop example exposure profiles describing second-by-second estimates of fecal indicator bacteria (E. coli and enterococci) concentrations on the caretaker's hands. The study demonstrates the application and utility of video activity data to quantify exposure factors for people in low-income countries and apply these factors to understand fecal contamination exposure pathways. This study provides both a methodological approach for the design and implementation of larger studies, and preliminary data suggesting contacts with dirt and sand may be important mechanisms of hand contamination. Increasing the scale of activity data collection and modeling to investigate individual-level exposure profiles within target populations for specific exposure scenarios would provide opportunities to identify the relative importance of fecal-oral disease transmission routes.

  14. Maximal Predictability Approach for Identifying the Right Descriptors for Electrocatalytic Reactions.

    PubMed

    Krishnamurthy, Dilip; Sumaria, Vaidish; Viswanathan, Venkatasubramanian

    2018-02-01

    Density functional theory (DFT) calculations are being routinely used to identify new material candidates that approach activity near fundamental limits imposed by thermodynamics or scaling relations. DFT calculations are associated with inherent uncertainty, which limits the ability to delineate materials (distinguishability) that possess high activity. Development of error-estimation capabilities in DFT has enabled uncertainty propagation through activity-prediction models. In this work, we demonstrate an approach to propagating uncertainty through thermodynamic activity models leading to a probability distribution of the computed activity and thereby its expectation value. A new metric, prediction efficiency, is defined, which provides a quantitative measure of the ability to distinguish activity of materials and can be used to identify the optimal descriptor(s) ΔG opt . We demonstrate the framework for four important electrochemical reactions: hydrogen evolution, chlorine evolution, oxygen reduction and oxygen evolution. Future studies could utilize expected activity and prediction efficiency to significantly improve the prediction accuracy of highly active material candidates.

  15. Voronovskaja's theorem revisited

    NASA Astrophysics Data System (ADS)

    Tachev, Gancho T.

    2008-07-01

    We represent a new quantitative variant of Voronovskaja's theorem for Bernstein operator. This estimate improves the recent quantitative versions of Voronovskaja's theorem for certain Bernstein-type operators, obtained by H. Gonska, P. Pitul and I. Rasa in 2006.

  16. Effects of genes, sex, age, and activity on BMC, bone size, and areal and volumetric BMD.

    PubMed

    Havill, Lorena M; Mahaney, Michael C; L Binkley, Teresa; Specker, Bonny L

    2007-05-01

    Quantitative genetic analyses of bone data for 710 inter-related individuals 8-85 yr of age found high heritability estimates for BMC, bone area, and areal and volumetric BMD that varied across bone sites. Activity levels, especially time in moderate plus vigorous activity, had notable effects on bone. In some cases, these effects were age and sex specific. Genetic and environmental factors play a complex role in determining BMC, bone size, and BMD. This study assessed the heritability of bone measures; characterized the effects of age, sex, and physical activity on bone; and tested for age- and sex-specific bone effects of activity. Measures of bone size and areal and volumetric density (aBMD and vBMD, respectively) were obtained by DXA and pQCT on 710 related individuals (466 women) 8-85 yr of age. Measures of activity included percent time in moderate + vigorous activity (%ModVig), stair flights climbed per day, and miles walked per day. Quantitative genetic analyses were conducted to model the effects of activity and covariates on bone outcomes. Accounting for effects of age, sex, and activity levels, genes explained 40-62% of the residual variation in BMC and BMD and 27-75% in bone size (all p<0.001). Decline in femoral neck (FN), hip, and spine aBMD with advancing age was greater among women than men (age-by-sex interaction; all p

  17. The debt of nations and the distribution of ecological impacts from human activities

    PubMed Central

    Srinivasan, U. Thara; Carey, Susan P.; Hallstein, Eric; Higgins, Paul A. T.; Kerr, Amber C.; Koteen, Laura E.; Smith, Adam B.; Watson, Reg; Harte, John; Norgaard, Richard B.

    2008-01-01

    As human impacts to the environment accelerate, disparities in the distribution of damages between rich and poor nations mount. Globally, environmental change is dramatically affecting the flow of ecosystem services, but the distribution of ecological damages and their driving forces has not been estimated. Here, we conservatively estimate the environmental costs of human activities over 1961–2000 in six major categories (climate change, stratospheric ozone depletion, agricultural intensification and expansion, deforestation, overfishing, and mangrove conversion), quantitatively connecting costs borne by poor, middle-income, and rich nations to specific activities by each of these groups. Adjusting impact valuations for different standards of living across the groups as commonly practiced, we find striking imbalances. Climate change and ozone depletion impacts predicted for low-income nations have been overwhelmingly driven by emissions from the other two groups, a pattern also observed for overfishing damages indirectly driven by the consumption of fishery products. Indeed, through disproportionate emissions of greenhouse gases alone, the rich group may have imposed climate damages on the poor group greater than the latter's current foreign debt. Our analysis provides prima facie evidence for an uneven distribution pattern of damages across income groups. Moreover, our estimates of each group's share in various damaging activities are independent from controversies in environmental valuation methods. In a world increasingly connected ecologically and economically, our analysis is thus an early step toward reframing issues of environmental responsibility, development, and globalization in accordance with ecological costs. PMID:18212119

  18. Relationship between convective precipitation and lightning activity using radar quantitative precipitation estimates and total lightning data

    NASA Astrophysics Data System (ADS)

    Pineda, N.; Rigo, T.; Bech, J.; Argemí, O.

    2009-09-01

    Thunderstorms can be characterized by both rainfall and lightning. The relationship between convective precipitation and lightning activity may be used as an indicator of the rainfall regime. Besides, a better knowledge of local thunderstorm phenomenology can be very useful to assess weather surveillance tasks. Two types of approach can be distinguished in the bibliography when analyzing the rainfall and lightning activity. On one hand, rain yields (ratio of rain mass to cloud-to-ground flash over a common area) calculated for long temporal and spatial domains and using rain-gauge records to estimate the amounts of precipitation. On the other hand, a case-by-case approach has been used in many studies to analyze the relationship between convective precipitation and lightning in individual storms, using weather radar data to estimate rainfall volumes. Considering a local thunderstorm case study approach, the relation between rainfall and lightning is usually quantified as the Rainfall-Lightning ratio (RLR). This ratio estimates the convective rainfall volume per lightning flash. Intense storms tend to produce lower RLR values than moderate storms, but the range of RLR found in diverse studies is quite wide. This relationship depends on thunderstorm type, local climatology, convective regime, type of lightning flashes considered, oceanic and continental storms, etc. The objective of this paper is to analyze the relationship between convective precipitation and lightning in a case-by-case approach, by means of daily radar-derived quantitative precipitation estimates (QPE) and total lightning data, obtained from observations of the Servei Meteorològic de Catalunya remote sensing systems, which covers an area of approximately 50000 km2 in the NE of the Iberian Peninsula. The analyzed dataset is composed by 45 thunderstorm days from April to October 2008. A good daily correlation has been found between the radar QPE and the CG flash counts (best linear fit with a R^2=0.74). The daily RLR found has a mean value of 86 10^3m3 rainfall volume per CG flash. The daily range of variation is quite wide, as it goes from 19 to 222 10^3m3 per CG flash. This variation has a seasonal component, related to changes in the convective regime. Summer days (July to middle September) had a mean RLR of 57 10^3m3 rainfall volume per CG flash, while from middle September to the end of October the rainfall volume per CG flash doubles (mean of 125 10^3m3 per CG flash).

  19. Comparison of in silico models for prediction of mutagenicity.

    PubMed

    Bakhtyari, Nazanin G; Raitano, Giuseppa; Benfenati, Emilio; Martin, Todd; Young, Douglas

    2013-01-01

    Using a dataset with more than 6000 compounds, the performance of eight quantitative structure activity relationships (QSAR) models was evaluated: ACD/Tox Suite, Absorption, Distribution, Metabolism, Elimination, and Toxicity of chemical substances (ADMET) predictor, Derek, Toxicity Estimation Software Tool (T.E.S.T.), TOxicity Prediction by Komputer Assisted Technology (TOPKAT), Toxtree, CEASAR, and SARpy (SAR in python). In general, the results showed a high level of performance. To have a realistic estimate of the predictive ability, the results for chemicals inside and outside the training set for each model were considered. The effect of applicability domain tools (when available) on the prediction accuracy was also evaluated. The predictive tools included QSAR models, knowledge-based systems, and a combination of both methods. Models based on statistical QSAR methods gave better results.

  20. Estimation of Metabolism Characteristics for Heat-Injured Bacteria Using Dielectrophoretic Impedance Measurement Method

    NASA Astrophysics Data System (ADS)

    Amako, Eri; Enjoji, Takaharu; Uchida, Satoshi; Tochikubo, Fumiyoshi

    Constant monitoring and immediate control of fermentation processes have been required for advanced quality preservation in food industry. In the present work, simple estimation of metabolic states for heat-injured Escherichia coli (E. coli) in a micro-cell was investigated using dielectrophoretic impedance measurement (DEPIM) method. Temporal change in the conductance between micro-gap (ΔG) was measured for various heat treatment temperatures. In addition, the dependence of enzyme activity, growth capacity and membrane situation for E. coli on heat treatment temperature was also analyzed with conventional biological methods. Consequently, a correlation between ΔG and those biological properties was obtained quantitatively. This result suggests that DEPIM method will be available for an effective monitoring technique for complex change in various biological states of microorganisms.

  1. Direct Estimation of Optical Parameters From Photoacoustic Time Series in Quantitative Photoacoustic Tomography.

    PubMed

    Pulkkinen, Aki; Cox, Ben T; Arridge, Simon R; Goh, Hwan; Kaipio, Jari P; Tarvainen, Tanja

    2016-11-01

    Estimation of optical absorption and scattering of a target is an inverse problem associated with quantitative photoacoustic tomography. Conventionally, the problem is expressed as two folded. First, images of initial pressure distribution created by absorption of a light pulse are formed based on acoustic boundary measurements. Then, the optical properties are determined based on these photoacoustic images. The optical stage of the inverse problem can thus suffer from, for example, artefacts caused by the acoustic stage. These could be caused by imperfections in the acoustic measurement setting, of which an example is a limited view acoustic measurement geometry. In this work, the forward model of quantitative photoacoustic tomography is treated as a coupled acoustic and optical model and the inverse problem is solved by using a Bayesian approach. Spatial distribution of the optical properties of the imaged target are estimated directly from the photoacoustic time series in varying acoustic detection and optical illumination configurations. It is numerically demonstrated, that estimation of optical properties of the imaged target is feasible in limited view acoustic detection setting.

  2. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  3. Uncertainty Quantification Techniques for Population Density Estimates Derived from Sparse Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Robert N; White, Devin A; Urban, Marie L

    2013-01-01

    The Population Density Tables (PDT) project at the Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort whichmore » considers over 250 countries, spans 40 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.« less

  4. The Interrupted Power Law and the Size of Shadow Banking

    PubMed Central

    Fiaschi, Davide; Kondor, Imre; Marsili, Matteo; Volpati, Valerio

    2014-01-01

    Using public data (Forbes Global 2000) we show that the asset sizes for the largest global firms follow a Pareto distribution in an intermediate range, that is “interrupted” by a sharp cut-off in its upper tail, where it is totally dominated by financial firms. This flattening of the distribution contrasts with a large body of empirical literature which finds a Pareto distribution for firm sizes both across countries and over time. Pareto distributions are generally traced back to a mechanism of proportional random growth, based on a regime of constant returns to scale. This makes our findings of an “interrupted” Pareto distribution all the more puzzling, because we provide evidence that financial firms in our sample should operate in such a regime. We claim that the missing mass from the upper tail of the asset size distribution is a consequence of shadow banking activity and that it provides an (upper) estimate of the size of the shadow banking system. This estimate–which we propose as a shadow banking index–compares well with estimates of the Financial Stability Board until 2009, but it shows a sharper rise in shadow banking activity after 2010. Finally, we propose a proportional random growth model that reproduces the observed distribution, thereby providing a quantitative estimate of the intensity of shadow banking activity. PMID:24728096

  5. Comparative assessment of endocrine modulators with oestrogenic activity: I. Definition of a hygiene-based margin of safety (HBMOS) for xeno-oestrogens against the background of European developments.

    PubMed

    Bolt, H M; Janning, P; Michna, H; Degen, G H

    2001-01-01

    A novel concept - the hygiene-based margin of safety (HBMOS) - is suggested for the assessment of the impact of potential endocrine modulators. It integrates exposure scenarios and potency data for industrial chemicals and naturally occurring dietary compounds with oestrogenic activity. An HBMOS is defined as a quotient of estimated daily intakes weighted by the relative in vivo potencies of these compounds. The Existing Chemicals Programme of the European Union provides Human and Environmental Risk Assessments of Existing Chemicals which include human exposure scenarios. Such exposure scenarios, along with potency estimates for endocrine activities, may provide a basis for a quantitative comparison of the potential endocrine-modulating effects of industrial chemicals with endocrine modulators as natural constituents of human diet. Natural phyto-oestrogens exhibit oestrogenic activity in vitro and in vivo. Important phyto-oestrogens for humans are isoflavones (daidzein, genistein) and lignans, with the highest quantities found in soybeans and flaxseed, respectively. Daily isoflavone exposures calculated for infants on soy-based formulae were in the ranges of 4.5-8 mg/kg body wt.; estimates for adults range up to 1 mg/kg body wt. The Senate Commission on the Evaluation of Food Safety (SKLM) of the Deutsche Forschungsgemeinschaft has also indicated a wide range of dietary exposures. For matters of risk assessment, the SKLM has based recommendations on dietary exposure scenarios, implying a daily intake of phyto-oestrogens in the order of 1 mg/kg body wt. On the basis of information compiled within the Existing Chemicals Programme of the EU, it appears that a daily human exposure to nonylphenol of 2 microg/kg body wt. may be a worst-case assumption, but which is based on valid scenarios. The intake of octylphenol is much lower, due to a different use pattern and applications, and may be neglected. Data from migration studies led to estimations of the daily human uptake of bisphenol A of maximally 1 microg/kg body wt. On the basis of comparative data from uterotrophic assays in rats, with three consecutive days of oral applications involved, and taking the natural phyto-oestrogen daidzein as reference (= 1), relative uterotrophic activities in DA/Han rats follow the sequence: daidzein = 1; bisphenol A = 1; p-tertoctylphenol = 2; o, p'-DDT = 4; ethinyl oestradiol = 40,000. The derived values from exposure scenarios, as well as these relative potency values and bridging assumptions, led to calculations of HBMOS as a quantitative comparison of potential endocrine-modulating effects of industrial chemicals with those of natural constituents of human diet. HBMOS estimates for nonylphenol ranged between 250 and 500, dependent on bridging assumptions, and around 1000 for bisphenol A. The derivations of HBMOS were in full support of the conclusions reached by the SKLM of the Deutsche Forschungsgemeinschaft. The estimated HBMOS values for the industrial chemicals (nonylphenol, bisphenol A) appear sufficiently high to ensure the absence of a practical risk to human health under the present exposure conditions.

  6. A theoretical/experimental program to develop active optical pollution sensors: Quantitative remote Raman lidar measurements of pollutants from stationary sources

    NASA Technical Reports Server (NTRS)

    Poultney, S. K.; Brumfield, M. L.; Siviter, J. S.

    1975-01-01

    Typical pollutant gas concentrations at the stack exits of stationary sources can be estimated to be about 500 ppm under the present emission standards. Raman lidar has a number of advantages which makes it a valuable tool for remote measurements of these stack emissions. Tests of the Langley Research Center Raman lidar at a calibration tank indicate that night measurements of SO2 concentrations and stack opacity are possible. Accuracies of 10 percent are shown to be achievable from a distance of 300 m within 30 min integration times for 500 ppm SO2 at the stack exits. All possible interferences were examined quantitatively (except for the fluorescence of aerosols in actual stack emissions) and found to have negligible effect on the measurements. An early test at an instrumented stack is strongly recommended.

  7. Mechanism on brain information processing: Energy coding

    NASA Astrophysics Data System (ADS)

    Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa

    2006-09-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.

  8. Effect of substituents on prediction of TLC retention of tetra-dentate Schiff bases and their Copper(II) and Nickel(II) complexes.

    PubMed

    Stevanović, Nikola R; Perušković, Danica S; Gašić, Uroš M; Antunović, Vesna R; Lolić, Aleksandar Đ; Baošić, Rada M

    2017-03-01

    The objectives of this study were to gain insights into structure-retention relationships and to propose the model to estimating their retention. Chromatographic investigation of series of 36 Schiff bases and their copper(II) and nickel(II) complexes was performed under both normal- and reverse-phase conditions. Chemical structures of the compounds were characterized by molecular descriptors which are calculated from the structure and related to the chromatographic retention parameters by multiple linear regression analysis. Effects of chelation on retention parameters of investigated compounds, under normal- and reverse-phase chromatographic conditions, were analyzed by principal component analysis, quantitative structure-retention relationship and quantitative structure-activity relationship models were developed on the basis of theoretical molecular descriptors, calculated exclusively from molecular structure, and parameters of retention and lipophilicity. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Energy coding in biological neural networks

    PubMed Central

    Zhang, Zhikang

    2007-01-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513

  10. Dynamic mechanical analysis of storage modulus development in light-activated polymer matrix composites.

    PubMed

    Sakaguchi, Ronald L; Shah, Nilam C; Lim, Bum Soon; Ferracane, Jack L; Borgersen, Svenn E

    2002-05-01

    The goal of this study was to evaluate the potential for using dynamic mechanical analysis of tubular geometry in a three-point flexure fixture for monitoring the storage modulus development of a light-activated polymer matrix composite. Composite samples were inserted into PTFE tubes and tested in a three-point bend fixture in a dynamic mechanical analyzer (DMA) at 200 Hz with 20 microm amplitude. Samples were light activated for 60s (385 mW/cm(2) at the composite surface) and storage modulus (E') was measured continuously for the seven light-activated composites studied (one microfill, four hybrids and two unfilled resins). Cores of composite were removed from the PTFE sheath after 13.5 min and evaluated with the same parameters in the DMA. A finite element model of the test configuration was created and used to estimate operating parameters for the DMA. Degree of conversion (DC) was measured using micro-Fourier Transform Infrared (FTIR) spectroscopy for the microfilled composite samples and one hybrid 13.5 and 60 min after light activation. The E' for a generic hybrid and microfilled composite was 13,400+/-1100 and 5900+/-200 MPa, respectively, when cured within the tube and then removed and tested in the DMA. DC was 54.6% for the hybrid and 60.6% for the microfill. A linear regression of E' for the sheath and core vs core alone (r(2)=0.986) indicated a linear scaling of the sheath and core values for E' enabling a correction for estimated E' values of the composite core. This method estimates the storage modulus growth during light-activated polymerization of highly filled dimethacrylates. Although the approach is phenomenological in that quantitative measurements of E' are not made directly from the DMA, estimates of early polymerization kinetics appear to be validated by three different approaches.

  11. Use of plasma creatine kinase pharmacokinetics to estimate the amount of excercise-induced muscle damage in Beagles.

    PubMed

    Chanoit, G P; Lefebvre, H P; Orcel, K; Laroute, V; Toutain, P L; Braun, J P

    2001-09-01

    To assess the effects of moderate exercise on plasma creatine kinase (CK) pharmacokinetics and to estimate exercise-induced muscle damage in dogs. 6 untrained adult Beagles. The study was divided into 3 phases. In phase 1, dogs ran for 1 hour at a speed of 9 km/h, and samples were used to determine the area under the plasma CK activity versus time curve (AUC) induced by exercise. In phases 2 and 3, pharmacokinetics of CK were calculated in dogs during exercise and at rest, respectively. Values for AUC and plasma clearance (CI) were used to estimate muscle damage. At rest, values for Cl, steady-state volume of distribution (Vdss), and mean retention time (MRT) were 0.32+/-0.02 ml/kg of body weight/min, 57+/-173 ml/kg, and 3.0+/-0.57 h, respectively. During exercise, Cl decreased significantly (0.26+/-0.03 ml/kg/min), MRT increased significantly, (4.4+/-0.97 h), and Vdss remained unchanged. Peak of plasma CK activity (151+/-58.8 U/L) was observed 3 hours after completion of exercise. Estimated equivalent amount of muscle corresponding to the quantity of CK released was 41+/-29.3 mg/kg. These results revealed that exercise had a minor effect on CK disposition and that the equivalent amount of muscle damaged by moderate exercise was negligible. This study illustrates the relevance for use of the minimally invasive and quantitative pharmacokinetic approach when estimating muscle damage.

  12. Quantitative evaluation of dual-flip-angle T1 mapping on DCE-MRI kinetic parameter estimation in head and neck

    PubMed Central

    Chow, Steven Kwok Keung; Yeung, David Ka Wai; Ahuja, Anil T; King, Ann D

    2012-01-01

    Purpose To quantitatively evaluate the kinetic parameter estimation for head and neck (HN) dynamic contrast-enhanced (DCE) MRI with dual-flip-angle (DFA) T1 mapping. Materials and methods Clinical DCE-MRI datasets of 23 patients with HN tumors were included in this study. T1 maps were generated based on multiple-flip-angle (MFA) method and different DFA combinations. Tofts model parameter maps of kep, Ktrans and vp based on MFA and DFAs were calculated and compared. Fitted parameter by MFA and DFAs were quantitatively evaluated in primary tumor, salivary gland and muscle. Results T1 mapping deviations by DFAs produced remarkable kinetic parameter estimation deviations in head and neck tissues. In particular, the DFA of [2º, 7º] overestimated, while [7º, 12º] and [7º, 15º] underestimated Ktrans and vp, significantly (P<0.01). [2º, 15º] achieved the smallest but still statistically significant overestimation for Ktrans and vp in primary tumors, 32.1% and 16.2% respectively. kep fitting results by DFAs were relatively close to the MFA reference compared to Ktrans and vp. Conclusions T1 deviations induced by DFA could result in significant errors in kinetic parameter estimation, particularly Ktrans and vp, through Tofts model fitting. MFA method should be more reliable and robust for accurate quantitative pharmacokinetic analysis in head and neck. PMID:23289084

  13. A Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) Quantitative Analysis Method Based on the Auto-Selection of an Internal Reference Line and Optimized Estimation of Plasma Temperature.

    PubMed

    Yang, Jianhong; Li, Xiaomeng; Xu, Jinwu; Ma, Xianghong

    2018-01-01

    The quantitative analysis accuracy of calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is severely affected by the self-absorption effect and estimation of plasma temperature. Herein, a CF-LIBS quantitative analysis method based on the auto-selection of internal reference line and the optimized estimation of plasma temperature is proposed. The internal reference line of each species is automatically selected from analytical lines by a programmable procedure through easily accessible parameters. Furthermore, the self-absorption effect of the internal reference line is considered during the correction procedure. To improve the analysis accuracy of CF-LIBS, the particle swarm optimization (PSO) algorithm is introduced to estimate the plasma temperature based on the calculation results from the Boltzmann plot. Thereafter, the species concentrations of a sample can be calculated according to the classical CF-LIBS method. A total of 15 certified alloy steel standard samples of known compositions and elemental weight percentages were used in the experiment. Using the proposed method, the average relative errors of Cr, Ni, and Fe calculated concentrations were 4.40%, 6.81%, and 2.29%, respectively. The quantitative results demonstrated an improvement compared with the classical CF-LIBS method and the promising potential of in situ and real-time application.

  14. Clinical pharmacists in general practice: an initial evaluation of activity in one English primary care organisation.

    PubMed

    Bush, Joseph; Langley, Christopher A; Jenkins, Duncan; Johal, Jaspal; Huckerby, Clair

    2017-12-27

    This aim of this research was to characterise the breadth and volume of activity conducted by clinical pharmacists in general practice in Dudley Clinical Commissioning Group (CCG), and to provide quantitative estimates of both the savings in general practitioner (GP) time and the financial savings attributable to such activity. This descriptive observational study retrospectively analysed quantitative data collected by Dudley CCG concerning the activity of clinical pharmacists in GP practices during 2015. Over the 9-month period for which data were available, the 5.4 whole time equivalent clinical pharmacists operating in GP practices within Dudley CCG identified 23 172 interventions. Ninety-five per cent of the interventions identified were completed within the study period saving the CCG in excess of £1 000 000. During the 4 months for which resource allocation data were available, the clinical pharmacists saved 628 GP appointments plus an additional 647 h that GPs currently devote to medication review and the management of repeat prescribing. This research suggests that clinical pharmacists in general practice in Dudley CCG are able to deliver clinical interventions efficiently and in high volume. In doing so, clinical pharmacists were able to generate considerable financial returns on investment. Further work is recommended to examine the effectiveness and cost-effectiveness of clinical pharmacists in general practice in improving outcomes for patients. © 2017 Royal Pharmaceutical Society.

  15. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  16. Quantitative estimation of film forming polymer-plasticizer interactions by the Lorentz-Lorenz Law.

    PubMed

    Dredán, J; Zelkó, R; Dávid, A Z; Antal, I

    2006-03-09

    Molar refraction as well as refractive index has many uses. Beyond confirming the identity and purity of a compound, determination of molecular structure and molecular weight, molar refraction is also used in other estimation schemes, such as in critical properties, surface tension, solubility parameter, molecular polarizability, dipole moment, etc. In the present study molar refraction values of polymer dispersions were determined for the quantitative estimation of film forming polymer-plasticizer interactions. Information can be obtained concerning the extent of interaction between the polymer and the plasticizer from the calculation of molar refraction values of film forming polymer dispersions containing plasticizer.

  17. A quantitative approach to combine sources in stable isotope mixing models

    EPA Science Inventory

    Stable isotope mixing models, used to estimate source contributions to a mixture, typically yield highly uncertain estimates when there are many sources and relatively few isotope elements. Previously, ecologists have either accepted the uncertain contribution estimates for indiv...

  18. QuantFusion: Novel Unified Methodology for Enhanced Coverage and Precision in Quantifying Global Proteomic Changes in Whole Tissues.

    PubMed

    Gunawardena, Harsha P; O'Brien, Jonathon; Wrobel, John A; Xie, Ling; Davies, Sherri R; Li, Shunqiang; Ellis, Matthew J; Qaqish, Bahjat F; Chen, Xian

    2016-02-01

    Single quantitative platforms such as label-based or label-free quantitation (LFQ) present compromises in accuracy, precision, protein sequence coverage, and speed of quantifiable proteomic measurements. To maximize the quantitative precision and the number of quantifiable proteins or the quantifiable coverage of tissue proteomes, we have developed a unified approach, termed QuantFusion, that combines the quantitative ratios of all peptides measured by both LFQ and label-based methodologies. Here, we demonstrate the use of QuantFusion in determining the proteins differentially expressed in a pair of patient-derived tumor xenografts (PDXs) representing two major breast cancer (BC) subtypes, basal and luminal. Label-based in-spectra quantitative peptides derived from amino acid-coded tagging (AACT, also known as SILAC) of a non-malignant mammary cell line were uniformly added to each xenograft with a constant predefined ratio, from which Ratio-of-Ratio estimates were obtained for the label-free peptides paired with AACT peptides in each PDX tumor. A mixed model statistical analysis was used to determine global differential protein expression by combining complementary quantifiable peptide ratios measured by LFQ and Ratio-of-Ratios, respectively. With minimum number of replicates required for obtaining the statistically significant ratios, QuantFusion uses the distinct mechanisms to "rescue" the missing data inherent to both LFQ and label-based quantitation. Combined quantifiable peptide data from both quantitative schemes increased the overall number of peptide level measurements and protein level estimates. In our analysis of the PDX tumor proteomes, QuantFusion increased the number of distinct peptide ratios by 65%, representing differentially expressed proteins between the BC subtypes. This quantifiable coverage improvement, in turn, not only increased the number of measurable protein fold-changes by 8% but also increased the average precision of quantitative estimates by 181% so that some BC subtypically expressed proteins were rescued by QuantFusion. Thus, incorporating data from multiple quantitative approaches while accounting for measurement variability at both the peptide and global protein levels make QuantFusion unique for obtaining increased coverage and quantitative precision for tissue proteomes. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  19. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation may be feasible. (©) RSNA, 2016 Online supplemental material is available for this article.

  20. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006

    PubMed Central

    Chen, Lin; Ray, Shonket; Keller, Brad M.; Pertuz, Said; McDonald, Elizabeth S.; Conant, Emily F.

    2016-01-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88–0.95; weighted κ = 0.83–0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76–0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation may be feasible. © RSNA, 2016 Online supplemental material is available for this article. PMID:27002418

  1. Specification and estimation of sources of bias affecting neurological studies in PET/MR with an anatomical brain phantom

    NASA Astrophysics Data System (ADS)

    Teuho, J.; Johansson, J.; Linden, J.; Saunavaara, V.; Tolvanen, T.; Teräs, M.

    2014-01-01

    Selection of reconstruction parameters has an effect on the image quantification in PET, with an additional contribution from a scanner-specific attenuation correction method. For achieving comparable results in inter- and intra-center comparisons, any existing quantitative differences should be identified and compensated for. In this study, a comparison between PET, PET/CT and PET/MR is performed by using an anatomical brain phantom, to identify and measure the amount of bias caused due to differences in reconstruction and attenuation correction methods especially in PET/MR. Differences were estimated by using visual, qualitative and quantitative analysis. The qualitative analysis consisted of a line profile analysis for measuring the reproduction of anatomical structures and the contribution of the amount of iterations to image contrast. The quantitative analysis consisted of measurement and comparison of 10 anatomical VOIs, where the HRRT was considered as the reference. All scanners reproduced the main anatomical structures of the phantom adequately, although the image contrast on the PET/MR was inferior when using a default clinical brain protocol. Image contrast was improved by increasing the amount of iterations from 2 to 5 while using 33 subsets. Furthermore, a PET/MR-specific bias was detected, which resulted in underestimation of the activity values in anatomical structures closest to the skull, due to the MR-derived attenuation map that ignores the bone. Thus, further improvements for the PET/MR reconstruction and attenuation correction could be achieved by optimization of RAMLA-specific reconstruction parameters and implementation of bone to the attenuation template.

  2. Connecting Taxon-Specific Microbial Activities to Carbon Cycling in the Rhizosphere

    NASA Astrophysics Data System (ADS)

    Hungate, B. A.; Morrissey, E.; Schwartz, E.; Dijkstra, P.; Blazewicz, S.; Pett-Ridge, J.; Koch, G. W.; Marks, J.; Koch, B.; McHugh, T. A.; Mau, R. L.; Hayer, M.

    2016-12-01

    Plant carbon inputs influence microbial growth in the rhizosphere, but the quantitative details of these effects are not well understood, nor are their consequences for carbon cycling in the rhizosphere. With a new pulse of carbon input to soil, which microbial taxa increase their growth rates, and by how much? Do any microbial taxa respond negatively? And how does the extra carbon addition alter the utilization of other resources, including other carbon sources, as well as inorganic nitrogen? This talk will present new research using quantitative stable isotope probing that reveals the distribution of growth responses among microbial taxa, from positive to neutral to negative, and how these growth responses are associated with various substrates. For example, decomposition of soil C in response to added labile carbon occurred as a phylogenetically-diverse majority of taxa shifted toward soil C use for growth. In contrast, bacteria with suppressed growth or that relied directly on glucose for growth clustered strongly by phylogeny. These results suggest that priming is a prototypical response of bacteria to sustained labile C addition, consistent with the widespread occurrence of the priming effect in nature. These results also illustrate the potential power of molecular tools and models that seek to estimate metrics directly relevant to quantitative ecology and biogeochemistry, moreso than is the standard currently in microbial ecology. Tools that estimate growth rate, mortality rate, and rates of substrate use - all quantified with the taxonomic precision afforded by modern sequencing - provide a foundation for quantifying the biogeochemical significance of microbial biodiversity, and a more complete understanding of the rich ecosystem of the rhizosphere.

  3. Estimated congener specific gas-phase atmospheric behavior and fractionation of perfluoroalkyl compounds: rates of reaction with atmospheric oxidants, air-water partitioning, and wet/dry deposition lifetimes.

    PubMed

    Rayne, Sierra; Forest, Kaya; Friesen, Ken J

    2009-08-01

    A quantitative structure-activity model has been validated for estimating congener specific gas-phase hydroxyl radical reaction rates for perfluoroalkyl sulfonic acids (PFSAs), carboxylic acids (PFCAs), aldehydes (PFAls) and dihydrates, fluorotelomer olefins (FTOls), alcohols (FTOHs), aldehydes (FTAls), and acids (FTAcs), and sulfonamides (SAs), sulfonamidoethanols (SEs), and sulfonamido carboxylic acids (SAAs), and their alkylated derivatives based on calculated semi-empirical PM6 method ionization potentials. Corresponding gas-phase reaction rates with nitrate radicals and ozone have also been estimated using the computationally derived ionization potentials. Henry's law constants for these classes of perfluorinated compounds also appear to be reasonably approximated by the SPARC software program, thereby allowing estimation of wet and dry atmospheric deposition rates. Both congener specific gas-phase atmospheric and air-water interface fractionation of these compounds is expected, complicating current source apportionment perspectives and necessitating integration of such differential partitioning influences into future multimedia models. The findings will allow development and refinement of more accurate and detailed local through global scale atmospheric models for the atmospheric fate of perfluoroalkyl compounds.

  4. Dengue prediction by the web: Tweets are a useful tool for estimating and forecasting Dengue at country and city level.

    PubMed

    Marques-Toledo, Cecilia de Almeida; Degener, Carolin Marlen; Vinhal, Livia; Coelho, Giovanini; Meira, Wagner; Codeço, Claudia Torres; Teixeira, Mauro Martins

    2017-07-01

    Infectious diseases are a leading threat to public health. Accurate and timely monitoring of disease risk and progress can reduce their impact. Mentioning a disease in social networks is correlated with physician visits by patients, and can be used to estimate disease activity. Dengue is the fastest growing mosquito-borne viral disease, with an estimated annual incidence of 390 million infections, of which 96 million manifest clinically. Dengue burden is likely to increase in the future owing to trends toward increased urbanization, scarce water supplies and, possibly, environmental change. The epidemiological dynamic of Dengue is complex and difficult to predict, partly due to costly and slow surveillance systems. In this study, we aimed to quantitatively assess the usefulness of data acquired by Twitter for the early detection and monitoring of Dengue epidemics, both at country and city level at a weekly basis. Here, we evaluated and demonstrated the potential of tweets modeling for Dengue estimation and forecast, in comparison with other available web-based data, Google Trends and Wikipedia access logs. Also, we studied the factors that might influence the goodness-of-fit of the model. We built a simple model based on tweets that was able to 'nowcast', i.e. estimate disease numbers in the same week, but also 'forecast' disease in future weeks. At the country level, tweets are strongly associated with Dengue cases, and can estimate present and future Dengue cases until 8 weeks in advance. At city level, tweets are also useful for estimating Dengue activity. Our model can be applied successfully to small and less developed cities, suggesting a robust construction, even though it may be influenced by the incidence of the disease, the activity of Twitter locally, and social factors, including human development index and internet access. Tweets association with Dengue cases is valuable to assist traditional Dengue surveillance at real-time and low-cost. Tweets are able to successfully nowcast, i.e. estimate Dengue in the present week, but also forecast, i.e. predict Dengue at until 8 weeks in the future, both at country and city level with high estimation capacity.

  5. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    PubMed

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  6. Simplifying volumes-of-interest (VOIs) definition in quantitative SPECT: Beyond manual definition of 3D whole-organ VOIs.

    PubMed

    Vicente, Esther M; Lodge, Martin A; Rowe, Steven P; Wahl, Richard L; Frey, Eric C

    2017-05-01

    We investigated the feasibility of using simpler methods than manual whole-organ volume-of-interest (VOI) definition to estimate the organ activity concentration in single photon emission computed tomography (SPECT) in cases where the activity in the organ can be assumed to be uniformly distributed on the scale of the voxel size. In particular, we investigated an anatomic region-of-interest (ROI) defined in a single transaxial slice, and a single sphere placed inside the organ boundaries. The evaluation was carried out using Monte Carlo simulations based on patient indium 111 In pentetreotide SPECT and computed tomography (CT) images. We modeled constant activity concentrations in each organ, validating this assumption by comparing the distribution of voxel values inside the organ VOIs of the simulated data with the patient data. We simulated projection data corresponding to 100, 50, and 25% of the clinical count level to study the effects of noise level due to shortened acquisition time. Images were reconstructed using a previously validated quantitative SPECT reconstruction method. The evaluation was performed in terms of the accuracy and precision of the activity concentration estimates. The results demonstrated that the non-uniform image intensity observed in the reconstructed images in the organs with normal uptake was consistent with uniform activity concentration in the organs on the scale of the voxel size; observed non-uniformities in image intensity were due to a combination of partial-volume effects at the boundaries of the organ, artifacts in the reconstructed image due to collimator-detector response compensation, and noise. Using an ROI defined in a single transaxial slice produced similar biases compared to the three-dimensional (3D) whole-organ VOIs, provided that the transaxial slice was near the central plane of the organ and that the pixels from the organ boundaries were not included in the ROI. Although this slice method was sensitive to noise, biases were less than 10% for all the noise levels studied. The use of spherical VOIs was more sensitive to noise. The method was more accurate for larger spheres and larger organs such as the liver in comparison to the kidneys. Biases lower than 7% were found in the liver when using large enough spheres (radius ≥ 28 mm), regardless of the position, of the VOI inside the organ even with shortened acquisition times. The biases were more position-dependent for smaller organs. Both of the simpler methods provided suitable surrogates in terms of accuracy and precision. The results suggested that a spherical VOI was more appropriate for estimating the activity concentration in larger organs such as the liver, regardless of the position of the sphere inside the organ. Larger spheres resulted in better estimates. A single-slice ROI was more suitable for activity estimation in smaller organs such as the kidneys, providing that the transaxial slice selected was near the central plane of the organ and that voxels from the organ boundaries were excluded. Under those conditions, activity concentrations with biases lower than 5% were observed for all the studied count levels and coefficients of variation were less than 9% and 5% for the 25% and 100% count levels, respectively. © 2017 American Association of Physicists in Medicine.

  7. Estimated stocks of circumpolar permafrost carbon with quantified uncertainty ranges and identified data gaps

    DOE PAGES

    Hugelius, Gustaf; Strauss, J.; Zubrzycki, S.; ...

    2014-12-01

    Soils and other unconsolidated deposits in the northern circumpolar permafrost region store large amounts of soil organic carbon (SOC). This SOC is potentially vulnerable to remobilization following soil warming and permafrost thaw, but SOC stock estimates were poorly constrained and quantitative error estimates were lacking. This study presents revised estimates of permafrost SOC stocks, including quantitative uncertainty estimates, in the 0–3 m depth range in soils as well as for sediments deeper than 3 m in deltaic deposits of major rivers and in the Yedoma region of Siberia and Alaska. Revised estimates are based on significantly larger databases compared tomore » previous studies. Despite this there is evidence of significant remaining regional data gaps. Estimates remain particularly poorly constrained for soils in the High Arctic region and physiographic regions with thin sedimentary overburden (mountains, highlands and plateaus) as well as for deposits below 3 m depth in deltas and the Yedoma region. While some components of the revised SOC stocks are similar in magnitude to those previously reported for this region, there are substantial differences in other components, including the fraction of perennially frozen SOC. Upscaled based on regional soil maps, estimated permafrost region SOC stocks are 217 ± 12 and 472 ± 27 Pg for the 0–0.3 and 0–1 m soil depths, respectively (±95% confidence intervals). Storage of SOC in 0–3 m of soils is estimated to 1035 ± 150 Pg. Of this, 34 ± 16 Pg C is stored in poorly developed soils of the High Arctic. Based on generalized calculations, storage of SOC below 3 m of surface soils in deltaic alluvium of major Arctic rivers is estimated as 91 ± 52 Pg. In the Yedoma region, estimated SOC stocks below 3 m depth are 181 ± 54 Pg, of which 74 ± 20 Pg is stored in intact Yedoma (late Pleistocene ice- and organic-rich silty sediments) with the remainder in refrozen thermokarst deposits. Total estimated SOC storage for the permafrost region is ∼1300 Pg with an uncertainty range of ∼1100 to 1500 Pg. Of this, ∼500 Pg is in non-permafrost soils, seasonally thawed in the active layer or in deeper taliks, while ∼800 Pg is perennially frozen. In conclusion, this represents a substantial ∼300 Pg lowering of the estimated perennially frozen SOC stock compared to previous estimates.« less

  8. Quantitative microbial risk assessment model for Legionnaires' disease: assessment of human exposures for selected spa outbreaks.

    PubMed

    Armstrong, Thomas W; Haas, Charles N

    2007-08-01

    Evaluation of a quantitative microbial risk assessment (QMRA) model for Legionnaires' disease (LD) required Legionella exposure estimates for several well-documented LD outbreaks. Reports for a whirlpool spa and two natural spring spa outbreaks provided data for the exposure assessment, as well as rates of infection and mortality. Exposure estimates for the whirlpool spa outbreak employed aerosol generation, water composition, exposure duration data, and building ventilation parameters with a two-zone model. Estimates for the natural hot springs outbreaks used bacterial water to air partitioning coefficients and exposure duration information. The air concentration and dose calculations used input parameter distributions with Monte Carlo simulations to estimate exposures as probability distributions. The assessment considered two sets of assumptions about the transfer of Legionella from the water phase to the aerosol emitted from the whirlpool spa. The estimated air concentration near the whirlpool spa was 5 to 18 colony forming units per cubic meter (CFU/m(3)) and 50 to 180 CFU/m(3) for each of the alternate assumptions. The estimated 95th percentile ranges of Legionella dose for workers within 15 m of the whirlpool spa were 0.13-3.4 CFU and 1.3-34.5 CFU, respectively. The modeling for hot springs Spas 1 and 2 resulted in estimated arithmetic mean air concentrations of 360 and 17 CFU/m(3), respectively, and 95 percentile ranges for Legionella dose of 28 to 67 CFU and 1.1 to 3.7 CFU, respectively. The Legionella air concentration estimates fall in the range of limited reports on air concentrations of Legionella (0.33 to 190 CFU/m(3)) near showers, aerated faucets, and baths during filling with Legionella-contaminated water. These measurements may provide some indication that the estimates are of a reasonable magnitude, but they do not clarify the exposure estimates accuracy, since they were not obtained during LD outbreaks. Further research to improve the data used for the Legionella exposure assessment would strengthen the results. Several of the primary additional data needs include improved data for bacterial water to air partitioning coefficients, better accounting of time-activity-distance patterns and exposure potential in outbreak reports, and data for Legionella-containing aerosol viability decay instead of loss of capability for growth in culture.

  9. Inter-rater reliability of motor unit number estimates and quantitative motor unit analysis in the tibialis anterior muscle.

    PubMed

    Boe, S G; Dalton, B H; Harwood, B; Doherty, T J; Rice, C L

    2009-05-01

    To establish the inter-rater reliability of decomposition-based quantitative electromyography (DQEMG) derived motor unit number estimates (MUNEs) and quantitative motor unit (MU) analysis. Using DQEMG, two examiners independently obtained a sample of needle and surface-detected motor unit potentials (MUPs) from the tibialis anterior muscle from 10 subjects. Coupled with a maximal M wave, surface-detected MUPs were used to derive a MUNE for each subject and each examiner. Additionally, size-related parameters of the individual MUs were obtained following quantitative MUP analysis. Test-retest MUNE values were similar with high reliability observed between examiners (ICC=0.87). Additionally, MUNE variability from test-retest as quantified by a 95% confidence interval was relatively low (+/-28 MUs). Lastly, quantitative data pertaining to MU size, complexity and firing rate were similar between examiners. MUNEs and quantitative MU data can be obtained with high reliability by two independent examiners using DQEMG. Establishing the inter-rater reliability of MUNEs and quantitative MU analysis using DQEMG is central to the clinical applicability of the technique. In addition to assessing response to treatments over time, multiple clinicians may be involved in the longitudinal assessment of the MU pool of individuals with disorders of the central or peripheral nervous system.

  10. Development and validation of a micellar electrokinetic chromatography method for quantitative determination of butenolides in Piper malacophyllum (C. Presl) C. DC.

    PubMed

    de Oliveira, Alberto; Silva, Claudinei A; Silva, Adalberto M; Tavares, Marina F M; Kato, Massuo J

    2010-01-01

    A large number of natural and synthetic compounds having butenolides as a core unit have been described and many of them display a wide range of biological activities. Butenolides from P. malacophyllum have presented potential antifungal activities but no specific, fast, and precise method has been developed for their determination. To develop a methodology based on micellar electrokinetic chromatography to determine butenolides in Piper species. The extracts were analysed in an uncoated fused-silica capillaries and for the micellar system 20 mmol/L SDS, 20% (v/v) acetonitrile (ACN) and 10 mmol/L STB aqueous buffer at pH 9.2 were used. The method was validated for precision, linearity, limit of detection (LOD) and limit of quantitation (LOQ) and the standard deviations were determined from the standard errors estimated by the regression line. A micellar electrokinetic chromatography (MEKC) method for determination of butenolides in extracts gave full resolution for 1 and 2. The analytical curve in the range 10.0-50.0 µg/mL (r(2) = 0.999) provided LOD and LOQ for 1 and 2 of 2.1/6.3 and 1.1/3.5 µg/mL, respectively. The RSD for migration times were 0.12 and 1.0% for peak area ratios with 100.0 ± 1.4% of recovery. A novel high-performance MEKC method developed for the analysis of butenolides 1 and 2 in leaf extracts of P. malacophyllum allowed their quantitative determined within an analysis time shorter than 5 min and the results indicated CE to be a feasible analytical technique for the quantitative determination of butenolides in Piper extracts. Copyright © 2010 John Wiley & Sons, Ltd.

  11. Families of short interspersed elements in the genome of the oomycete plant pathogen, Phytophthora infestans.

    PubMed

    Whisson, Stephen C; Avrova, Anna O; Lavrova, Olga; Pritchard, Leighton

    2005-04-01

    The first known families of tRNA-related short interspersed elements (SINEs) in the oomycetes were identified by exploiting the genomic DNA sequence resources for the potato late blight pathogen, Phytophthora infestans. Fifteen families of tRNA-related SINEs, as well as predicted tRNAs, and other possible RNA polymerase III-transcribed sequences were identified. The size of individual elements ranges from 101 to 392 bp, representing sequences present from low (1) to highly abundant (over 2000) copy number in the P. infestans genome, based on quantitative PCR analysis. Putative short direct repeat sequences (6-14 bp) flanking the elements were also identified for eight of the SINEs. Predicted SINEs were named in a series prefixed infSINE (for infestans-SINE). Two SINEs were apparently present as multimers of tRNA-related units; four copies of a related unit for infSINEr, and two unrelated units for infSINEz. Two SINEs, infSINEh and infSINEi, were typically located within 400 bp of each other. These were also the only two elements identified as being actively transcribed in the mycelial stage of P. infestans by RT-PCR. It is possible that infSINEh and infSINEi represent active retrotransposons in P. infestans. Based on the quantitative PCR estimates of copy number for all of the elements identified, tRNA-related SINEs were estimated to comprise 0.3% of the 250 Mb P. infestans genome. InfSINE-related sequences were found to occur in species throughout the genus Phytophthora. However, seven elements were shown to be exclusive to P. infestans.

  12. Towards cheminformatics-based estimation of drug therapeutic index: Predicting the protective index of anticonvulsants using a new quantitative structure-index relationship approach.

    PubMed

    Chen, Shangying; Zhang, Peng; Liu, Xin; Qin, Chu; Tao, Lin; Zhang, Cheng; Yang, Sheng Yong; Chen, Yu Zong; Chui, Wai Keung

    2016-06-01

    The overall efficacy and safety profile of a new drug is partially evaluated by the therapeutic index in clinical studies and by the protective index (PI) in preclinical studies. In-silico predictive methods may facilitate the assessment of these indicators. Although QSAR and QSTR models can be used for predicting PI, their predictive capability has not been evaluated. To test this capability, we developed QSAR and QSTR models for predicting the activity and toxicity of anticonvulsants at accuracy levels above the literature-reported threshold (LT) of good QSAR models as tested by both the internal 5-fold cross validation and external validation method. These models showed significantly compromised PI predictive capability due to the cumulative errors of the QSAR and QSTR models. Therefore, in this investigation a new quantitative structure-index relationship (QSIR) model was devised and it showed improved PI predictive capability that superseded the LT of good QSAR models. The QSAR, QSTR and QSIR models were developed using support vector regression (SVR) method with the parameters optimized by using the greedy search method. The molecular descriptors relevant to the prediction of anticonvulsant activities, toxicities and PIs were analyzed by a recursive feature elimination method. The selected molecular descriptors are primarily associated with the drug-like, pharmacological and toxicological features and those used in the published anticonvulsant QSAR and QSTR models. This study suggested that QSIR is useful for estimating the therapeutic index of drug candidates. Copyright © 2016. Published by Elsevier Inc.

  13. Patient-specific dosimetry calculations using mathematic models of different anatomic sizes during therapy with 111In-DTPA-D-Phe1-octreotide infusions after catheterization of the hepatic artery.

    PubMed

    Kontogeorgakos, Dimitrios K; Dimitriou, Panagiotis A; Limouris, Georgios S; Vlahos, Lambros J

    2006-09-01

    The aim of the study was to provide dosimetric data on intrahepatic (111)In-diethylenetriaminepentaacetic acid (DTPA)-D-Phe(1)-octreotide therapy for neuroendocrine tumors with overexpression of somatostatin receptors. A dosimetric protocol was designed to estimate the absorbed dose to the tumor and healthy tissue in a course of 48 treatments for 12 patients, who received a mean activity of 5.4 +/- 1.7 GBq per session. The patient-specific dosimetry calculations, based on quantitative biplanar whole-body scintigrams, were performed using a Monte Carlo simulation program for 3 male and 3 female mathematic models of different anatomic sizes. Thirty minutes and 2, 6, 24, and 48 h after the radionuclide infusion, blood-sample data were collected for estimation of the red marrow radiation burden. The mean absorbed doses per administered activity (mGy/MBq) by the critical organs liver, spleen, kidneys, bladder wall, and bone marrow were 0.14 +/- 0.04, 1.4 +/- 0.6, 0.41 +/- 0.08, 0.094 +/- 0.013, and (3.5 +/- 0.8) x 10(-3), respectively; the tumor absorbed dose ranged from 2.2 to 19.6 mGy/MBq, strongly depending on the lesion size and tissue type. The results of the present study quantitatively confirm the therapeutic efficacy of transhepatic administration; the tumor-to-healthy-tissue uptake ratio was enhanced, compared with the results after antecubital infusions. Planning of treatment was also optimized by use of the patient-specific dosimetric protocol.

  14. [Effect of social desirability on dietary intake estimated from a food questionnaire].

    PubMed

    Barros, Renata; Moreira, Pedro; Oliveira, Bruno

    2005-01-01

    Self-report of dietary intake could be biased by social thus affecting risk estimates in epidemiological studies. The objective of study was to assess the effect of social desirability on dietary intake from a food frequency questionnaire (FFQ). A convenience sample of 483 Portuguese university students was recruited. Subjects were invited to complete a two-part self-administered questionnaire: the first part included the Marlowe-Crowne Social Desirability Scale (M-CSDS), a physical activity questionnaire and self-reported height and weight; the second part, included a semi-quantitative FFQ validated for Portuguese adults, that should be returned after fulfillment. All subjects completed the first part of the questionnaire and 40.4% returned the FFQ fairly completed. In multiple regression analysis, after adjustment for energy and confounders, social desirability produced a significant positive effect in the estimates of dietary fibre, vitamin C, vitamin E, magnesium and potassium, in both genders. In multiple regression, after adjustment for energy and confounders, social desirability had a significant positive effect in the estimates of vegetable consumption, for both genders, and a negative effect in white bread and beer, for women. Social desirability affected nutritional and food intake estimated from a food frequency questionnaire.

  15. Simultaneous estimation of diet composition and calibration coefficients with fatty acid signature data

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Budge, Suzanne M.; Thiemann, Gregory W.; Rode, Karyn D.

    2017-01-01

    Knowledge of animal diets provides essential insights into their life history and ecology, although diet estimation is challenging and remains an active area of research. Quantitative fatty acid signature analysis (QFASA) has become a popular method of estimating diet composition, especially for marine species. A primary assumption of QFASA is that constants called calibration coefficients, which account for the differential metabolism of individual fatty acids, are known. In practice, however, calibration coefficients are not known, but rather have been estimated in feeding trials with captive animals of a limited number of model species. The impossibility of verifying the accuracy of feeding trial derived calibration coefficients to estimate the diets of wild animals is a foundational problem with QFASA that has generated considerable criticism. We present a new model that allows simultaneous estimation of diet composition and calibration coefficients based only on fatty acid signature samples from wild predators and potential prey. Our model performed almost flawlessly in four tests with constructed examples, estimating both diet proportions and calibration coefficients with essentially no error. We also applied the model to data from Chukchi Sea polar bears, obtaining diet estimates that were more diverse than estimates conditioned on feeding trial calibration coefficients. Our model avoids bias in diet estimates caused by conditioning on inaccurate calibration coefficients, invalidates the primary criticism of QFASA, eliminates the need to conduct feeding trials solely for diet estimation, and consequently expands the utility of fatty acid data to investigate aspects of ecology linked to animal diets.

  16. Coseismic landslides reveal near-surface rock strength in a high-relief tectonically active setting

    USGS Publications Warehouse

    Gallen, Sean F.; Clark, Marin K.; Godt, Jonathan W.

    2014-01-01

    We present quantitative estimates of near-surface rock strength relevant to landscape evolution and landslide hazard assessment for 15 geologic map units of the Longmen Shan, China. Strength estimates are derived from a novel method that inverts earthquake peak ground acceleration models and coseismic landslide inventories to obtain material proper- ties and landslide thickness. Aggregate rock strength is determined by prescribing a friction angle of 30° and solving for effective cohesion. Effective cohesion ranges are from 70 kPa to 107 kPa for 15 geologic map units, and are approximately an order of magnitude less than typical laboratory measurements, probably because laboratory tests on hand-sized specimens do not incorporate the effects of heterogeneity and fracturing that likely control near-surface strength at the hillslope scale. We find that strength among the geologic map units studied varies by less than a factor of two. However, increased weakening of units with proximity to the range front, where precipitation and active fault density are the greatest, suggests that cli- matic and tectonic factors overwhelm lithologic differences in rock strength in this high-relief tectonically active setting.

  17. [Quantitative determination of 7-phenoxyacetamidodesacetoxycephalosporanic acid].

    PubMed

    Dzegilenko, N B; Riabova, N M; Zinchenko, E Ia; Korchagin, V B

    1976-11-01

    7-Phenoxyacetamidodesacetoxycephalosporanic acid, an intermediate product in synthesis of cephalexin, was prepared by oxydation of phenoxymethylpenicillin into the respective sulphoxide and transformation of the latter. The UV-spectra of the reaction products were studied. A quantitative method is proposed for determination of 7-phenoxyacetamidodesacetoxycephalosporanic acid in the finished products based on estimation os the coefficient of specific extinction of the ethanol solutions at a wave length of 268 um in the UV-spectrum region in combination with semiquantitative estimation of the admixtures with the method of thin-layer chromatography.

  18. Estimating the Post-Mortem Interval of skeletonized remains: The use of Infrared spectroscopy and Raman spectro-microscopy

    NASA Astrophysics Data System (ADS)

    Creagh, Dudley; Cameron, Alyce

    2017-08-01

    When skeletonized remains are found it becomes a police task to determine to identify the body and establish the cause of death. It assists investigators if the Post-Mortem Interval (PMI) can be established. Hitherto no reliable qualitative method of estimating the PMI has been found. A quantitative method has yet to be developed. This paper shows that IR spectroscopy and Raman microscopy have the potential to form the basis of a quantitative method.

  19. Transcript copy number estimation using a mouse whole-genome oligonucleotide microarray

    PubMed Central

    Carter, Mark G; Sharov, Alexei A; VanBuren, Vincent; Dudekula, Dawood B; Carmack, Condie E; Nelson, Charlie; Ko, Minoru SH

    2005-01-01

    The ability to quantitatively measure the expression of all genes in a given tissue or cell with a single assay is an exciting promise of gene-expression profiling technology. An in situ-synthesized 60-mer oligonucleotide microarray designed to detect transcripts from all mouse genes was validated, as well as a set of exogenous RNA controls derived from the yeast genome (made freely available without restriction), which allow quantitative estimation of absolute endogenous transcript abundance. PMID:15998450

  20. Doctor-patient communication: some quantitative estimates of the role of cognitive factors in non-compliance.

    PubMed

    Ley, P

    1985-04-01

    Patients frequently fail to understand what they are told. Further, they frequently forget the information given to them. These factors have effects on patients' satisfaction with the consultation. All three of these factors--understanding, memory and satisfaction--have effects on the probability that a patient will comply with advice. The levels of failure to understand and remember and levels of dissatisfaction are described. Quantitative estimates of the effects of these factors on non-compliance are presented.

  1. Estimation of genetic parameters and their sampling variances of quantitative traits in the type 2 modified augmented design

    USDA-ARS?s Scientific Manuscript database

    We proposed a method to estimate the error variance among non-replicated genotypes, thus to estimate the genetic parameters by using replicated controls. We derived formulas to estimate sampling variances of the genetic parameters. Computer simulation indicated that the proposed methods of estimatin...

  2. A subagging regression method for estimating the qualitative and quantitative state of groundwater

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Park, E.; Choi, J.; Han, W. S.; Yun, S. T.

    2016-12-01

    A subagging regression (SBR) method for the analysis of groundwater data pertaining to the estimation of trend and the associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of the other methods and the uncertainties are reasonably estimated where the others have no uncertainty analysis option. To validate further, real quantitative and qualitative data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by SBR, whereas the GPR has limitations in representing the variability of non-Gaussian skewed data. From the implementations, it is determined that the SBR method has potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data.

  3. Comparison of Dynamic Contrast Enhanced MRI and Quantitative SPECT in a Rat Glioma Model

    PubMed Central

    Skinner, Jack T.; Yankeelov, Thomas E.; Peterson, Todd E.; Does, Mark D.

    2012-01-01

    Pharmacokinetic modeling of dynamic contrast enhanced (DCE)-MRI data provides measures of the extracellular volume fraction (ve) and the volume transfer constant (Ktrans) in a given tissue. These parameter estimates may be biased, however, by confounding issues such as contrast agent and tissue water dynamics, or assumptions of vascularization and perfusion made by the commonly used model. In contrast to MRI, radiotracer imaging with SPECT is insensitive to water dynamics. A quantitative dual-isotope SPECT technique was developed to obtain an estimate of ve in a rat glioma model for comparison to the corresponding estimates obtained using DCE-MRI with a vascular input function (VIF) and reference region model (RR). Both DCE-MRI methods produced consistently larger estimates of ve in comparison to the SPECT estimates, and several experimental sources were postulated to contribute to these differences. PMID:22991315

  4. Implicit timing activates the left inferior parietal cortex.

    PubMed

    Wiener, Martin; Turkeltaub, Peter E; Coslett, H Branch

    2010-11-01

    Coull and Nobre (2008) suggested that tasks that employ temporal cues might be divided on the basis of whether these cues are explicitly or implicitly processed. Furthermore, they suggested that implicit timing preferentially engages the left cerebral hemisphere. We tested this hypothesis by conducting a quantitative meta-analysis of eleven neuroimaging studies of implicit timing using the activation-likelihood estimation (ALE) algorithm (Turkeltaub, Eden, Jones, & Zeffiro, 2002). Our analysis revealed a single but robust cluster of activation-likelihood in the left inferior parietal cortex (supramarginal gyrus). This result is in accord with the hypothesis that the left hemisphere subserves implicit timing mechanisms. Furthermore, in conjunction with a previously reported meta-analysis of explicit timing tasks, our data support the claim that implicit and explicit timing are supported by at least partially distinct neural structures. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Stochastic time series analysis of fetal heart-rate variability

    NASA Astrophysics Data System (ADS)

    Shariati, M. A.; Dripps, J. H.

    1990-06-01

    Fetal Heart Rate(FHR) is one of the important features of fetal biophysical activity and its long term monitoring is used for the antepartum(period of pregnancy before labour) assessment of fetal well being. But as yet no successful method has been proposed to quantitatively represent variety of random non-white patterns seen in FHR. Objective of this paper is to address this issue. In this study the Box-Jenkins method of model identification and diagnostic checking was used on phonocardiographic derived FHR(averaged) time series. Models remained exclusively autoregressive(AR). Kalman filtering in conjunction with maximum likelihood estimation technique forms the parametric estimator. Diagnosrics perfonned on the residuals indicated that a second order model may be adequate in capturing type of variability observed in 1 up to 2 mm data windows of FHR. The scheme may be viewed as a means of data reduction of a highly redundant information source. This allows a much more efficient transmission of FHR information from remote locations to places with facilities and expertise for doser analysis. The extracted parameters is aimed to reflect numerically the important FHR features. These are normally picked up visually by experts for their assessments. As a result long term FHR recorded during antepartum period could then be screened quantitatively for detection of patterns considered normal or abnonnal. 1.

  6. Structure/activity relationships for biodegradability and their role in environmental assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boethling, R.S.

    1994-12-31

    Assessment of biodegradability is an important part of the review process for both new and existing chemicals under the Toxic Substances Control Act. It is often necessary to estimate biodegradability because experimental data are unavailable. Structure/biodegradability relationships (SBR) are a means to this end. Quantitative SBR have been developed, but this approach has not been very useful because they apply only to a few narrowly defined classes of chemicals. In response to the need for more widely applicable methods, multivariate analysis has been used to develop biodegradability classification models. For example, recent efforts have produced four new models. Two calculatemore » the probability of rapid biodegradation and can be used for classification; the other two models allow semi-quantitative estimation of primary and ultimate biodegradation rates. All are based on multiple regressions against 36 preselected substructures plus molecular weight. Such efforts have been fairly successful by statistical criteria, but in general are hampered by a lack of large and consistent datasets. Knowledge-based expert systems may represent the next step in the evolution of SBR. In principle such systems need not be as severely limited by imperfect datasets. However, the codification of expert knowledge and reasoning is a critical prerequisite. Results of knowledge acquisition exercises and modeling based on them will also be described.« less

  7. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China*

    PubMed Central

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-01-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749

  8. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China.

    PubMed

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-04-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors.

  9. Connecting the Dots: Linking Environmental Justice Indicators to Daily Dose Model Estimates

    EPA Science Inventory

    Many different quantitative techniques have been developed to either assess Environmental Justice (EJ) issues or estimate exposure and dose for risk assessment. However, very few approaches have been applied to link EJ factors to exposure dose estimate and identify potential impa...

  10. Communication among neurons.

    PubMed

    Marner, Lisbeth

    2012-04-01

    The communication among neurons is the prerequisite for the working brain. To understand the cellular, neurochemical, and structural basis of this communication, and the impacts of aging and disease on brain function, quantitative measures are necessary. This thesis evaluates several quantitative neurobiological methods with respect to possible bias and methodological issues. Stereological methods are suited for the unbiased estimation of number, length, and volumes of components of the nervous system. Stereological estimates of the total length of myelinated nerve fibers were made in white matter of post mortem brains, and the impact of aging and diseases as Schizophrenia and Alzheimer's disease were evaluated. Although stereological methods are in principle unbiased, shrinkage artifacts are difficult to account for. Positron emission tomography (PET) recordings, in conjunction with kinetic modeling, permit the quantitation of radioligand binding in brain. The novel serotonin 5-HT4 antagonist [11C]SB207145 was used as an example of the validation process for quantitative PET receptor imaging. Methods based on reference tissue as well as methods based on an arterial plasma input function were evaluated with respect to precision and accuracy. It was shown that [11C]SB207145 binding had high sensitivity to occupancy by unlabeled ligand, necessitating high specific activity in the radiosynthesis to avoid bias. The established serotonin 5-HT2A ligand [18F]altanersin was evaluated in a two-year follow-up study in elderly subjects. Application of partial volume correction of the PET data diminished the reliability of the measures, but allowed for the correct distinction between changes due to brain atrophy and receptor availability. Furthermore, a PET study of patients with Alzheimer's disease with the serotonin transporter ligand [11C]DASB showed relatively preserved serotonergic projections, despite a marked decrease in 5-HT2A receptor binding. Possible confounders are considered and the relation to the prevailing beta-amyloid hypothesis is discussed.

  11. A Probabilistic Method for Estimation of Bowel Wall Thickness in MR Colonography

    PubMed Central

    Menys, Alex; Jaffer, Asif; Bhatnagar, Gauraang; Punwani, Shonit; Atkinson, David; Halligan, Steve; Hawkes, David J.; Taylor, Stuart A.

    2017-01-01

    MRI has recently been applied as a tool to quantitatively evaluate the response to therapy in patients with Crohn’s disease, and is the preferred choice for repeated imaging. Bowel wall thickness on MRI is an important biomarker of underlying inflammatory activity, being abnormally increased in the acute phase and reducing in response to successful therapy; however, a poor level of interobserver agreement of measured thickness is reported and therefore a system for accurate, robust and reproducible measurements is desirable. We propose a novel method for estimating bowel wall-thickness to improve the poor interobserver agreement of the manual procedure. We show that the variability of wall thickness measurement between the algorithm and observer measurements (0.25mm ± 0.81mm) has differences which are similar to observer variability (0.16mm ± 0.64mm). PMID:28072831

  12. Time-of-flight PET time calibration using data consistency

    NASA Astrophysics Data System (ADS)

    Defrise, Michel; Rezaei, Ahmadreza; Nuyts, Johan

    2018-05-01

    This paper presents new data driven methods for the time of flight (TOF) calibration of positron emission tomography (PET) scanners. These methods are derived from the consistency condition for TOF PET, they can be applied to data measured with an arbitrary tracer distribution and are numerically efficient because they do not require a preliminary image reconstruction from the non-TOF data. Two-dimensional simulations are presented for one of the methods, which only involves the two first moments of the data with respect to the TOF variable. The numerical results show that this method estimates the detector timing offsets with errors that are larger than those obtained via an initial non-TOF reconstruction, but remain smaller than of the TOF resolution and thereby have a limited impact on the quantitative accuracy of the activity image estimated with standard maximum likelihood reconstruction algorithms.

  13. 17 CFR Appendix A to Part 255 - Reporting and Recordkeeping Requirements for Covered Trading Activities

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... regarding a variety of quantitative measurements of their covered trading activities, which vary depending... entity's covered trading activities. c. The quantitative measurements that must be furnished pursuant to... prior to September 30, 2015. e. In addition to the quantitative measurements required in this appendix...

  14. Clinical Investigation of the Dopaminergic System with PET and FLUORINE-18-FLUORO-L-DOPA.

    NASA Astrophysics Data System (ADS)

    Oakes, Terrence Rayford

    1995-01-01

    Positron Emission Tomography (PET) is a tool that provides quantitative physiological information. It is valuable both in a clinical environment, where information is sought for an individual, and in a research environment, to answer more fundamental questions about physiology and disease states. PET is particularly attractive compared to other nuclear medicine imaging techniques in cases where the anatomical regions of interest are small or when true metabolic rate constants are required. One example with both of these requirements is the investigation of Parkinson's Disease, which is characterized as a presynaptic motor function deficit affecting the striatum. As dopaminergic neurons die, the ability of the striatum to affect motor function decreases. The extent of functional neuronal damage in the small sub-structures may be ascertained by measuring the ability of the caudate and putamen to trap and store dopamine, a neurotransmitter. PET is able to utilize a tracer of dopamine activity, ^ {18}F- scL-DOPA, to quantitate the viability of the striatum. This thesis work deals with implementing and optimizing the many different elements that compose a PET study of the dopaminergic system, including: radioisotope production; conversion of aqueous ^{18}F ^-into [^ {18}F]-F2; synthesis of ^{18}F- scL -DOPA; details of the PET scan itself; measurements to estimate the radiation dosimetry; accurate measurement of a plasma input function; and the quantitation of dopaminergic activity in normal human subjects as well as in Parkinson's Disease patients.

  15. Common biology of craving across legal and illegal drugs - a quantitative meta-analysis of cue-reactivity brain response.

    PubMed

    Kühn, Simone; Gallinat, Jürgen

    2011-04-01

    The present quantitative meta-analysis set out to test whether cue-reactivity responses in humans differ across drugs of abuse and whether these responses constitute the biological basis of drug craving as a core psychopathology of addiction. By means of activation likelihood estimation, we investigated the concurrence of brain regions activated by cue-induced craving paradigms across studies on nicotine, alcohol and cocaine addicts. Furthermore, we analysed the concurrence of brain regions positively correlated with self-reported craving in nicotine and alcohol studies. We found direct overlap between nicotine, alcohol and cocaine cue reactivity in the ventral striatum. In addition, regions of close proximity were observed in the anterior cingulate cortex (ACC; nicotine and cocaine) and amygdala (alcohol, nicotine and cocaine). Brain regions of concurrence in drug cue-reactivity paradigms that overlapped with brain regions of concurrence in self-reported craving correlations were found in the ACC, ventral striatum and right pallidum (for alcohol). This first quantitative meta-analysis on drug cue reactivity identifies brain regions underlying nicotine, alcohol and cocaine dependency, i.e. the ventral striatum. The ACC, right pallidum and ventral striatum were related to drug cue reactivity as well as self-reported craving, suggesting that this set of brain regions constitutes the core circuit of drug craving in nicotine and alcohol addiction. © 2011 The Authors. European Journal of Neuroscience © 2011 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  16. Towards a Quantitative Framework for Evaluating Vulnerability of Drinking Water Wells to Contamination from Unconventional Oil & Gas Development

    NASA Astrophysics Data System (ADS)

    Soriano, M., Jr.; Deziel, N. C.; Saiers, J. E.

    2017-12-01

    The rapid expansion of unconventional oil and gas (UO&G) production, made possible by advances in hydraulic fracturing (fracking), has triggered concerns over risks this extraction poses to water resources and public health. Concerns are particularly acute within communities that host UO&G development and rely heavily on shallow aquifers as sources of drinking water. This research aims to develop a quantitative framework to evaluate the vulnerability of drinking water wells to contamination from UO&G activities. The concept of well vulnerability is explored through application of backwards travel time probability modeling to estimate the likelihood that capture zones of drinking water wells circumscribe source locations of UO&G contamination. Sources of UO&G contamination considered in this analysis include gas well pads and documented sites of UO&G wastewater and chemical spills. The modeling approach is illustrated for a portion of Susquehanna County, Pennsylvania, where more than one thousand shale gas wells have been completed since 2005. Data from a network of eight multi-level groundwater monitoring wells installed in the study site in 2015 are used to evaluate the model. The well vulnerability concept is proposed as a physically based quantitative tool for policy-makers dealing with the management of contamination risks of drinking water wells. In particular, the model can be used to identify adequate setback distances of UO&G activities from drinking water wells and other critical receptors.

  17. Repeat 24-hour recalls and locally developed food composition databases: a feasible method to estimate dietary adequacy in a multi-site preconception maternal nutrition RCT.

    PubMed

    Lander, Rebecca L; Hambidge, K Michael; Krebs, Nancy F; Westcott, Jamie E; Garces, Ana; Figueroa, Lester; Tejeda, Gabriela; Lokangaka, Adrien; Diba, Tshilenge S; Somannavar, Manjunath S; Honnayya, Ranjitha; Ali, Sumera A; Khan, Umber S; McClure, Elizabeth M; Thorsten, Vanessa R; Stolka, Kristen B

    2017-01-01

    Background : Our aim was to utilize a feasible quantitative methodology to estimate the dietary adequacy of >900 first-trimester pregnant women in poor rural areas of the Democratic Republic of the Congo, Guatemala, India and Pakistan. This paper outlines the dietary methods used. Methods : Local nutritionists were trained at the sites by the lead study nutritionist and received ongoing mentoring throughout the study. Training topics focused on the standardized conduct of repeat multiple-pass 24-hr dietary recalls, including interview techniques, estimation of portion sizes, and construction of a unique site-specific food composition database (FCDB). Each FCDB was based on 13 food groups and included values for moisture, energy, 20 nutrients (i.e. macro- and micronutrients), and phytate (an anti-nutrient). Nutrient values for individual foods or beverages were taken from recently developed FAO-supported regional food composition tables or the USDA national nutrient database. Appropriate adjustments for differences in moisture and application of nutrient retention and yield factors after cooking were applied, as needed. Generic recipes for mixed dishes consumed by the study population were compiled at each site, followed by calculation of a median recipe per 100 g. Each recipe's nutrient values were included in the FCDB. Final site FCDB checks were planned according to FAO/INFOODS guidelines. Discussion : This dietary strategy provides the opportunity to assess estimated mean group usual energy and nutrient intakes and estimated prevalence of the population 'at risk' of inadequate intakes in first-trimester pregnant women living in four low- and middle-income countries. While challenges and limitations exist, this methodology demonstrates the practical application of a quantitative dietary strategy for a large international multi-site nutrition trial, providing within- and between-site comparisons. Moreover, it provides an excellent opportunity for local capacity building and each site FCDB can be easily modified for additional research activities conducted in other populations living in the same area.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stegen, James C.; Lin, Xueju; Fredrickson, Jim K.

    Across a set of ecological communities connected to each other through organismal dispersal (a ‘meta-community’), turnover in composition is governed by (ecological) Drift, Selection, and Dispersal Limitation. Quantitative estimates of these processes remain elusive, but would represent a common currency needed to unify community ecology. Using a novel analytical framework we quantitatively estimate the relative influences of Drift, Selection, and Dispersal Limitation on subsurface, sediment-associated microbial meta-communities. The communities we study are distributed across two geologic formations encompassing ~12,500m3 of uranium-contaminated sediments within the Hanford Site in eastern Washington State. We find that Drift consistently governs ~25% of spatial turnovermore » in community composition; Selection dominates (governing ~60% of turnover) across spatially-structured habitats associated with fine-grained, low permeability sediments; and Dispersal Limitation is most influential (governing ~40% of turnover) across spatially-unstructured habitats associated with coarse-grained, highly-permeable sediments. Quantitative influences of Selection and Dispersal Limitation may therefore be predictable from knowledge of environmental structure. To develop a system-level conceptual model we extend our analytical framework to compare process estimates across formations, characterize measured and unmeasured environmental variables that impose Selection, and identify abiotic features that limit dispersal. Insights gained here suggest that community ecology can benefit from a shift in perspective; the quantitative approach developed here goes beyond the ‘niche vs. neutral’ dichotomy by moving towards a style of natural history in which estimates of Selection, Dispersal Limitation and Drift can be described, mapped and compared across ecological systems.« less

  19. Bayesian parameter estimation in spectral quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Pulkkinen, Aki; Cox, Ben T.; Arridge, Simon R.; Kaipio, Jari P.; Tarvainen, Tanja

    2016-03-01

    Photoacoustic tomography (PAT) is an imaging technique combining strong contrast of optical imaging to high spatial resolution of ultrasound imaging. These strengths are achieved via photoacoustic effect, where a spatial absorption of light pulse is converted into a measurable propagating ultrasound wave. The method is seen as a potential tool for small animal imaging, pre-clinical investigations, study of blood vessels and vasculature, as well as for cancer imaging. The goal in PAT is to form an image of the absorbed optical energy density field via acoustic inverse problem approaches from the measured ultrasound data. Quantitative PAT (QPAT) proceeds from these images and forms quantitative estimates of the optical properties of the target. This optical inverse problem of QPAT is illposed. To alleviate the issue, spectral QPAT (SQPAT) utilizes PAT data formed at multiple optical wavelengths simultaneously with optical parameter models of tissue to form quantitative estimates of the parameters of interest. In this work, the inverse problem of SQPAT is investigated. Light propagation is modelled using the diffusion equation. Optical absorption is described with chromophore concentration weighted sum of known chromophore absorption spectra. Scattering is described by Mie scattering theory with an exponential power law. In the inverse problem, the spatially varying unknown parameters of interest are the chromophore concentrations, the Mie scattering parameters (power law factor and the exponent), and Gruneisen parameter. The inverse problem is approached with a Bayesian method. It is numerically demonstrated, that estimation of all parameters of interest is possible with the approach.

  20. An advanced application of the quantitative structure-activity relationship concept in electrokinetic chromatography of metal complexes.

    PubMed

    Oszwałdowski, Sławomir; Timerbaev, Andrei R

    2008-02-01

    The relevance of the quantitative structure-activity relationship (QSAR) principle in MEKC and microemulsion EKC (MEEKC) of metal-ligand complexes was evaluated for a better understanding of analyte migration mechanism. A series of gallium chelates were applied as test solutes with available experimental migration data in order to reveal the molecular properties that govern the separation. The QSAR models operating with n-octanol-water partition coefficients or van der Waals volumes were found to be valid for estimation of the retention factors (log k') of neutral compounds when using only an aqueous MEEKC electrolyte. On the other hand, consistent approximations of log k' for both uncharged and charged complexes in either EKC mode (and also with hydro-organic BGEs) were achievable with two-parametric QSARs in which the dipole moment is additionally incorporated as a structural descriptor, reflecting the electrostatic solute-pseudostationary phase interaction. The theoretical analysis of significant molecular parameters in MEKC systems, in which the micellar BGE is modified with an organic solvent, confirmed that concomitant consideration of hydrophobic, electrostatic, and solvation factors is essential for explaining the migration behavior of neutral metal complexes.

  1. Methods for Derivation of Inhalation Reference Concentrations and Application of Inhalation Dosimetry

    EPA Pesticide Factsheets

    EPA's methodology for estimation of inhalation reference concentrations (RfCs) as benchmark estimates of the quantitative dose-response assessment of chronic noncancer toxicity for individual inhaled chemicals.

  2. 50 CFR 600.340 - National Standard 7-Costs and Benefits.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... regulation are real and substantial relative to the added research, administrative, and enforcement costs, as..., including the status quo, is adequate. If quantitative estimates are not possible, qualitative estimates...

  3. 50 CFR 600.340 - National Standard 7-Costs and Benefits.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... regulation are real and substantial relative to the added research, administrative, and enforcement costs, as..., including the status quo, is adequate. If quantitative estimates are not possible, qualitative estimates...

  4. 50 CFR 600.340 - National Standard 7-Costs and Benefits.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... regulation are real and substantial relative to the added research, administrative, and enforcement costs, as..., including the status quo, is adequate. If quantitative estimates are not possible, qualitative estimates...

  5. 50 CFR 600.340 - National Standard 7-Costs and Benefits.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... regulation are real and substantial relative to the added research, administrative, and enforcement costs, as..., including the status quo, is adequate. If quantitative estimates are not possible, qualitative estimates...

  6. 50 CFR 600.340 - National Standard 7-Costs and Benefits.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... regulation are real and substantial relative to the added research, administrative, and enforcement costs, as..., including the status quo, is adequate. If quantitative estimates are not possible, qualitative estimates...

  7. Measuring Aircraft Capability for Military and Political Analysis

    DTIC Science & Technology

    1976-03-01

    challenged in 1932 when a panel of distinguished British scientists discussed the feasibility of quantitatively estimating sensory events... Quantitative Analysis of Social Problems , E.R. Tufte (ed.), p. 407, Addison-Wesley, 1970. 17 "artificial" boundaries are imposed on the data. Less...of arms transfers in various parts of the world as well. Quantitative research (and hence measurement) contributes to theoretical development by

  8. Development of retrospective quantitative and qualitative job-exposure matrices for exposures at a beryllium processing facility.

    PubMed

    Couch, James R; Petersen, Martin; Rice, Carol; Schubauer-Berigan, Mary K

    2011-05-01

    To construct a job-exposure matrix (JEM) for an Ohio beryllium processing facility between 1953 and 2006 and to evaluate temporal changes in airborne beryllium exposures. Quantitative area- and breathing-zone-based exposure measurements of airborne beryllium were made between 1953 and 2006 and used by plant personnel to estimate daily weighted average (DWA) exposure concentrations for sampled departments and operations. These DWA measurements were used to create a JEM with 18 exposure metrics, which was linked to the plant cohort consisting of 18,568 unique job, department and year combinations. The exposure metrics ranged from quantitative metrics (annual arithmetic/geometric average DWA exposures, maximum DWA and peak exposures) to descriptive qualitative metrics (chemical beryllium species and physical form) to qualitative assignment of exposure to other risk factors (yes/no). Twelve collapsed job titles with long-term consistent industrial hygiene samples were evaluated using regression analysis for time trends in DWA estimates. Annual arithmetic mean DWA estimates (overall plant-wide exposures including administration, non-production, and production estimates) for the data by decade ranged from a high of 1.39 μg/m(3) in the 1950s to a low of 0.33 μg/m(3) in the 2000s. Of the 12 jobs evaluated for temporal trend, the average arithmetic DWA mean was 2.46 μg/m(3) and the average geometric mean DWA was 1.53 μg/m(3). After the DWA calculations were log-transformed, 11 of the 12 had a statistically significant (p < 0.05) decrease in reported exposure over time. The constructed JEM successfully differentiated beryllium exposures across jobs and over time. This is the only quantitative JEM containing exposure estimates (average and peak) for the entire plant history.

  9. Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.

    PubMed

    Omer, Travis; Intes, Xavier; Hahn, Juergen

    2015-01-01

    Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.

  10. Quantitative diet reconstruction of a Neolithic population using a Bayesian mixing model (FRUITS): The case study of Ostorf (Germany).

    PubMed

    Fernandes, Ricardo; Grootes, Pieter; Nadeau, Marie-Josée; Nehlich, Olaf

    2015-07-14

    The island cemetery site of Ostorf (Germany) consists of individual human graves containing Funnel Beaker ceramics dating to the Early or Middle Neolithic. However, previous isotope and radiocarbon analysis demonstrated that the Ostorf individuals had a diet rich in freshwater fish. The present study was undertaken to quantitatively reconstruct the diet of the Ostorf population and establish if dietary habits are consistent with the traditional characterization of a Neolithic diet. Quantitative diet reconstruction was achieved through a novel approach consisting of the use of the Bayesian mixing model Food Reconstruction Using Isotopic Transferred Signals (FRUITS) to model isotope measurements from multiple dietary proxies (δ 13 C collagen , δ 15 N collagen , δ 13 C bioapatite , δ 34 S methione , 14 C collagen ). The accuracy of model estimates was verified by comparing the agreement between observed and estimated human dietary radiocarbon reservoir effects. Quantitative diet reconstruction estimates confirm that the Ostorf individuals had a high protein intake due to the consumption of fish and terrestrial animal products. However, FRUITS estimates also show that plant foods represented a significant source of calories. Observed and estimated human dietary radiocarbon reservoir effects are in good agreement provided that the aquatic reservoir effect at Lake Ostorf is taken as reference. The Ostorf population apparently adopted elements associated with a Neolithic culture but adapted to available local food resources and implemented a subsistence strategy that involved a large proportion of fish and terrestrial meat consumption. This case study exemplifies the diversity of subsistence strategies followed during the Neolithic. Am J Phys Anthropol, 2015. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  11. Dynamic Granger-Geweke causality modeling with application to interictal spike propagation

    PubMed Central

    Lin, Fa-Hsuan; Hara, Keiko; Solo, Victor; Vangel, Mark; Belliveau, John W.; Stufflebeam, Steven M.; Hamalainen, Matti S.

    2010-01-01

    A persistent problem in developing plausible neurophysiological models of perception, cognition, and action is the difficulty of characterizing the interactions between different neural systems. Previous studies have approached this problem by estimating causal influences across brain areas activated during cognitive processing using Structural Equation Modeling and, more recently, with Granger-Geweke causality. While SEM is complicated by the need for a priori directional connectivity information, the temporal resolution of dynamic Granger-Geweke estimates is limited because the underlying autoregressive (AR) models assume stationarity over the period of analysis. We have developed a novel optimal method for obtaining data-driven directional causality estimates with high temporal resolution in both time and frequency domains. This is achieved by simultaneously optimizing the length of the analysis window and the chosen AR model order using the SURE criterion. Dynamic Granger-Geweke causality in time and frequency domains is subsequently calculated within a moving analysis window. We tested our algorithm by calculating the Granger-Geweke causality of epileptic spike propagation from the right frontal lobe to the left frontal lobe. The results quantitatively suggested the epileptic activity at the left frontal lobe was propagated from the right frontal lobe, in agreement with the clinical diagnosis. Our novel computational tool can be used to help elucidate complex directional interactions in the human brain. PMID:19378280

  12. The mechanics and energetics of soil bioturbation by earthworms and plant roots - Impacts on soil structure generation and maintenance

    NASA Astrophysics Data System (ADS)

    Or, Dani; Ruiz, Siul; Schymanski, Stanlislaus

    2015-04-01

    Soil structure is the delicate arrangement of solids and voids that facilitate numerous hydrological and ecological soil functions ranging from water infiltration and retention to gaseous exchange and mechanical anchoring of plant roots. Many anthropogenic activities affect soil structure, e.g. via tillage and compaction, and by promotion or suppression of biological activity and soil carbon pools. Soil biological activity is critical to the generation and maintenance of favorable soil structure, primarily through bioturbation by earthworms and root proliferation. The study aims to quantify the mechanisms, rates, and energetics associated with soil bioturbation, using a new biomechanical model to estimate stresses required to penetrate and expand a cylindrical cavity in a soil under different hydration and mechanical conditions. The stresses and soil displacement involved are placed in their ecological context (typical sizes, population densities, burrowing rates and behavior) enabling estimation of mechanical energy requirements and impacts on soil organic carbon pool (in the case of earthworms). We consider steady state plastic cavity expansion to determine burrowing pressures of earthworms and plant roots, akin to models of cone penetration representing initial burrowing into soil volumes. Results show that with increasing water content the strain energy decreases and suggest trade-offs between cavity expansion pressures and energy investment for different root and earthworm geometries and soil hydration. The study provides a quantitative framework for estimating energy costs of bioturbation in terms of soil organic carbon or the mechanical costs of soil exploration by plant roots as well as mechanical and hydration limits to such activities.

  13. Mountain Heavy Rainfall Measurement Experiments in a Subtropical Monsoon Environment

    NASA Astrophysics Data System (ADS)

    Jong-Dao Jou, Ben; Chi-June Jung, Ultimate; Lai, Hsiao-Wei; Feng, Lei

    2014-05-01

    Quantitative rainfall measurement experiments have been conducted in Taiwan area for the past 5 years (since 2008), especially over the complex terrain region. In this paper, results from these experiments will be analyzed and discussed, especially those associated with heavy rain events in the summer monsoon season. Observations from s-band polarimetric radar (SPOL of NCAR) and also x-band vertically-pointing radar are analyzed to reveal the high resolution temporal and spatial variation of precipitation structure. May and June, the Meiyu season in the area, are months with subtropical frontal rainfall events. Mesoscale convective systems, i.e., pre-frontal squall lines and frontal convective rainbands, are very active and frequently produce heavy rain events over mountain areas. Accurate quantitative precipitation measurements are needed in order to meet the requirement for landslide and flood early warning purpose. Using ground-based disdrometers and vertically-pointing radar, we have been trying to modify the quantitative precipitation estimation in the mountain region by using coastal operational radar. In this paper, the methodology applied will be presented and the potential of its application will be discussed. *corresponding author: Ben Jong-Dao Jou, jouben43@gmail.com

  14. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for carcinogens having a non-linear mode of action; instead dose-response modelling would be used in the experimental range to calculate an LED10* (a statistical lower bound on the dose corresponding to a 10% increase in risk), and safety factors would be applied to the LED10* to determine acceptable exposure levels for humans. This approach is very similar to the one presently used by USEPA for non-carcinogens. Rather than using one approach for carcinogens believed to have a linear mode of action and a different approach for all other health effects, it is suggested herein that it would be more appropriate to use an approach conceptually similar to the 'LED10*-safety factor' approach for all health effects, and not to routinely develop quantitative risk estimates from animal data.

  15. Comparing Bayesian estimates of genetic differentiation of molecular markers and quantitative traits: an application to Pinus sylvestris.

    PubMed

    Waldmann, P; García-Gil, M R; Sillanpää, M J

    2005-06-01

    Comparison of the level of differentiation at neutral molecular markers (estimated as F(ST) or G(ST)) with the level of differentiation at quantitative traits (estimated as Q(ST)) has become a standard tool for inferring that there is differential selection between populations. We estimated Q(ST) of timing of bud set from a latitudinal cline of Pinus sylvestris with a Bayesian hierarchical variance component method utilizing the information on the pre-estimated population structure from neutral molecular markers. Unfortunately, the between-family variances differed substantially between populations that resulted in a bimodal posterior of Q(ST) that could not be compared in any sensible way with the unimodal posterior of the microsatellite F(ST). In order to avoid publishing studies with flawed Q(ST) estimates, we recommend that future studies should present heritability estimates for each trait and population. Moreover, to detect variance heterogeneity in frequentist methods (ANOVA and REML), it is of essential importance to check also that the residuals are normally distributed and do not follow any systematically deviating trends.

  16. Robust estimation of adaptive tensors of curvature by tensor voting.

    PubMed

    Tong, Wai-Shun; Tang, Chi-Keung

    2005-03-01

    Although curvature estimation from a given mesh or regularly sampled point set is a well-studied problem, it is still challenging when the input consists of a cloud of unstructured points corrupted by misalignment error and outlier noise. Such input is ubiquitous in computer vision. In this paper, we propose a three-pass tensor voting algorithm to robustly estimate curvature tensors, from which accurate principal curvatures and directions can be calculated. Our quantitative estimation is an improvement over the previous two-pass algorithm, where only qualitative curvature estimation (sign of Gaussian curvature) is performed. To overcome misalignment errors, our improved method automatically corrects input point locations at subvoxel precision, which also rejects outliers that are uncorrectable. To adapt to different scales locally, we define the RadiusHit of a curvature tensor to quantify estimation accuracy and applicability. Our curvature estimation algorithm has been proven with detailed quantitative experiments, performing better in a variety of standard error metrics (percentage error in curvature magnitudes, absolute angle difference in curvature direction) in the presence of a large amount of misalignment noise.

  17. Centennial increase in geomagnetic activity: Latitudinal differences and global estimates

    NASA Astrophysics Data System (ADS)

    Mursula, K.; Martini, D.

    2006-08-01

    We study here the centennial change in geomagnetic activity using the newly proposed Inter-Hour Variability (IHV) index. We correct the earlier estimates of the centennial increase by taking into account the effect of the change of the sampling of the magnetic field from one sample per hour to hourly means in the first years of the previous century. Since the IHV index is a variability index, the larger variability in the case of hourly sampling leads, without due correction, to excessively large values in the beginning of the century and an underestimated centennial increase. We discuss two ways to extract the necessary sampling calibration factors and show that they agree very well with each other. The effect of calibration is especially large at the midlatitude Cheltenham/Fredricksburg (CLH/FRD) station where the centennial increase changes from only 6% to 24% caused by calibration. Sampling calibration also leads to a larger centennial increase of global geomagnetic activity based on the IHV index. The results verify a significant centennial increase in global geomagnetic activity, in a qualitative agreement with the aa index, although a quantitative comparison is not warranted. We also find that the centennial increase has a rather strong and curious latitudinal dependence. It is largest at high latitudes. Quite unexpectedly, it is larger at low latitudes than at midlatitudes. These new findings indicate interesting long-term changes in near-Earth space. We also discuss possible internal and external causes for these observed differences. The centennial change of geomagnetic activity may be partly affected by changes in external conditions, partly by the secular decrease of the Earth's magnetic moment whose effect in near-Earth space may be larger than estimated so far.

  18. Evaluation of polyurethane foam passive air sampler (PUF) as a tool for occupational PAH measurements.

    PubMed

    Strandberg, Bo; Julander, Anneli; Sjöström, Mattias; Lewné, Marie; Koca Akdeva, Hatice; Bigert, Carolina

    2018-01-01

    Routine monitoring of workplace exposure to polycyclic aromatic hydrocarbons (PAHs) is performed mainly via active sampling. However, active samplers have several drawbacks and, in some cases, may even be unusable. Polyurethane foam (PUF) as personal passive air samplers constitute good alternatives for PAH monitoring in occupational air (8 h). However, PUFs must be further tested to reliably yield detectable levels of PAHs in short exposure times (1-3 h) and under extreme occupational conditions. Therefore, we compared the personal exposure monitoring performance of a passive PUF sampler with that of an active air sampler and determined the corresponding uptake rates (Rs). These rates were then used to estimate the occupational exposure of firefighters and police forensic specialists to 32 PAHs. The work environments studied were heavily contaminated by PAHs with (for example) benzo(a)pyrene ranging from 0.2 to 56 ng m -3 , as measured via active sampling. We show that, even after short exposure times, PUF can reliably accumulate both gaseous and particle-bound PAHs. The Rs-values are almost independent of variables such as the concentration and the wind speed. Therefore, by using the Rs-values (2.0-20 m 3 day -1 ), the air concentrations can be estimated within a factor of two for gaseous PAHs and a factor of 10 for particulate PAHs. With very short sampling times (1 h), our method can serve as a (i) simple and user-friendly semi-quantitative screening tool for estimating and tracking point sources of PAH in micro-environments and (ii) complement to the traditional active pumping methods. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Modulation of Active Site Electronic Structure by the Protein Matrix to Control [NiFe] Hydrogenase Reactivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Dayle MA; Raugei, Simone; Squier, Thomas C.

    2014-09-30

    Control of the reactivity of the nickel center of the [NiFe] hydrogenase and other metalloproteins commonly involves outer coordination sphere ligands that act to modify the geometry and physical properties of the active site metal centers. We carried out a combined set of classical molecular dynamics and quantum/classical mechanics calculations to provide quantitative estimates of how dynamic fluctuations of the active site within the protein matrix modulate the electronic structure at the catalytic center. Specifically we focused on the dynamics of the inner and outer coordination spheres of the cysteinate-bound Ni–Fe cluster in the catalytically active Ni-C state. There aremore » correlated movements of the cysteinate ligands and the surrounding hydrogen-bonding network, which modulate the electron affinity at the active site and the proton affinity of a terminal cysteinate. On the basis of these findings, we hypothesize a coupling between protein dynamics and electron and proton transfer reactions critical to dihydrogen production.« less

  20. Modulation of active site electronic structure by the protein matrix to control [NiFe] hydrogenase reactivity.

    PubMed

    Smith, Dayle M A; Raugei, Simone; Squier, Thomas C

    2014-11-21

    Control of the reactivity of the nickel center of the [NiFe] hydrogenase and other metalloproteins commonly involves outer coordination sphere ligands that act to modify the geometry and physical properties of the active site metal centers. We carried out a combined set of classical molecular dynamics and quantum/classical mechanics calculations to provide quantitative estimates of how dynamic fluctuations of the active site within the protein matrix modulate the electronic structure at the catalytic center. Specifically we focused on the dynamics of the inner and outer coordination spheres of the cysteinate-bound Ni-Fe cluster in the catalytically active Ni-C state. There are correlated movements of the cysteinate ligands and the surrounding hydrogen-bonding network, which modulate the electron affinity at the active site and the proton affinity of a terminal cysteinate. On the basis of these findings, we hypothesize a coupling between protein dynamics and electron and proton transfer reactions critical to dihydrogen production.

  1. Estimation of Qualitative and Quantitative Parameters of Air Cleaning by a Pulsed Corona Discharge Using Multicomponent Standard Mixtures

    NASA Astrophysics Data System (ADS)

    Filatov, I. E.; Uvarin, V. V.; Kuznetsov, D. L.

    2018-05-01

    The efficiency of removal of volatile organic impurities in air by a pulsed corona discharge is investigated using model mixtures. Based on the method of competing reactions, an approach to estimating the qualitative and quantitative parameters of the employed electrophysical technique is proposed. The concept of the "toluene coefficient" characterizing the relative reactivity of a component as compared to toluene is introduced. It is proposed that the energy efficiency of the electrophysical method be estimated using the concept of diversified yield of the removal process. Such an approach makes it possible to substantially intensify the determination of energy parameters of removal of impurities and can also serve as a criterion for estimating the effectiveness of various methods in which a nonequilibrium plasma is used for air cleaning from volatile impurities.

  2. Assessing the performance of the generalized propensity score for estimating the effect of quantitative or continuous exposures on survival or time-to-event outcomes.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are frequently used to estimate the effects of interventions using observational data. The propensity score was originally developed for use with binary exposures. The generalized propensity score (GPS) is an extension of the propensity score for use with quantitative or continuous exposures (e.g. pack-years of cigarettes smoked, dose of medication, or years of education). We describe how the GPS can be used to estimate the effect of continuous exposures on survival or time-to-event outcomes. To do so we modified the concept of the dose-response function for use with time-to-event outcomes. We used Monte Carlo simulations to examine the performance of different methods of using the GPS to estimate the effect of quantitative exposures on survival or time-to-event outcomes. We examined covariate adjustment using the GPS and weighting using weights based on the inverse of the GPS. The use of methods based on the GPS was compared with the use of conventional G-computation and weighted G-computation. Conventional G-computation resulted in estimates of the dose-response function that displayed the lowest bias and the lowest variability. Amongst the two GPS-based methods, covariate adjustment using the GPS tended to have the better performance. We illustrate the application of these methods by estimating the effect of average neighbourhood income on the probability of survival following hospitalization for an acute myocardial infarction.

  3. Hydrogeochemical Investigation of Recharge Pathways to Intermediate and Regional Groundwater in Canon de Valle and Technical Area 16, Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brady, Brendan W.

    In aquifers consisting of fractured or porous igneous rocks, as well as conglomerate and sandstone products of volcanic formations, silicate minerals actively dissolve and precipitate (Eby, 2004; Eriksson, 1985; Drever, 1982). Dissolution of hydrated volcanic glass is also known to influence the character of groundwater to which it is exposed (White et al., 1980). Hydrochemical evolution, within saturated zones of volcanic formations, is modeled here as a means to resolve the sources feeding a perched groundwater zone. By observation of solute mass balances in groundwater, together with rock chemistry, this study characterizes the chemical weathering processes active along recharge pathwaysmore » in a mountain front system. Inverse mass balance modeling, which accounts for mass fluxes between solid phases and solution, is used to contrive sets of quantitative reactions that explain chemical variability of water between sampling points. Model results are used, together with chloride mass balance estimation, to evaluate subsurface mixing scenarios generated by further modeling. Final model simulations estimate contributions of mountain block and local recharge to various contaminated zones.« less

  4. Are numbers grounded in a general magnitude processing system? A functional neuroimaging meta-analysis.

    PubMed

    Sokolowski, H Moriah; Fias, Wim; Bosah Ononye, Chuka; Ansari, Daniel

    2017-10-01

    It is currently debated whether numbers are processed using a number-specific system or a general magnitude processing system, also used for non-numerical magnitudes such as physical size, duration, or luminance. Activation likelihood estimation (ALE) was used to conduct the first quantitative meta-analysis of 93 empirical neuroimaging papers examining neural activation during numerical and non-numerical magnitude processing. Foci were compiled to generate probabilistic maps of activation for non-numerical magnitudes (e.g. physical size), symbolic numerical magnitudes (e.g. Arabic digits), and nonsymbolic numerical magnitudes (e.g. dot arrays). Conjunction analyses revealed overlapping activation for symbolic, nonsymbolic and non-numerical magnitudes in frontal and parietal lobes. Contrast analyses revealed specific activation in the left superior parietal lobule for symbolic numerical magnitudes. In contrast, small regions in the bilateral precuneus were specifically activated for nonsymbolic numerical magnitudes. No regions in the parietal lobes were activated for non-numerical magnitudes that were not also activated for numerical magnitudes. Therefore, numbers are processed using both a generalized magnitude system and format specific number regions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Semi-quantitative analysis of salivary gland scintigraphy in Sjögren's syndrome diagnosis: a first-line tool.

    PubMed

    Angusti, Tiziana; Pilati, Emanuela; Parente, Antonella; Carignola, Renato; Manfredi, Matteo; Cauda, Simona; Pizzigati, Elena; Dubreuil, Julien; Giammarile, Francesco; Podio, Valerio; Skanjeti, Andrea

    2017-09-01

    The aim of this study was the assessment of semi-quantified salivary gland dynamic scintigraphy (SGdS) parameters independently and in an integrated way in order to predict primary Sjögren's syndrome (pSS). Forty-six consecutive patients (41 females; age 61 ± 11 years) with sicca syndrome were studied by SGdS after injection of 200 MBq of pertechnetate. In sixteen patients, pSS was diagnosed, according to American-European Consensus Group criteria (AECGc). Semi-quantitative parameters (uptake (UP) and excretion fraction (EF)) were obtained for each gland. ROC curves were used to determine the best cut-off value. The area under the curve (AUC) was used to estimate the accuracy of each semi-quantitative analysis. To assess the correlation between scintigraphic results and disease severity, semi-quantitative parameters were plotted versus Sjögren's syndrome disease activity index (ESSDAI). A nomogram was built to perform an integrated evaluation of all the scintigraphic semi-quantitative data. Both UP and EF of salivary glands were significantly lower in pSS patients compared to those in non-pSS (p < 0.001). ROC curve showed significantly large AUC for both the parameters (p < 0.05). Parotid UP and submandibular EF, assessed by univariated and multivariate logistic regression, showed a significant and independent correlation with pSS diagnosis (p value <0.05). No correlation was found between SGdS semi-quantitative parameters and ESSDAI. The proposed nomogram accuracy was 87%. SGdS is an accurate and reproducible tool for the diagnosis of pSS. ESSDAI was not shown to be correlated with SGdS data. SGdS should be the first-line imaging technique in patients with suspected pSS.

  6. Disease quantification on PET/CT images without object delineation

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.

    2017-03-01

    The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.

  7. Concordance of transcriptional and apical benchmark dose levels for conazole-induced liver effects in mice.

    PubMed

    Bhat, Virunya S; Hester, Susan D; Nesnow, Stephen; Eastmond, David A

    2013-11-01

    The ability to anchor chemical class-based gene expression changes to phenotypic lesions and to describe these changes as a function of dose and time informs mode-of-action determinations and improves quantitative risk assessments. Previous global expression profiling identified a 330-probe cluster differentially expressed and commonly responsive to 3 hepatotumorigenic conazoles (cyproconazole, epoxiconazole, and propiconazole) at 30 days. Extended to 2 more conazoles (triadimefon and myclobutanil), the present assessment encompasses 4 tumorigenic and 1 nontumorigenic conazole. Transcriptional benchmark dose levels (BMDL(T)) were estimated for a subset of the cluster with dose-responsive behavior and a ≥ 5-fold increase or decrease in signal intensity at the highest dose. These genes primarily encompassed CAR/RXR activation, P450 metabolism, liver hypertrophy- glutathione depletion, LPS/IL-1-mediated inhibition of RXR, and NRF2-mediated oxidative stress pathways. Median BMDL(T) estimates from the subset were concordant (within a factor of 2.4) with apical benchmark doses (BMDL(A)) for increased liver weight at 30 days for the 5 conazoles. The 30-day median BMDL(T) estimates were within one-half order of magnitude of the chronic BMDLA for hepatocellular tumors. Potency differences seen in the dose-responsive transcription of certain phase II metabolism, bile acid detoxification, and lipid oxidation genes mirrored each conazole's tumorigenic potency. The 30-day BMDL(T) corresponded to tumorigenic potency on a milligram per kilogram day basis with cyproconazole > epoxiconazole > propiconazole > triadimefon > myclobutanil (nontumorigenic). These results support the utility of measuring short-term gene expression changes to inform quantitative risk assessments from long-term exposures.

  8. Estimation of tissue stiffness, reflex activity, optimal muscle length and slack length in stroke patients using an electromyography driven antagonistic wrist model.

    PubMed

    de Gooijer-van de Groep, Karin L; de Vlugt, Erwin; van der Krogt, Hanneke J; Helgadóttir, Áróra; Arendzen, J Hans; Meskers, Carel G M; de Groot, Jurriaan H

    2016-06-01

    About half of all chronic stroke patients experience loss of arm function coinciding with increased stiffness, reduced range of motion and a flexed wrist due to a change in neural and/or structural tissue properties. Quantitative assessment of these changes is of clinical importance, yet not trivial. The goal of this study was to quantify the neural and structural properties contributing to wrist joint stiffness and to compare these properties between healthy subjects and stroke patients. Stroke patients (n=32) and healthy volunteers (n=14) were measured using ramp-and-hold rotations applied to the wrist joint by a haptic manipulator. Neural (reflexive torque) and structural (connective tissue stiffness and slack lengths and (contractile) optimal muscle lengths) parameters were estimated using an electromyography driven antagonistic wrist model. Kruskal-Wallis analysis with multiple comparisons was used to compare results between healthy subjects, stroke patients with modified Ashworth score of zero and stroke patients with modified Ashworth score of one or more. Stroke patients with modified Ashworth score of one or more differed from healthy controls (P<0.05) by increased tissue stiffness, increased reflexive torque, decreased optimal muscle length and decreased slack length of connective tissue of the flexor muscles. Non-invasive quantitative analysis, including estimation of optimal muscle lengths, enables to identify neural and non-neural changes in chronic stroke patients. Monitoring these changes in time is important to understand the recovery process and to optimize treatment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Estrogens in seminal plasma of human and animal species: identification and quantitative estimation by gas chromatography-mass spectrometry associated with stable isotope dilution.

    PubMed

    Reiffsteck, A; Dehennin, L; Scholler, R

    1982-11-01

    Estrone, 2-methoxyestrone and estradiol-17 beta have been definitely identified in seminal plasma of man, bull, boar and stallion by high resolution gas chromatography associated with selective monitoring of characteristic ions of suitable derivatives. Quantitative estimations were performed by isotope dilution with deuterated analogues and by monitoring molecular ions of trimethylsilyl ethers of labelled and unlabelled compounds. Concentrations of unconjugated and total estrogens are reported together with the statistical evaluation of accuracy and precision.

  10. Sex-specific genetic effects in physical activity: results from a quantitative genetic analysis.

    PubMed

    Diego, Vincent P; de Chaves, Raquel Nichele; Blangero, John; de Souza, Michele Caroline; Santos, Daniel; Gomes, Thayse Natacha; dos Santos, Fernanda Karina; Garganta, Rui; Katzmarzyk, Peter T; Maia, José A R

    2015-08-01

    The objective of this study is to present a model to estimate sex-specific genetic effects on physical activity (PA) levels and sedentary behaviour (SB) using three generation families. The sample consisted of 100 families covering three generations from Portugal. PA and SB were assessed via the International Physical Activity Questionnaire short form (IPAQ-SF). Sex-specific effects were assessed by genotype-by-sex interaction (GSI) models and sex-specific heritabilities. GSI effects and heterogeneity were tested in the residual environmental variance. SPSS 17 and SOLAR v. 4.1 were used in all computations. The genetic component for PA and SB domains varied from low to moderate (11% to 46%), when analyzing both genders combined. We found GSI effects for vigorous PA (p = 0.02) and time spent watching television (WT) (p < 0.001) that showed significantly higher additive genetic variance estimates in males. The heterogeneity in the residual environmental variance was significant for moderate PA (p = 0.02), vigorous PA (p = 0.006) and total PA (p = 0.001). Sex-specific heritability estimates were significantly higher in males only for WT, with a male-to-female difference in heritability of 42.5 (95% confidence interval: 6.4, 70.4). Low to moderate genetic effects on PA and SB traits were found. Results from the GSI model show that there are sex-specific effects in two phenotypes, VPA and WT with a stronger genetic influence in males.

  11. Quantitative structure-cytotoxicity relationship of phenylpropanoid amides.

    PubMed

    Shimada, Chiyako; Uesawa, Yoshihiro; Ishihara, Mariko; Kagaya, Hajime; Kanamoto, Taisei; Terakubo, Shigemi; Nakashima, Hideki; Takao, Koichi; Saito, Takayuki; Sugita, Yoshiaki; Sakagami, Hiroshi

    2014-07-01

    A total of 12 phenylpropanoid amides were subjected to quantitative structure-activity relationship (QSAR) analysis, based on their cytotoxicity, tumor selectivity and anti-HIV activity, in order to investigate on their biological activities. Cytotoxicity against four human oral squamous cell carcinoma (OSCC) cell lines and three human oral normal cells was determined by the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) method. Tumor selectivity was evaluated by the ratio of the mean CC50 (50% cytotoxic concentration) against normal oral cells to that against OSCC cell lines. Anti-HIV activity was evaluated by the ratio of CC50 to EC50 (50% cytoprotective concentration from HIV infection). Physicochemical, structural, and quantum-chemical parameters were calculated based on the conformations optimized by the LowModeMD method followed by density functional theory (DFT) method. Twelve phenylpropanoid amides showed moderate cytotoxicity against both normal and OSCC cell lines. N-Caffeoyl derivatives coupled with vanillylamine and tyramine exhibited relatively higher tumor selectivity. Cytotoxicity against normal cells was correlated with descriptors related to electrostatic interaction such as polar surface area and chemical hardness, whereas cytotoxicity against tumor cells correlated with free energy, surface area and ellipticity. The tumor-selective cytotoxicity correlated with molecular size (surface area) and electrostatic interaction (the maximum electrostatic potential). The molecular size, shape and ability for electrostatic interaction are useful parameters for estimating the tumor selectivity of phenylpropanoid amides. Copyright© 2014 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  12. Plastics in the Ocean: Engaging Students in Core Competencies Through Issues-Based Activities in the Science Classroom.

    NASA Astrophysics Data System (ADS)

    Fergusson-Kolmes, L. A.

    2016-02-01

    Plastic pollution in the ocean is a critical issue. The high profile of this issue in the popular media makes it an opportune vehicle for promoting deeper understanding of the topic while also advancing student learning in the core competency areas identified in the NSF's Vision and Change document: integration of the process of science, quantitative reasoning, modeling and simulation, and an understanding of the relationship between science and society. This is a challenging task in an introductory non-majors class where the students may have very limited math skills and no prior science background. In this case activities are described that ask students to use an understanding of density to make predictions and test them as they consider the fate of different kinds of plastics in the marine environment. A comparison of the results from different sampling regimes introduces students to the difficulties of carrying out scientific investigations in the complex marine environment as well as building quantitative literacy skills. Activities that call on students to make connections between global issues of plastic pollution and personal actions include extraction of microplastic from personal care products, inventories of local plastic-recycling options and estimations of contributions to the waste stream on an individual level. This combination of hands-on-activities in an accessible context serves to help students appreciate the immediacy of the threat of plastic pollution and calls them to reflect on possible solutions.

  13. Assessing the Economic Cost of Landslide Damage in Low-Relief Regions: Case Study Evidence from the Flemish Ardennes (Belgium)

    NASA Astrophysics Data System (ADS)

    Vranken, L.; Van Turnhout, P.; Van Den Eeckhaut, M.; Vandekerckhove, L.; Vantilt, G.; Poesen, J.

    2012-04-01

    Several regions around the globe are at risk to incur damage from landslides. These landslides cause significant structural and functional damage to public and private buildings and infrastructure. Numerous studies investigated how natural factors and human activities control the (re-)activation of landslides. However, few studies have concentrated on a quantitative estimate of the overall damage caused by landslides at a regional scale. This study therefore starts with a quantitative economic assessment of the direct and indirect damage caused by landslides in the Flemish Ardennes (Belgium), a low-relief region (area=ca. 700 km2) susceptible to landslides. Based on focus interviews as well as on semi-structured interviews with homeowners, civil servants (e.g. from the technical services from the various towns), or with the owners and providers of lifelines such as electricity and sewage, we have quantitatively estimated the direct and indirect damage induced by landsliding and this for a 10 to 30 year period (depending on the type of infrastructure or buildings). Economic damage to public infrastructure and buildings was estimated for the entire region, while for private damage 10 cases with severe to small damage were quantified. For example, in the last 10 year, costs of road repair augmented to 814 560 €. Costs to repair damaged roads that have not yet been repaired, were estimated at 669 318 €. In the past 30 years, costs of measures to prevent road damage augmented to at least 14 872 380 €. More than 90% of this budget for preventive measures was spent 30 years ago, when an important freeway was damaged and had to be repaired. These preventive measures (building a grout wall and improving the drainage system) were effective as no further damage has been reported until present. To repair and prevent damage to waterworks and sewage systems, expenditures amounted to 551 044 € and this for the last 30 years. In the past 10 years, a new railway line connecting two important Belgian cities has been built and within that one project, the cost to prevent damage to railroads augmented already to at least 4 567 822 €. The value of real estate located in regions affected by landslides decreased with 15% to 35%. All these damage costs were then used to made potential damage maps. Based on the inventory of landslides, frequency of landslides' re-activation and land use, we categorized regions that are affected by landslides according to their temporal probability of landslide re-activation. This allowed us to produce a (semi-) qualitative risk map for regions that were affected by landslides in the past. This paper shows that, though generally not spectacular, landsliding in low-relief regions susceptible to landslides is a slow but continuously operating process with considerable damage allowing one to identify several medium to high landslide risk zones. As such this study provides important information for government officials, especially those in charge of spatial planning and of town and environmental planning, as it clearly informs about the costs associated with certain land use types in landslide prone areas. This information can be particularly useful for regions in which increasing demand for building land pressures government officials and (local) political leaders to expand the built environment.

  14. Prediction of Environmental Impact of High-Energy Materials with Atomistic Computer Simulations

    DTIC Science & Technology

    2010-11-01

    from a training set of compounds. Other methods include Quantitative Struc- ture-Activity Relationship ( QSAR ) and Quantitative Structure-Property...26 28 the development of QSPR/ QSAR models, in contrast to boiling points and critical parameters derived from empirical correlations, to improve...Quadratic Configuration Interaction Singles Doubles QSAR Quantitative Structure-Activity Relationship QSPR Quantitative Structure-Property

  15. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    USDA-ARS?s Scientific Manuscript database

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  16. 76 FR 50904 - Thiamethoxam; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ... exposure and risk. A separate assessment was done for clothianidin. i. Acute exposure. Quantitative acute... not expected to pose a cancer risk, a quantitative dietary exposure assessment for the purposes of...-dietary sources of post application exposure to obtain an estimate of potential combined exposure. These...

  17. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  18. Bound Pool Fractions Complement Diffusion Measures to Describe White Matter Micro and Macrostructure

    PubMed Central

    Stikov, Nikola; Perry, Lee M.; Mezer, Aviv; Rykhlevskaia, Elena; Wandell, Brian A.; Pauly, John M.; Dougherty, Robert F.

    2010-01-01

    Diffusion imaging and bound pool fraction (BPF) mapping are two quantitative magnetic resonance imaging techniques that measure microstructural features of the white matter of the brain. Diffusion imaging provides a quantitative measure of the diffusivity of water in tissue. BPF mapping is a quantitative magnetization transfer (qMT) technique that estimates the proportion of exchanging protons bound to macromolecules, such as those found in myelin, and is thus a more direct measure of myelin content than diffusion. In this work, we combine BPF estimates of macromolecular content with measurements of diffusivity within human white matter tracts. Within the white matter, the correlation between BPFs and diffusivity measures such as fractional anisotropy and radial diffusivity was modest, suggesting that diffusion tensor imaging and bound pool fractions are complementary techniques. We found that several major tracts have high BPF, suggesting a higher density of myelin in these tracts. We interpret these results in the context of a quantitative tissue model. PMID:20828622

  19. Validation of the quantitative point-of-care CareStart biosensor for assessment of G6PD activity in venous blood.

    PubMed

    Bancone, Germana; Gornsawun, Gornpan; Chu, Cindy S; Porn, Pen; Pal, Sampa; Bansil, Pooja; Domingo, Gonzalo J; Nosten, Francois

    2018-01-01

    Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most common enzymopathy in the human population affecting an estimated 8% of the world population, especially those living in areas of past and present malaria endemicity. Decreased G6PD enzymatic activity is associated with drug-induced hemolysis and increased risk of severe neonatal hyperbilirubinemia leading to brain damage. The G6PD gene is on the X chromosome therefore mutations cause enzymatic deficiency in hemizygote males and homozygote females while the majority of heterozygous females have an intermediate activity (between 30-80% of normal) with a large distribution into the range of deficiency and normality. Current G6PD qualitative tests are unable to diagnose G6PD intermediate activities which could hinder wide use of 8-aminoquinolines for Plasmodium vivax elimination. The aim of the study was to assess the diagnostic performances of the new Carestart G6PD quantitative biosensor. A total of 150 samples of venous blood with G6PD deficient, intermediate and normal phenotypes were collected among healthy volunteers living along the north-western Thailand-Myanmar border. Samples were analyzed by complete blood count, by gold standard spectrophotometric assay using Trinity kits and by the latest model of Carestart G6PD biosensor which analyzes both G6PD and hemoglobin. Bland-Altman comparison of the CareStart normalized G6PD values to that of the gold standard assay showed a strong bias in values resulting in poor area under-the-curve values for both 30% and 80% thresholds. Performing a receiver operator curve identified threshold values for the CareStart product equivalent to the 30% and 80% gold standard values with good sensitivity and specificity values, 100% and 92% (for 30% G6PD activity) and 92% and 94% (for 80% activity) respectively. The Carestart G6PD biosensor represents a significant improvement for quantitative diagnosis of G6PD deficiency over previous versions. Further improvements and validation studies are required to assess its utility for informing radical cure decisions in malaria endemic settings.

  20. Validation of the quantitative point-of-care CareStart biosensor for assessment of G6PD activity in venous blood

    PubMed Central

    Gornsawun, Gornpan; Chu, Cindy S.; Porn, Pen; Pal, Sampa; Bansil, Pooja

    2018-01-01

    Introduction Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most common enzymopathy in the human population affecting an estimated 8% of the world population, especially those living in areas of past and present malaria endemicity. Decreased G6PD enzymatic activity is associated with drug-induced hemolysis and increased risk of severe neonatal hyperbilirubinemia leading to brain damage. The G6PD gene is on the X chromosome therefore mutations cause enzymatic deficiency in hemizygote males and homozygote females while the majority of heterozygous females have an intermediate activity (between 30–80% of normal) with a large distribution into the range of deficiency and normality. Current G6PD qualitative tests are unable to diagnose G6PD intermediate activities which could hinder wide use of 8-aminoquinolines for Plasmodium vivax elimination. The aim of the study was to assess the diagnostic performances of the new Carestart G6PD quantitative biosensor. Methods A total of 150 samples of venous blood with G6PD deficient, intermediate and normal phenotypes were collected among healthy volunteers living along the north-western Thailand-Myanmar border. Samples were analyzed by complete blood count, by gold standard spectrophotometric assay using Trinity kits and by the latest model of Carestart G6PD biosensor which analyzes both G6PD and hemoglobin. Results Bland-Altman comparison of the CareStart normalized G6PD values to that of the gold standard assay showed a strong bias in values resulting in poor area under-the-curve values for both 30% and 80% thresholds. Performing a receiver operator curve identified threshold values for the CareStart product equivalent to the 30% and 80% gold standard values with good sensitivity and specificity values, 100% and 92% (for 30% G6PD activity) and 92% and 94% (for 80% activity) respectively. Conclusion The Carestart G6PD biosensor represents a significant improvement for quantitative diagnosis of G6PD deficiency over previous versions. Further improvements and validation studies are required to assess its utility for informing radical cure decisions in malaria endemic settings. PMID:29738562

  1. Quantitative Structure-Cytotoxicity Relationship of Bioactive Heterocycles by the Semi-empirical Molecular Orbital Method with the Concept of Absolute Hardness

    NASA Astrophysics Data System (ADS)

    Ishihara, Mariko; Sakagami, Hiroshi; Kawase, Masami; Motohashi, Noboru

    The relationship between the cytotoxicity of N-heterocycles (13 4-trifluoromethylimidazole, 15 phenoxazine and 12 5-trifluoromethyloxazole derivatives), O-heterocycles (11 3-formylchromone and 20 coumarin derivatives) and seven vitamin K2 derivatives against eight tumor cell lines (HSC-2, HSC-3, HSC-4, T98G, HSG, HepG2, HL-60, MT-4) and a maximum of 15 chemical descriptors was investigated using CAChe Worksystem 4.9 project reader. After determination of the conformation of these compounds and approximation to the molecular form present in vivo (biomimetic) by CONFLEX5, the most stable structure was determined by CAChe Worksystem 4.9 MOPAC (PM3). The present study demonstrates the best relationship between the cytotoxic activity and molecular shape or molecular weight of these compounds. Their biological activities can be estimated by hardness and softness, and by using η-χ activity diagrams.

  2. A new adaptive L1-norm for optimal descriptor selection of high-dimensional QSAR classification model for anti-hepatitis C virus activity of thiourea derivatives.

    PubMed

    Algamal, Z Y; Lee, M H

    2017-01-01

    A high-dimensional quantitative structure-activity relationship (QSAR) classification model typically contains a large number of irrelevant and redundant descriptors. In this paper, a new design of descriptor selection for the QSAR classification model estimation method is proposed by adding a new weight inside L1-norm. The experimental results of classifying the anti-hepatitis C virus activity of thiourea derivatives demonstrate that the proposed descriptor selection method in the QSAR classification model performs effectively and competitively compared with other existing penalized methods in terms of classification performance on both the training and the testing datasets. Moreover, it is noteworthy that the results obtained in terms of stability test and applicability domain provide a robust QSAR classification model. It is evident from the results that the developed QSAR classification model could conceivably be employed for further high-dimensional QSAR classification studies.

  3. Heritability and genetic correlations of personality traits in a wild population of yellow-bellied marmots (Marmota flaviventris).

    PubMed

    Petelle, M B; Martin, J G A; Blumstein, D T

    2015-10-01

    Describing and quantifying animal personality is now an integral part of behavioural studies because individually distinctive behaviours have ecological and evolutionary consequences. Yet, to fully understand how personality traits may respond to selection, one must understand the underlying heritability and genetic correlations between traits. Previous studies have reported a moderate degree of heritability of personality traits, but few of these studies have either been conducted in the wild or estimated the genetic correlations between personality traits. Estimating the additive genetic variance and covariance in the wild is crucial to understand the evolutionary potential of behavioural traits. Enhanced environmental variation could reduce heritability and genetic correlations, thus leading to different evolutionary predictions. We estimated the additive genetic variance and covariance of docility in the trap, sociability (mirror image stimulation), and exploration and activity in two different contexts (open-field and mirror image simulation experiments) in a wild population of yellow-bellied marmots (Marmota flaviventris). We estimated both heritability of behaviours and of personality traits and found nonzero additive genetic variance in these traits. We also found nonzero maternal, permanent environment and year effects. Finally, we found four phenotypic correlations between traits, and one positive genetic correlation between activity in the open-field test and sociability. We also found permanent environment correlations between activity in both tests and docility and exploration in the MIS test. This is one of a handful of studies to adopt a quantitative genetic approach to explain variation in personality traits in the wild and, thus, provides important insights into the potential variance available for selection. © 2015 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2015 European Society For Evolutionary Biology.

  4. Dose and Effect Thresholds for Early Key Events in a Mode of ...

    EPA Pesticide Factsheets

    ABSTRACT Strategies for predicting adverse health outcomes of environmental chemicals are centered on early key events in toxicity pathways. However, quantitative relationships between early molecular changes in a given pathway and later health effects are often poorly defined. The goal of this study was to evaluate short-term key event indicators using qualitative and quantitative methods in an established pathway of mouse liver tumorigenesis mediated by peroxisome proliferator-activated receptor-alpha (PPARα). Male B6C3F1 mice were exposed for 7 days to di(2-ethylhexyl) phthalate (DEHP), di-n-octyl phthalate (DNOP), and n-butyl benzyl phthalate (BBP), which vary in PPARα activity and liver tumorigenicity. Each phthalate increased expression of select PPARα target genes at 7 days, while only DEHP significantly increased liver cell proliferation labeling index (LI). Transcriptional benchmark dose (BMDT) estimates for dose-related genomic markers stratified phthalates according to hypothetical tumorigenic potencies, unlike BMDs for non-genomic endpoints (liver weights or proliferation). The 7-day BMDT values for Acot1 as a surrogate measure for PPARα activation were 29, 370, and 676 mg/kg-d for DEHP, DNOP, and BBP, respectively, distinguishing DEHP (liver tumor BMD of 35 mg/kg-d) from non-tumorigenic DNOP and BBP. Effect thresholds were generated using linear regression of DEHP effects at 7 days and 2-year tumor incidence values to anchor early response molec

  5. Quantitative EEG patterns of differential in-flight workload

    NASA Technical Reports Server (NTRS)

    Sterman, M. B.; Mann, C. A.; Kaiser, D. A.

    1993-01-01

    Four test pilots were instrumented for in-flight EEG recordings using a custom portable recording system. Each flew six, two minute tracking tasks in the Calspan NT-33 experimental trainer at Edwards AFB. With the canopy blacked out, pilots used a HUD display to chase a simulated aircraft through a random flight course. Three configurations of flight controls altered the flight characteristics to achieve low, moderate, and high workload, as determined by normative Cooper-Harper ratings. The test protocol was administered by a command pilot in the back seat. Corresponding EEG and tracking data were compared off-line. Tracking performance was measured as deviation from the target aircraft and combined with control difficulty to achieve an estimate of 'cognitive workload'. Trended patterns of parietal EEG activity at 8-12 Hz were sorted according to this classification. In all cases, high workload produced a significantly greater suppression of 8-12 Hz activity than low workload. Further, a clear differentiation of EEG trend patterns was obtained in 80 percent of the cases. High workload produced a sustained suppression of 8-12 Hz activity, while moderate workload resulted in an initial suppression followed by a gradual increment. Low workload was associated with a modulated pattern lacking any periods of marked or sustained suppression. These findings suggest that quantitative analysis of appropriate EEG measures may provide an objective and reliable in-flight index of cognitive effort that could facilitate workload assessment.

  6. A prioritization of generic safety issues. Supplement 19, Revision insertion instructions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1995-11-01

    The report presents the safety priority ranking for generic safety issues related to nuclear power plants. The purpose of these rankings is to assist in the timely and efficient allocation of NRC resources for the resolution of those safety issues that have a significant potential for reducing risk. The safety priority rankings are HIGH, MEDIUM, LOW, and DROP, and have been assigned on the basis of risk significance estimates, the ratio of risk to costs and other impacts estimated to result if resolution of the safety issues were implemented, and the consideration of uncertainties and other quantitative or qualitative factors.more » To the extent practical, estimates are quantitative. This document provides revisions and amendments to the report.« less

  7. Estimating weak ratiometric signals in imaging data. II. Meta-analysis with multiple, dual-channel datasets.

    PubMed

    Sornborger, Andrew; Broder, Josef; Majumder, Anirban; Srinivasamoorthy, Ganesh; Porter, Erika; Reagin, Sean S; Keith, Charles; Lauderdale, James D

    2008-09-01

    Ratiometric fluorescent indicators are used for making quantitative measurements of a variety of physiological variables. Their utility is often limited by noise. This is the second in a series of papers describing statistical methods for denoising ratiometric data with the aim of obtaining improved quantitative estimates of variables of interest. Here, we outline a statistical optimization method that is designed for the analysis of ratiometric imaging data in which multiple measurements have been taken of systems responding to the same stimulation protocol. This method takes advantage of correlated information across multiple datasets for objectively detecting and estimating ratiometric signals. We demonstrate our method by showing results of its application on multiple, ratiometric calcium imaging experiments.

  8. Developing Geoscience Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2005-12-01

    Sophisticated quantitative skills are an essential tool for the professional geoscientist. While students learn many of these sophisticated skills in graduate school, it is increasingly important that they have a strong grounding in quantitative geoscience as undergraduates. Faculty have developed many strong approaches to teaching these skills in a wide variety of geoscience courses. A workshop in June 2005 brought together eight faculty teaching surface processes and climate change to discuss and refine activities they use and to publish them on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills) for broader use. Workshop participants in consultation with two mathematics faculty who have expertise in math education developed six review criteria to guide discussion: 1) Are the quantitative and geologic goals central and important? (e.g. problem solving, mastery of important skill, modeling, relating theory to observation); 2) Does the activity lead to better problem solving? 3) Are the quantitative skills integrated with geoscience concepts in a way that makes sense for the learning environment and supports learning both quantitative skills and geoscience? 4) Does the methodology support learning? (e.g. motivate and engage students; use multiple representations, incorporate reflection, discussion and synthesis) 5) Are the materials complete and helpful to students? 6) How well has the activity worked when used? Workshop participants found that reviewing each others activities was very productive because they thought about new ways to teach and the experience of reviewing helped them think about their own activity from a different point of view. The review criteria focused their thinking about the activity and would be equally helpful in the design of a new activity. We invite a broad international discussion of the criteria(serc.Carleton.edu/quantskills/workshop05/review.html).The Teaching activities can be found on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills/). In addition to the teaching activity collection (85 activites), this site contains a variety of resources to assist faculty with the methods they use to teach quantitative skills at both the introductory and advanced levels; information about broader efforts in quantitative literacy involving other science disciplines, and a special section of resources for students who are struggling with their quantitative skills. The site is part of the Digital Library for Earth Science Education and has been developed by geoscience faculty in collaboration with mathematicians and mathematics educators with funding from the National Science Foundation.

  9. Contextual and perceptual brain processes underlying moral cognition: a quantitative meta-analysis of moral reasoning and moral emotions.

    PubMed

    Sevinc, Gunes; Spreng, R Nathan

    2014-01-01

    Human morality has been investigated using a variety of tasks ranging from judgments of hypothetical dilemmas to viewing morally salient stimuli. These experiments have provided insight into neural correlates of moral judgments and emotions, yet these approaches reveal important differences in moral cognition. Moral reasoning tasks require active deliberation while moral emotion tasks involve the perception of stimuli with moral implications. We examined convergent and divergent brain activity associated with these experimental paradigms taking a quantitative meta-analytic approach. A systematic search of the literature yielded 40 studies. Studies involving explicit decisions in a moral situation were categorized as active (n = 22); studies evoking moral emotions were categorized as passive (n = 18). We conducted a coordinate-based meta-analysis using the Activation Likelihood Estimation to determine reliable patterns of brain activity. Results revealed a convergent pattern of reliable brain activity for both task categories in regions of the default network, consistent with the social and contextual information processes supported by this brain network. Active tasks revealed more reliable activity in the temporoparietal junction, angular gyrus and temporal pole. Active tasks demand deliberative reasoning and may disproportionately involve the retrieval of social knowledge from memory, mental state attribution, and construction of the context through associative processes. In contrast, passive tasks reliably engaged regions associated with visual and emotional information processing, including lingual gyrus and the amygdala. A laterality effect was observed in dorsomedial prefrontal cortex, with active tasks engaging the left, and passive tasks engaging the right. While overlapping activity patterns suggest a shared neural network for both tasks, differential activity suggests that processing of moral input is affected by task demands. The results provide novel insight into distinct features of moral cognition, including the generation of moral context through associative processes and the perceptual detection of moral salience.

  10. Development of a Multi-Biomarker Disease Activity Test for Rheumatoid Arthritis

    PubMed Central

    Shen, Yijing; Ramanujan, Saroja; Knowlton, Nicholas; Swan, Kathryn A.; Turner, Mary; Sutton, Chris; Smith, Dustin R.; Haney, Douglas J.; Chernoff, David; Hesterberg, Lyndal K.; Carulli, John P.; Taylor, Peter C.; Shadick, Nancy A.; Weinblatt, Michael E.; Curtis, Jeffrey R.

    2013-01-01

    Background Disease activity measurement is a key component of rheumatoid arthritis (RA) management. Biomarkers that capture the complex and heterogeneous biology of RA have the potential to complement clinical disease activity assessment. Objectives To develop a multi-biomarker disease activity (MBDA) test for rheumatoid arthritis. Methods Candidate serum protein biomarkers were selected from extensive literature screens, bioinformatics databases, mRNA expression and protein microarray data. Quantitative assays were identified and optimized for measuring candidate biomarkers in RA patient sera. Biomarkers with qualifying assays were prioritized in a series of studies based on their correlations to RA clinical disease activity (e.g. the Disease Activity Score 28-C-Reactive Protein [DAS28-CRP], a validated metric commonly used in clinical trials) and their contributions to multivariate models. Prioritized biomarkers were used to train an algorithm to measure disease activity, assessed by correlation to DAS and area under the receiver operating characteristic curve for classification of low vs. moderate/high disease activity. The effect of comorbidities on the MBDA score was evaluated using linear models with adjustment for multiple hypothesis testing. Results 130 candidate biomarkers were tested in feasibility studies and 25 were selected for algorithm training. Multi-biomarker statistical models outperformed individual biomarkers at estimating disease activity. Biomarker-based scores were significantly correlated with DAS28-CRP and could discriminate patients with low vs. moderate/high clinical disease activity. Such scores were also able to track changes in DAS28-CRP and were significantly associated with both joint inflammation measured by ultrasound and damage progression measured by radiography. The final MBDA algorithm uses 12 biomarkers to generate an MBDA score between 1 and 100. No significant effects on the MBDA score were found for common comorbidities. Conclusion We followed a stepwise approach to develop a quantitative serum-based measure of RA disease activity, based on 12-biomarkers, which was consistently associated with clinical disease activity levels. PMID:23585841

  11. Contextual and Perceptual Brain Processes Underlying Moral Cognition: A Quantitative Meta-Analysis of Moral Reasoning and Moral Emotions

    PubMed Central

    Sevinc, Gunes; Spreng, R. Nathan

    2014-01-01

    Background and Objectives Human morality has been investigated using a variety of tasks ranging from judgments of hypothetical dilemmas to viewing morally salient stimuli. These experiments have provided insight into neural correlates of moral judgments and emotions, yet these approaches reveal important differences in moral cognition. Moral reasoning tasks require active deliberation while moral emotion tasks involve the perception of stimuli with moral implications. We examined convergent and divergent brain activity associated with these experimental paradigms taking a quantitative meta-analytic approach. Data Source A systematic search of the literature yielded 40 studies. Studies involving explicit decisions in a moral situation were categorized as active (n = 22); studies evoking moral emotions were categorized as passive (n = 18). We conducted a coordinate-based meta-analysis using the Activation Likelihood Estimation to determine reliable patterns of brain activity. Results & Conclusions Results revealed a convergent pattern of reliable brain activity for both task categories in regions of the default network, consistent with the social and contextual information processes supported by this brain network. Active tasks revealed more reliable activity in the temporoparietal junction, angular gyrus and temporal pole. Active tasks demand deliberative reasoning and may disproportionately involve the retrieval of social knowledge from memory, mental state attribution, and construction of the context through associative processes. In contrast, passive tasks reliably engaged regions associated with visual and emotional information processing, including lingual gyrus and the amygdala. A laterality effect was observed in dorsomedial prefrontal cortex, with active tasks engaging the left, and passive tasks engaging the right. While overlapping activity patterns suggest a shared neural network for both tasks, differential activity suggests that processing of moral input is affected by task demands. The results provide novel insight into distinct features of moral cognition, including the generation of moral context through associative processes and the perceptual detection of moral salience. PMID:24503959

  12. Development of a multi-biomarker disease activity test for rheumatoid arthritis.

    PubMed

    Centola, Michael; Cavet, Guy; Shen, Yijing; Ramanujan, Saroja; Knowlton, Nicholas; Swan, Kathryn A; Turner, Mary; Sutton, Chris; Smith, Dustin R; Haney, Douglas J; Chernoff, David; Hesterberg, Lyndal K; Carulli, John P; Taylor, Peter C; Shadick, Nancy A; Weinblatt, Michael E; Curtis, Jeffrey R

    2013-01-01

    Disease activity measurement is a key component of rheumatoid arthritis (RA) management. Biomarkers that capture the complex and heterogeneous biology of RA have the potential to complement clinical disease activity assessment. To develop a multi-biomarker disease activity (MBDA) test for rheumatoid arthritis. Candidate serum protein biomarkers were selected from extensive literature screens, bioinformatics databases, mRNA expression and protein microarray data. Quantitative assays were identified and optimized for measuring candidate biomarkers in RA patient sera. Biomarkers with qualifying assays were prioritized in a series of studies based on their correlations to RA clinical disease activity (e.g. the Disease Activity Score 28-C-Reactive Protein [DAS28-CRP], a validated metric commonly used in clinical trials) and their contributions to multivariate models. Prioritized biomarkers were used to train an algorithm to measure disease activity, assessed by correlation to DAS and area under the receiver operating characteristic curve for classification of low vs. moderate/high disease activity. The effect of comorbidities on the MBDA score was evaluated using linear models with adjustment for multiple hypothesis testing. 130 candidate biomarkers were tested in feasibility studies and 25 were selected for algorithm training. Multi-biomarker statistical models outperformed individual biomarkers at estimating disease activity. Biomarker-based scores were significantly correlated with DAS28-CRP and could discriminate patients with low vs. moderate/high clinical disease activity. Such scores were also able to track changes in DAS28-CRP and were significantly associated with both joint inflammation measured by ultrasound and damage progression measured by radiography. The final MBDA algorithm uses 12 biomarkers to generate an MBDA score between 1 and 100. No significant effects on the MBDA score were found for common comorbidities. We followed a stepwise approach to develop a quantitative serum-based measure of RA disease activity, based on 12-biomarkers, which was consistently associated with clinical disease activity levels.

  13. Neural Networks Involved in Adolescent Reward Processing: An Activation Likelihood Estimation Meta-Analysis of Functional Neuroimaging Studies

    PubMed Central

    Silverman, Merav H.; Jedd, Kelly; Luciana, Monica

    2015-01-01

    Behavioral responses to, and the neural processing of, rewards change dramatically during adolescence and may contribute to observed increases in risk-taking during this developmental period. Functional MRI (fMRI) studies suggest differences between adolescents and adults in neural activation during reward processing, but findings are contradictory, and effects have been found in non-predicted directions. The current study uses an activation likelihood estimation (ALE) approach for quantitative meta-analysis of functional neuroimaging studies to: 1) confirm the network of brain regions involved in adolescents’ reward processing, 2) identify regions involved in specific stages (anticipation, outcome) and valence (positive, negative) of reward processing, and 3) identify differences in activation likelihood between adolescent and adult reward-related brain activation. Results reveal a subcortical network of brain regions involved in adolescent reward processing similar to that found in adults with major hubs including the ventral and dorsal striatum, insula, and posterior cingulate cortex (PCC). Contrast analyses find that adolescents exhibit greater likelihood of activation in the insula while processing anticipation relative to outcome and greater likelihood of activation in the putamen and amygdala during outcome relative to anticipation. While processing positive compared to negative valence, adolescents show increased likelihood for activation in the posterior cingulate cortex (PCC) and ventral striatum. Contrasting adolescent reward processing with the existing ALE of adult reward processing (Liu et al., 2011) reveals increased likelihood for activation in limbic, frontolimbic, and striatal regions in adolescents compared with adults. Unlike adolescents, adults also activate executive control regions of the frontal and parietal lobes. These findings support hypothesized elevations in motivated activity during adolescence. PMID:26254587

  14. Disdrometer-based C-Band Radar Quantitative Precipitation Estimation (QPE) in a highly complex terrain region in tropical Colombia.

    NASA Astrophysics Data System (ADS)

    Sepúlveda, J.; Hoyos Ortiz, C. D.

    2017-12-01

    An adequate quantification of precipitation over land is critical for many societal applications including agriculture, hydroelectricity generation, water supply, and risk management associated with extreme events. The use of rain gauges, a traditional method for precipitation estimation, and an excellent one, to estimate the volume of liquid water during a particular precipitation event, does not allow to fully capture the highly spatial variability of the phenomena which is a requirement for almost all practical applications. On the other hand, the weather radar, an active remote sensing sensor, provides a proxy for rainfall with fine spatial resolution and adequate temporary sampling, however, it does not measure surface precipitation. In order to fully exploit the capabilities of the weather radar, it is necessary to develop quantitative precipitation estimation (QPE) techniques combining radar information with in-situ measurements. Different QPE methodologies are explored and adapted to local observations in a highly complex terrain region in tropical Colombia using a C-Band radar and a relatively dense network of rain gauges and disdrometers. One important result is that the expressions reported in the literature for extratropical locations are not representative of the conditions found in the tropical region studied. In addition to reproducing the state-of-the-art techniques, a new multi-stage methodology based on radar-derived variables and disdrometer data is proposed in order to achieve the best QPE possible. The main motivation for this new methodology is based on the fact that most traditional QPE methods do not directly take into account the different uncertainty sources involved in the process. The main advantage of the multi-stage model compared to traditional models is that it allows assessing and quantifying the uncertainty in the surface rain rate estimation. The sub-hourly rainfall estimations using the multi-stage methodology are realistic compared to observed data in spite of the many sources of uncertainty including the sampling volume, the different physical principles of the sensors, the incomplete understanding of the microphysics of precipitation and, the most important, the rapidly varying droplet size distribution.

  15. Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.

    PubMed

    Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K

    2011-01-01

    We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.

  16. A fast signal subspace approach for the determination of absolute levels from phased microphone array measurements

    NASA Astrophysics Data System (ADS)

    Sarradj, Ennes

    2010-04-01

    Phased microphone arrays are used in a variety of applications for the estimation of acoustic source location and spectra. The popular conventional delay-and-sum beamforming methods used with such arrays suffer from inaccurate estimations of absolute source levels and in some cases also from low resolution. Deconvolution approaches such as DAMAS have better performance, but require high computational effort. A fast beamforming method is proposed that can be used in conjunction with a phased microphone array in applications with focus on the correct quantitative estimation of acoustic source spectra. This method bases on an eigenvalue decomposition of the cross spectral matrix of microphone signals and uses the eigenvalues from the signal subspace to estimate absolute source levels. The theoretical basis of the method is discussed together with an assessment of the quality of the estimation. Experimental tests using a loudspeaker setup and an airfoil trailing edge noise setup in an aeroacoustic wind tunnel show that the proposed method is robust and leads to reliable quantitative results.

  17. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  18. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  19. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  20. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  1. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  2. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  3. A simple bias correction in linear regression for quantitative trait association under two-tail extreme selection.

    PubMed

    Kwan, Johnny S H; Kung, Annie W C; Sham, Pak C

    2011-09-01

    Selective genotyping can increase power in quantitative trait association. One example of selective genotyping is two-tail extreme selection, but simple linear regression analysis gives a biased genetic effect estimate. Here, we present a simple correction for the bias.

  4. Extrapolating cetacean densities to quantitatively assess human impacts on populations in the high seas.

    PubMed

    Mannocci, Laura; Roberts, Jason J; Miller, David L; Halpin, Patrick N

    2017-06-01

    As human activities expand beyond national jurisdictions to the high seas, there is an increasing need to consider anthropogenic impacts to species inhabiting these waters. The current scarcity of scientific observations of cetaceans in the high seas impedes the assessment of population-level impacts of these activities. We developed plausible density estimates to facilitate a quantitative assessment of anthropogenic impacts on cetacean populations in these waters. Our study region extended from a well-surveyed region within the U.S. Exclusive Economic Zone into a large region of the western North Atlantic sparsely surveyed for cetaceans. We modeled densities of 15 cetacean taxa with available line transect survey data and habitat covariates and extrapolated predictions to sparsely surveyed regions. We formulated models to reduce the extent of extrapolation beyond covariate ranges, and constrained them to model simple and generalizable relationships. To evaluate confidence in the predictions, we mapped where predictions were made outside sampled covariate ranges, examined alternate models, and compared predicted densities with maps of sightings from sources that could not be integrated into our models. Confidence levels in model results depended on the taxon and geographic area and highlighted the need for additional surveying in environmentally distinct areas. With application of necessary caution, our density estimates can inform management needs in the high seas, such as the quantification of potential cetacean interactions with military training exercises, shipping, fisheries, and deep-sea mining and be used to delineate areas of special biological significance in international waters. Our approach is generally applicable to other marine taxa and geographic regions for which management will be implemented but data are sparse. © 2016 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  5. Using Active Learning to Teach Concepts and Methods in Quantitative Biology.

    PubMed

    Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A

    2015-11-01

    This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  6. Semi-quantitative estimation by IR of framework, extraframework and defect Al species of HBEA zeolites.

    PubMed

    Marques, João P; Gener, Isabelle; Ayrault, Philippe; Lopes, José M; Ribeiro, F Ramôa; Guisnet, Michel

    2004-10-21

    A simple method based on the characterization (composition, Bronsted and Lewis acidities) of acid treated HBEA zeolites was developed for estimating the concentrations of framework, extraframework and defect Al species.

  7. Quantitative Compactness Estimates for Hamilton-Jacobi Equations

    NASA Astrophysics Data System (ADS)

    Ancona, Fabio; Cannarsa, Piermarco; Nguyen, Khai T.

    2016-02-01

    We study quantitative compactness estimates in {W^{1,1}_{loc}} for the map {S_t}, {t > 0} that is associated with the given initial data {u_0in Lip (R^N)} for the corresponding solution {S_t u_0} of a Hamilton-Jacobi equation u_t+Hbig(nabla_{x} ubig)=0, qquad t≥ 0,quad xinR^N, with a uniformly convex Hamiltonian {H=H(p)}. We provide upper and lower estimates of order {1/\\varepsilon^N} on the Kolmogorov {\\varepsilon}-entropy in {W^{1,1}} of the image through the map S t of sets of bounded, compactly supported initial data. Estimates of this type are inspired by a question posed by Lax (Course on Hyperbolic Systems of Conservation Laws. XXVII Scuola Estiva di Fisica Matematica, Ravello, 2002) within the context of conservation laws, and could provide a measure of the order of "resolution" of a numerical method implemented for this equation.

  8. Low-Cost Evaluation of EO-1 Hyperion and ALI for Detection and Biophysical Characterization of Forest Logging in Amazonia (NCC5-481)

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P.; Keller, Michael M.; Silva, Jose Natalino; Zweede, Johan C.; Pereira, Rodrigo, Jr.

    2002-01-01

    Major uncertainties exist regarding the rate and intensity of logging in tropical forests worldwide: these uncertainties severely limit economic, ecological, and biogeochemical analyses of these regions. Recent sawmill surveys in the Amazon region of Brazil show that the area logged is nearly equal to total area deforested annually, but conversion of survey data to forest area, forest structural damage, and biomass estimates requires multiple assumptions about logging practices. Remote sensing could provide an independent means to monitor logging activity and to estimate the biophysical consequences of this land use. Previous studies have demonstrated that the detection of logging in Amazon forests is difficult and no studies have developed either the quantitative physical basis or remote sensing approaches needed to estimate the effects of various logging regimes on forest structure. A major reason for these limitations has been a lack of sufficient, well-calibrated optical satellite data, which in turn, has impeded the development and use of physically-based, quantitative approaches for detection and structural characterization of forest logging regimes. We propose to use data from the EO-1 Hyperion imaging spectrometer to greatly increase our ability to estimate the presence and structural attributes of selective logging in the Amazon Basin. Our approach is based on four "biogeophysical indicators" not yet derived simultaneously from any satellite sensor: 1) green canopy leaf area index; 2) degree of shadowing; 3) presence of exposed soil and; 4) non-photosynthetic vegetation material. Airborne, field and modeling studies have shown that the optical reflectance continuum (400-2500 nm) contains sufficient information to derive estimates of each of these indicators. Our ongoing studies in the eastern Amazon basin also suggest that these four indicators are sensitive to logging intensity. Satellite-based estimates of these indicators should provide a means to quantify both the presence and degree of structural disturbance caused by various logging regimes. Our quantitative assessment of Hyperion hyperspectral and ALI multi-spectral data for the detection and structural characterization of selective logging in Amazonia will benefit from data collected through an ongoing project run by the Tropical Forest Foundation, within which we have developed a study of the canopy and landscape biophysics of conventional and reduced-impact logging. We will add to our base of forest structural information in concert with an EO-1 overpass. Using a photon transport model inversion technique that accounts for non-linear mixing of the four biogeophysical indicators, we will estimate these parameters across a gradient of selective logging intensity provided by conventional and reduced impact logging sites. We will also compare our physical ly-based approach to both conventional (e.g., NDVI) and novel (e.g., SWIR-channel) vegetation indices as well as to linear mixture modeling methods. We will cross-compare these approaches using Hyperion and ALI imagers to determine the strengths and limitations of these two sensors for applications of forest biophysics. This effort will yield the first physical ly-based, quantitative analysis of the detection and intensity of selective logging in Amazonia, comparing hyperspectral and improved multi-spectral approaches as well as inverse modeling, linear mixture modeling, and vegetation index techniques.

  9. Reliability and precision of pellet-group counts for estimating landscape-level deer density

    Treesearch

    David S. deCalesta

    2013-01-01

    This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...

  10. REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING CANCER RISKS FROM EXPOSURE TO IONIZING RADIATION

    EPA Science Inventory

    In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainti...

  11. A Method for Semi-quantitative Assessment of Exposure to Pesticides of Applicators and Re-entry Workers: An Application in Three Farming Systems in Ethiopia.

    PubMed

    Negatu, Beyene; Vermeulen, Roel; Mekonnen, Yalemtshay; Kromhout, Hans

    2016-07-01

    To develop an inexpensive and easily adaptable semi-quantitative exposure assessment method to characterize exposure to pesticide in applicators and re-entry farmers and farm workers in Ethiopia. Two specific semi-quantitative exposure algorithms for pesticides applicators and re-entry workers were developed and applied to 601 farm workers employed in 3 distinctly different farming systems [small-scale irrigated, large-scale greenhouses (LSGH), and large-scale open (LSO)] in Ethiopia. The algorithm for applicators was based on exposure-modifying factors including application methods, farm layout (open or closed), pesticide mixing conditions, cleaning of spraying equipment, intensity of pesticide application per day, utilization of personal protective equipment (PPE), personal hygienic behavior, annual frequency of application, and duration of employment at the farm. The algorithm for re-entry work was based on an expert-based re-entry exposure intensity score, utilization of PPE, personal hygienic behavior, annual frequency of re-entry work, and duration of employment at the farm. The algorithms allowed estimation of daily, annual and cumulative lifetime exposure for applicators, and re-entry workers by farming system, by gender, and by age group. For all metrics, highest exposures occurred in LSGH for both applicators and female re-entry workers. For male re-entry workers, highest cumulative exposure occurred in LSO farms. Female re-entry workers appeared to be higher exposed on a daily or annual basis than male re-entry workers, but their cumulative exposures were similar due to the fact that on average males had longer tenure. Factors related to intensity of exposure (like application method and farm layout) were indicated as the main driving factors for estimated potential exposure. Use of personal protection, hygienic behavior, and duration of employment in surveyed farm workers contributed less to the contrast in exposure estimates. This study indicated that farmers' and farm workers' exposure to pesticides can be inexpensively characterized, ranked, and classified. Our method could be extended to assess exposure to specific active ingredients provided that detailed information on pesticides used is available. The resulting exposure estimates will consequently be used in occupational epidemiology studies in Ethiopia and other similar countries with few resources. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  12. Quantifying light exposure patterns in young adult students

    NASA Astrophysics Data System (ADS)

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2013-08-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects' estimates of time spent indoors and outdoors. Subjects' estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires.

  13. An algorithm for the estimation of the signal-to-noise ratio in surface myoelectric signals generated during cyclic movements.

    PubMed

    Agostini, Valentina; Knaflitz, Marco

    2012-01-01

    In many applications requiring the study of the surface myoelectric signal (SMES) acquired in dynamic conditions, it is essential to have a quantitative evaluation of the quality of the collected signals. When the activation pattern of a muscle has to be obtained by means of single- or double-threshold statistical detectors, the background noise level e (noise) of the signal is a necessary input parameter. Moreover, the detection strategy of double-threshold detectors may be properly tuned when the SNR and the duty cycle (DC) of the signal are known. The aim of this paper is to present an algorithm for the estimation of e (noise), SNR, and DC of an SMES collected during cyclic movements. The algorithm is validated on synthetic signals with statistical properties similar to those of SMES, as well as on more than 100 real signals. © 2011 IEEE

  14. Quantifying light exposure patterns in young adult students

    PubMed Central

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2014-01-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects’ estimates of time spent indoors and outdoors. Subjects’ estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires. PMID:25342873

  15. Estimation of sonodynamic treatment region with sonochemiluminescence in gel phantom

    NASA Astrophysics Data System (ADS)

    Mashiko, Daisaku; Nishitaka, Shinya; Iwasaki, Ryosuke; Lafond, Maxime; Yoshizawa, Shin; Umemura, Shin-ichiro

    2018-07-01

    Sonodynamic treatment is a non-invasive cancer treatment using ultrasound through the generation of reactive oxygen species (ROS) by acoustic cavitation. High-intensity focused ultrasound (HIFU) can generate cavitation bubbles using highly negative pressure in its focal region. When cavitation bubbles are forced to collapse, they generate ROS, which can attack cancer cells, typically assisted by a sonodynamically active antitumor agent. For sonodynamic treatment, both localization and efficiency of generating ROS are important. To improve them, the region of ROS generation was quantitatively estimated in this study using a polyacrylamide gel containing luminol as the target exposed to “Trigger HIFU”, consisting of a highly intense short “trigger pulse” to generate a cavitation cloud followed by a moderate-intensity long “sustaining burst” to keep the cavitation bubbles oscillating. It was found to be important for efficient ROS generation that the focal region of the trigger pulse should be immediately exposed to the sustaining burst.

  16. Monitoring and modeling for investigating driver/pressure-state/impact relationships in coastal ecosystems: Examples from the Lagoon of Venice

    NASA Astrophysics Data System (ADS)

    Pastres, Roberto; Solidoro, Cosimo

    2012-01-01

    In this paper, we show how the integration of monitoring data and mathematical model can generate valuable information by using a few examples taken from a well studied but complex ecosystem, namely the Lagoon of Venice. We will focus on three key issues, which are of concern also for many other coastal ecosystems, namely: (1) Nitrogen and Phosphorus annual budgets; (2) estimation of Net Ecosystem Metabolism and early warnings for anoxic events; (3) assessment of ecosystem status. The results highlight the importance of framing monitoring activities within the "DPSIR" conceptual model, thus going far beyond the monitoring of major biogeochemical variables and including: (1) the estimation of the fluxes of the main constituents at the boundaries; (2) the use of appropriate mathematical models. These tools can provide quantitative links among Pressures and State/Impacts, thus enabling decision makers and stakeholders to evaluate the effects of alternative management scenarios.

  17. Random Interchange of Magnetic Connectivity

    NASA Astrophysics Data System (ADS)

    Matthaeus, W. H.; Ruffolo, D. J.; Servidio, S.; Wan, M.; Rappazzo, A. F.

    2015-12-01

    Magnetic connectivity, the connection between two points along a magnetic field line, has a stochastic character associated with field lines random walking in space due to magnetic fluctuations, but connectivity can also change in time due to dynamical activity [1]. For fluctuations transverse to a strong mean field, this connectivity change be caused by stochastic interchange due to component reconnection. The process may be understood approximately by formulating a diffusion-like Fokker-Planck coefficient [2] that is asymptotically related to standard field line random walk. Quantitative estimates are provided, for transverse magnetic field models and anisotropic models such as reduced magnetohydrodynamics. In heliospheric applications, these estimates may be useful for understanding mixing between open and close field line regions near coronal hole boundaries, and large latitude excursions of connectivity associated with turbulence. [1] A. F. Rappazzo, W. H. Matthaeus, D. Ruffolo, S. Servidio & M. Velli, ApJL, 758, L14 (2012) [2] D. Ruffolo & W. Matthaeus, ApJ, 806, 233 (2015)

  18. NIRS-SPM: statistical parametric mapping for near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul

    2008-02-01

    Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.

  19. Use of statistical and neural net approaches in predicting toxicity of chemicals.

    PubMed

    Basak, S C; Grunwald, G D; Gute, B D; Balasubramanian, K; Opitz, D

    2000-01-01

    Hierarchical quantitative structure-activity relationships (H-QSAR) have been developed as a new approach in constructing models for estimating physicochemical, biomedicinal, and toxicological properties of interest. This approach uses increasingly more complex molecular descriptors in a graduated approach to model building. In this study, statistical and neural network methods have been applied to the development of H-QSAR models for estimating the acute aquatic toxicity (LC50) of 69 benzene derivatives to Pimephales promelas (fathead minnow). Topostructural, topochemical, geometrical, and quantum chemical indices were used as the four levels of the hierarchical method. It is clear from both the statistical and neural network models that topostructural indices alone cannot adequately model this set of congeneric chemicals. Not surprisingly, topochemical indices greatly increase the predictive power of both statistical and neural network models. Quantum chemical indices also add significantly to the modeling of this set of acute aquatic toxicity data.

  20. A robust approach for ECG-based analysis of cardiopulmonary coupling.

    PubMed

    Zheng, Jiewen; Wang, Weidong; Zhang, Zhengbo; Wu, Dalei; Wu, Hao; Peng, Chung-Kang

    2016-07-01

    Deriving respiratory signal from a surface electrocardiogram (ECG) measurement has advantage of simultaneously monitoring of cardiac and respiratory activities. ECG-based cardiopulmonary coupling (CPC) analysis estimated by heart period variability and ECG-derived respiration (EDR) shows promising applications in medical field. The aim of this paper is to provide a quantitative analysis of the ECG-based CPC, and further improve its performance. Two conventional strategies were tested to obtain EDR signal: R-S wave amplitude and area of the QRS complex. An adaptive filter was utilized to extract the common component of inter-beat interval (RRI) and EDR, generating enhanced versions of EDR signal. CPC is assessed through probing the nonlinear phase interactions between RRI series and respiratory signal. Respiratory oscillations presented in both RRI series and respiratory signals were extracted by ensemble empirical mode decomposition for coupling analysis via phase synchronization index. The results demonstrated that CPC estimated from conventional EDR series exhibits constant and proportional biases, while that estimated from enhanced EDR series is more reliable. Adaptive filtering can improve the accuracy of the ECG-based CPC estimation significantly and achieve robust CPC analysis. The improved ECG-based CPC estimation may provide additional prognostic information for both sleep medicine and autonomic function analysis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berryman, J. G.

    While the well-known Voigt and Reuss (VR) bounds, and the Voigt-Reuss-Hill (VRH) elastic constant estimators for random polycrystals are all straightforwardly calculated once the elastic constants of anisotropic crystals are known, the Hashin-Shtrikman (HS) bounds and related self-consistent (SC) estimators for the same constants are, by comparison, more difficult to compute. Recent work has shown how to simplify (to some extent) these harder to compute HS bounds and SC estimators. An overview and analysis of a subsampling of these results is presented here with the main point being to show whether or not this extra work (i.e., in calculating bothmore » the HS bounds and the SC estimates) does provide added value since, in particular, the VRH estimators often do not fall within the HS bounds, while the SC estimators (for good reasons) have always been found to do so. The quantitative differences between the SC and the VRH estimators in the eight cases considered are often quite small however, being on the order of ±1%. These quantitative results hold true even though these polycrystal Voigt-Reuss-Hill estimators more typically (but not always) fall outside the Hashin-Shtrikman bounds, while the self-consistent estimators always fall inside (or on the boundaries of) these same bounds.« less

  2. Oxidative DNA damage background estimated by a system model of base excision repair

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sokhansanj, B A; Wilson, III, D M

    Human DNA can be damaged by natural metabolism through free radical production. It has been suggested that the equilibrium between innate damage and cellular DNA repair results in an oxidative DNA damage background that potentially contributes to disease and aging. Efforts to quantitatively characterize the human oxidative DNA damage background level based on measuring 8-oxoguanine lesions as a biomarker have led to estimates varying over 3-4 orders of magnitude, depending on the method of measurement. We applied a previously developed and validated quantitative pathway model of human DNA base excision repair, integrating experimentally determined endogenous damage rates and model parametersmore » from multiple sources. Our estimates of at most 100 8-oxoguanine lesions per cell are consistent with the low end of data from biochemical and cell biology experiments, a result robust to model limitations and parameter variation. Our results show the power of quantitative system modeling to interpret composite experimental data and make biologically and physiologically relevant predictions for complex human DNA repair pathway mechanisms and capacity.« less

  3. Electrophysiological and neuromuscular stability of persons with chronic inflammatory demyelinating polyneuropathy.

    PubMed

    Gilmore, Kevin J; Allen, Matti D; Doherty, Timothy J; Kimpinski, Kurt; Rice, Charles L

    2017-09-01

    We assessed motor unit (MU) properties and neuromuscular stability in the tibialis anterior (TA) of chronic inflammatory demyelinating polyneuropathy (CIDP) patients using decomposition-based quantitative electromyography. Dorsiflexion strength was assessed, and surface and concentric needle electromyography were sampled from the TA. Estimates of MU numbers were derived using decomposition-based quantitative electromyography and spike-triggered averaging. Neuromuscular transmission stability was assessed from concentric needle-detected MU potentials. CIDP patients had 43% lower compound muscle action potential amplitude than controls, and despite near-maximum voluntary activation, were 37% weaker. CIDP had 27% fewer functioning MUs in the TA, and had 90% and 44% higher jiggle and jitter values, respectively compared with controls. CIDP had lower strength and compound muscle action potential values, moderately fewer numbers of MUs, and significant neuromuscular instability compared with controls. Thus, in addition to muscle atrophy, voluntary weakness is also due to limitations of peripheral neural transmission consistent with demyelination. Muscle Nerve 56: 413-420, 2017. © 2016 Wiley Periodicals, Inc.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montanini, R.; Freni, F.; Rossi, G. L.

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phasemore » image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.« less

  5. Comparison of air pollution exposures in active vs. passive travel modes in European cities: A quantitative review.

    PubMed

    de Nazelle, Audrey; Bode, Olivier; Orjuela, Juan Pablo

    2017-02-01

    Transport microenvironments tend to have higher air pollutant concentrations than other settings most people encounter in their daily lives. The choice of travel modes may affect significantly individuals' exposures; however such considerations are typically not accounted for in exposure assessment used in environmental health studies. In particular, with increasing interest in the promotion of active travel, health impact studies that attempt to estimate potential adverse consequences of potential increased pollutant inhalation during walking or cycling have emerged. Such studies require a quantification of relative exposures in travel modes. The literature on air pollution exposures in travel microenvironments in Europe was reviewed. Studies which measured various travel modes including at least walking or cycling in a simultaneous or quasi-simultaneous design were selected. Data from these studies were harmonized to allow for a quantitative synthesis of the estimates. Ranges of ratios and 95% confidence interval (CI) of air pollution exposure between modes and between background and transportation modes were estimated. Ten studies measuring fine particulate matter (PM 2.5 ), black carbon (BC), ultrafine particles (UFP), and/or carbon monoxide (CO) in the walk, bicycle, car and/or bus modes were included in the analysis. Only three reported on CO and BC and results should be interpreted with caution. Pedestrians were shown to be the most consistently least exposed of all across studies, with the bus, bicycle and car modes on average 1.3 to 1.5 times higher for PM 2.5 ; 1.1 to 1.7 times higher for UFP; and 1.3 to 2.9 times higher for CO; however the 95% CI included 1 for the UFP walk to bus ratio. Only for BC were pedestrians more exposed than bus users on average (bus to walk ratio 0.8), but remained less exposed than those on bicycles or in cars. Car users tended to be the most exposed (from 2.9 times higher than pedestrians for BC down to similar exposures to cyclists for UFP on average). Bus exposures tended to be similar to that of cyclists (95% CI including 1 for PM 2.5 , CO and BC), except for UFP where they were lower (ratio 0.7). A quantitative method that synthesizes the literature on air pollution exposure in travel microenvironments for use in health impact assessments or potentially for epidemiology was conducted. Results relevant for the European context are presented, showing generally greatest exposures in car riders and lowest exposure in pedestrians. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Effect of traditional medicine brahmi vati and bacoside A-rich fraction of Bacopa monnieri on acute pentylenetetrzole-induced seizures, amphetamine-induced model of schizophrenia, and scopolamine-induced memory loss in laboratory animals.

    PubMed

    Mishra, Amrita; Mishra, Arun K; Jha, Shivesh

    2018-03-01

    Brahmi vati (BV) is an Ayurvedic polyherbal formulation used since ancient times and has been prescribed in seizures associated with schizophrenia and related memory loss by Ayurvedic practitioners in India. The aim of the study was to investigate these claims by evaluation of anticonvulsant, antischizophreniac, and memory-enhancing activities. Antioxidant condition of brain was determined by malondialdehyde (MDA) and reduced glutathione (GSH) levels estimations. Acetylcholinesterase (AChE) was quantitatively estimated in the brain tissue. Brahmi vati was prepared in-house by strictly following the traditional Ayurvedic formula. Bacoside A rich fraction (BA) of Bacopa monnieri was prepared by extraction and fractionation. It was than standardized by High Performance Liquid Chromatography (HPLC) and given in the dose of 32.5mg/kg body weight to the different groups of animals for 7days. On the seventh day, activities were performed adopting standard procedures. Brahmi vati showed significant anticonvulsant, memory-enhancing and antischizophrenia activities, when compared with the control groups and BA. It cause significantly higher brain glutathione levels. Acetylcholinesterase activity was found to be significantly low in BV-treated group. The finding of the present study suggests that BV may be used to treat seizures associated with schizophrenia and related memory loss. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Intraoperative perception and estimates on extent of resection during awake glioma surgery: overcoming the learning curve.

    PubMed

    Lau, Darryl; Hervey-Jumper, Shawn L; Han, Seunggu J; Berger, Mitchel S

    2018-05-01

    OBJECTIVE There is ample evidence that extent of resection (EOR) is associated with improved outcomes for glioma surgery. However, it is often difficult to accurately estimate EOR intraoperatively, and surgeon accuracy has yet to be reviewed. In this study, the authors quantitatively assessed the accuracy of intraoperative perception of EOR during awake craniotomy for tumor resection. METHODS A single-surgeon experience of performing awake craniotomies for tumor resection over a 17-year period was examined. Retrospective review of operative reports for quantitative estimation of EOR was recorded. Definitive EOR was based on postoperative MRI. Analysis of accuracy of EOR estimation was examined both as a general outcome (gross-total resection [GTR] or subtotal resection [STR]), and quantitatively (5% within EOR on postoperative MRI). Patient demographics, tumor characteristics, and surgeon experience were examined. The effects of accuracy on motor and language outcomes were assessed. RESULTS A total of 451 patients were included in the study. Overall accuracy of intraoperative perception of whether GTR or STR was achieved was 79.6%, and overall accuracy of quantitative perception of resection (within 5% of postoperative MRI) was 81.4%. There was a significant difference (p = 0.049) in accuracy for gross perception over the 17-year period, with improvement over the later years: 1997-2000 (72.6%), 2001-2004 (78.5%), 2005-2008 (80.7%), and 2009-2013 (84.4%). Similarly, there was a significant improvement (p = 0.015) in accuracy of quantitative perception of EOR over the 17-year period: 1997-2000 (72.2%), 2001-2004 (69.8%), 2005-2008 (84.8%), and 2009-2013 (93.4%). This improvement in accuracy is demonstrated by the significantly higher odds of correctly estimating quantitative EOR in the later years of the series on multivariate logistic regression. Insular tumors were associated with the highest accuracy of gross perception (89.3%; p = 0.034), but lowest accuracy of quantitative perception (61.1% correct; p < 0.001) compared with tumors in other locations. Even after adjusting for surgeon experience, this particular trend for insular tumors remained true. The absence of 1p19q co-deletion was associated with higher quantitative perception accuracy (96.9% vs 81.5%; p = 0.051). Tumor grade, recurrence, diagnosis, and isocitrate dehydrogenase-1 (IDH-1) status were not associated with accurate perception of EOR. Overall, new neurological deficits occurred in 8.4% of cases, and 42.1% of those new neurological deficits persisted after the 3-month follow-up. Correct quantitative perception was associated with lower postoperative motor deficits (2.4%) compared with incorrect perceptions (8.0%; p = 0.029). There were no detectable differences in language outcomes based on perception of EOR. CONCLUSIONS The findings from this study suggest that there is a learning curve associated with the ability to accurately assess intraoperative EOR during glioma surgery, and it may take more than a decade to be truly proficient. Understanding the factors associated with this ability to accurately assess EOR will provide safer surgeries while maximizing tumor resection.

  8. Visually estimated ejection fraction by two dimensional and triplane echocardiography is closely correlated with quantitative ejection fraction by real-time three dimensional echocardiography.

    PubMed

    Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Ake; Winter, Reidar

    2009-08-25

    Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 +/- 3.7% and -0.2 +/- 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant.

  9. Dengue prediction by the web: Tweets are a useful tool for estimating and forecasting Dengue at country and city level

    PubMed Central

    Degener, Carolin Marlen; Vinhal, Livia; Coelho, Giovanini; Meira, Wagner; Codeço, Claudia Torres; Teixeira, Mauro Martins

    2017-01-01

    Background Infectious diseases are a leading threat to public health. Accurate and timely monitoring of disease risk and progress can reduce their impact. Mentioning a disease in social networks is correlated with physician visits by patients, and can be used to estimate disease activity. Dengue is the fastest growing mosquito-borne viral disease, with an estimated annual incidence of 390 million infections, of which 96 million manifest clinically. Dengue burden is likely to increase in the future owing to trends toward increased urbanization, scarce water supplies and, possibly, environmental change. The epidemiological dynamic of Dengue is complex and difficult to predict, partly due to costly and slow surveillance systems. Methodology / Principal findings In this study, we aimed to quantitatively assess the usefulness of data acquired by Twitter for the early detection and monitoring of Dengue epidemics, both at country and city level at a weekly basis. Here, we evaluated and demonstrated the potential of tweets modeling for Dengue estimation and forecast, in comparison with other available web-based data, Google Trends and Wikipedia access logs. Also, we studied the factors that might influence the goodness-of-fit of the model. We built a simple model based on tweets that was able to ‘nowcast’, i.e. estimate disease numbers in the same week, but also ‘forecast’ disease in future weeks. At the country level, tweets are strongly associated with Dengue cases, and can estimate present and future Dengue cases until 8 weeks in advance. At city level, tweets are also useful for estimating Dengue activity. Our model can be applied successfully to small and less developed cities, suggesting a robust construction, even though it may be influenced by the incidence of the disease, the activity of Twitter locally, and social factors, including human development index and internet access. Conclusions Tweets association with Dengue cases is valuable to assist traditional Dengue surveillance at real-time and low-cost. Tweets are able to successfully nowcast, i.e. estimate Dengue in the present week, but also forecast, i.e. predict Dengue at until 8 weeks in the future, both at country and city level with high estimation capacity. PMID:28719659

  10. Analysis of spatiotemporal variability of C-factor derived from remote sensing data

    NASA Astrophysics Data System (ADS)

    Pechanec, Vilém; Mráz, Alexander; Benc, Antonín; Cudlín, Pavel

    2018-01-01

    Soil erosion is an important phenomenon that contributes to the degradation of agricultural land. Even though it is a natural process, human activities can significantly increase its impact on land degradation and present serious limitation on sustainable agricultural land use. Nowadays, the risk of soil erosion is assessed either qualitatively by expert assessment or quantitatively using model-based approach. One of the primary factors affecting the soil erosion assessment is a cover-management factor, C-factor. In the Czech Republic, several models are used to assess the C-factor on a long-term basis based on data collected using traditional tabular methods. This paper presents work to investigate the estimation of both long-term and short-term cover-management factors using remote sensing data. The results demonstrate a successful development of C-factor maps for each month of 2014, growing season average, and annual average for the Czech Republic. C-factor values calculated from remote sensing data confirmed expected trend in their temporal variability for selected crops. The results presented in this paper can be used for enhancing existing methods for estimating C-factor, planning future agricultural activities, and designing technical remediations and improvement activities of land use in the Czech Republic, which are also financially supported by the European Union funds.

  11. A BAYESIAN METHOD FOR CALCULATING REAL-TIME QUANTITATIVE PCR CALIBRATION CURVES USING ABSOLUTE PLASMID DNA STANDARDS

    EPA Science Inventory

    In real-time quantitative PCR studies using absolute plasmid DNA standards, a calibration curve is developed to estimate an unknown DNA concentration. However, potential differences in the amplification performance of plasmid DNA compared to genomic DNA standards are often ignore...

  12. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  13. Employment from Solar Energy: A Bright but Partly Cloudy Future.

    ERIC Educational Resources Information Center

    Smeltzer, K. K.; Santini, D. J.

    A comparison of quantitative and qualitative employment effects of solar and conventional systems can prove the increased employment postulated as one of the significant secondary benefits of a shift from conventional to solar energy use. Current quantitative employment estimates show solar technology-induced employment to be generally greater…

  14. 78 FR 53336 - List of Fisheries for 2013

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-29

    ... provided on the LOF are solely used for descriptive purposes and will not be used in determining future... this information to determine whether the fishery can be classified on the LOF based on quantitative... does not have a quantitative estimate of the number of mortalities and serious injuries of pantropical...

  15. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  16. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  17. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  18. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  19. QUANTITATIVE EVALUATION OF BROMODICHLOROMETHANE METABOLISM BY RECOMBINANT RAT AND HUMAN CYTOCHROME P450S

    EPA Science Inventory

    ABSTRACT
    We report quantitative estimates of the parameters for metabolism of bromodichloromethane (BDCM) by recombinant preparations of hepatic cytochrome P450s (CYPs) from rat and human. BDCM is a drinking water disinfectant byproduct that has been implicated in liver, kidn...

  20. Quantification of Cyclic Ground Reaction Force Histories During Daily Activity in Humans

    NASA Technical Reports Server (NTRS)

    Breit, G. A.; Whalen, R. T.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    Theoretical models and experimental studies of bone remodeling suggest that bone density and structure are influenced by local cyclic skeletal tissue stress and strain histories. Estimation of long-term loading histories in humans is usually achieved by assessment of physical activity level by questionnaires, logbooks, and pedometers, since the majority of lower limb cyclic loading occurs during walking and running. These methods provide some indication of the mechanical loading history, but fail to consider the true magnitude of the lower limb skeletal forces generated by various daily activities. These techniques cannot account for individual gait characteristics, gait speed, and unpredictable high loading events that may influence bone mass significantly. We have developed portable instrumentation to measure and record the vertical component of the ground reaction force (GRFz) during normal daily activity. This equipment allows long-term quantitative monitoring of musculoskeletal loads, which in conjunction with bone mineral density assessments, promises to elucidate the relationship between skeletal stresses and bone remodeling.

  1. Antibodies against toluene diisocyanate protein conjugates. Three methods of measurement.

    PubMed

    Patterson, R; Harris, K E; Zeiss, C R

    1983-12-01

    With the use of canine antisera against toluene diisocyanate (TDI)-dog serum albumin (DSA), techniques for measuring antibody against TDI-DSA were evaluated. The use of an ammonium sulfate precipitation assay showed suggestive evidence of antibody binding but high levels of TDI-DSA precipitation in the absence of antibody limit any usefulness of this technique. Double-antibody co-precipitation techniques will measure total antibody or Ig class antibody against 125I-TDI-DSA. These techniques are quantitative. The polystyrene tube radioimmunoassay is a highly sensitive method of detecting and quantitatively estimating IgG antibody. The enzyme linked immunosorbent assay is a rapidly adaptable method for the quantitative estimation of IgG, IgA, and IgM against TDI-homologous proteins. All these techniques were compared and results are demonstrated by using the same serum sample for analysis.

  2. Effects of a 20 year rain event: a quantitative microbial risk assessment of a case of contaminated bathing water in Copenhagen, Denmark.

    PubMed

    Andersen, S T; Erichsen, A C; Mark, O; Albrechtsen, H-J

    2013-12-01

    Quantitative microbial risk assessments (QMRAs) often lack data on water quality leading to great uncertainty in the QMRA because of the many assumptions. The quantity of waste water contamination was estimated and included in a QMRA on an extreme rain event leading to combined sewer overflow (CSO) to bathing water where an ironman competition later took place. Two dynamic models, (1) a drainage model and (2) a 3D hydrodynamic model, estimated the dilution of waste water from source to recipient. The drainage model estimated that 2.6% of waste water was left in the system before CSO and the hydrodynamic model estimated that 4.8% of the recipient bathing water came from the CSO, so on average there was 0.13% of waste water in the bathing water during the ironman competition. The total estimated incidence rate from a conservative estimate of the pathogenic load of five reference pathogens was 42%, comparable to 55% in an epidemiological study of the case. The combination of applying dynamic models and exposure data led to an improved QMRA that included an estimate of the dilution factor. This approach has not been described previously.

  3. Benefits of dynamic mobility applications : preliminary estimates from the literature.

    DOT National Transportation Integrated Search

    2012-12-01

    This white paper examines the available quantitative information on the potential mobility benefits of the connected vehicle Dynamic Mobility Applications (DMA). This work will be refined as more and better estimates of benefits from mobility applica...

  4. Automated detection of arterial input function in DSC perfusion MRI in a stroke rat model

    NASA Astrophysics Data System (ADS)

    Yeh, M.-Y.; Lee, T.-H.; Yang, S.-T.; Kuo, H.-H.; Chyi, T.-K.; Liu, H.-L.

    2009-05-01

    Quantitative cerebral blood flow (CBF) estimation requires deconvolution of the tissue concentration time curves with an arterial input function (AIF). However, image-based determination of AIF in rodent is challenged due to limited spatial resolution. We evaluated the feasibility of quantitative analysis using automated AIF detection and compared the results with commonly applied semi-quantitative analysis. Permanent occlusion of bilateral or unilateral common carotid artery was used to induce cerebral ischemia in rats. The image using dynamic susceptibility contrast method was performed on a 3-T magnetic resonance scanner with a spin-echo echo-planar-image sequence (TR/TE = 700/80 ms, FOV = 41 mm, matrix = 64, 3 slices, SW = 2 mm), starting from 7 s prior to contrast injection (1.2 ml/kg) at four different time points. For quantitative analysis, CBF was calculated by the AIF which was obtained from 10 voxels with greatest contrast enhancement after deconvolution. For semi-quantitative analysis, relative CBF was estimated by the integral divided by the first moment of the relaxivity time curves. We observed if the AIFs obtained in the three different ROIs (whole brain, hemisphere without lesion and hemisphere with lesion) were similar, the CBF ratios (lesion/normal) between quantitative and semi-quantitative analyses might have a similar trend at different operative time points. If the AIFs were different, the CBF ratios might be different. We concluded that using local maximum one can define proper AIF without knowing the anatomical location of arteries in a stroke rat model.

  5. Prediction of acute mammalian toxicity using QSAR methods: a case study of sulfur mustard and its breakdown products.

    PubMed

    Ruiz, Patricia; Begluitti, Gino; Tincher, Terry; Wheeler, John; Mumtaz, Moiz

    2012-07-27

    Predicting toxicity quantitatively, using Quantitative Structure Activity Relationships (QSAR), has matured over recent years to the point that the predictions can be used to help identify missing comparison values in a substance's database. In this manuscript we investigate using the lethal dose that kills fifty percent of a test population (LD₅₀) for determining relative toxicity of a number of substances. In general, the smaller the LD₅₀ value, the more toxic the chemical, and the larger the LD₅₀ value, the lower the toxicity. When systemic toxicity and other specific toxicity data are unavailable for the chemical(s) of interest, during emergency responses, LD₅₀ values may be employed to determine the relative toxicity of a series of chemicals. In the present study, a group of chemical warfare agents and their breakdown products have been evaluated using four available rat oral QSAR LD₅₀ models. The QSAR analysis shows that the breakdown products of Sulfur Mustard (HD) are predicted to be less toxic than the parent compound as well as other known breakdown products that have known toxicities. The QSAR estimated break down products LD₅₀ values ranged from 299 mg/kg to 5,764 mg/kg. This evaluation allows for the ranking and toxicity estimation of compounds for which little toxicity information existed; thus leading to better risk decision making in the field.

  6. 3D-quantitative structure-activity relationship study for the design of novel enterovirus A71 3C protease inhibitors.

    PubMed

    Nie, Quandeng; Xu, Xiaoyi; Zhang, Qi; Ma, Yuying; Yin, Zheng; Shang, Luqing

    2018-06-07

    A three-dimensional quantitative structure-activity relationships model of enterovirus A71 3C protease inhibitors was constructed in this study. The protein-ligand interaction fingerprint was analyzed to generate a pharmacophore model. A predictive and reliable three-dimensional quantitative structure-activity relationships model was built based on the Flexible Alignment of AutoGPA. Moreover, three novel compounds (I-III) were designed and evaluated for their biochemical activity against 3C protease and anti-enterovirus A71 activity in vitro. III exhibited excellent inhibitory activity (IC 50 =0.031 ± 0.005 μM, EC 50 =0.036 ± 0.007 μM). Thus, this study provides a useful quantitative structure-activity relationships model to develop potent inhibitors for enterovirus A71 3C protease. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    PubMed

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  8. Detection, monitoring, and quantitative analysis of wildfires with the BIRD satellite

    NASA Astrophysics Data System (ADS)

    Oertel, Dieter A.; Briess, Klaus; Lorenz, Eckehard; Skrbek, Wolfgang; Zhukov, Boris

    2004-02-01

    Increasing concern about environment and interest to avoid losses led to growing demands on space borne fire detection, monitoring and quantitative parameter estimation of wildfires. The global change research community intends to quantify the amount of gaseous and particulate matter emitted from vegetation fires, peat fires and coal seam fires. The DLR Institute of Space Sensor Technology and Planetary Exploration (Berlin-Adlershof) developed a small satellite called BIRD (Bi-spectral Infrared Detection) which carries a sensor package specially designed for fire detection. BIRD was launched as a piggy-back satellite on October 22, 2001 with ISRO"s Polar Satellite Launch Vehicle (PSLV). It is circling the Earth on a polar and sun-synchronous orbit at an altitude of 572 km and it is providing unique data for detailed analysis of high temperature events on Earth surface. The BIRD sensor package is dedicated for high resolution and reliable fire recognition. Active fire analysis is possible in the sub-pixel domain. The leading channel for fire detection and monitoring is the MIR channel at 3.8 μm. The rejection of false alarms is based on procedures using MIR/NIR (Middle Infra Red/Near Infra Red) and MIR/TIR (Middle Infra Red/Thermal Infra Red) radiance ratio thresholds. Unique results of BIRD wildfire detection and analysis over fire prone regions in Australia and Asia will be presented. BIRD successfully demonstrates innovative fire recognition technology for small satellites which permit to retrieve quantitative characteristics of active burning wildfires, such as the equivalent fire temperature, fire area, radiative energy release, fire front length and fire front strength.

  9. Time-dependent 31P saturation transfer in the phosphoglucomutase reaction. Characterization of the spin system for the Cd(II) enzyme and evaluation of rate constants for the transfer process.

    PubMed

    Post, C B; Ray, W J; Gorenstein, D G

    1989-01-24

    Time-dependent 31P saturation-transfer studies were conducted with the Cd2+-activated form of muscle phosphoglucomutase to probe the origin of the 100-fold difference between its catalytic efficiency (in terms of kcat) and that of the more efficient Mg2+-activated enzyme. The present paper describes the equilibrium mixture of phosphoglucomutase and its substrate/product pair when the concentration of the Cd2+ enzyme approaches that of the substrate and how the nine-spin 31P NMR system provided by this mixture was treated. It shows that the presence of abortive complexes is not a significant factor in the reduced activity of the Cd2+ enzyme since the complex of the dephosphoenzyme and glucose 1,6-bisphosphate, which accounts for a large majority of the enzyme present at equilibrium, is catalytically competent. It also shows that rate constants for saturation transfer obtained at three different ratios of enzyme to free substrate are mutually compatible. These constants, which were measured at chemical equilibrium, can be used to provide a quantitative kinetic rationale for the reduced steady-state activity elicited by Cd2+ relative to Mg2+ [cf. Ray, W.J., Post, C.B., & Puvathingal, J.M. (1989) Biochemistry (following paper in this issue)]. They also provide minimal estimates of 350 and 150 s-1 for the rate constants describing (PO3-) transfer from the Cd2+ phosphoenzyme to the 6-position of bound glucose 1-phosphate and to the 1-position of bound glucose 6-phosphate, respectively. These minimal estimates are compared with analogous estimates for the Mg2+ and Li+ forms of the enzyme in the accompanying paper.

  10. Quantitative elasticity measurement of urinary bladder wall using laser-induced surface acoustic waves.

    PubMed

    Li, Chunhui; Guan, Guangying; Zhang, Fan; Song, Shaozhen; Wang, Ruikang K; Huang, Zhihong; Nabi, Ghulam

    2014-12-01

    The maintenance of urinary bladder elasticity is essential to its functions, including the storage and voiding phases of the micturition cycle. The bladder stiffness can be changed by various pathophysiological conditions. Quantitative measurement of bladder elasticity is an essential step toward understanding various urinary bladder disease processes and improving patient care. As a nondestructive, and noncontact method, laser-induced surface acoustic waves (SAWs) can accurately characterize the elastic properties of different layers of organs such as the urinary bladder. This initial investigation evaluates the feasibility of a noncontact, all-optical method of generating and measuring the elasticity of the urinary bladder. Quantitative elasticity measurements of ex vivo porcine urinary bladder were made using the laser-induced SAW technique. A pulsed laser was used to excite SAWs that propagated on the bladder wall surface. A dedicated phase-sensitive optical coherence tomography (PhS-OCT) system remotely recorded the SAWs, from which the elasticity properties of different layers of the bladder were estimated. During the experiments, series of measurements were performed under five precisely controlled bladder volumes using water to estimate changes in the elasticity in relation to various urinary bladder contents. The results, validated by optical coherence elastography, show that the laser-induced SAW technique combined with PhS-OCT can be a feasible method of quantitative estimation of biomechanical properties.

  11. Distance measures and optimization spaces in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Rode, Karyn D.; Budge, Suzanne M.; Thiemann, Gregory W.

    2015-01-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback-Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted.

  12. A comparison of individual and population-derived vascular input functions for quantitative DCE-MRI in rats.

    PubMed

    Hormuth, David A; Skinner, Jack T; Does, Mark D; Yankeelov, Thomas E

    2014-05-01

    Dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) can quantitatively and qualitatively assess physiological characteristics of tissue. Quantitative DCE-MRI requires an estimate of the time rate of change of the concentration of the contrast agent in the blood plasma, the vascular input function (VIF). Measuring the VIF in small animals is notoriously difficult as it requires high temporal resolution images limiting the achievable number of slices, field-of-view, spatial resolution, and signal-to-noise. Alternatively, a population-averaged VIF could be used to mitigate the acquisition demands in studies aimed to investigate, for example, tumor vascular characteristics. Thus, the overall goal of this manuscript is to determine how the kinetic parameters estimated by a population based VIF differ from those estimated by an individual VIF. Eight rats bearing gliomas were imaged before, during, and after an injection of Gd-DTPA. K(trans), ve, and vp were extracted from signal-time curves of tumor tissue using both individual and population-averaged VIFs. Extended model voxel estimates of K(trans) and ve in all animals had concordance correlation coefficients (CCC) ranging from 0.69 to 0.98 and Pearson correlation coefficients (PCC) ranging from 0.70 to 0.99. Additionally, standard model estimates resulted in CCCs ranging from 0.81 to 0.99 and PCCs ranging from 0.98 to 1.00, supporting the use of a population based VIF if an individual VIF is not available. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. On the reconciliation of missing heritability for genome-wide association studies

    PubMed Central

    Chen, Guo-Bo

    2016-01-01

    The definition of heritability has been unique and clear, but its estimation and estimates vary across studies. Linear mixed model (LMM) and Haseman–Elston (HE) regression analyses are commonly used for estimating heritability from genome-wide association data. This study provides an analytical resolution that can be used to reconcile the differences between LMM and HE in the estimation of heritability given the genetic architecture, which is responsible for these differences. The genetic architecture was classified into three forms via thought experiments: (i) coupling genetic architecture that the quantitative trait loci (QTLs) in the linkage disequilibrium (LD) had a positive covariance; (ii) repulsion genetic architecture that the QTLs in the LD had a negative covariance; (iii) and neutral genetic architecture that the QTLs in the LD had a covariance with a summation of zero. The neutral genetic architecture is so far most embraced, whereas the coupling and the repulsion genetic architecture have not been well investigated. For a quantitative trait under the coupling genetic architecture, HE overestimated the heritability and LMM underestimated the heritability; under the repulsion genetic architecture, HE underestimated but LMM overestimated the heritability for a quantitative trait. These two methods gave identical results under the neutral genetic architecture. A general analytical result for the statistic estimated under HE is given regardless of genetic architecture. In contrast, the performance of LMM remained elusive, such as further depended on the ratio between the sample size and the number of markers, but LMM converged to HE with increased sample size. PMID:27436266

  14. Detecting structural heat losses with mobile infrared thermography. Part IV. Estimating quantitative heat loss at Dartmouth College, Hanover, New Hampshire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munis, R.H.; Marshall, S.J.; Bush, M.A.

    1976-09-01

    During the winter of 1973-74 a mobile infrared thermography system was used to survey campus buildings at Dartmouth College, Hanover, New Hampshire. Both qualitative and quantitative data are presented regarding heat flow through a small area of a wall of one brick dormitory building before and after installation of aluminum reflectors between radiators and the wall. These data were used to estimate annual cost savings for 22 buildings of similar construction having aluminum reflectors installed behind 1100 radiators. The data were then compared with the actual savings which were calculated from condensate meter data. The discrepancy between estimated and actualmore » annual cost savings is explained in detail along with all assumptions required for these calculations.« less

  15. 75 FR 81665 - Notice of Intent to Seek Approval to Reinstate an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... are both quantitative and descriptive. Quantitative information from the most recently completed... activities with respect to industrial collaboration [cir] Conducting a survey of all center participants to probe the participant satisfaction with center activities [cir] Compiling a set of quantitative...

  16. Estimation of maximum tolerated dose for long-term bioassays from acute lethal dose and structure by QSAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gombar, V.K.; Enslein, K.; Hart, J.B.

    1991-09-01

    A quantitative structure-activity relationship (QSAR) model has been developed to estimate maximum tolerated doses (MTD) from structural features of chemicals and the corresponding oral acute lethal doses (LD50) as determined in male rats. The model is based on a set of 269 diverse chemicals which have been tested under the National Cancer Institute/National Toxicology Program (NCI/NTP) protocols. The rat oral LD50 value was the strongest predictor. Additionally, 22 structural descriptors comprising nine substructural MOLSTAC(c) keys, three molecular connectivity indices, and sigma charges on 10 molecular fragments were identified as endpoint predictors. The model explains 76% of the variance and ismore » significant (F = 35.7) at p less than 0.0001 with a standard error of the estimate of 0.40 in the log (1/mol) units used in Hansch-type equations. Cross-validation showed that the difference between the average deleted residual square (0.179) and the model residual square (0.160) was not significant (t = 0.98).« less

  17. Asymmetric responsiveness of physician prescription behavior to drug promotion of competitive brands within an established therapeutic drug class.

    PubMed

    Pedan, Alex; Wu, Hongsheng

    2011-04-01

    This article examines the impact of direct-to-physician, direct-to-consumer, and other marketing activities by pharmaceutical companies on a mature drug category which is in the later stage of its life cycle and in which generics have accrued a significant market share. The main objective of this article is to quantitatively estimate the impact of pharmaceutical promotions on physician prescribing behavior for three different statin brands, after controlling for factors such as patient, physician and physician practice characteristics, generic pressure, et cetera. Using unique panel data of physicians, combined with patient pharmacy prescription records, the authors developed a physician level generalized linear regression model. The generalized estimating equations method was used to account for within physician serial correlations and estimate physician population averaged effects. The findings reveal that even though on average the marketing efforts affect the brand share positively, the magnitude of the effects is very brand specific. Generally, each statin brand has its own trend and because of this, the best choice of predictors for one brand could be suboptimal for another.

  18. Information-Driven Active Audio-Visual Source Localization

    PubMed Central

    Schult, Niclas; Reineking, Thomas; Kluss, Thorsten; Zetzsche, Christoph

    2015-01-01

    We present a system for sensorimotor audio-visual source localization on a mobile robot. We utilize a particle filter for the combination of audio-visual information and for the temporal integration of consecutive measurements. Although the system only measures the current direction of the source, the position of the source can be estimated because the robot is able to move and can therefore obtain measurements from different directions. These actions by the robot successively reduce uncertainty about the source’s position. An information gain mechanism is used for selecting the most informative actions in order to minimize the number of actions required to achieve accurate and precise position estimates in azimuth and distance. We show that this mechanism is an efficient solution to the action selection problem for source localization, and that it is able to produce precise position estimates despite simplified unisensory preprocessing. Because of the robot’s mobility, this approach is suitable for use in complex and cluttered environments. We present qualitative and quantitative results of the system’s performance and discuss possible areas of application. PMID:26327619

  19. Uniform gradient estimates on manifolds with a boundary and applications

    NASA Astrophysics Data System (ADS)

    Cheng, Li-Juan; Thalmaier, Anton; Thompson, James

    2018-04-01

    We revisit the problem of obtaining uniform gradient estimates for Dirichlet and Neumann heat semigroups on Riemannian manifolds with boundary. As applications, we obtain isoperimetric inequalities, using Ledoux's argument, and uniform quantitative gradient estimates, firstly for C^2_b functions with boundary conditions and then for the unit spectral projection operators of Dirichlet and Neumann Laplacians.

  20. Comparison of estimated and measured sediment yield in the Gualala River

    Treesearch

    Matthew O’Connor; Jack Lewis; Robert Pennington

    2012-01-01

    This study compares quantitative erosion rate estimates developed at different spatial and temporal scales. It is motivated by the need to assess potential water quality impacts of a proposed vineyard development project in the Gualala River watershed. Previous erosion rate estimates were developed using sediment source assessment techniques by the North Coast Regional...

  1. Neuroergonomics: Quantitative Modeling of Individual, Shared, and Team Neurodynamic Information.

    PubMed

    Stevens, Ronald H; Galloway, Trysha L; Willemsen-Dunlap, Ann

    2018-06-01

    The aim of this study was to use the same quantitative measure and scale to directly compare the neurodynamic information/organizations of individual team members with those of the team. Team processes are difficult to separate from those of individual team members due to the lack of quantitative measures that can be applied to both process sets. Second-by-second symbolic representations were created of each team member's electroencephalographic power, and quantitative estimates of their neurodynamic organizations were calculated from the Shannon entropy of the symbolic data streams. The information in the neurodynamic data streams of health care ( n = 24), submarine navigation ( n = 12), and high school problem-solving ( n = 13) dyads was separated into the information of each team member, the information shared by team members, and the overall team information. Most of the team information was the sum of each individual's neurodynamic information. The remaining team information was shared among the team members. This shared information averaged ~15% of the individual information, with momentary levels of 1% to 80%. Continuous quantitative estimates can be made from the shared, individual, and team neurodynamic information about the contributions of different team members to the overall neurodynamic organization of a team and the neurodynamic interdependencies among the team members. Information models provide a generalizable quantitative method for separating a team's neurodynamic organization into that of individual team members and that shared among team members.

  2. Developing High-Frequency Quantitative Ultrasound Techniques to Characterize Three-Dimensional Engineered Tissues

    NASA Astrophysics Data System (ADS)

    Mercado, Karla Patricia E.

    Tissue engineering holds great promise for the repair or replacement of native tissues and organs. Further advancements in the fabrication of functional engineered tissues are partly dependent on developing new and improved technologies to monitor the properties of engineered tissues volumetrically, quantitatively, noninvasively, and nondestructively over time. Currently, engineered tissues are evaluated during fabrication using histology, biochemical assays, and direct mechanical tests. However, these techniques destroy tissue samples and, therefore, lack the capability for real-time, longitudinal monitoring. The research reported in this thesis developed nondestructive, noninvasive approaches to characterize the structural, biological, and mechanical properties of 3-D engineered tissues using high-frequency quantitative ultrasound and elastography technologies. A quantitative ultrasound technique, using a system-independent parameter known as the integrated backscatter coefficient (IBC), was employed to visualize and quantify structural properties of engineered tissues. Specifically, the IBC was demonstrated to estimate cell concentration and quantitatively detect differences in the microstructure of 3-D collagen hydrogels. Additionally, the feasibility of an ultrasound elastography technique called Single Tracking Location Acoustic Radiation Force Impulse (STL-ARFI) imaging was demonstrated for estimating the shear moduli of 3-D engineered tissues. High-frequency ultrasound techniques can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, these high-frequency quantitative ultrasound techniques can enable noninvasive, volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation.

  3. Assessing soil erosion using USLE model and MODIS data in the Guangdong, China

    NASA Astrophysics Data System (ADS)

    Gao, Feng; Wang, Yunpeng; Yang, Jingxue

    2017-07-01

    In this study, soil erosion in the Guangdong, China during 2012 was quantitatively assessed using Universal Soil Loss Equation (USLE). The parameters of the model were calculated using GIS and MODIS data. The spatial distribution of the average annual soil loss on grid basis was mapped. The estimated average annual soil erosion in Guangdong in 2012 is about 2294.47t/ (km2.a). Four high sensitive area of soil erosion in Guangdong in 2012 was found. The key factors of these four high sensitive areas of soil erosion were significantly contributed to the land cover types, rainfall and Economic development and human activities.

  4. Etalon (standard) for surface potential distribution produced by electric activity of the heart.

    PubMed

    Szathmáry, V; Ruttkay-Nedecký, I

    1981-01-01

    The authors submit etalon (standard) equipotential maps as an aid in the evaluation of maps of surface potential distributions in living subjects. They were obtained by measuring potentials on the surface of an electrolytic tank shaped like the thorax. The individual etalon maps were determined in such a way that the parameters of the physical dipole forming the source of the electric field in the tank corresponded to the mean vectorcardiographic parameters measured in a healthy population sample. The technique also allows a quantitative estimate of the degree of non-dipolarity of the heart as the source of the electric field.

  5. PARAMETER ESTIMATION OF TWO-FLUID CAPILLARY PRESSURE-SATURATION AND PERMEABILITY FUNCTIONS

    EPA Science Inventory

    Capillary pressure and permeability functions are crucial to the quantitative description of subsurface flow and transport. Earlier work has demonstrated the feasibility of using the inverse parameter estimation approach in determining these functions if both capillary pressure ...

  6. Toward an Assessment of the Global Inventory of Present-Day Mercury Releases to Freshwater Environments.

    PubMed

    Kocman, David; Wilson, Simon J; Amos, Helen M; Telmer, Kevin H; Steenhuisen, Frits; Sunderland, Elsie M; Mason, Robert P; Outridge, Peter; Horvat, Milena

    2017-02-01

    Aquatic ecosystems are an essential component of the biogeochemical cycle of mercury (Hg), as inorganic Hg can be converted to toxic methylmercury (MeHg) in these environments and reemissions of elemental Hg rival anthropogenic Hg releases on a global scale. Quantification of effluent Hg releases to aquatic systems globally has focused on discharges to the global oceans, rather than contributions to freshwater systems that affect local exposures and risks associated with MeHg. Here we produce a first-estimate of sector-specific, spatially resolved global aquatic Hg discharges to freshwater systems. We compare our release estimates to atmospheric sources that have been quantified elsewhere. By analyzing available quantitative and qualitative information, we estimate that present-day global Hg releases to freshwater environments (rivers and lakes) associated with anthropogenic activities have a lower bound of ~1000 Mg· a-1. Artisanal and small-scale gold mining (ASGM) represents the single largest source, followed by disposal of mercury-containing products and domestic waste water, metal production, and releases from industrial installations such as chlor-alkali plants and oil refineries. In addition to these direct anthropogenic inputs, diffuse inputs from land management activities and remobilization of Hg previously accumulated in terrestrial ecosystems are likely comparable in magnitude. Aquatic discharges of Hg are greatly understudied and further constraining associated data gaps is crucial for reducing the uncertainties in the global biogeochemical Hg budget.

  7. Toward an Assessment of the Global Inventory of Present-Day Mercury Releases to Freshwater Environments

    PubMed Central

    Kocman, David; Wilson, Simon J.; Amos, Helen M.; Telmer, Kevin H.; Steenhuisen, Frits; Sunderland, Elsie M.; Mason, Robert P.; Outridge, Peter; Horvat, Milena

    2017-01-01

    Aquatic ecosystems are an essential component of the biogeochemical cycle of mercury (Hg), as inorganic Hg can be converted to toxic methylmercury (MeHg) in these environments and reemissions of elemental Hg rival anthropogenic Hg releases on a global scale. Quantification of effluent Hg releases to aquatic systems globally has focused on discharges to the global oceans, rather than contributions to freshwater systems that affect local exposures and risks associated with MeHg. Here we produce a first-estimate of sector-specific, spatially resolved global aquatic Hg discharges to freshwater systems. We compare our release estimates to atmospheric sources that have been quantified elsewhere. By analyzing available quantitative and qualitative information, we estimate that present-day global Hg releases to freshwater environments (rivers and lakes) associated with anthropogenic activities have a lower bound of ~1000 Mg·a−1. Artisanal and small-scale gold mining (ASGM) represents the single largest source, followed by disposal of mercury-containing products and domestic waste water, metal production, and releases from industrial installations such as chlor-alkali plants and oil refineries. In addition to these direct anthropogenic inputs, diffuse inputs from land management activities and remobilization of Hg previously accumulated in terrestrial ecosystems are likely comparable in magnitude. Aquatic discharges of Hg are greatly understudied and further constraining associated data gaps is crucial for reducing the uncertainties in the global biogeochemical Hg budget. PMID:28157152

  8. Controlling the non-linear intracavity dynamics of large He-Ne laser gyroscopes

    NASA Astrophysics Data System (ADS)

    Cuccato, D.; Beghi, A.; Belfi, J.; Beverini, N.; Ortolan, A.; Di Virgilio, A.

    2014-02-01

    A model based on Lamb's theory of gas lasers is applied to a He-Ne ring laser (RL) gyroscope to estimate and remove the laser dynamics contribution from the rotation measurements. The intensities of the counter-propagating laser beams exiting one cavity mirror are continuously observed together with a monitor of the laser population inversion. These observables, once properly calibrated with a dedicated procedure, allow us to estimate cold cavity and active medium parameters driving the main part of the non-linearities of the system. The quantitative estimation of intrinsic non-reciprocal effects due to cavity and active medium non-linear coupling plays a key role in testing fundamental symmetries of space-time with RLs. The parameter identification and noise subtraction procedure has been verified by means of a Monte Carlo study of the system, and experimentally tested on the G-PISA RL oriented with the normal to the ring plane almost parallel to the Earth's rotation axis. In this configuration the Earth's rotation rate provides the maximum Sagnac effect while the contribution of the orientation error is reduced to a minimum. After the subtraction of laser dynamics by a Kalman filter, the relative systematic errors of G-PISA reduce from 50 to 5 parts in 103 and can be attributed to the residual uncertainties on geometrical scale factor and orientation of the ring.

  9. 17 CFR Appendix A to Part 75 - Reporting and Recordkeeping Requirements for Covered Trading Activities

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... reports to the Commission regarding a variety of quantitative measurements of their covered trading... the risks associated with the banking entity's covered trading activities. c. The quantitative... of the data collected prior to September 30, 2015. e. In addition to the quantitative measurements...

  10. Quantitative risk assessment for a glass fiber insulation product.

    PubMed

    Fayerweather, W E; Bender, J R; Hadley, J G; Eastes, W

    1997-04-01

    California Proposition 65 (Prop65) provides a mechanism by which the manufacturer may perform a quantitative risk assessment to be used in determining the need for cancer warning labels. This paper presents a risk assessment under this regulation for professional and do-it-yourself insulation installers. It determines the level of insulation glass fiber exposure (specifically Owens Corning's R-25 PinkPlus with Miraflex) that, assuming a working lifetime exposure, poses no significant cancer risk under Prop65's regulations. "No significant risk" is defined under Prop65 as a lifetime risk of no more than one additional cancer case per 100,000 exposed persons, and nonsignificant exposure is defined as a working lifetime exposure associated with "no significant risk." This determination can be carried out despite the fact that the relevant underlying studies (i.e., chronic inhalation bioassays) of comparable glass wool fibers do not show tumorigenic activity. Nonsignificant exposures are estimated from (1) the most recent RCC chronic inhalation bioassay of nondurable fiberglass in rats; (2) intraperitoneal fiberglass injection studies in rats; (3) a distributional, decision analysis approach applied to four chronic inhalation rat bioassays of conventional fiberglass; (4) an extrapolation from the RCC chronic rat inhalation bioassay of durable refractory ceramic fibers; and (5) an extrapolation from the IOM chronic rat inhalation bioassay of durable E glass microfibers. When the EPA linear nonthreshold model is used, central estimates of nonsignificant exposure range from 0.36 fibers/cc (for the RCC chronic inhalation bioassay of fiberglass) through 21 fibers/cc (for the i.p. fiberglass injection studies). Lower 95% confidence bounds on these estimates vary from 0.17 fibers/cc through 13 fibers/cc. Estimates derived from the distributional approach or from applying the EPA linear nonthreshold model to chronic bioassays of durable fibers such as refractory ceramic fiber or E glass microfibers are intermediate to the other approaches. Estimates based on the Weibull 1.5-hit nonthreshold and 2-hit threshold models exceed by at least a factor of 10 the corresponding EPA linear nonthreshold estimates. The lowest nonsignificant exposures derived in this assessment are at least a factor of two higher than field exposures measured for professionals installing the R-25 fiberglass insulation product and are orders of magnitude higher than the estimated lifetime exposures for do-it-yourselfers.

  11. Designing a Quantitative Structure-Activity Relationship for the ...

    EPA Pesticide Factsheets

    Toxicokinetic models serve a vital role in risk assessment by bridging the gap between chemical exposure and potentially toxic endpoints. While intrinsic metabolic clearance rates have a strong impact on toxicokinetics, limited data is available for environmentally relevant chemicals including nearly 8000 chemicals tested for in vitro bioactivity in the Tox21 program. To address this gap, a quantitative structure-activity relationship (QSAR) for intrinsic metabolic clearance rate was developed to offer reliable in silico predictions for a diverse array of chemicals. Models were constructed with curated in vitro assay data for both pharmaceutical-like chemicals (ChEMBL database) and environmentally relevant chemicals (ToxCast screening) from human liver microsomes (2176 from ChEMBL) and human hepatocytes (757 from ChEMBL and 332 from ToxCast). Due to variability in the experimental data, a binned approach was utilized to classify metabolic rates. Machine learning algorithms, such as random forest and k-nearest neighbor, were coupled with open source molecular descriptors and fingerprints to provide reasonable estimates of intrinsic metabolic clearance rates. Applicability domains defined the optimal chemical space for predictions, which covered environmental chemicals well. A reduced set of informative descriptors (including relative charge and lipophilicity) and a mixed training set of pharmaceuticals and environmentally relevant chemicals provided the best intr

  12. New perspectives on quantitative characterization of biomass burning (Invited)

    NASA Astrophysics Data System (ADS)

    Ichoku, C. M.

    2010-12-01

    Biomass burning (BB) occurs seasonally in different vegetated landscapes across the world, consuming large amounts of biomass, generating intense heat energy, and emitting corresponding amounts of smoke plumes that comprise aerosols and trace gases, which include carbon monoxide (CO), carbon dioxide (CO2), methane (CH4), non-methane hydrocarbons, and numerous other trace compounds, many of which have adverse effects on human health, air quality, and environmental processes. Accurate estimates of these emissions are required as model inputs to evaluate and forecast smoke plume transport and impacts on air quality, human health, clouds, weather, radiation, and climate. The goal of this presentation is to highlight results of research activities that are aimed at advancing the quantitative characterization of various aspects of biomass burning (energetics, intensity, burn areas, burn severity, emissions, and fire weather) from aircraft and satellite measurements that can help advance our understanding of biomass burning and its overall effects. We will show recent results of analysis of fire radiative power (FRP), burned areas, fuel consumption, smoke emission rates, and plume heights from satellite measurements, as well as related aircraft calibration/validation activities. We will also briefly examine potential future plans and strategies for effective monitoring of biomass burning characteristics and emissions from aircraft and satellite.

  13. Elemental analysis of scorpion venoms.

    PubMed

    Al-Asmari, AbdulRahman K; Kunnathodi, Faisal; Al Saadon, Khalid; Idris, Mohammed M

    2016-01-01

    Scorpion venom is a rich source of biomolecules, which can perturb physiological activity of the host on envenomation and may also have a therapeutic potential. Scorpion venoms produced by the columnar cells of venom gland are complex mixture of mucopolysaccharides, neurotoxic peptides and other components. This study was aimed at cataloguing the elemental composition of venoms obtained from medically important scorpions found in the Arabian peninsula. The global elemental composition of the crude venom obtained from Androctonus bicolor, Androctonus crassicauda and Leiurus quinquestriatus scorpions were estimated using ICP-MS analyzer. The study catalogued several chemical elements present in the scorpion venom using ICP-MS total quant analysis and quantitation of nine elements exclusively using appropriate standards. Fifteen chemical elements including sodium, potassium and calcium were found abundantly in the scorpion venom at PPM concentrations. Thirty six chemical elements of different mass ranges were detected in the venom at PPB level. Quantitative analysis of the venoms revealed copper to be the most abundant element in Androctonus sp. venom but at lower level in Leiurus quinquestriatus venom; whereas zinc and manganese was found at higher levels in Leiurus sp. venom but at lower level in Androctonus sp. venom. These data and the concentrations of other different elements present in the various venoms are likely to increase our understanding of the mechanisms of venom activity and their pharmacological potentials.

  14. A Quantitative Structure Activity Relationship for acute oral toxicity of pesticides on rats: Validation, domain of application and prediction.

    PubMed

    Hamadache, Mabrouk; Benkortbi, Othmane; Hanini, Salah; Amrane, Abdeltif; Khaouane, Latifa; Si Moussa, Cherif

    2016-02-13

    Quantitative Structure Activity Relationship (QSAR) models are expected to play an important role in the risk assessment of chemicals on humans and the environment. In this study, we developed a validated QSAR model to predict acute oral toxicity of 329 pesticides to rats because a few QSAR models have been devoted to predict the Lethal Dose 50 (LD50) of pesticides on rats. This QSAR model is based on 17 molecular descriptors, and is robust, externally predictive and characterized by a good applicability domain. The best results were obtained with a 17/9/1 Artificial Neural Network model trained with the Quasi Newton back propagation (BFGS) algorithm. The prediction accuracy for the external validation set was estimated by the Q(2)ext and the root mean square error (RMS) which are equal to 0.948 and 0.201, respectively. 98.6% of external validation set is correctly predicted and the present model proved to be superior to models previously published. Accordingly, the model developed in this study provides excellent predictions and can be used to predict the acute oral toxicity of pesticides, particularly for those that have not been tested as well as new pesticides. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Predicting human skin absorption of chemicals: development of a novel quantitative structure activity relationship.

    PubMed

    Luo, Wen; Medrek, Sarah; Misra, Jatin; Nohynek, Gerhard J

    2007-02-01

    The objective of this study was to construct and validate a quantitative structure-activity relationship model for skin absorption. Such models are valuable tools for screening and prioritization in safety and efficacy evaluation, and risk assessment of drugs and chemicals. A database of 340 chemicals with percutaneous absorption was assembled. Two models were derived from the training set consisting 306 chemicals (90/10 random split). In addition to the experimental K(ow) values, over 300 2D and 3D atomic and molecular descriptors were analyzed using MDL's QsarIS computer program. Subsequently, the models were validated using both internal (leave-one-out) and external validation (test set) procedures. Using the stepwise regression analysis, three molecular descriptors were determined to have significant statistical correlation with K(p) (R2 = 0.8225): logK(ow), X0 (quantification of both molecular size and the degree of skeletal branching), and SsssCH (count of aromatic carbon groups). In conclusion, two models to estimate skin absorption were developed. When compared to other skin absorption QSAR models in the literature, our model incorporated more chemicals and explored a large number of descriptors. Additionally, our models are reasonably predictive and have met both internal and external statistical validations.

  16. Characterizing health risks associated with recreational swimming at Taiwanese beaches by using quantitative microbial risk assessment.

    PubMed

    Jang, Cheng-Shin; Liang, Ching-Ping

    2018-01-01

    Taiwan is surrounded by oceans, and therefore numerous pleasure beaches attract millions of tourists annually to participate in recreational swimming activities. However, impaired water quality because of fecal pollution poses a potential threat to the tourists' health. This study probabilistically characterized the health risks associated with recreational swimming engendered by waterborne enterococci at 13 Taiwanese beaches by using quantitative microbial risk assessment. First, data on enterococci concentrations at coastal beaches monitored by the Taiwan Environmental Protection Administration were reproduced using nonparametric Monte Carlo simulation (MCS). The ingestion volumes of recreational swimming based on uniform and gamma distributions were subsequently determined using MCS. Finally, after the distribution combination of the two parameters, the beta-Poisson dose-response function was employed to quantitatively estimate health risks to recreational swimmers. Moreover, various levels of risk to recreational swimmers were classified and spatially mapped to explore feasible recreational and environmental management strategies at the beaches. The study results revealed that although the health risks associated with recreational swimming did not exceed an acceptable benchmark of 0.019 illnesses daily at all beaches, they approached to this benchmark at certain beaches. Beaches with relatively high risks are located in Northwestern Taiwan owing to the current movements.

  17. The Interrelationship between Promoter Strength, Gene Expression, and Growth Rate

    PubMed Central

    Klesmith, Justin R.; Detwiler, Emily E.; Tomek, Kyle J.; Whitehead, Timothy A.

    2014-01-01

    In exponentially growing bacteria, expression of heterologous protein impedes cellular growth rates. Quantitative understanding of the relationship between expression and growth rate will advance our ability to forward engineer bacteria, important for metabolic engineering and synthetic biology applications. Recently, a work described a scaling model based on optimal allocation of ribosomes for protein translation. This model quantitatively predicts a linear relationship between microbial growth rate and heterologous protein expression with no free parameters. With the aim of validating this model, we have rigorously quantified the fitness cost of gene expression by using a library of synthetic constitutive promoters to drive expression of two separate proteins (eGFP and amiE) in E. coli in different strains and growth media. In all cases, we demonstrate that the fitness cost is consistent with the previous findings. We expand upon the previous theory by introducing a simple promoter activity model to quantitatively predict how basal promoter strength relates to growth rate and protein expression. We then estimate the amount of protein expression needed to support high flux through a heterologous metabolic pathway and predict the sizable fitness cost associated with enzyme production. This work has broad implications across applied biological sciences because it allows for prediction of the interplay between promoter strength, protein expression, and the resulting cost to microbial growth rates. PMID:25286161

  18. Validation of Passive Sampling Devices for Monitoring of Munitions Constituents in Underwater Environments

    DTIC Science & Technology

    2017-06-30

    Research and Development Program [SERDP] project #ER-2542) into the canister would provide enhancement of the quantitative estimation of the TWA...7 4. Advantages and limitations compared to other sampling techniques...Department of Defense EOD Explosive Ordnance Disposal EPA United States Environmental Protection Agency EQL Environmental Quantitation Limit EST

  19. AN INVESTIGATION OF THE CHEMICAL STABILITY OF ARSENOSUGARS IN SIMULATED GASTRIC JUICE AND ACIDIC ENVIRONMENTS USING IC-ICP-MS AND IC-ESI-MS/MS

    EPA Science Inventory

    A more quantitative extraction of arsenic-containing compounds from seafood matrices is essential in developing better dietary exposure estimates. More quantitative extraction often implies a more chemically aggressive set of extraction conditions. However, these conditions may...

  20. A Quantitative Needs Assessment Technique for Cross-Cultural Work Adjustment Training.

    ERIC Educational Resources Information Center

    Selmer, Lyn

    2000-01-01

    A study of 67 Swedish expatriate bosses and 104 local Hong Kong middle managers tested a quantitative needs assessment technique measuring work values. Two-thirds of middle managers' work values were not correctly estimated by their bosses, especially instrumental values (pay, benefits, security, working hours and conditions), indicating a need…

  1. Quantitative Genetics in the Era of Molecular Genetics: Learning Abilities and Disabilities as an Example

    ERIC Educational Resources Information Center

    Haworth, Claire M. A.; Plomin, Robert

    2010-01-01

    Objective: To consider recent findings from quantitative genetic research in the context of molecular genetic research, especially genome-wide association studies. We focus on findings that go beyond merely estimating heritability. We use learning abilities and disabilities as examples. Method: Recent twin research in the area of learning…

  2. Mapping of quantitative trait loci controlling adaptive traits in coastal Douglas-fir. III

    Treesearch

    Kathleen D. Jermstad; Daniel L. Bassoni; Keith S. Jech; Gary A. Ritchie; Nicholas C. Wheeler; David B. Neale

    2003-01-01

    Quantitative trait loci (QTL) were mapped in the woody perennial Douglas fir (Pseudotsuga menziesii var. menziesii [Mirb.] Franco) for complex traits controlling the timing of growth initiation and growth cessation. QTL were estimated under controlled environmental conditions to identify QTL interactions with photoperiod, moisture stress, winter chilling, and spring...

  3. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... Evaluation and Research (CBER) and suggestions for further development. The public workshop will include... Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...

  4. Quantitative test for concave aspheric surfaces using a Babinet compensator.

    PubMed

    Saxena, A K

    1979-08-15

    A quantitative test for the evaluation of surface figures of concave aspheric surfaces using a Babinet compensator is reported. A theoretical estimate of the sensitivity is 0.002lambda for a minimum detectable phase change of 2 pi x 10(-3) rad over a segment length of 1.0 cm.

  5. Characterization of Dynamics in Complex Lyophilized Formulations: I. Comparison of Relaxation Times Measured by Isothermal Calorimetry with Data Estimated from the Width of the Glass Transition Temperature Region

    PubMed Central

    Chieng, Norman; Mizuno, Masayasu; Pikal, Michael

    2013-01-01

    The purposes of this study are to characterize the relaxation dynamics in complex freeze dried formulations and to investigate the quantitative relationship between the structural relaxation time as measured by thermal activity monitor (TAM) and that estimated from the width of the glass transition temperature (ΔTg). The latter method has advantages over TAM because it is simple and quick. As part of this objective, we evaluate the accuracy in estimating relaxation time data at higher temperatures (50°C and 60°C) from TAM data at lower temperature (40°C) and glass transition region width (ΔTg) data obtained by differential scanning calorimetry. Formulations studied here were hydroxyethyl starch (HES)-disaccharide, HES-polyol and HES-disaccharide-polyol at various ratios. We also re-examine, using TAM derived relaxation times, the correlation between protein stability (human growth hormone, hGH) and relaxation times explored in a previous report, which employed relaxation time data obtained from ΔTg. Results show that most of the freeze dried formulations exist in single amorphous phase, and structural relaxation times were successfully measured for these systems. We find a reasonably good correlation between TAM measured relaxation times and corresponding data obtained from estimates based on ΔTg, but the agreement is only qualitative. The comparison plot showed that TAM data is directly proportional to the 1/3 power of ΔTg data, after correcting for an offset. Nevertheless, the correlation between hGH stability and relaxation time remained qualitatively the same as found with using ΔTg derived relaxation data, and it was found that the modest extrapolation of TAM data to higher temperatures using ΔTg method and TAM data at 40°C resulted in quantitative agreement with TAM measurements made at 50 °C and 60 °C, provided the TAM experiment temperature is well below the Tg of the sample. PMID:23608636

  6. Correlations between quantitative fat–water magnetic resonance imaging and computed tomography in human subcutaneous white adipose tissue

    PubMed Central

    Gifford, Aliya; Walker, Ronald C.; Towse, Theodore F.; Brian Welch, E.

    2015-01-01

    Abstract. Beyond estimation of depot volumes, quantitative analysis of adipose tissue properties could improve understanding of how adipose tissue correlates with metabolic risk factors. We investigated whether the fat signal fraction (FSF) derived from quantitative fat–water magnetic resonance imaging (MRI) scans at 3.0 T correlates to CT Hounsfield units (HU) of the same tissue. These measures were acquired in the subcutaneous white adipose tissue (WAT) at the umbilical level of 21 healthy adult subjects. A moderate correlation exists between MRI- and CT-derived WAT values for all subjects, R2=0.54, p<0.0001, with a slope of −2.6, (95% CI [−3.3,−1.8]), indicating that a decrease of 1 HU equals a mean increase of 0.38% FSF. We demonstrate that FSF estimates obtained using quantitative fat–water MRI techniques correlate with CT HU values in subcutaneous WAT, and therefore, MRI-based FSF could be used as an alternative to CT HU for assessing metabolic risk factors. PMID:26702407

  7. New semi-quantitative 123I-MIBG estimation method compared with scoring system in follow-up of advanced neuroblastoma: utility of total MIBG retention ratio versus scoring method.

    PubMed

    Sano, Yuko; Okuyama, Chio; Iehara, Tomoko; Matsushima, Shigenori; Yamada, Kei; Hosoi, Hajime; Nishimura, Tsunehiko

    2012-07-01

    The purpose of this study is to evaluate a new semi-quantitative estimation method using (123)I-MIBG retention ratio to assess response to chemotherapy for advanced neuroblastoma. Thirteen children with advanced neuroblastoma (International Neuroblastoma Risk Group Staging System: stage M) were examined for a total of 51 studies with (123)I-MIBG scintigraphy (before and during chemotherapy). We proposed a new semi-quantitative method using MIBG retention ratio (count obtained with delayed image/count obtained with early image with decay correction) to estimate MIBG accumulation. We analyzed total (123)I-MIBG retention ratio (TMRR: total body count obtained with delayed image/total body count obtained with early image with decay correction) and compared with a scoring method in terms of correlation with tumor markers. TMRR showed significantly higher correlations with urinary catecholamine metabolites before chemotherapy (VMA: r(2) = 0.45, P < 0.05, HVA: r(2) = 0.627, P < 0.01) than MIBG score (VMA: r(2) = 0.19, P = 0.082, HVA: r(2) = 0.25, P = 0.137). There were relatively good correlations between serial change of TMRR and those of urinary catecholamine metabolites (VMA: r(2) = 0.274, P < 0.001, HVA: r(2) = 0.448, P < 0.0001) compared with serial change of MIBG score and those of tumor markers (VMA: r(2) = 0.01, P = 0.537, HVA: 0.084, P = 0.697) during chemotherapy for advanced neuroblastoma. TMRR could be a useful semi-quantitative method for estimating early response to chemotherapy of advanced neuroblastoma because of its high correlation with urine catecholamine metabolites.

  8. Assessing the influence of land use land cover pattern, socio economic factors and air quality status to predict morbidity on the basis of logistic based regression model

    NASA Astrophysics Data System (ADS)

    Dixit, A.; Singh, V. K.

    2017-12-01

    Recent studies conducted by World Health Organisation (WHO) estimated that 92 % of the total world population are living in places where the air quality level has exceeded the WHO standard limit for air quality. This is due to the change in Land Use Land Cover (LULC) pattern, socio economic drivers and anthropogenic heat emission caused by manmade activity. Thereby, many prevalent human respiratory diseases such as lung cancer, chronic obstructive pulmonary disease and emphysema have increased in recent times. In this study, a quantitative relationship is developed between land use (built-up land, water bodies, and vegetation), socio economic drivers and air quality parameters using logistic based regression model over 7 different cities of India for the winter season of 2012 to 2016. Different LULC, socio economic, industrial emission sources, meteorological condition and air quality level from the monitoring stations are taken to estimate the influence on morbidity of each city. Results of correlation are analyzed between land use variables and monthly concentration of pollutants. These values range from 0.63 to 0.76. Similarly, the correlation value between land use variable with socio economic and morbidity ranges from 0.57 to 0.73. The performance of model is improved from 67 % to 79 % in estimating morbidity for the year 2015 and 2016 due to the better availability of observed data.The study highlights the growing importance of incorporating socio-economic drivers with air quality data for evaluating morbidity rate for each city in comparison to just change in quantitative analysis of air quality.

  9. Vascular perfusion kinetics by contrast-enhanced ultrasound are related to synovial microvascularity in the joints of psoriatic arthritis.

    PubMed

    Fiocco, Ugo; Stramare, Roberto; Coran, Alessandro; Grisan, Enrico; Scagliori, Elena; Caso, Francesco; Costa, Luisa; Lunardi, Francesca; Oliviero, Francesca; Bianchi, Fulvia Chieco; Scanu, Anna; Martini, Veronica; Boso, Daniele; Beltrame, Valeria; Vezzù, Maristella; Cozzi, Luisella; Scarpa, Raffaele; Sacerdoti, David; Punzi, Leonardo; Doria, Andrea; Calabrese, Fiorella; Rubaltelli, Leopoldo

    2015-11-01

    The purpose of the study was to assess the relationship of the continuous mode contrast-enhanced harmonic ultrasound (CEUS) imaging with the histopathological and immunohistochemical (IHC) quantitative estimation of microvascular proliferation on synovial samples of patients affected by sustained psoriatic arthritis (PsA). A dedicated linear transducer was used in conjunction with a specific continuous mode contrast enhanced harmonic imaging technology with a second-generation sulfur hexafluoride-filled microbubbles C-agent. The examination was carried out within 1 week before arthroscopic biopsies in 32 active joints. Perfusional parameters were analyzed including regional blood flow (RBF); peak (PEAK) of the C-signal intensity, proportional to the regional blood volume (RBV); beta (β) perfusion frequency; slope (S), representing the inclination of the tangent in the origin; and the refilling time (RT), the reverse of beta. Arthroscopic synovial biopsies were targeted in the hypervascularity areas, as in the same knee recesses assessed by CEUS; the synovial cell infiltrate and vascularity (vessel density) was evaluated by IHC staining of CD45 (mononuclear cell) and CD31, CD105 (endothelial cell) markers, measured by computer-assisted morphometric analysis. In the CEUS area examined, the corresponding time-intensity curves demonstrated a slow rise time. Synovial histology showed slight increased layer lining thickness, perivascular lymphomonocyte cell infiltration, and microvascular remodeling, with marked vessel wall thickening with reduction of the vascular lumen. A significant correlation was found between RT and CD31+ as PEAK and CD105+ vessel density; RT was inversely correlated to RBF, PEAK, S, and β. The study demonstrated the association of the CEUS perfusion kinetics with the histopathological quantitative and morphologic estimation of synovial microvascular proliferation, suggesting that a CEUS imaging represents a reliable tool for the estimate of the synovial hypervascularity in PsA.

  10. Intracranial microprobe for evaluating neuro-hemodynamic coupling in unanesthetized human neocortex

    PubMed Central

    Keller, Corey J.; Cash, Sydney S.; Narayanan, Suresh; Wang, Chunmao; Kuzniecky, Ruben; Carlson, Chad; Devinsky, Orrin; Thesen, Thomas; Doyle, Werner; Sassaroli, Angelo; Boas, David A.; Ulbert, Istvan; Halgren, Eric

    2009-01-01

    Measurement of the blood-oxygen-level dependent (BOLD) response with fMRI has revolutionized cognitive neuroscience and is increasingly important in clinical care. The BOLD response reflects changes in deoxy-hemoglobin concentration, blood volume, and blood flow. These hemodynamic changes ultimately result from neuronal firing and synaptic activity, but the linkage between these domains is complex, poorly understood, and may differ across species, cortical areas, diseases, and cognitive states. We describe here a technique that can measure neural and hemodynamic changes simultaneously from cortical microdomains in waking humans. We utilize a “laminar optode,” a linear array of microelectrodes for electrophysiological measures paired with a micro-optical device for hemodynamic measurements. Optical measurements include laser Doppler to estimate cerebral blood flow as well as point spectroscopy to estimate oxy- and deoxy-hemoglobin concentrations. The microelectrode array records local field potential gradients (PG) and multi-unit activity (MUA) at 24 locations spanning the cortical depth, permitting estimation of population trans-membrane current flows (Current Source Density, CSD) and population cell firing in each cortical lamina. Comparison of the laminar CSD/MUA profile with the origins and terminations of cortical circuits allows activity in specific neuronal circuits to be inferred and then directly compared to hemodynamics. Access is obtained in epileptic patients during diagnostic evaluation for surgical therapy. Validation tests with relatively well-understood manipulations (EKG, breath-holding, cortical electrical stimulation) demonstrate the expected responses. This device can provide a new and robust means for obtaining detailed, quantitative data for defining neurovascular coupling in awake humans. PMID:19428529

  11. Intracranial microprobe for evaluating neuro-hemodynamic coupling in unanesthetized human neocortex.

    PubMed

    Keller, Corey J; Cash, Sydney S; Narayanan, Suresh; Wang, Chunmao; Kuzniecky, Ruben; Carlson, Chad; Devinsky, Orrin; Thesen, Thomas; Doyle, Werner; Sassaroli, Angelo; Boas, David A; Ulbert, Istvan; Halgren, Eric

    2009-05-15

    Measurement of the blood-oxygen-level dependent (BOLD) response with fMRI has revolutionized cognitive neuroscience and is increasingly important in clinical care. The BOLD response reflects changes in deoxy-hemoglobin concentration, blood volume, and blood flow. These hemodynamic changes ultimately result from neuronal firing and synaptic activity, but the linkage between these domains is complex, poorly understood, and may differ across species, cortical areas, diseases, and cognitive states. We describe here a technique that can measure neural and hemodynamic changes simultaneously from cortical microdomains in waking humans. We utilize a "laminar optode," a linear array of microelectrodes for electrophysiological measures paired with a micro-optical device for hemodynamic measurements. Optical measurements include laser Doppler to estimate cerebral blood flow as well as point spectroscopy to estimate oxy- and deoxy-hemoglobin concentrations. The microelectrode array records local field potential gradients (PG) and multi-unit activity (MUA) at 24 locations spanning the cortical depth, permitting estimation of population trans-membrane current flows (Current Source Density, CSD) and population cell firing in each cortical lamina. Comparison of the laminar CSD/MUA profile with the origins and terminations of cortical circuits allows activity in specific neuronal circuits to be inferred and then directly compared to hemodynamics. Access is obtained in epileptic patients during diagnostic evaluation for surgical therapy. Validation tests with relatively well-understood manipulations (EKG, breath-holding, cortical electrical stimulation) demonstrate the expected responses. This device can provide a new and robust means for obtaining detailed, quantitative data for defining neurovascular coupling in awake humans.

  12. Estimating the number of animals in wildlife populations

    USGS Publications Warehouse

    Lancia, R.A.; Kendall, W.L.; Pollock, K.H.; Nichols, J.D.; Braun, Clait E.

    2005-01-01

    INTRODUCTION In 1938, Howard M. Wight devoted 9 pages, which was an entire chapter in the first wildlife management techniques manual, to what he termed 'census' methods. As books and chapters such as this attest, the volume of literature on this subject has grown tremendously. Abundance estimation remains an active area of biometrical research, as reflected in the many differences between this chapter and the similar contribution in the previous manual. Our intent in this chapter is to present an overview of the basic and most widely used population estimation techniques and to provide an entree to the relevant literature. Several possible approaches could be taken in writing a chapter dealing with population estimation. For example, we could provide a detailed treatment focusing on statistical models and on derivation of estimators based on these models. Although a chapter using this approach might provide a valuable reference for quantitative biologists and biometricians, it would be of limited use to many field biologists and wildlife managers. Another approach would be to focus on details of actually applying different population estimation techniques. This approach would include both field application (e.g., how to set out a trapping grid or conduct an aerial survey) and detailed instructions on how to use the resulting data with appropriate estimation equations. We are reluctant to attempt such an approach, however, because of the tremendous diversity of real-world field situations defined by factors such as the animal being studied, habitat, available resources, and because of our resultant inability to provide detailed instructions for all possible cases. We believe it is more useful to provide the reader with the conceptual basis underlying estimation methods. Thus, we have tried to provide intuitive explanations for how basic methods work. In doing so, we present relevant estimation equations for many methods and provide citations of more detailed treatments covering both statistical considerations and field applications. We have chosen to present methods that are representative of classes of estimators, rather than address every available method. Our hope is that this chapter will provide the reader with enough background to make an informed decision about what general method(s) will likely perform well in any particular field situation. Readers with a more quantitative background may then be able to consult detailed references and tailor the selected method to suit their particular needs. Less quantitative readers should consult a biometrician, preferably one with experience in wildlife studies, for this 'tailoring,' with the hope they will be able to do so with a basic understanding of the general method, thereby permitting useful interaction and discussion with the biometrician. SUMMARY Estimating the abundance or density of animals in wild populations is not a trivial matter. Virtually all techniques involve the basic problem of estimating the probability of seeing, capturing, or otherwise detecting animals during some type of survey and, in many cases, sampling concerns as well. In the case of indices, the detection probability is assumed to be constant (but unknown). We caution against use of indices unless this assumption can be verified for the comparison(s) of interest. In the case of population estimation, many methods have been developed over the years to estimate the probability of detection associated with various kinds of count statistics. Techniques range from complete counts, where sampling concerns often dominate, to incomplete counts where detection probabilities are also important. Some examples of the latter are multiple observers, removal methods, and capture-recapture. Before embarking on a survey to estimate the size of a population, one must understand clearly what information is needed and for what purpose the information will be used. The key to derivin

  13. Quantitative Precipitation Nowcasting: A Lagrangian Pixel-Based Approach

    DTIC Science & Technology

    2012-01-01

    Sorooshian, T. Bellerby, and G. Huffman, 2010: REFAME: Rain Estimation Using Forward-Adjusted Advection of Microwave Estimates. J. of Hydromet ., 11...precipitation forecasting using information from radar and Numerical Weather Prediction models. J. of Hydromet ., 4(6):1168-1180. Germann, U., and I

  14. Reference-based source separation method for identification of brain regions involved in a reference state from intracerebral EEG

    PubMed Central

    Samadi, Samareh; Amini, Ladan; Cosandier-Rimélé, Delphine; Soltanian-Zadeh, Hamid; Jutten, Christian

    2013-01-01

    In this paper, we present a fast method to extract the sources related to interictal epileptiform state. The method is based on general eigenvalue decomposition using two correlation matrices during: 1) periods including interictal epileptiform discharges (IED) as a reference activation model and 2) periods excluding IEDs or abnormal physiological signals as background activity. After extracting the most similar sources to the reference or IED state, IED regions are estimated by using multiobjective optimization. The method is evaluated using both realistic simulated data and actual intracerebral electroencephalography recordings of patients suffering from focal epilepsy. These patients are seizure-free after the resective surgery. Quantitative comparisons of the proposed IED regions with the visually inspected ictal onset zones by the epileptologist and another method of identification of IED regions reveal good performance. PMID:23428609

  15. The effect and importance of physical activity on behavioural and psychological symptoms in people with dementia: A systematic mixed studies review.

    PubMed

    Junge, Tina; Ahler, Jonas; Knudsen, Hans K; Kristensen, Hanne K

    2018-01-01

    Background People with dementia may benefit from the effect of physical activity on behavioural and psychological symptoms of dementia. Qualitative synthesis of the importance of physical activity might complement and help clarify quantitative findings on this topic. The purpose of this systematic mixed studies review was to evaluate findings from both quantitative and qualitative methods about the effect and importance of physical activity on behavioural and psychological symptoms of dementia in people with dementia. Methods The systematic literature search was conducted in EMBASE, CINAHL, PubMed, PEDro and PsycINFO. Inclusion criteria were: people with a light to moderate degree of dementia, interventions including physical activity and outcomes focusing on behavioural and psychological symptoms of dementia or quality of life. To assess the methodological quality of the studies, the AMSTAR and GRADE checklists were applied for the quantitative studies and the CASP qualitative checklist for the qualitative studies. Results A small reduction in depression level and improved mood were seen in some quantitative studies of multi-component physical activity interventions, including walking. Due to high heterogeneity in the quantitative studies, a single summary of the effect of physical activity on behavioural and psychological symptoms of dementia should be interpreted with some caution. Across the qualitative studies, the common themes about the importance of physical activity were its 'socially rewarding' nature, the 'benefits of walking outdoors' and its contribution to 'maintaining self-hood'. Conclusion For people with dementia, there was a small, quantitative effect of multi-component physical activity including walking, on depression level and mood. People with dementia reported the importance of walking outdoors, experiencing the social rewards of physical activity in groups, as well as physical activity were a means toward maintaining self-hood.

  16. Quantitative estimation of infarct size by simultaneous dual radionuclide single photon emission computed tomography: comparison with peak serum creatine kinase activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawaguchi, K.; Sone, T.; Tsuboi, H.

    1991-05-01

    To test the hypothesis that simultaneous dual energy single photon emission computed tomography (SPECT) with technetium-99m (99mTc) pyrophosphate and thallium-201 (201TI) can provide an accurate estimate of the size of myocardial infarction and to assess the correlation between infarct size and peak serum creatine kinase activity, 165 patients with acute myocardial infarction underwent SPECT 3.2 +/- 1.3 (SD) days after the onset of acute myocardial infarction. In the present study, the difference in the intensity of 99mTc-pyrophosphate accumulation was assumed to be attributable to difference in the volume of infarcted myocardium, and the infarct volume was corrected by the ratiomore » of the myocardial activity to the osseous activity to quantify the intensity of 99mTc-pyrophosphate accumulation. The correlation of measured infarct volume with peak serum creatine kinase activity was significant (r = 0.60, p less than 0.01). There was also a significant linear correlation between the corrected infarct volume and peak serum creatine kinase activity (r = 0.71, p less than 0.01). Subgroup analysis showed a high correlation between corrected volume and peak creatine kinase activity in patients with anterior infarctions (r = 0.75, p less than 0.01) but a poor correlation in patients with inferior or posterior infarctions (r = 0.50, p less than 0.01). In both the early reperfusion and the no reperfusion groups, a good correlation was found between corrected infarct volume and peak serum creatine kinase activity (r = 0.76 and r = 0.76, respectively; p less than 0.01).« less

  17. Research Review: Neural response to threat in children, adolescents, and adults after child maltreatment - a quantitative meta-analysis.

    PubMed

    Hein, Tyler C; Monk, Christopher S

    2017-03-01

    Child maltreatment is common and has long-term consequences for affective function. Investigations of neural consequences of maltreatment have focused on the amygdala. However, developmental neuroscience indicates that other brain regions are also likely to be affected by child maltreatment, particularly in the social information processing network (SIPN). We conducted a quantitative meta-analysis to: confirm that maltreatment is related to greater bilateral amygdala activation in a large sample that was pooled across studies; investigate other SIPN structures that are likely candidates for altered function; and conduct a data-driven examination to identify additional regions that show altered activation in maltreated children, teens, and adults. We conducted an activation likelihood estimation analysis with 1,733 participants across 20 studies of emotion processing in maltreated individuals. Maltreatment is associated with increased bilateral amygdala activation to emotional faces. One SIPN structure is altered: superior temporal gyrus, of the detection node, is hyperactive in maltreated individuals. The results of the whole-brain corrected analysis also show hyperactivation of the parahippocampal gyrus and insula in maltreated individuals. The meta-analysis confirms that maltreatment is related to increased bilateral amygdala reactivity and also shows that maltreatment affects multiple additional structures in the brain that have received little attention in the literature. Thus, although the majority of studies examining maltreatment and brain function have focused on the amygdala, these findings indicate that the neural consequences of child maltreatment involve a broader network of structures. © 2016 Association for Child and Adolescent Mental Health.

  18. Quantitative study of digestive enzyme secretion and gastrointestinal lipolysis in chronic pancreatitis.

    PubMed

    Carrière, Frédéric; Grandval, Philippe; Renou, Christophe; Palomba, Aurélie; Priéri, Florence; Giallo, Jacqueline; Henniges, Friederike; Sander-Struckmeier, Suntje; Laugier, René

    2005-01-01

    The contribution of human gastric lipase (HGL) to the overall lipolysis process in chronic pancreatitis (CP), as well as the relative pancreatic enzyme levels, rarely are addressed. This study was designed to quantify pancreatic and extrapancreatic enzyme output, activity, and stability in CP patients vs. healthy volunteers. Healthy volunteers (n = 6), mild CP patients (n = 5), and severe (n = 7) CP patients were intubated with gastric and duodenal tubes before the administration of a test meal. HGL, human pancreatic lipase (HPL), chymotrypsin, and amylase concentrations were assessed in gastric and duodenal samples by measuring the respective enzymatic activities. Intragastric and overall lipolysis levels at the angle of Treitz were estimated based on quantitative analysis of lipolysis products. Similar analyses were performed on duodenal contents incubated ex vivo for studying enzyme stability and evolution of lipolysis. Although HPL, chymotrypsin, and amylase outputs all were extremely low, HGL outputs in patients with severe CP (46.8 +/- 31.0 mg) were 3-4-fold higher than in healthy controls (13.3 +/- 13.8 mg). Intragastric lipolysis did not increase, however, in patients with severe CP, probably because of the rapid decrease in the pH level of the gastric contents caused by a higher gastric acid secretion. HGL remains active and highly stable in the acidic duodenal contents of CP patients, and, overall, can achieve a significant lipolysis of the dietary triglycerides (30% of the control values) in the absence of HPL. Although all pancreatic enzyme secretions are simultaneously reduced in severe CP, gastric lipase can compensate partly for the loss of pancreatic lipase but not normalize overall lipolytic activity.

  19. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  20. qF-SSOP: real-time optical property corrected fluorescence imaging

    PubMed Central

    Valdes, Pablo A.; Angelo, Joseph P.; Choi, Hak Soo; Gioux, Sylvain

    2017-01-01

    Fluorescence imaging is well suited to provide image guidance during resections in oncologic and vascular surgery. However, the distorting effects of tissue optical properties on the emitted fluorescence are poorly compensated for on even the most advanced fluorescence image guidance systems, leading to subjective and inaccurate estimates of tissue fluorophore concentrations. Here we present a novel fluorescence imaging technique that performs real-time (i.e., video rate) optical property corrected fluorescence imaging. We perform full field of view simultaneous imaging of tissue optical properties using Single Snapshot of Optical Properties (SSOP) and fluorescence detection. The estimated optical properties are used to correct the emitted fluorescence with a quantitative fluorescence model to provide quantitative fluorescence-Single Snapshot of Optical Properties (qF-SSOP) images with less than 5% error. The technique is rigorous, fast, and quantitative, enabling ease of integration into the surgical workflow with the potential to improve molecular guidance intraoperatively. PMID:28856038

Top