Science.gov

Sample records for quantitative spect assessment

  1. Quantitative SPECT techniques.

    PubMed

    Watson, D D

    1999-07-01

    Quantitative imaging involves first, a set of measurements that characterize an image. There are several variations of technique, but the basic measurements that are used for single photon emission computed tomography (SPECT) perfusion images are reasonably standardized. Quantification currently provides only relative tracer activity within the myocardial regions defined by an individual SPECT acquisition. Absolute quantification is still a work in progress. Quantitative comparison of absolute changes in tracer uptake comparing a stress and rest study or preintervention and postintervention study would be useful and could be done, but most commercial systems do not maintain the data normalization that is necessary for this. Measurements of regional and global function are now possible with electrocardiography (ECG) gating, and this provides clinically useful adjunctive data. Techniques for measuring ventricular function are evolving and promise to provide clinically useful accuracy. The computer can classify images as normal or abnormal by comparison with a normal database. The criteria for this classification involve more than just checking the normal limits. The images should be analyzed to measure how far they deviate from normal, and this information can be used in conjunction with pretest likelihood to indicate the level of statistical certainty that an individual patient has a true positive or true negative test. The interface between the computer and the clinician interpreter is an important part of the process. Especially when both perfusion and function are being determined, the ability of the interpreter to correctly assimilate the data is essential to the use of the quantitative process. As we become more facile with performing and recording objective measurements, the significance of the measurements in terms of risk evaluation, viability assessment, and outcome should be continually enhanced. PMID:10433336

  2. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    SciTech Connect

    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley; Department of Radiology, University of California; Gullberg, Grant T; Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-02-15

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50percent when imaging with iodine-125, and up to 25percent when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30percent, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50percent) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the

  3. Brain SPECT quantitation in clinical diagnosis

    SciTech Connect

    Hellman, R.S.

    1991-12-31

    Methods to quantitate SPECT data for clinical diagnosis should be chosen so that they take advantage of the lessons learned from PET data. This is particularly important because current SPECT high-resolution brain imaging systems now produce images that are similar in resolution to those generated by the last generation PET equipment (9 mm FWHM). These high-resolution SPECT systems make quantitation of SPECT more problematic than earlier. Methodology validated on low-resolution SPECT systems may no longer be valid for data obtained with the newer SPECT systems. For example, in patients with dementia, the ratio of parietal to cerebellar activity often was studied. However, with new instruments, the cerebellum appears very different: discrete regions are more apparent. The large cerebellar regions usually used with older instrumentation are of an inappropriate size for the new equipment. The normal range for any method of quantitation determined using older equipment probably changes for data obtained with new equipment. It is not surprising that Kim et al. in their simulations demonstrated that because of the finite resolution of imaging systems, the ability to measure pure function is limited, with {open_quotes}anatomy{close_quotes} and {open_quotes}function{close_quotes} coupled in a {open_quotes}complex nonlinear way{close_quotes}. 11 refs.

  4. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR

  5. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,‑26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated

  6. Quantitative Monte Carlo-based holmium-166 SPECT reconstruction

    SciTech Connect

    Elschot, Mattijs; Smits, Maarten L. J.; Nijsen, Johannes F. W.; Lam, Marnix G. E. H.; Zonnenberg, Bernard A.; Bosch, Maurice A. A. J. van den; Jong, Hugo W. A. M. de; Viergever, Max A.

    2013-11-15

    Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 ({sup 166}Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative {sup 166}Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of {sup 166}Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full {sup 166}Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (A{sup est}) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six {sup 166}Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80

  7. Accuracy of quantitative reconstructions in SPECT/CT imaging

    NASA Astrophysics Data System (ADS)

    Shcherbinin, S.; Celler, A.; Belhocine, T.; van der Werf, R.; Driedger, A.

    2008-09-01

    The goal of this study was to determine the quantitative accuracy of our OSEM-APDI reconstruction method based on SPECT/CT imaging for Tc-99m, In-111, I-123, and I-131 isotopes. Phantom studies were performed on a SPECT/low-dose multislice CT system (Infinia-Hawkeye-4 slice, GE Healthcare) using clinical acquisition protocols. Two radioactive sources were centrally and peripherally placed inside an anthropometric Thorax phantom filled with non-radioactive water. Corrections for attenuation, scatter, collimator blurring and collimator septal penetration were applied and their contribution to the overall accuracy of the reconstruction was evaluated. Reconstruction with the most comprehensive set of corrections resulted in activity estimation with error levels of 3-5% for all the isotopes.

  8. Design and assessment of cardiac SPECT systems

    NASA Astrophysics Data System (ADS)

    Lee, Chih-Jie

    Single-photon emission computed tomography (SPECT) is a modality widely used to detect myocardial ischemia and myocardial infarction. Objectively assessing and comparing different SPECT systems is important so that the best detectability of cardiac defects can be achieved. Whitaker, Clarkson, and Barrett's study on the scanning linear observer (SLO) shows that the SLO can be used to estimate the location and size of signals. One major advantage of the SLO is that it can be used with projection data rather than reconstruction data. Thus, this observer model assesses overall hardware performance independent by any reconstruction algorithm. In addition, we will show that the run time of image-quality studies is significantly reduced. Several systems derived from the GE CZT-based dedicated cardiac SPECT camera Discovery 530c design, which is officially named the Alcyone Technology: Discovery NM 530c, were assessed using the performance of the SLO for the task of detecting cardiac defects and estimating the properties of the defects. Clinically, hearts can be virtually segmented into three coronary artery territories: left anterior descending artery (LAD), left circumflex artery (LCX), and right coronary artery (RCA). One of the most important functions of a cardiac SPECT system is to produce images from which a radiologist can correctly predict in which territory the defect exists. A good estimation of the defect extent from the images is also very helpful for determining the seriousness of the myocardial ischemia. In this dissertation, both locations and extent of defects were estimated by the SLO, and system performance was assessed using localization receiver operating characteristic (LROC) / estimation receiver operating characteristic (EROC) curves. Area under LROC curve (AULC) / area under EROC curve (AUEC) and true positive fraction (TPF) at specific false positive fraction (FPF) can be treated as the gures of merit (FOMs). As the results will show, a

  9. Sci—Thur PM: Imaging — 05: Calibration of a SPECT/CT camera for quantitative SPECT with {sup 99m}Tc

    SciTech Connect

    Gaudin, Émilie; Montégiani, Jean-François; Després, Philippe; Beauregard, Jean-Mathieu

    2014-08-15

    While quantitation is the norm in PET, it is not widely available yet in SPECT. This work's aim was to calibrate a commercially available SPECT/CT system to perform quantitative SPECT. Counting sensitivity, dead-time (DT) constant and partial volume effect (PVE) of the system were assessed. A dual-head Siemens SymbiaT6 SPECT/CT camera equipped with low energy high-resolution collimators was studied. {sup 99m}Tc was the radioisotope of interest because of its wide usage in nuclear medicine. First, point source acquisitions were performed (activity: 30–990MBq). Further acquisitions were then performed with a uniform Jaszczak phantom filled with water at high activity (25–5000MBq). PVE was studied using 6 hot spheres (diameters: 9.9–31.2 mm) filled with {sup 99m}Tc (2.8MBq/cc) in the Jaszczak phantom, which was: (1) empty, (2) water-filled and (3) water-filled with low activity (0.1MBq/cc). The data was reconstructed with the Siemens's Flash3D iterative algorithm with 4 subsets and 8 iterations, attenuation-correction (AC) and scatter-correction (SC). DT modelling was based on the total spectrum counting rate. Sensitivity was assessed using AC-SC reconstructed SPECT data. Sensitivity and DT for the sources were 99.51±1.46cps/MBq and 0.60±0.04µs. For the phantom, sensitivity and DT were 109.9±2.3cps/MBq and 0.62±0.13µs. The recovery-coefficient varied from 5% for the 9.9mm, to 80% for the 31.2mm spheres. With our calibration methods, both sensitivity and DT constant of the SPECT camera had little dependence on the object geometry and attenuation. For small objects of known size, recovery-coefficient can be applied to correct PVE. Clinical quantitative SPECT appears to be possible and has many potential applications.

  10. Frequency-domain approaches to quantitative brain SPECT

    NASA Astrophysics Data System (ADS)

    Cheng, Jui-Hsi

    1997-12-01

    Quantitative SPECT (Single Photon Emission Computed Tomography) has been limited mainly by (1) inadequate numbers of detected photons which are contaminated by Poisson noise in projections, (2) photon attenuation in the body, (3) inclusion of scattered photons in the projections, and (4) depth-dependent blurring due to the finite size of collimator holes. Various methods to compensate for the above effects via either spatial- domain approaches or frequency-domain approaches have been proposed. However, most of the proposed methods focus only on individual effects. A reconstruction method which can compensate for all of the effects simultaneously is necessary. We have developed an algorithm to compensate for all of the effects simultaneously using frequency-domain approaches. For noise suppression, a method to convert the signal-dependent Poisson noise into signal- independent Gaussian white noise was first applied. Then the Wiener filter with a designed butterfly window was used. For scatter-photon removal, a subtraction technique based on multiple energy-window acquisitions was employed. For collimation deblurring, a direct inverse filter was designed via stationary phase condition (depth-frequency relationship). For attenuation compensation, an exact analytical solution was derived by Fourier analysis. Our algorithm was executed on a HP/730 workstation. An improvement was shown in computing time, noise suppression, recognition of phantom features, and quantification of concentrations of regions of interest (ROI). In addition to simulation, we performed a series of experiments to verify our models and test our algorithm. These include (1) investigation of the characteristics of Poisson noise of SPECT, (2) comparison of the performance of dual-energy window and triple-energy window methods for scatter correction, (3) measurement of a depth- dependent PSF, (4) reconstruction of the attenuation map using scatter-window data, and (5) implementation of a simultaneous

  11. SPECT imaging evaluation in movement disorders: far beyond visual assessment.

    PubMed

    Badiavas, Kosmas; Molyvda, Elisavet; Iakovou, Ioannis; Tsolaki, Magdalini; Psarrakos, Kyriakos; Karatzas, Nikolaos

    2011-04-01

    Single photon emission computed tomography (SPECT) imaging with (123)I-FP-CIT is of great value in differentiating patients suffering from Parkinson's disease (PD) from those suffering from essential tremor (ET). Moreover, SPECT with (123)I-IBZM can differentiate PD from Parkinson's "plus" syndromes. Diagnosis is still mainly based on experienced observers' visual assessment of the resulting images while many quantitative methods have been developed in order to assist diagnosis since the early days of neuroimaging. The aim of this work is to attempt to categorize, briefly present and comment on a number of semi-quantification methods used in nuclear medicine neuroimaging. Various arithmetic indices have been introduced with region of interest (ROI) manual drawing methods giving their place to automated procedures, while advancing computer technology has allowed automated image registration, fusion and segmentation to bring quantification closer to the final diagnosis based on the whole of the patient's examinations results, clinical condition and response to therapy. The search for absolute quantification has passed through neuroreceptor quantification models, which are invasive methods that involve tracer kinetic modelling and arterial blood sampling, a practice that is not commonly used in a clinical environment. On the other hand, semi-quantification methods relying on computers and dedicated software try to elicit numerical information out of SPECT images. The application of semi-quantification methods aims at separating the different patient categories solving the main problem of finding the uptake in the structures of interest. The semi-quantification methods which were studied fall roughly into three categories, which are described as classic methods, advanced automated methods and pixel-based statistical analysis methods. All these methods can be further divided into various subcategories. The plethora of the existing semi-quantitative methods reinforces

  12. Quantitation of renal uptake of technetium-99m DMSA using SPECT

    SciTech Connect

    Groshar, D.; Frankel, A.; Iosilevsky, G.; Israel, O.; Moskovitz, B.; Levin, D.R.; Front, D.

    1989-02-01

    Quantitative single photon emission computed tomography (SPECT) methodology based on calibration with kidney phantoms has been applied for the assessment of renal uptake of (/sup 99m/Tc)DMSA in 25 normals; 16 patients with a single normal kidney; 30 patients with unilateral nephropathy; and 17 patients with bilateral nephropathy. An excellent correlation (r = 0.99, s.e.e. = 152) was found between SPECT measured concentration and actual concentration in kidney phantoms. Kidney uptake at 6 hr after injection in normals was 20.0% +/- 4.6% for the left and 20.8% +/- 4.4% for the right. Patients with unilateral nephropathy had a statistically significant (p less than 0.001) low uptake in the diseased kidney (7.0% +/- 4.7%), but the contralateral kidney uptake did not differ from the normal group (20.0% +/- 7.0%). The method was especially useful in patients with bilateral nephropathy. Significantly (p less than 0.001) decreased uptake was found in both kidneys (5.1% +/- 3.4% for the left and 6.7% +/- 4.2% for the right). The total kidney uptake (right and left) in this group showed to be inversely correlated (r = 0.83) with serum creatinine. The uptake of (/sup 99m/Tc)DMSA in single normal kidney was higher (p less than 0.001) than in a normal kidney (34.7% +/- 11.9%), however, it was lower than the total absolute uptake (RT + LT = 41.5% +/- 8.8%) in the normal group. The results indicate that SPECT is a reliable and reproducible technique to quantitate absolute kidney uptake of (/sup 99m/Tc)DMSA.

  13. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  14. Quantitative analysis of L-SPECT system for small animal brain imaging

    NASA Astrophysics Data System (ADS)

    Rahman, Tasneem; Tahtali, Murat; Pickering, Mark R.

    2016-03-01

    This paper aims to investigate the performance of a newly proposed L-SPECT system for small animal brain imaging. The L-SPECT system consists of an array of 100 × 100 micro range diameter pinholes. The proposed detector module has a 48 mm by 48 mm active area and the system is based on a pixelated array of NaI crystals (10×10×10 mm elements) coupled with an array of position sensitive photomultiplier tubes (PSPMTs). The performance of this system was evaluated with pinhole radii of 50 μm, 60 μm and 100 μm. Monte Carlo simulation studies using the Geant4 Application for Tomographic Emission (GATE) software package validate the performance of this novel dual head L-SPECT system where a geometric mouse phantom is used to investigate its performance. All SPECT data were obtained using 120 projection views from 0° to 360° with a 3° step. Slices were reconstructed using conventional filtered back projection (FBP) algorithm. We have evaluated the quality of the images in terms of spatial resolution (FWHM) based on line spread function, the system sensitivity, the point source response function and the image quality. The sensitivity of our newly proposed L- SPECT system was about 4500 cps/μCi at 6 cm along with excellent full width at half-maximum (FWHM) using 50 μm pinhole aperture at several radii of rotation. The analysis results show the combination of excellent spatial resolution and high detection efficiency over an energy range between 20-160 keV. The results demonstrate that SPECT imaging using a pixelated L-SPECT detector module is applicable in a quantitative study of mouse brain imaging.

  15. Applicability of a set of tomographic reconstruction algorithms for quantitative SPECT on irradiated nuclear fuel assemblies

    NASA Astrophysics Data System (ADS)

    Jacobsson Svärd, Staffan; Holcombe, Scott; Grape, Sophie

    2015-05-01

    A fuel assembly operated in a nuclear power plant typically contains 100-300 fuel rods, depending on fuel type, which become strongly radioactive during irradiation in the reactor core. For operational and security reasons, it is of interest to experimentally deduce rod-wise information from the fuel, preferably by means of non-destructive measurements. The tomographic SPECT technique offers such possibilities through its two-step application; (1) recording the gamma-ray flux distribution around the fuel assembly, and (2) reconstructing the assembly's internal source distribution, based on the recorded radiation field. In this paper, algorithms for performing the latter step and extracting quantitative relative rod-by-rod data are accounted for. As compared to application of SPECT in nuclear medicine, nuclear fuel assemblies present a much more heterogeneous distribution of internal attenuation to gamma radiation than the human body, typically with rods containing pellets of heavy uranium dioxide surrounded by cladding of a zirconium alloy placed in water or air. This inhomogeneity severely complicates the tomographic quantification of the rod-wise relative source content, and the deduction of conclusive data requires detailed modelling of the attenuation to be introduced in the reconstructions. However, as shown in this paper, simplified models may still produce valuable information about the fuel. Here, a set of reconstruction algorithms for SPECT on nuclear fuel assemblies are described and discussed in terms of their quantitative performance for two applications; verification of fuel assemblies' completeness in nuclear safeguards, and rod-wise fuel characterization. It is argued that a request not to base the former assessment on any a priori information brings constraints to which reconstruction methods that may be used in that case, whereas the use of a priori information on geometry and material content enables highly accurate quantitative assessment, which

  16. A direct measurement of skull attenuation for quantitative SPECT

    SciTech Connect

    Turkington, T.G.; Gilland, D.R.; Jaszczak, R.J.; Greer, K.L.; Coleman, R.E. . Dept. of Radiology); Smith, M.F. . Dept. of Biomedical Engineering)

    1993-08-01

    The attenuation of 140 keV photons was measured in three empty skulls by placing a [sup 99m]Tc line source inside each one and acquiring projection data. These projections were compared to projections of the line source alone to determine the transmission through each point in the skull surrounding the line source. The effective skull thickness was calculated for each point using an assumed dense bone attenuation coefficient. The relative attenuation for this thickness of bone was compared to that of an equivalent amount of soft tissue to evaluate the increased attenuation of photons in brain SPECT relative to a uniform soft tissue approximation. For the skull regions surrounding most of the brain, the effective bone thickness varied considerably, but was generally less than 6 mm, resulting in a relative attenuation increases of less than 6%.

  17. A new dynamic myocardial phantom for evaluation of SPECT and PET quantitation in systolic and diastolic conditions

    SciTech Connect

    Dreuille, O. de; Bendriem, B.; Riddell, C.

    1996-12-31

    We present a new dynamic myocardial phantom designed to evaluate SPECT and PET imaging in systolic and diastolic conditions. The phantom includes a thoracic attenuating media and the myocardial wall thickness varying during the scan can be performed. In this study the phantom was used with three different wall thickness characteristic of a systolic, end-diastolic and pathologic end-diastolic condition. The myocardium was filled with {sup 99m}Tc, {sup 18}F and Gd and imaged by SPECT, PET and MRI. SPECT attenuation correction was performed using a modified PET transmission. A bull`s eyes image was obtained for all data and wall ROI were then drawn for analysis. Using MRI as a reference, error from PET, SPECT and attenuation corrected SPECT were calculated. Systolic PET performances agree with MRI. Quantitation loss due to wall thickness reduction compared to the systole. Attenuation correction in SPECT leads to significant decrease of the error both in systole (from 29% to 14%) and diastole (35% to 22%). This is particularly sensitive for septum and inferior walls. SPECT residual errors (14% in systole and 22% in pathologic end-diastole) are likely caused by scatter, noise and depth dependent resolution effect. The results obtained with this dynamical phantom demonstrate the quantitation improvement achieved in SPECT with attenuation correction and also reinforce the need for variable resolution correction in addition to attenuation correction.

  18. 99mTc-MDP SPECT/CT for assessment of condylar hyperplasia.

    PubMed

    Derlin, Thorsten; Busch, Jasmin D; Habermann, Christian R

    2013-01-01

    We report a case of condylar hyperplasia diagnosed with 99mTc-MDP SPECT/CT. A 21-year-old woman with facial asymmetry was referred for assessment of condylar growth activity. SPECT/CT confirmed condylar hyperactivity, and simultaneous low-dose CT contributed to the diagnosis of hemimandibular hyperplasia. SPECT/CT may become a valuable tool for the diagnosis and comprehensive assessment of condylar hyperplasia, providing both functional and morphological information. PMID:23242067

  19. Quantitative simultaneous 111In/99mTc SPECT-CT of osteomyelitis

    PubMed Central

    Cervo, Morgan; Gerbaudo, Victor H.; Park, Mi-Ae; Moore, Stephen C.

    2013-01-01

    projections to the sum of 99mTc and 111In contributions, using the known half-lives. Uncontaminated data were scaled and recombined into six datasets with different activity ratios; ten Poisson noise realizations were then generated for each ratio. VOIs in each of the compartments were used to evaluate the bias and precision of each method with respect to reconstructions of uncontaminated datasets. In addition to the simulated and acquired phantom images, the authors reconstructed patient images with MC-JOSEM and TEW-OSEM. Patient reconstructions were assessed qualitatively for lesion contrast, spatial definition, and scatter. Results: For all simulated and acquired infection phantoms, the root-mean squared-error of measured 99mTc activity was significantly improved with MC-JOSEM and TEW-OSEM in comparison to NC-OSEM reconstructions. While MC-JOSEM trended toward outperforming TEW-OSEM, the improvement was only found to be significant (p < 0.001) for the acquired bone phantom in which a wide range of 111In/99mTc concentration ratios were tested. In all cases, scatter correction did not significantly improve 111In quantitation. Conclusions: Compensation for scatter and crosstalk is useful for improving quality, bias, and precision of 99mTc activity estimates in simultaneous dual-radionuclide imaging of OM. The use of the more rigorous MC-based estimates provided marginal improvements over TEW. While the phantom results were encouraging, more subjects are needed to evaluate the usefulness of quantitative 111In/99mTc SPECT-CT in the clinic. PMID:23927346

  20. Quantitative cardiac SPECT in three dimensions: validation by experimental phantom studies

    NASA Astrophysics Data System (ADS)

    Liang, Z.; Ye, J.; Cheng, J.; Li, J.; Harrington, D.

    1998-04-01

    A mathematical framework for quantitative SPECT (single photon emission computed tomography) reconstruction of the heart is presented. An efficient simultaneous compensation approach to the reconstruction task is described. The implementation of the approach on a digital computer is delineated. The approach was validated by experimental data acquired from chest phantoms. The phantoms consisted of a cylindrical elliptical tank of Plexiglass, a cardiac insert made of Plexiglass, a spine insert of packed bone meal and lung inserts made of styrofoam beads alone. Water bags were added to simulate different body characteristics. Comparison between the quantitative reconstruction and the conventional FBP (filtered backprojection) method was performed. The FBP reconstruction had a poor quantitative accuracy and varied for different body configurations. Significant improvement in reconstruction accuracy by the quantitative approach was demonstrated with a moderate computing time on a currently available desktop computer. Furthermore, the quantitative reconstruction was robust for different body characteristics. Therefore, the quantitative approach has the potential for clinical use.

  1. Quantitative evaluation study of four-dimensional gated cardiac SPECT reconstruction †

    PubMed Central

    Jin, Mingwu; Yang, Yongyi; Niu, Xiaofeng; Marin, Thibault; Brankov, Jovan G.; Feng, Bing; Pretorius, P. Hendrik; King, Michael A.; Wernick, Miles N.

    2013-01-01

    In practice gated cardiac SPECT images suffer from a number of degrading factors, including distance-dependent blur, attenuation, scatter, and increased noise due to gating. Recently we proposed a motion-compensated approach for four-dimensional (4D) reconstruction for gated cardiac SPECT, and demonstrated that use of motion-compensated temporal smoothing could be effective for suppressing the increased noise due to lowered counts in individual gates. In this work we further develop this motion-compensated 4D approach by also taking into account attenuation and scatter in the reconstruction process, which are two major degrading factors in SPECT data. In our experiments we conducted a thorough quantitative evaluation of the proposed 4D method using Monte Carlo simulated SPECT imaging based on the 4D NURBS-based cardiac-torso (NCAT) phantom. In particular we evaluated the accuracy of the reconstructed left ventricular myocardium using a number of quantitative measures including regional bias-variance analyses and wall intensity uniformity. The quantitative results demonstrate that use of motion-compensated 4D reconstruction can improve the accuracy of the reconstructed myocardium, which in turn can improve the detectability of perfusion defects. Moreover, our results reveal that while traditional spatial smoothing could be beneficial, its merit would become diminished with the use of motion-compensated temporal regularization. As a preliminary demonstration, we also tested our 4D approach on patient data. The reconstructed images from both simulated and patient data demonstrated that our 4D method can improve the definition of the LV wall. PMID:19724094

  2. Three modality image registration of brain SPECT/CT and MR images for quantitative analysis of dopamine transporter imaging

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Yuzuho; Takeda, Yuta; Hara, Takeshi; Zhou, Xiangrong; Matsusako, Masaki; Tanaka, Yuki; Hosoya, Kazuhiko; Nihei, Tsutomu; Katafuchi, Tetsuro; Fujita, Hiroshi

    2016-03-01

    Important features in Parkinson's disease (PD) are degenerations and losses of dopamine neurons in corpus striatum. 123I-FP-CIT can visualize activities of the dopamine neurons. The activity radio of background to corpus striatum is used for diagnosis of PD and Dementia with Lewy Bodies (DLB). The specific activity can be observed in the corpus striatum on SPECT images, but the location and the shape of the corpus striatum on SPECT images only are often lost because of the low uptake. In contrast, MR images can visualize the locations of the corpus striatum. The purpose of this study was to realize a quantitative image analysis for the SPECT images by using image registration technique with brain MR images that can determine the region of corpus striatum. In this study, the image fusion technique was used to fuse SPECT and MR images by intervening CT image taken by SPECT/CT. The mutual information (MI) for image registration between CT and MR images was used for the registration. Six SPECT/CT and four MR scans of phantom materials are taken by changing the direction. As the results of the image registrations, 16 of 24 combinations were registered within 1.3mm. By applying the approach to 32 clinical SPECT/CT and MR cases, all of the cases were registered within 0.86mm. In conclusions, our registration method has a potential in superimposing MR images on SPECT images.

  3. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments

    PubMed Central

    Eter, Wael A.; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-01-01

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, 111In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of 111In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers. PMID:27080529

  4. Quantitative I-123-IMP brain SPECT and neuropsychological testing in AIDS dementia

    SciTech Connect

    Kuni, C.C.; Rhame, F.S.; Meier, M.J.; Foehse, M.C.; Loewenson, R.B.; Lee, B.C.; Boudreau, R.J.; duCret, R.P. )

    1991-03-01

    We performed I-123-IMP SPECT brain imaging on seven mildly demented AIDS patients and seven normal subjects. In an attempt to detect and quantitate regions of decreased I-123-IMP uptake, pixel intensity histograms of normalized SPECT images at the basal ganglia level were analyzed for the fraction of pixels in the lowest quartile of the intensity range. This fraction (F) averaged 17.5% (S.D. = 4.6) in the AIDS group and 12.6% (S.D. = 5.1) in the normal group (p less than .05). Six of the AIDS patients underwent neuropsychological testing (NPT). NPT showed the patients to have a variety of mild abnormalities. Regression analysis of NPT scores versus F yielded a correlation coefficient of .80 (p less than .05). We conclude that analysis of I-123-IMP SPECT image pixel intensity distribution is potentially sensitive in detecting abnormalities associated with AIDS dementia and may correlate with the severity of dementia as measured by NPT.

  5. Patient-specific dosimetry based on quantitative SPECT imaging and 3D-DFT convolution

    SciTech Connect

    Akabani, G.; Hawkins, W.G.; Eckblade, M.B.; Leichner, P.K.

    1999-01-01

    The objective of this study was to validate the use of a 3-D discrete Fourier Transform (3D-DFT) convolution method to carry out the dosimetry for I-131 for soft tissues in radioimmunotherapy procedures. To validate this convolution method, mathematical and physical phantoms were used as a basis of comparison with Monte Carlo transport (MCT) calculations which were carried out using the EGS4 system code. The mathematical phantom consisted of a sphere containing uniform and nonuniform activity distributions. The physical phantom consisted of a cylinder containing uniform and nonuniform activity distributions. Quantitative SPECT reconstruction was carried out using the Circular Harmonic Transform (CHT) algorithm.

  6. Alzheimer disease: Quantitative analysis of I-123-iodoamphetamine SPECT brain imaging

    SciTech Connect

    Hellman, R.S.; Tikofsky, R.S.; Collier, B.D.; Hoffmann, R.G.; Palmer, D.W.; Glatt, S.L.; Antuono, P.G.; Isitman, A.T.; Papke, R.A.

    1989-07-01

    To enable a more quantitative diagnosis of senile dementia of the Alzheimer type (SDAT), the authors developed and tested a semiautomated method to define regions of interest (ROIs) to be used in quantitating results from single photon emission computed tomography (SPECT) of regional cerebral blood flow performed with N-isopropyl iodine-123-iodoamphetamine. SPECT/IMP imaging was performed in ten patients with probable SDAT and seven healthy subjects. Multiple ROIs were manually and semiautomatically generated, and uptake was quantitated for each ROI. Mean cortical activity was estimated as the average of the mean activity in 24 semiautomatically generated ROIs; mean cerebellar activity was determined from the mean activity in separate ROIs. A ratio of parietal to cerebellar activity less than 0.60 and a ratio of parietal to mean cortical activity less than 0.90 allowed correct categorization of nine of ten and eight of ten patients, respectively, with SDAT and all control subjects. The degree of diminished mental status observed in patients with SDAT correlated with both global and regional changes in IMP uptake.

  7. Assessing the Reliability of Quantitative Imaging of Sm-153

    NASA Astrophysics Data System (ADS)

    Poh, Zijie; Dagan, Maáyan; Veldman, Jeanette; Trees, Brad

    2013-03-01

    Samarium-153 is used for palliation of and recently has been investigated for therapy for bone metastases. Patient specific dosing of Sm-153 is based on quantitative single-photon emission computed tomography (SPECT) and knowing the accuracy and precision of image-based estimates of the in vivo activity distribution. Physical phantom studies are useful for estimating these in simple objects, but do not model realistic activity distributions. We are using realistic Monte Carlo simulations combined with a realistic digital phantom modeling human anatomy to assess the accuracy and precision of Sm-153 SPECT. Preliminary data indicates that we can simulate projection images and reconstruct them with compensation for various physical image degrading factors, such as attenuation and scatter in the body as well as non-idealities in the imaging system, to provide realistic SPECT images.

  8. Evaluation of quantitative 90Y SPECT based on experimental phantom studies

    NASA Astrophysics Data System (ADS)

    Minarik, D.; Sjögreen Gleisner, K.; Ljungberg, M.

    2008-10-01

    In SPECT imaging of pure beta emitters, such as 90Y, the acquired spectrum is very complex, which increases the demands on the imaging protocol and the reconstruction. In this work, we have evaluated the quantitative accuracy of bremsstrahlung SPECT with focus on the reconstruction algorithm including model-based attenuation, scatter and collimator-detector response (CDR) compensations. The scatter and CDR compensation methods require pre-calculated point-spread functions, which were generated with the SIMIND MC program. The SIMIND program is dedicated for simulation of scintillation camera imaging and only handles photons. The aim of this work was therefore twofold. The first aim was to implement simulation of bremsstrahlung imaging into the SIMIND code and to validate simulations against experimental measurements. The second was to investigate the quality of bremsstrahlung SPECT imaging and to evaluate the possibility of quantifying the activity in differently shaped sources. In addition, a feasibility test was performed on a patient that underwent treatment with 90Y-Ibritumomab tiuxetan (Zevalin®). The MCNPX MC program was used to generate bremsstrahlung photon spectra which were used as source input in the SIMIND program. The obtained bremsstrahlung spectra were separately validated by experimental measurement using a HPGe detector. Validation of the SIMIND generated images was done by a comparison to gamma camera measurements of a syringe containing 90Y. Results showed a slight deviation between simulations and measurements in image regions outside the source, but the agreement was sufficient for the purpose of generating scatter and CDR kernels. For the bremsstrahlung SPECT experiment, the RSD torso phantom with 90Y in the liver insert was measured with and without background activities. Projection data were obtained using a GE VH/Hawkeye system. Image reconstruction was performed by using the OSEM algorithm with and without different combinations of model

  9. Quantitative capabilities of four state-of-the-art SPECT-CT cameras

    PubMed Central

    2012-01-01

    Background Four state-of-the-art single-photon emission computed tomography-computed tomography (SPECT-CT) systems, namely Philips Brightview, General Electric Discovery NM/CT 670 and Infinia Hawkeye 4, and Siemens Symbia T6, were investigated in terms of accuracy of attenuation and scatter correction, contrast recovery for small hot and cold structures, and quantitative capabilities when using their dedicated three-dimensional iterative reconstruction with attenuation and scatter corrections and resolution recovery. Methods The National Electrical Manufacturers Association (NEMA) NU-2 1994 phantom with cold air, water, and Teflon inserts, and a homemade contrast phantom with hot and cold rods were filled with 99mTc and scanned. The acquisition parameters were chosen to provide adequate linear and angular sampling and high count statistics. The data were reconstructed using Philips Astonish, General Electric Evolution for Bone, or Siemens Flash3D, eight subsets, and a varying number of iterations. A procedure similar to the one used in positron emission tomography (PET) allowed us to obtain the factor to convert counts per pixel into activity per unit volume. Results Edge and oscillation artifacts were observed with all phantoms and all systems. At 30 iterations, the residual fraction in the inserts of the NEMA phantom fell below 3.5%. Contrast recovery increased with the number of iterations but became almost saturated at 24 iterations onwards. In the uniform part of the NEMA and contrast phantoms, a quantification error below 10% was achieved. Conclusions In objects whose dimensions exceeded the SPECT spatial resolution by several times, quantification seemed to be feasible within 10% error limits. A partial volume effect correction strategy remains necessary for the smallest structures. The reconstruction artifacts nevertheless remain a handicap on the road towards accurate quantification in SPECT and should be the focus of further works in reconstruction

  10. Simulation of realistic abnormal SPECT brain perfusion images: application in semi-quantitative analysis

    NASA Astrophysics Data System (ADS)

    Ward, T.; Fleming, J. S.; Hoffmann, S. M. A.; Kemp, P. M.

    2005-11-01

    Simulation is useful in the validation of functional image analysis methods, particularly when considering the number of analysis techniques currently available lacking thorough validation. Problems exist with current simulation methods due to long run times or unrealistic results making it problematic to generate complete datasets. A method is presented for simulating known abnormalities within normal brain SPECT images using a measured point spread function (PSF), and incorporating a stereotactic atlas of the brain for anatomical positioning. This allows for the simulation of realistic images through the use of prior information regarding disease progression. SPECT images of cerebral perfusion have been generated consisting of a control database and a group of simulated abnormal subjects that are to be used in a UK audit of analysis methods. The abnormality is defined in the stereotactic space, then transformed to the individual subject space, convolved with a measured PSF and removed from the normal subject image. The dataset was analysed using SPM99 (Wellcome Department of Imaging Neuroscience, University College, London) and the MarsBaR volume of interest (VOI) analysis toolbox. The results were evaluated by comparison with the known ground truth. The analysis showed improvement when using a smoothing kernel equal to system resolution over the slightly larger kernel used routinely. Significant correlation was found between effective volume of a simulated abnormality and the detected size using SPM99. Improvements in VOI analysis sensitivity were found when using the region median over the region mean. The method and dataset provide an efficient methodology for use in the comparison and cross validation of semi-quantitative analysis methods in brain SPECT, and allow the optimization of analysis parameters.

  11. Regularized reconstruction in quantitative SPECT using CT side information from hybrid imaging

    NASA Astrophysics Data System (ADS)

    Dewaraja, Yuni K.; Koral, Kenneth F.; Fessler, Jeffrey A.

    2010-05-01

    A penalized-likelihood (PL) SPECT reconstruction method using a modified regularizer that accounts for anatomical boundary side information was implemented to achieve accurate estimates of both the total target activity and the activity distribution within targets. In both simulations and experimental I-131 phantom studies, reconstructions from (1) penalized likelihood employing CT-side information-based regularization (PL-CT), (2) penalized likelihood with edge preserving regularization (no CT) and (3) penalized likelihood with conventional spatially invariant quadratic regularization (no CT) were compared with (4) ordered subset expectation maximization (OSEM), which is the iterative algorithm conventionally used in clinics for quantitative SPECT. Evaluations included phantom studies with perfect and imperfect side information and studies with uniform and non-uniform activity distributions in the target. For targets with uniform activity, the PL-CT images and profiles were closest to the 'truth', avoided the edge offshoots evident with OSEM and minimized the blurring across boundaries evident with regularization without CT information. Apart from visual comparison, reconstruction accuracy was evaluated using the bias and standard deviation (STD) of the total target activity estimate and the root mean square error (RMSE) of the activity distribution within the target. PL-CT reconstruction reduced both bias and RMSE compared with regularization without side information. When compared with unregularized OSEM, PL-CT reduced RMSE and STD while bias was comparable. For targets with non-uniform activity, these improvements with PL-CT were observed only when the change in activity was matched by a change in the anatomical image and the corresponding inner boundary was also used to control the regularization. In summary, the present work demonstrates the potential of using CT side information to obtain improved estimates of the activity distribution in targets without

  12. An analytical approach to quantitative reconstruction of non-uniform attenuated brain SPECT.

    PubMed

    Liang, Z; Ye, J; Harrington, D P

    1994-11-01

    An analytical approach to quantitative brain SPECT (single-photon-emission computed tomography) with non-uniform attenuation is developed. The approach formulates accurately the projection-transform equation as a summation of primary- and scatter-photon contributions. The scatter contribution can be estimated using the multiple-energy-window samples and removed from the primary-energy-window data by subtraction. The approach models the primary contribution as a convolution of the attenuated source and the detector-response kernel at a constant depth from the detector with the central-ray approximation. The attenuated Radon transform of the source can be efficiently deconvolved using the depth-frequency relation. The approach inverts exactly the attenuated Radon transform by Fourier transforms and series expansions. The performance of the analytical approach was studied for both uniform- and non-uniform-attenuation cases, and compared to the conventional FBP (filtered-backprojection) method by computer simulations. A patient brain X-ray image was acquired by a CT (computed-tomography) scanner and converted to the object-specific attenuation map for 140 keV energy. The mathematical Hoffman brain phantom was used to simulate the emission source and was resized such that it was completely surrounded by the skull of the CT attenuation map. The detector-response kernel was obtained from measurements of a point source at several depths in air from a parallel-hole collimator of a SPECT camera. The projection data were simulated from the object-specific attenuating source including the depth-dependent detector response. Quantitative improvement (>5%) in reconstructing the data was demonstrated with the nonuniform attenuation compensation, as compared to the uniform attenuation correction and the conventional FBP reconstruction. The commuting time was less than 5 min on an HP/730 desktop computer for an image array of 1282*32 from 128 projections of 128*32 size. PMID

  13. Quantitative High-Efficiency Cadmium-Zinc-Telluride SPECT with Dedicated Parallel-Hole Collimation System in Obese Patients: Results of a Multi-Center Study

    PubMed Central

    Nakazato, Ryo; Slomka, Piotr J.; Fish, Mathews; Schwartz, Ronald G.; Hayes, Sean W.; Thomson, Louise E.J.; Friedman, John D.; Lemley, Mark; Mackin, Maria L.; Peterson, Benjamin; Schwartz, Arielle M.; Doran, Jesse A.; Germano, Guido; Berman, Daniel S.

    2014-01-01

    Background Obesity is a common source of artifact on conventional SPECT myocardial perfusion imaging (MPI). We evaluated image quality and diagnostic performance of high-efficiency (HE) cadmium-zinc-telluride (CZT) parallel-hole SPECT-MPI for coronary artery disease (CAD) in obese patients. Methods and Results 118 consecutive obese patients at 3 centers (BMI 43.6±8.9 kg/m2, range 35–79.7 kg/m2) had upright/supine HE-SPECT and ICA >6 months (n=67) or low-likelihood of CAD (n=51). Stress quantitative total perfusion deficit (TPD) for upright (U-TPD), supine (S-TPD) and combined acquisitions (C-TPD) was assessed. Image quality (IQ; 5=excellent; <3 nondiagnostic) was compared among BMI 35–39.9 (n=58), 40–44.9 (n=24) and ≥45 (n=36) groups. ROC-curve area for CAD detection (≥50% stenosis) for U-TPD, S-TPD, and C-TPD were 0.80, 0.80, and 0.87, respectively. Sensitivity/specificity was 82%/57% for U-TPD, 74%/71% for S-TPD, and 80%/82% for C-TPD. C-TPD had highest specificity (P=.02). C-TPD normalcy rate was higher than U-TPD (88% vs. 75%, P=.02). Mean IQ was similar among BMI 35–39.9, 40–44.9 and ≥45 groups [4.6 vs. 4.4 vs. 4.5, respectively (P=.6)]. No patient had a non-diagnostic stress scan. Conclusions In obese patients, HE-SPECT MPI with dedicated parallel-hole collimation demonstrated high image quality, normalcy rate, and diagnostic accuracy for CAD by quantitative analysis of combined upright/supine acquisitions. PMID:25388380

  14. Post-reconstruction non-local means filtering methods using CT side information for quantitative SPECT

    NASA Astrophysics Data System (ADS)

    Chun, Se Young; Fessler, Jeffrey A.; Dewaraja, Yuni K.

    2013-09-01

    Quantitative SPECT techniques are important for many applications including internal emitter therapy dosimetry where accurate estimation of total target activity and activity distribution within targets are both potentially important for dose-response evaluations. We investigated non-local means (NLM) post-reconstruction filtering for accurate I-131 SPECT estimation of both total target activity and the 3D activity distribution. We first investigated activity estimation versus number of ordered-subsets expectation-maximization (OSEM) iterations. We performed simulations using the XCAT phantom with tumors containing a uniform and a non-uniform activity distribution, and measured the recovery coefficient (RC) and the root mean squared error (RMSE) to quantify total target activity and activity distribution, respectively. We observed that using more OSEM iterations is essential for accurate estimation of RC, but may or may not improve RMSE. We then investigated various post-reconstruction filtering methods to suppress noise at high iteration while preserving image details so that both RC and RMSE can be improved. Recently, NLM filtering methods have shown promising results for noise reduction. Moreover, NLM methods using high-quality side information can improve image quality further. We investigated several NLM methods with and without CT side information for I-131 SPECT imaging and compared them to conventional Gaussian filtering and to unfiltered methods. We studied four different ways of incorporating CT information in the NLM methods: two known (NLM CT-B and NLM CT-M) and two newly considered (NLM CT-S and NLM CT-H). We also evaluated the robustness of NLM filtering using CT information to erroneous CT. NLM CT-S and NLM CT-H yielded comparable RC values to unfiltered images while substantially reducing RMSE. NLM CT-S achieved -2.7 to 2.6% increase of RC compared to no filtering and NLM CT-H yielded up to 6% decrease in RC while other methods yielded lower RCs

  15. SPECT brain perfusion imaging with Tc-99m ECD: Semi-quantitative regional analysis and database mapping

    SciTech Connect

    Schiepers, C.; Hegge, J.; De Roo, M.

    1994-05-01

    Brain SPECT is a well accepted method for the assessment of brain perfusion in various disorders such as epilepsy, stroke, dementia. A program for handling the tomographic data was developed, using a commercial spreadsheet (Microsoft EXCEL) with a set of macro`s for analysis, graphic display and database management of the final results.

  16. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    NASA Astrophysics Data System (ADS)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less

  17. Assessment of demented patients by dynamic SPECT of inhaled xenon-133

    SciTech Connect

    Komatani, A.; Yamaguchi, K.; Sugai, Y.; Takanashi, T.; Kera, M.; Shinohara, M.; Kawakatsu, S.

    1988-10-01

    We studied the potential for using dynamic single photon emission computed tomography of inhaled xenon-133 (/sup 133/Xe) gas in the assessment of demented patients. An advanced ring-type single photon emission computed tomography (SPECT) HEADTOME with improved spatial resolution (15 mm in full width at half maximum (FWHM)) was used for tomographic measurement of regional cerebral blood flow (rCBF). All 34 patients underwent a detailed psychiatric examination and x-ray computed tomography scan, and matched research criteria for Alzheimer's disease (n = 13), senile dementia of the Alzheimer type (n = 9), or multi-infarct dementia (n = 12). In comparison with a senile control group (n = 7), mean CBF of both the whole brain and the temporo-parietal region was significantly less in the Alzheimer's disease and senile dementia Alzheimer type groups, but no significant difference was seen between the senile control group and multi-infarct dementia group. The correlation was 0.72 (p less than 0.004) between the mean CBF of the whole brain and the score of Hasegawa's Dementia Scale, and 0.94 (p less than 0.0001) between rCBF of the temporo-parietal region and the scale in Alzheimer's disease. In the senile dementia Alzheimer type group, the correlations were 0.77 (p less than 0.01) and 0.83 (p less than 0.004) respectively. No significant correlations were found in the multi-infarct dementia group. A temporo-parietal reduction in the distribution of the rCBF characteristic in the Alzheimer's disease group and a patchy whole brain reduction characteristic in the multi-infarct dementia group was detected. The ability of our improved SPECT to provide both quantitative measurement of rCBF and characteristic rCBF distribution patterns, makes it a promising tool for research or routine examination of demented patients.

  18. CT-based quantitative SPECT for the radionuclide ²⁰¹Tl: experimental validation and a standardized uptake value for brain tumour patients.

    PubMed

    Willowson, Kathy; Bailey, Dale; Schembri, Geoff; Baldock, Clive

    2012-01-01

    We have previously reported on a method for reconstructing quantitative data from 99mTc single photon emission computed tomography (SPECT) images based on corrections derived from X-ray computed tomography, producing accurate results in both experimental and clinical studies. This has been extended for use with the radionuclide ²⁰¹Tl. Accuracy was evaluated with experimental phantom studies, including corrections for partial volume effects where necessary. The quantitative technique was used to derive standardized uptake values (SUVs) for ²⁰¹Tl evaluation of brain tumours. A preliminary study was performed on 26 patients using ²⁰¹Tl SPECT scans to assess residual tumor after surgery and then to monitor response to treatment, with a follow-up time of 18 months. Measures of SUVmax were made following quantitative processing of the data and using a threshold grown volume of interest around the tumour. Phantom studies resulted in the calculation of concentration values consistently within 4% of true values. No continuous relation was found between SUVmax (post-resection) and patient survival. Choosing an SUVmax cut-off of 1.5 demonstrated a difference in survival between the 2 groups of patients after surgery. Patients with an SUVmax<1.5 had a 70% survival rate over the first 10 months, compared with a 47% survival rate for those with SUVmax>1.5. This difference did not achieve significance, most likely due to the small study numbers. By 18 months follow-up this difference had reduced, with corresponding survival rates of 40% and 27%, respectively. Although this study involves only a small cohort, it has succeeded in demonstrating the possibility of an SUV measure for SPECT to help monitor response to treatment of brain tumours and predict survival. PMID:22375306

  19. CT-based quantitative SPECT for the radionuclide ²⁰¹Tl: experimental validation and a standardized uptake value for brain tumour patients.

    PubMed

    Willowson, Kathy; Bailey, Dale; Schembri, Geoff; Baldock, Clive

    2012-01-01

    We have previously reported on a method for reconstructing quantitative data from 99mTc single photon emission computed tomography (SPECT) images based on corrections derived from X-ray computed tomography, producing accurate results in both experimental and clinical studies. This has been extended for use with the radionuclide ²⁰¹Tl. Accuracy was evaluated with experimental phantom studies, including corrections for partial volume effects where necessary. The quantitative technique was used to derive standardized uptake values (SUVs) for ²⁰¹Tl evaluation of brain tumours. A preliminary study was performed on 26 patients using ²⁰¹Tl SPECT scans to assess residual tumor after surgery and then to monitor response to treatment, with a follow-up time of 18 months. Measures of SUVmax were made following quantitative processing of the data and using a threshold grown volume of interest around the tumour. Phantom studies resulted in the calculation of concentration values consistently within 4% of true values. No continuous relation was found between SUVmax (post-resection) and patient survival. Choosing an SUVmax cut-off of 1.5 demonstrated a difference in survival between the 2 groups of patients after surgery. Patients with an SUVmax<1.5 had a 70% survival rate over the first 10 months, compared with a 47% survival rate for those with SUVmax>1.5. This difference did not achieve significance, most likely due to the small study numbers. By 18 months follow-up this difference had reduced, with corresponding survival rates of 40% and 27%, respectively. Although this study involves only a small cohort, it has succeeded in demonstrating the possibility of an SUV measure for SPECT to help monitor response to treatment of brain tumours and predict survival.

  20. An automated voxelized dosimetry tool for radionuclide therapy based on serial quantitative SPECT/CT imaging

    SciTech Connect

    Jackson, Price A.; Kron, Tomas; Beauregard, Jean-Mathieu; Hofman, Michael S.; Hogg, Annette; Hicks, Rodney J.

    2013-11-15

    Purpose: To create an accurate map of the distribution of radiation dose deposition in healthy and target tissues during radionuclide therapy.Methods: Serial quantitative SPECT/CT images were acquired at 4, 24, and 72 h for 28 {sup 177}Lu-octreotate peptide receptor radionuclide therapy (PRRT) administrations in 17 patients with advanced neuroendocrine tumors. Deformable image registration was combined with an in-house programming algorithm to interpolate pharmacokinetic uptake and clearance at a voxel level. The resultant cumulated activity image series are comprised of values representing the total number of decays within each voxel's volume. For PRRT, cumulated activity was translated to absorbed dose based on Monte Carlo-determined voxel S-values at a combination of long and short ranges. These dosimetric image sets were compared for mean radiation absorbed dose to at-risk organs using a conventional MIRD protocol (OLINDA 1.1).Results: Absorbed dose values to solid organs (liver, kidneys, and spleen) were within 10% using both techniques. Dose estimates to marrow were greater using the voxelized protocol, attributed to the software incorporating crossfire effect from nearby tumor volumes.Conclusions: The technique presented offers an efficient, automated tool for PRRT dosimetry based on serial post-therapy imaging. Following retrospective analysis, this method of high-resolution dosimetry may allow physicians to prescribe activity based on required dose to tumor volume or radiation limits to healthy tissue in individual patients.

  1. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging

    NASA Astrophysics Data System (ADS)

    Könik, Arda; Kupinski, Meredith; Hendrik Pretorius, P.; King, Michael A.; Barrett, Harrison H.

    2015-08-01

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3 cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested.

  2. MIRD Pamphlet No. 26: Joint EANM/MIRD Guidelines for Quantitative 177Lu SPECT Applied for Dosimetry of Radiopharmaceutical Therapy.

    PubMed

    Ljungberg, Michael; Celler, Anna; Konijnenberg, Mark W; Eckerman, Keith F; Dewaraja, Yuni K; Sjögreen-Gleisner, Katarina; Bolch, Wesley E; Brill, A Bertrand; Fahey, Frederic; Fisher, Darrell R; Hobbs, Robert; Howell, Roger W; Meredith, Ruby F; Sgouros, George; Zanzonico, Pat; Bacher, Klaus; Chiesa, Carlo; Flux, Glenn; Lassmann, Michael; Strigari, Lidia; Walrand, Stephan

    2016-01-01

    The accuracy of absorbed dose calculations in personalized internal radionuclide therapy is directly related to the accuracy of the activity (or activity concentration) estimates obtained at each of the imaging time points. MIRD Pamphlet no. 23 presented a general overview of methods that are required for quantitative SPECT imaging. The present document is next in a series of isotope-specific guidelines and recommendations that follow the general information that was provided in MIRD 23. This paper focuses on (177)Lu (lutetium) and its application in radiopharmaceutical therapy. PMID:26471692

  3. MIRD Pamphlet No. 26: Joint EANM/MIRD Guidelines for Quantitative 177Lu SPECT Applied for Dosimetry of Radiopharmaceutical Therapy.

    PubMed

    Ljungberg, Michael; Celler, Anna; Konijnenberg, Mark W; Eckerman, Keith F; Dewaraja, Yuni K; Sjögreen-Gleisner, Katarina; Bolch, Wesley E; Brill, A Bertrand; Fahey, Frederic; Fisher, Darrell R; Hobbs, Robert; Howell, Roger W; Meredith, Ruby F; Sgouros, George; Zanzonico, Pat; Bacher, Klaus; Chiesa, Carlo; Flux, Glenn; Lassmann, Michael; Strigari, Lidia; Walrand, Stephan

    2016-01-01

    The accuracy of absorbed dose calculations in personalized internal radionuclide therapy is directly related to the accuracy of the activity (or activity concentration) estimates obtained at each of the imaging time points. MIRD Pamphlet no. 23 presented a general overview of methods that are required for quantitative SPECT imaging. The present document is next in a series of isotope-specific guidelines and recommendations that follow the general information that was provided in MIRD 23. This paper focuses on (177)Lu (lutetium) and its application in radiopharmaceutical therapy.

  4. A 3-Dimensional Absorbed Dose Calculation Method Based on Quantitative SPECT for Radionuclide Therapy: Evaluation for 131I Using Monte Carlo Simulation

    PubMed Central

    Ljungberg, Michael; Sjögreen, Katarina; Liu, Xiaowei; Frey, Eric; Dewaraja, Yuni; Strand, Sven-Erik

    2009-01-01

    A general method is presented for patient-specific 3-dimensional absorbed dose calculations based on quantitative SPECT activity measurements. Methods The computational scheme includes a method for registration of the CT image to the SPECT image and position-dependent compensation for attenuation, scatter, and collimator detector response performed as part of an iterative reconstruction method. A method for conversion of the measured activity distribution to a 3-dimensional absorbed dose distribution, based on the EGS4 (electron-gamma shower, version 4) Monte Carlo code, is also included. The accuracy of the activity quantification and the absorbed dose calculation is evaluated on the basis of realistic Monte Carlo–simulated SPECT data, using the SIMIND (simulation of imaging nuclear detectors) program and a voxel-based computer phantom. CT images are obtained from the computer phantom, and realistic patient movements are added relative to the SPECT image. The SPECT-based activity concentration and absorbed dose distributions are compared with the true ones. Results Correction could be made for object scatter, photon attenuation, and scatter penetration in the collimator. However, inaccuracies were imposed by the limited spatial resolution of the SPECT system, for which the collimator response correction did not fully compensate. Conclusion The presented method includes compensation for most parameters degrading the quantitative image information. The compensation methods are based on physical models and therefore are generally applicable to other radionuclides. The proposed evaluation methodology may be used as a basis for future intercomparison of different methods. PMID:12163637

  5. System design and development of a pinhole SPECT system for quantitative functional imaging of small animals.

    PubMed

    Aoi, Toshiyuki; Zeniya, Tsutomu; Watabe, Hiroshi; Deloar, Hossain M; Matsuda, Tetsuya; Iida, Hidehiro

    2006-04-01

    Recently, small animal imaging by pinhole SPECT has been widely investigated by several researchers. We developed a pinhole SPECT system specially designed for small animal imaging. The system consists of a rotation unit for a small animal and a SPECT camera attached with a pinhole collimator. In order to acquire complete data of the projections, the system has two orbits with angles of 90 degrees and 45 degrees with respect to the object. In this system, the position of the SPECT camera is kept fixed, and the animal is rotated in order to avoid misalignment of the center of rotation (COR). We implemented a three dimensional OSEM algorithm for the reconstruction of data acquired by the system from both the orbitals. A point source experiment revealed no significant COR misalignment using the proposed system. Experiments with a line phantom clearly indicated that our system succeeded in minimizing the misalignment of the COR. We performed a study with a rat and 99mTc-HMDP, an agent for bone scan, and demonstrated a dramatic improvement in the spatial resolution and uniformity achieved by our system in comparison with the conventional Feldkamp algorithm with one set of orbital data.

  6. In vivo Tumor Grading of Prostate Cancer using Quantitative 111In-Capromab Pendetide SPECT/CT

    PubMed Central

    Seo, Youngho; Aparici, Carina Mari; Cooperberg, Matthew R.; Konety, Badrinath R.; Hawkins, Randall A.

    2010-01-01

    -based PVE correction could recover true tracer concentrations in volumes as small as 7.77 ml up to 90% in phantom measurements. From patient studies, there was a statistically significant correlation (ρ = 0.71, P = 0.033) between higher AUVs (from either left or right lobe) and higher components of pathologic Gleason scores. Conclusion Our results strongly indicate noninvasive prostate tumor grading potential using quantitative 111In-capromab pendetide SPECT/CT for prostate cancer evaluation. PMID:20008977

  7. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  8. Regional cerebral blood flow imaging: A quantitative comparison of technetium-99m-HMPAO SPECT with C15O2 PET

    SciTech Connect

    Gemmell, H.G.; Evans, N.T.; Besson, J.A.; Roeda, D.; Davidson, J.; Dodd, M.G.; Sharp, P.F.; Smith, F.W.; Crawford, J.R.; Newton, R.H. )

    1990-10-01

    The aim of this study was to compare technetium-99m-hexamethylpropyleneamineoxime ({sup 99m}Tc-HMPAO) single-photon emission computed tomography (SPECT) with regional cerebral blood flow (rCBF) imaging using positron emission tomography (PET). As investigation of dementia is likely to be one of the main uses of routine rCBF imaging, 18 demented patients were imaged with both techniques. The PET data were compared quantitatively with three versions of the SPECT data. These were, first, data normalized to the SPECT cerebellar uptake, second, data linearly corrected using the PET cerebellar value and, finally, data Lassen corrected for washout from the high flow areas. Both the linearly-corrected (r = 0.81) and the Lassen-corrected (r = 0.79) HMPAO SPECT data showed good correlation with the PET rCBF data. The relationship between the normalized HMPAO SPECT data and the PET data was nonlinear. It is not yet possible to obtain rCBF values in absolute units from HMPAO SPECT without knowledge of the true rCBF in one reference region for each patient.

  9. Assessment of SPM in perfusion brain SPECT studies. A numerical simulation study using bootstrap resampling methods.

    PubMed

    Pareto, Deborah; Aguiar, Pablo; Pavía, Javier; Gispert, Juan Domingo; Cot, Albert; Falcón, Carles; Benabarre, Antoni; Lomeña, Francisco; Vieta, Eduard; Ros, Domènec

    2008-07-01

    Statistical parametric mapping (SPM) has become the technique of choice to statistically evaluate positron emission tomography (PET), functional magnetic resonance imaging (fMRI), and single photon emission computed tomography (SPECT) functional brain studies. Nevertheless, only a few methodological studies have been carried out to assess the performance of SPM in SPECT. The aim of this paper was to study the performance of SPM in detecting changes in regional cerebral blood flow (rCBF) in hypo- and hyperperfused areas in brain SPECT studies. The paper seeks to determine the relationship between the group size and the rCBF changes, and the influence of the correction for degradations. The assessment was carried out using simulated brain SPECT studies. Projections were obtained with Monte Carlo techniques, and a fan-beam collimator was considered in the simulation process. Reconstruction was performed by using the ordered subsets expectation maximization (OSEM) algorithm with and without compensation for attenuation, scattering, and spatial variant collimator response. Significance probability maps were obtained with SPM2 by using a one-tailed two-sample t-test. A bootstrap resampling approach was used to determine the sample size for SPM to detect the between-group differences. Our findings show that the correction for degradations results in a diminution of the sample size, which is more significant for small regions and low-activation factors. Differences in sample size were found between hypo- and hyperperfusion. These differences were larger for small regions and low-activation factors, and when no corrections were included in the reconstruction algorithm.

  10. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  11. SU-C-201-02: Quantitative Small-Animal SPECT Without Scatter Correction Using High-Purity Germanium Detectors

    SciTech Connect

    Gearhart, A; Peterson, T; Johnson, L

    2015-06-15

    Purpose: To evaluate the impact of the exceptional energy resolution of germanium detectors for preclinical SPECT in comparison to conventional detectors. Methods: A cylindrical water phantom was created in GATE with a spherical Tc-99m source in the center. Sixty-four projections over 360 degrees using a pinhole collimator were simulated. The same phantom was simulated using air instead of water to establish the true reconstructed voxel intensity without attenuation. Attenuation correction based on the Chang method was performed on MLEM reconstructed images from the water phantom to determine a quantitative measure of the effectiveness of the attenuation correction. Similarly, a NEMA phantom was simulated, and the effectiveness of the attenuation correction was evaluated. Both simulations were carried out using both NaI detectors with an energy resolution of 10% FWHM and Ge detectors with an energy resolution of 1%. Results: Analysis shows that attenuation correction without scatter correction using germanium detectors can reconstruct a small spherical source to within 3.5%. Scatter analysis showed that for standard sized objects in a preclinical scanner, a NaI detector has a scatter-to-primary ratio between 7% and 12.5% compared to between 0.8% and 1.5% for a Ge detector. Preliminary results from line profiles through the NEMA phantom suggest that applying attenuation correction without scatter correction provides acceptable results for the Ge detectors but overestimates the phantom activity using NaI detectors. Due to the decreased scatter, we believe that the spillover ratio for the air and water cylinders in the NEMA phantom will be lower using germanium detectors compared to NaI detectors. Conclusion: This work indicates that the superior energy resolution of germanium detectors allows for less scattered photons to be included within the energy window compared to traditional SPECT detectors. This may allow for quantitative SPECT without implementing scatter

  12. Gamma camera calibration and validation for quantitative SPECT imaging with (177)Lu.

    PubMed

    D'Arienzo, M; Cazzato, M; Cozzella, M L; Cox, M; D'Andrea, M; Fazio, A; Fenwick, A; Iaccarino, G; Johansson, L; Strigari, L; Ungania, S; De Felice, P

    2016-06-01

    Over the last years (177)Lu has received considerable attention from the clinical nuclear medicine community thanks to its wide range of applications in molecular radiotherapy, especially in peptide-receptor radionuclide therapy (PRRT). In addition to short-range beta particles, (177)Lu emits low energy gamma radiation of 113keV and 208keV that allows gamma camera quantitative imaging. Despite quantitative cancer imaging in molecular radiotherapy having been proven to be a key instrument for the assessment of therapeutic response, at present no general clinically accepted quantitative imaging protocol exists and absolute quantification studies are usually based on individual initiatives. The aim of this work was to develop and evaluate an approach to gamma camera calibration for absolute quantification in tomographic imaging with (177)Lu. We assessed the gamma camera calibration factors for a Philips IRIX and Philips AXIS gamma camera system using various reference geometries, both in air and in water. Images were corrected for the major effects that contribute to image degradation, i.e. attenuation, scatter and dead- time. We validated our method in non-reference geometry using an anthropomorphic torso phantom provided with the liver cavity uniformly filled with (177)LuCl3. Our results showed that calibration factors depend on the particular reference condition. In general, acquisitions performed with the IRIX gamma camera provided good results at 208keV, with agreement within 5% for all geometries. The use of a Jaszczak 16mL hollow sphere in water provided calibration factors capable of recovering the activity in anthropomorphic geometry within 1% for the 208keV peak, for both gamma cameras. The point source provided the poorest results, most likely because scatter and attenuation correction are not incorporated in the calibration factor. However, for both gamma cameras all geometries provided calibration factors capable of recovering the activity in

  13. Gamma camera calibration and validation for quantitative SPECT imaging with (177)Lu.

    PubMed

    D'Arienzo, M; Cazzato, M; Cozzella, M L; Cox, M; D'Andrea, M; Fazio, A; Fenwick, A; Iaccarino, G; Johansson, L; Strigari, L; Ungania, S; De Felice, P

    2016-06-01

    Over the last years (177)Lu has received considerable attention from the clinical nuclear medicine community thanks to its wide range of applications in molecular radiotherapy, especially in peptide-receptor radionuclide therapy (PRRT). In addition to short-range beta particles, (177)Lu emits low energy gamma radiation of 113keV and 208keV that allows gamma camera quantitative imaging. Despite quantitative cancer imaging in molecular radiotherapy having been proven to be a key instrument for the assessment of therapeutic response, at present no general clinically accepted quantitative imaging protocol exists and absolute quantification studies are usually based on individual initiatives. The aim of this work was to develop and evaluate an approach to gamma camera calibration for absolute quantification in tomographic imaging with (177)Lu. We assessed the gamma camera calibration factors for a Philips IRIX and Philips AXIS gamma camera system using various reference geometries, both in air and in water. Images were corrected for the major effects that contribute to image degradation, i.e. attenuation, scatter and dead- time. We validated our method in non-reference geometry using an anthropomorphic torso phantom provided with the liver cavity uniformly filled with (177)LuCl3. Our results showed that calibration factors depend on the particular reference condition. In general, acquisitions performed with the IRIX gamma camera provided good results at 208keV, with agreement within 5% for all geometries. The use of a Jaszczak 16mL hollow sphere in water provided calibration factors capable of recovering the activity in anthropomorphic geometry within 1% for the 208keV peak, for both gamma cameras. The point source provided the poorest results, most likely because scatter and attenuation correction are not incorporated in the calibration factor. However, for both gamma cameras all geometries provided calibration factors capable of recovering the activity in

  14. Quantitative MRI Assessment of Leukoencephalopathy

    PubMed Central

    Reddick, Wilburn E.; Glass, John O.; Langston, James W.; Helton, Kathleen J.

    2008-01-01

    Quantitative MRI assessment of leukoencephalopathy is difficult because the MRI properties of leukoencephalopathy significantly overlap those of normal tissue. This report describes the use of an automated procedure for longitudinal measurement of tissue volume and relaxation times to quantify leukoencephalopathy. Images derived by using this procedure in patients undergoing therapy for acute lymphoblastic leukemia (ALL) are presented. Five examinations from each of five volunteers (25 examinations) were used to test the reproducibility of quantitated baseline and subsequent, normal-appearing images; the coefficients of variation were less than 2% for gray and white matter. Regions of leukoencephalopathy in patients were assessed by comparison with manual segmentation. Two radiologists manually segmented images from 15 randomly chosen MRI examinations that exhibited leukoencephalopathy. Kappa analyses showed that the two radiologists’ interpretations were concordant (κ = 0.70) and that each radiologist’s interpretations agreed with the results of the automated procedure (κ = 0.57 and 0.55).The clinical application of this method was illustrated by analysis of images from sequential MR examinations of two patients who developed leukoencephalopathy during treatment for ALL. The ultimate goal is to use these quantitative MR imaging measures to better understand therapy-induced neurotoxicity, which can be limited or even reversed with some combination of therapy adjustments and pharmacological and neurobehavioral interventions. PMID:11979570

  15. Three-Dimensional Dosimetric Analysis and Quantitative Bremsstrahlung Spect Imaging for Treatment of Non-Resectable Pancreatic Cancer Using Colloidal PHOSPHORUS-32.

    NASA Astrophysics Data System (ADS)

    Parsai, E. Ishmael

    1995-01-01

    Current methods of calculating absorbed dose in tissue from beta emitting radiopharmaceuticals yield only estimates of the average dose and cannot be used for dose mapping of bremsstrahlung SPECT images. The present work describes a clinically applicable methodology that can be used to determine the 3-D absorbed dose distribution from bremsstrahlung SPECT images for patients undergoing infusional brachytherapy. The radiopharmaceutical used in this study was colloidal P-32; however, other beta emitters can be used with this method. Calibration curves were generated from phantom studies to determine the activity per voxel from the attenuation corrected measured counts per voxel. The cumulative activity at each voxel position was converted to dose (Gy) using a Monte Carlo based P -32 point dose kernel calculation in water. Two-dimensional isodose distributions then were generated and projected on the reconstructed SPECT slices. This technique was further extended to calculate the quantitative dose for the entire volume and iso-surface dose distributions were generated in 3-D from bremsstrahlung SPECT data. In addition, to calculate the dose rate or accumulated dose at any depth from a given activity, a computer program based on the modified Loevinger point function was developed. This program calculates the dose in two ways: (1) through a closed solution for the spherical geometry by integration of the function over small spherical volumes, or (2) by applying the revised parameters of the modified Loevinger function. A practical and clinically feasible technique was developed for 3-D image co-registration between CT and SPECT for direct anatomic confirmation of the correlation between the region of the P-32 activity distribution and the anatomic site of injection. The method provides the correlation of the body contours obtained from bremsstrahlung SPECT data with corresponding contours from CT. A 3-D surface was first generated by mapping the iso-counts in the SPECT

  16. Systolic and diastolic assessment by 3D-ASM segmentation of gated-SPECT Studies: a comparison with MRI

    NASA Astrophysics Data System (ADS)

    Tobon-Gomez, C.; Bijnens, B. H.; Huguet, M.; Sukno, F.; Moragas, G.; Frangi, A. F.

    2009-02-01

    Gated single photon emission tomography (gSPECT) is a well-established technique used routinely in clinical practice. It can be employed to evaluate global left ventricular (LV) function of a patient. The purpose of this study is to assess LV systolic and diastolic function from gSPECT datasets in comparison with cardiac magnetic resonance imaging (CMR) measurements. This is achieved by applying our recently implemented 3D active shape model (3D-ASM) segmentation approach for gSPECT studies. This methodology allows for generation of 3D LV meshes for all cardiac phases, providing volume time curves and filling rate curves. Both systolic and diastolic functional parameters can be derived from these curves for an assessment of patient condition even at early stages of LV dysfunction. Agreement of functional parameters, with respect to CMR measurements, were analyzed by means of Bland-Altman plots. The analysis included subjects presenting either LV hypertrophy, dilation or myocardial infarction.

  17. Spatially resolved assessment of hepatic function using 99mTc-IDA SPECT

    SciTech Connect

    Wang, Hesheng; Cao, Yue

    2013-09-15

    Purpose: 99mTc-iminodiacetic acid (IDA) hepatobiliary imaging is usually quantified for hepatic function on the entire liver or regions of interest (ROIs) in the liver. The authors presented a method to estimate the hepatic extraction fraction (HEF) voxel-by-voxel from single-photon emission computed tomography (SPECT)/CT with a 99mTc-labeled IDA agent of mebrofenin and evaluated the spatially resolved HEF measurements with an independent physiological measurement.Methods: Fourteen patients with intrahepatic cancers were treated with radiation therapy (RT) and imaged by 99mTc-mebrofenin SPECT before and 1 month after RT. The dynamic SPECT volumes were with a resolution of 3.9 × 3.9 × 2.5 mm{sup 3}. Throughout the whole liver with approximate 50 000 voxels, voxelwise HEF quantifications were estimated and compared between using arterial input function (AIF) from the heart and using vascular input function (VIF) from the spleen. The correlation between mean of the HEFs over the nontumor liver tissue and the overall liver function measured by Indocyanine green clearance half-time (T1/2) was assessed. Variation of the voxelwise estimation was evaluated in ROIs drawn in relatively homogeneous regions of the livers. The authors also examined effects of the time range parameter on the voxelwise HEF quantification.Results: Mean of the HEFs over the liver estimated using AIF significantly correlated with the physiological measurement T1/2 (r= 0.52, p= 0.0004), and the correlation was greatly improved by using VIF (r= 0.79, p < 0.0001). The parameter of time range for the retention phase did not lead to a significant difference in the means of the HEFs in the ROIs. Using VIF and a retention phase time range of 7–30 min, the relative variation of the voxelwise HEF in the ROIs was 10%± 6% of respective mean HEF.Conclusions: The voxelwise HEF derived from 99mTc-IDA SPECT by the deconvolution analysis is feasible to assess the spatial distribution of hepatic function in the

  18. Dopamine D2 receptor status assessed by IBZM SPECT - A sensitive indicator for cerebral hypoxia

    SciTech Connect

    Tatsch, K.; Schwarz, J.; Welz, A.

    1995-05-01

    The striatum is highly sensitive to tissue hypoxia. Thus, it may be suggested that cerebral hypoxia could affect the integrity of the striatal receptor system. Purpose of the current SPECT investigations with IBZM was to evaluate whether hypoxic conditions cause detectable changes in the D2 receptor status. 25 controls and 30 pts with history of cerebral hypoxia (resuscitation after cardiac arrest: n=19, CABG surgery under cardiopulmonary bypass: n=11) were investigated with SPECT 2h p.i. of 185 MBq I-123 IBZM. For semiquant, evaluation transverse slices corrected for attenuation were used to calculate striatal to frontal cortex (S/FC) ratios. In 13/19 pts with cerebral hypoxia due to cardiac arrest IBZM binding was severely reduced after successful resuscitation. 7 died, 5 were in a vegetative state, 1 remained severely disabled. In 6/19 S/FC ratios were normal/mildly reduced, 2 of them had a good outcome, 4 were moderatley disabled. In pts with CABG IBZM binding was preoperatively normal. After hypoxia due to cardiac surgery striatal S/FC ratios decreased slightly, persisting on this level even 6 months after surgery. Neuropsychological/psychiatric testing showed only minor or transient changes in this group of patients. The striatal D2 receptor status seems to be a sensitive indicator for cerebral hypoxia. After hypoxia due to cardiac arrest IBZM results well correlate (in contrast to morphological or SEP findings) with the clinical outcome and thus may serve as early predictor of the individual prognosis. The moderate decline in IBZM binding following CABG surgery suggests mild cerebral hypoxia despite of protective hypothermia. Sensitively indicating cerebral hypoxia changes in the D2 receptor status assessed by IBZM SPECT may serve as a valuable diagnostic tool for testing neuroprotective drugs or modified surgical techniques.

  19. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  20. Dobutamine stress-redistribution-reinjection versus rest-redistribution thallium-201 SPECT in the assessment of myocardial viability.

    PubMed

    Cornel, J H; Bax, J J; Elhendy, A; Reijs, A E; Fioretti, P M

    1997-02-01

    The aim of this study was to evaluate the value of thallium-201 chloride (201Tl) reinjection imaging following dobutamine stress (DRi) to identify viable myocardium in comparison with a rest-redistribution 201Tl protocol (RR). The identification of viable myocardium bears important consequences for adequate selection of patients with poor left ventricular function, often unable to exercise, who are considered for revascularization. Twenty-six patients with chronic coronary artery disease and depressed left ventricular function (ejection fraction 36 +/- 10%) were studied by both DRi and RR single photon emission computed tomography (SPECT). Semi-quantitative analysis of regional 201Tl activity (5-point score) and wall motion by echocardiography using a 16-segment model was performed. Regions were classified as viable (normal/reversible/fixed moderate defects) or non-viable (fixed severe defects) and related to regional wall motion. Target heart rate was reached in 25 patients. Myocardial viability was demonstrated in 353/416 (85%) by DRi SPECT and in 346/416 (83%) by RR SPECT. The agreement between the 2 protocols was 98% with a K-value of 0.94; similar results were obtained when the analysis was limited to dyscontractile segments. In conclusion, this study demonstrates the feasibility and diagnostic value of DRi SPECT to identify viable myocardium.

  1. Quantitative reconstruction for myocardial perfusion SPECT: an efficient approach by depth-dependent deconvolution and matrix rotation.

    PubMed

    Ye, J; Liang, Z; Harrington, D P

    1994-08-01

    An efficient reconstruction method for myocardial perfusion single-photon emission computed tomography (SPECT) has been developed which compensates simultaneously for attenuation, scatter, and resolution variation. The scattered photons in the primary-energy-window measurements are approximately removed by subtracting the weighted scatter-energy-window samples. The resolution variation is corrected by deconvolving the subtracted data with the detector-response kernel in frequency space using the depth-dependent frequency relation. The attenuated photons are compensated by recursively tracing the attenuation factors through the object-specific attenuation map. An experimental chest phantom with defects inside myocardium was used to test the method. The attenuation map of the phantom was reconstructed from transmission scans using a flat external source and a high-resolution parallel-hole collimator of a single-detector system. The detector-response kernel was approximated from measurements of a point source in air at several depths from the collimator surface. The emission data were acquired by the same detector setting. A computer simulation using similar protocols as in the experiment was performed. Both the simulation and experiment showed significant improvement in quantification with the proposed method, as compared to the conventional filtered-backprojection technique. The quantitative gain by the additional deconvolution was demonstrated. The computation time was less than 20 min on a HP/730 desktop computer for reconstruction of a 1282 x 64 array from 128 projections of 128 x 64 samples. PMID:15551566

  2. SU-C-201-06: Utility of Quantitative 3D SPECT/CT Imaging in Patient Specific Internal Dosimetry of 153-Samarium with GATE Monte Carlo Package

    SciTech Connect

    Fallahpoor, M; Abbasi, M; Sen, A; Parach, A; Kalantari, F

    2015-06-15

    Purpose: Patient-specific 3-dimensional (3D) internal dosimetry in targeted radionuclide therapy is essential for efficient treatment. Two major steps to achieve reliable results are: 1) generating quantitative 3D images of radionuclide distribution and attenuation coefficients and 2) using a reliable method for dose calculation based on activity and attenuation map. In this research, internal dosimetry for 153-Samarium (153-Sm) was done by SPECT-CT images coupled GATE Monte Carlo package for internal dosimetry. Methods: A 50 years old woman with bone metastases from breast cancer was prescribed 153-Sm treatment (Gamma: 103keV and beta: 0.81MeV). A SPECT/CT scan was performed with the Siemens Simbia-T scanner. SPECT and CT images were registered using default registration software. SPECT quantification was achieved by compensating for all image degrading factors including body attenuation, Compton scattering and collimator-detector response (CDR). Triple energy window method was used to estimate and eliminate the scattered photons. Iterative ordered-subsets expectation maximization (OSEM) with correction for attenuation and distance-dependent CDR was used for image reconstruction. Bilinear energy mapping is used to convert Hounsfield units in CT image to attenuation map. Organ borders were defined by the itk-SNAP toolkit segmentation on CT image. GATE was then used for internal dose calculation. The Specific Absorbed Fractions (SAFs) and S-values were reported as MIRD schema. Results: The results showed that the largest SAFs and S-values are in osseous organs as expected. S-value for lung is the highest after spine that can be important in 153-Sm therapy. Conclusion: We presented the utility of SPECT-CT images and Monte Carlo for patient-specific dosimetry as a reliable and accurate method. It has several advantages over template-based methods or simplified dose estimation methods. With advent of high speed computers, Monte Carlo can be used for treatment planning

  3. Examining a hypothetical quantitative model for better approximation of culprit coronary artery and site of stenosis on 99mTc-sestamibi gated myocardial perfusion SPECT.

    PubMed

    Pal, Sushanta; Sen, Srabani; Das, Debasis; Basu, Sandip

    2016-10-01

    A hypothetical quantitative model of analyzing gated myocardial perfusion SPECT is proposed and examined for the feasibility of its use as a predictor of diseased coronary artery and approximating the site of stenosis to determine whether it could serve as a useful noninvasive complement for coronary angiography. The extent and severity of perfusion defects on rest gated myocardial perfusion imaging SPECT-images were assessed on a five-point scale in a standard 17-segment model and total perfusion deficit was quantified by automated software. The first step was to locate the diseased coronary artery using a quantitative method: for this, the score of each segment belonging to a particular coronary artery was determined using a systematic presumptive approach. After determination of specific coronary artery segments, the scores of the contiguous segments in three short axis slices (apical, middle, and basal) were summed for six subdivisions (anterior, anterolateral, inferolateral, inferior, anteroseptal, and inferoseptal). The site of stenosis was determined from (a) the initial approximation of the involved segments with a defect score of 2-4 and (b) subsequent calculation of the defect score of each of the six subdivisions and allocating the site through a preassigned number for each coronary artery. For each coronary artery, only the subdivision with the highest defect score was considered. Proximal, middle, and distal segments of left anterior descending artery (LAD) were considered to be represented when the summed value of a subdivision within a particular arterial territory was more than or equal to 7, between 5 and 7, 5 and 3, respectively. For the left circumflex and right coronary artery, summed scores (of respective subdivisions) of more than or equal to 5 and between 3 and 5 were preassigned to proximal and distal stenosis, respectively. The results were then correlated with the coronary angiographic data. On coronary angiography, proximal LAD occlusion

  4. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  5. MIRD Pamphlet No. 23: Quantitative SPECT for Patient-Specific 3-Dimensional Dosimetry in Internal Radionuclide Therapy

    PubMed Central

    Dewaraja, Yuni K.; Frey, Eric C.; Sgouros, George; Brill, A. Bertrand; Roberson, Peter; Zanzonico, Pat B.; Ljungberg, Michael

    2012-01-01

    In internal radionuclide therapy, a growing interest in voxel-level estimates of tissue-absorbed dose has been driven by the desire to report radiobiologic quantities that account for the biologic consequences of both spatial and temporal nonuniformities in these dose estimates. This report presents an overview of 3-dimensional SPECT methods and requirements for internal dosimetry at both regional and voxel levels. Combined SPECT/CT image-based methods are emphasized, because the CT-derived anatomic information allows one to address multiple technical factors that affect SPECT quantification while facilitating the patient-specific voxel-level dosimetry calculation itself. SPECT imaging and reconstruction techniques for quantification in radionuclide therapy are not necessarily the same as those designed to optimize diagnostic imaging quality. The current overview is intended as an introduction to an upcoming series of MIRD pamphlets with detailed radionuclide-specific recommendations intended to provide best-practice SPECT quantification–based guidance for radionuclide dosimetry. PMID:22743252

  6. Quantitative study of lung perfusion SPECT scanning and pulmonary function testing for early radiation-induced lung injury in patients with locally advanced non-small cell lung cancer

    PubMed Central

    ZHANG, WEI; WANG, JIEZHONG; TANG, MINGDENG; PAN, JIANJI; BAI, PENGGANG; LIN, DUANYU; QIAN, FEIYU; LIN, FENGJIE; YANG, XUEQIN; ZHANG, SHENGLI

    2012-01-01

    Radiation lung injury is a common side-effect of pulmonary radiotherapy. The aim of this study was to quantitatively assess early changes in lung perfusion single photon emission computed tomography (SPECT) scanning and pulmonary function testing (PFT) prior to and after intensity modulated radiotherapy (IMRT) for patients suffering from locally advanced non-small cell lung cancer (LANSCLC). Twenty patients with LANSCLC received lung perfusion SPECT scanning and PFT prior to IMRT and immediately after IMRT. Lung perfusion index (LPI) was calculated after the quantification of perfusion SPECT images. The LPI of the two groups was analyzed by matched t-test. The radioactive count of each layer of single lung was added to obtain the sum of the irradiated area. The percentage of the irradiated area of single lung was calculated. Linear correlation analysis was carried out between the percentage of the irradiated area and LPI in order to verify the validity of LPI. In this study, LPI and the percentage of the irradiated area of single lung exhibited an excellent correlation either prior to or after IMRT (r=0.820 and r=0.823, respectively; p<0.001). There was no statistically significant difference between pre-IMRT LPI and post-IMRT LPI (p=0.135). LPI in the group receiving a radical dose had no statistically significant difference (p=0.993), however, it showed a statistically significant difference in the group receiving a non-radical dose (p=0.025). In the non-radical dose group, the post-IMRT LPI was larger compared to pre-IMRT. None of the parameters of PFT exhibited a statistically significant difference prior to and after IMRT (p>0.05). The quantitative method of lung perfusion SPECT scanning can be used to evaluate changes in perfusion early in patients receiving a non-radical dose (BED ≤126,500 cGy) IMRT. Evaluating early changes in global lung function using the current method of PFT is difficult, since time can be a contributing factor for radiation

  7. Does percutaneous nephrolithotomy and its outcomes have an impact on renal function? Quantitative analysis using SPECT-CT DMSA.

    PubMed

    Pérez-Fentes, Daniel; Cortés, Julia; Gude, Francisco; García, Camilo; Ruibal, Alvaro; Aguiar, Pablo

    2014-10-01

    To assess the functional effects of percutaneous nephrolithotomy (PCNL) and its outcomes in the operated kidney, we prospectively studied 30 consecutive cases undergoing PCNL. Kidney function was evaluated preoperatively and 3 months after surgery with serum creatinine, glomerular filtration rate (GFR), and with (99m)Tc-DMSA SPECT-CT scans to determine the differential renal function (DRF). PCNL effects in the operated kidney DRF were considered globally (DRFPLANAR, DRFSPECT) and in the region of percutaneous access (DRFACCESS). PCNL functional impact was also assessed depending on its outcomes, namely success (stone-free status) and the development of perioperative complications. PCNL has rendered 73 % of the cases completely stone free with a 33 % complication rate. After PCNL, serum creatinine and GFR did not change significantly, whereas DRFPLANAR and DRFSPECT dropped 1.2 % (p = 0.014) and 1.0 % (p = 0.041), respectively. The highest decrease was observed in DRFACCESS (1.8 %, p = 0.012). Stone-free status after PCNL did not show any impact on kidney function. Conversely, cases that suffered from a complication showed impairment in serum creatinine (0.1 mg/dL, p = 0.028), in GFR (11.1 mL/min, p = 0.036) as well as in DRFPLANAR (2.7 %, p = 0.018), DRFSPECT (2.2 %, p = 0.023) and DRFACCESS (2.7 %, p = 0.049). We conclude that PCNL has a minimal impact on global kidney function, which is mainly located in the region of percutaneous access. The advent of perioperative complications increased PCNL functional damage, whereas the stone-free status did not show any meaningful effect.

  8. Quantitative assessment of scientific quality

    NASA Astrophysics Data System (ADS)

    Heinzl, Harald; Bloching, Philipp

    2012-09-01

    Scientific publications, authors, and journals are commonly evaluated with quantitative bibliometric measures. Frequently-used measures will be reviewed and their strengths and weaknesses will be highlighted. Reflections about conditions for a new, research paper-specific measure will be presented.

  9. Abdominal SPECT imaging

    SciTech Connect

    Van Heertum, R.L.; Brunetti, J.C.; Yudd, A.P.

    1987-07-01

    Over the past several years, abdominal single photon emission computed tomography (SPECT) imaging has evolved from a research tool to an important clinical imaging modality that is helpful in the diagnostic assessment of a wide variety of disorders involving the abdominal viscera. Although liver-spleen imaging is the most popular of the abdominal SPECT procedures, blood pool imaging is becoming much more widely utilized for the evaluation of cavernous hemangiomas of the liver as well as other vascular abnormalities in the abdomen. Adjunctive indium leukocyte and gallium SPECT studies are also proving to be of value in the assessment of a variety of infectious and neoplastic diseases. As more experience is acquired in this area, SPECT should become the primary imaging modality for both gallium and indium white blood cells in many institutions. Renal SPECT, on the other hand, has only recently been used as a clinical imaging modality for the assessment of such parameters as renal depth and volume. The exact role of renal SPECT as a clinical tool is, therefore, yet to be determined. 79 references.

  10. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  11. The impact of 3D volume of interest definition on accuracy and precision of activity estimation in quantitative SPECT and planar processing methods

    NASA Astrophysics Data System (ADS)

    He, Bin; Frey, Eric C.

    2010-06-01

    Accurate and precise estimation of organ activities is essential for treatment planning in targeted radionuclide therapy. We have previously evaluated the impact of processing methodology, statistical noise and variability in activity distribution and anatomy on the accuracy and precision of organ activity estimates obtained with quantitative SPECT (QSPECT) and planar (QPlanar) processing. Another important factor impacting the accuracy and precision of organ activity estimates is accuracy of and variability in the definition of organ regions of interest (ROI) or volumes of interest (VOI). The goal of this work was thus to systematically study the effects of VOI definition on the reliability of activity estimates. To this end, we performed Monte Carlo simulation studies using randomly perturbed and shifted VOIs to assess the impact on organ activity estimates. The 3D NCAT phantom was used with activities that modeled clinically observed 111In ibritumomab tiuxetan distributions. In order to study the errors resulting from misdefinitions due to manual segmentation errors, VOIs of the liver and left kidney were first manually defined. Each control point was then randomly perturbed to one of the nearest or next-nearest voxels in three ways: with no, inward or outward directional bias, resulting in random perturbation, erosion or dilation, respectively, of the VOIs. In order to study the errors resulting from the misregistration of VOIs, as would happen, e.g. in the case where the VOIs were defined using a misregistered anatomical image, the reconstructed SPECT images or projections were shifted by amounts ranging from -1 to 1 voxels in increments of with 0.1 voxels in both the transaxial and axial directions. The activity estimates from the shifted reconstructions or projections were compared to those from the originals, and average errors were computed for the QSPECT and QPlanar methods, respectively. For misregistration, errors in organ activity estimations were

  12. The impact of 3D volume of interest definition on accuracy and precision of activity estimation in quantitative SPECT and planar processing methods.

    PubMed

    He, Bin; Frey, Eric C

    2010-06-21

    Accurate and precise estimation of organ activities is essential for treatment planning in targeted radionuclide therapy. We have previously evaluated the impact of processing methodology, statistical noise and variability in activity distribution and anatomy on the accuracy and precision of organ activity estimates obtained with quantitative SPECT (QSPECT) and planar (QPlanar) processing. Another important factor impacting the accuracy and precision of organ activity estimates is accuracy of and variability in the definition of organ regions of interest (ROI) or volumes of interest (VOI). The goal of this work was thus to systematically study the effects of VOI definition on the reliability of activity estimates. To this end, we performed Monte Carlo simulation studies using randomly perturbed and shifted VOIs to assess the impact on organ activity estimates. The 3D NCAT phantom was used with activities that modeled clinically observed (111)In ibritumomab tiuxetan distributions. In order to study the errors resulting from misdefinitions due to manual segmentation errors, VOIs of the liver and left kidney were first manually defined. Each control point was then randomly perturbed to one of the nearest or next-nearest voxels in three ways: with no, inward or outward directional bias, resulting in random perturbation, erosion or dilation, respectively, of the VOIs. In order to study the errors resulting from the misregistration of VOIs, as would happen, e.g. in the case where the VOIs were defined using a misregistered anatomical image, the reconstructed SPECT images or projections were shifted by amounts ranging from -1 to 1 voxels in increments of with 0.1 voxels in both the transaxial and axial directions. The activity estimates from the shifted reconstructions or projections were compared to those from the originals, and average errors were computed for the QSPECT and QPlanar methods, respectively. For misregistration, errors in organ activity estimations were

  13. Comparison of simultaneous and sequential SPECT imaging for discrimination tasks in assessment of cardiac defects

    NASA Astrophysics Data System (ADS)

    Trott, C. M.; Ouyang, J.; El Fakhri, G.

    2010-11-01

    Simultaneous rest perfusion/fatty-acid metabolism studies have the potential to replace sequential rest/stress perfusion studies for the assessment of cardiac function. Simultaneous acquisition has the benefits of increased signal and lack of need for patient stress, but is complicated by cross-talk between the two radionuclide signals. We consider a simultaneous rest 99mTc-sestamibi/123I-BMIPP imaging protocol in place of the commonly used sequential rest/stress 99mTc-sestamibi protocol. The theoretical precision with which the severity of a cardiac defect and the transmural extent of infarct can be measured is computed for simultaneous and sequential SPECT imaging, and their performance is compared for discriminating (1) degrees of defect severity and (2) sub-endocardial from transmural defects. We consider cardiac infarcts for which reduced perfusion and metabolism are observed. From an information perspective, simultaneous imaging is found to yield comparable or improved performance compared with sequential imaging for discriminating both severity of defect and transmural extent of infarct, for three defects of differing location and size.

  14. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  15. Optimal left ventricular lead position assessed with phase analysis on gated myocardial perfusion SPECT

    PubMed Central

    Boogers, Mark J.; Chen, Ji; van Bommel, Rutger J.; Borleffs, C. Jan Willem; Dibbets-Schneider, Petra; van der Hiel, Bernies; Al Younis, Imad; Schalij, Martin J.; van der Wall, Ernst E.; Garcia, Ernest V.

    2010-01-01

    Purpose The aim of the current study was to evaluate the relationship between the site of latest mechanical activation as assessed with gated myocardial perfusion SPECT (GMPS), left ventricular (LV) lead position and response to cardiac resynchronization therapy (CRT). Methods The patient population consisted of consecutive patients with advanced heart failure in whom CRT was currently indicated. Before implantation, 2-D echocardiography and GMPS were performed. The echocardiography was performed to assess LV end-systolic volume (LVESV), LV end-diastolic volume (LVEDV) and LV ejection fraction (LVEF). The site of latest mechanical activation was assessed by phase analysis of GMPS studies and related to LV lead position on fluoroscopy. Echocardiography was repeated after 6 months of CRT. CRT response was defined as a decrease of ≥15% in LVESV. Results Enrolled in the study were 90 patients (72% men, 67±10 years) with advanced heart failure. In 52 patients (58%), the LV lead was positioned at the site of latest mechanical activation (concordant), and in 38 patients (42%) the LV lead was positioned outside the site of latest mechanical activation (discordant). CRT response was significantly more often documented in patients with a concordant LV lead position than in patients with a discordant LV lead position (79% vs. 26%, p<0.01). After 6 months, patients with a concordant LV lead position showed significant improvement in LVEF, LVESV and LVEDV (p<0.05), whereas patients with a discordant LV lead position showed no significant improvement in these variables. Conclusion Patients with a concordant LV lead position showed significant improvement in LV volumes and LV systolic function, whereas patients with a discordant LV lead position showed no significant improvements. PMID:20953608

  16. Quantitative assessment of fluorescent proteins.

    PubMed

    Cranfill, Paula J; Sell, Brittney R; Baird, Michelle A; Allen, John R; Lavagnino, Zeno; de Gruiter, H Martijn; Kremers, Gert-Jan; Davidson, Michael W; Ustione, Alessandro; Piston, David W

    2016-07-01

    The advent of fluorescent proteins (FPs) for genetic labeling of molecules and cells has revolutionized fluorescence microscopy. Genetic manipulations have created a vast array of bright and stable FPs spanning blue to red spectral regions. Common to autofluorescent FPs is their tight β-barrel structure, which provides the rigidity and chemical environment needed for effectual fluorescence. Despite the common structure, each FP has unique properties. Thus, there is no single 'best' FP for every circumstance, and each FP has advantages and disadvantages. To guide decisions about which FP is right for a given application, we have quantitatively characterized the brightness, photostability, pH stability and monomeric properties of more than 40 FPs to enable straightforward and direct comparison between them. We focus on popular and/or top-performing FPs in each spectral region. PMID:27240257

  17. Quantitative assessment of DNA condensation.

    PubMed

    Trubetskoy, V S; Slattum, P M; Hagstrom, J E; Wolff, J A; Budker, V G

    1999-02-15

    A fluorescent method is proposed for assessing DNA condensation in aqueous solutions with variety of condensing agents. The technique is based on the effect of concentration-dependent self-quenching of covalently bound fluorophores upon DNA collapse. The method allows a more precise determination of charge equivalency in titration experiments with various polycations. The technique's ability to determine the number of DNA molecules that are condensed together in close proximity is under further investigation.

  18. Computational tools and methods for objective assessment of image quality in x-ray CT and SPECT

    NASA Astrophysics Data System (ADS)

    Palit, Robin

    Computational tools of use in the objective assessment of image quality for tomography systems were developed for computer processing units (CPU) and graphics processing units (GPU) in the image quality lab at the University of Arizona. Fast analytic x-ray projection code called IQCT was created to compute the mean projection image for cone beam multi-slice helical computed tomography (CT) scanners. IQCT was optimized to take advantage of the massively parallel architecture of GPUs. CPU code for computing single photon emission computed tomography (SPECT) projection images was written calling upon previous research in the image quality lab. IQCT and the SPECT modeling code were used to simulate data for multi-modality SPECT/CT observer studies. The purpose of these observer studies was to assess the benefit in image quality of using attenuation information from a CT measurement in myocardial SPECT imaging. The observer chosen for these studies was the scanning linear observer. The tasks for the observer were localization of a signal and estimation of the signal radius. For the localization study, area under the localization receiver operating characteristic curve (A LROC) was computed as AMeasLROC = 0.89332 ± 0.00474 and ANoLROC = 0.89408 ± 0.00475, where "Meas" implies the use of attenuation information from the CT measurement, and "No" indicates the absence of attenuation information. For the estimation study, area under the estimation receiver operating characteristic curve (AEROC) was quantified as AMeasEROC = 0.55926 ± 0.00731 and ANoEROC = 0.56167 ± 0.00731. Based on these results, it was concluded that the use of CT information did not improve the scanning linear observer's ability to perform the stated myocardial SPECT tasks. The risk to the patient of the CT measurement was quantified in terms of excess effective dose as 2.37 mSv for males and 3.38 mSv for females. Another image quality tool generated within this body of work was a singular value

  19. Biologically based, quantitative risk assessment of neurotoxicants.

    PubMed

    Slikker, W; Crump, K S; Andersen, M E; Bellinger, D

    1996-01-01

    The need for biologically based, quantitative risk assessment procedures for noncancer endpoints such as neurotoxicity has been discussed in reports by the United States Congress (Office of Technology Assessment, OTA), National Research Council (NRC), and a federal coordinating council. According to OTA, current attention and resources allocated to health risk assessment research are inadequate and not commensurate with its impact on public health and the economy. Methods to include continuous rather than dichotomous data for neurotoxicity endpoints, biomarkers of exposure and effects, and pharmacokinetic and mechanistic data have been proposed for neurotoxicity risk assessment but require further review and validation before acceptance. The purpose of this symposium was to examine procedures to enhance the risk assessment process for neurotoxicants and to discuss techniques to make the process more quantitative. Accordingly, a review of the currently used safety factor risk assessment approach for neurotoxicants is provided along with specific examples of how this process may be enhanced with the use of the benchmark dose approach. The importance of including physiologically based pharmacokinetic data in the risk assessment process and specific examples of this approach is presented for neurotoxicants. The role of biomarkers of exposure and effect and mechanistic information in the risk assessment process are also addressed. Finally, quantitative approaches with the use of continuous neurotoxicity data are demonstrated and the outcomes compared to those generated by currently used risk assessment procedures. PMID:8838636

  20. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  1. Integrated regional assessment: qualitative and quantitative issues

    SciTech Connect

    Malone, Elizabeth L.

    2009-11-19

    Qualitative and quantitative issues are particularly significant in integrated regional assessment. This chapter examines the terms “qualitative” and “quantitative” separately and in relation to one another, along with a discussion of the degree of interdependence or overlap between the two. Strategies for integrating the two general approaches often produce uneasy compromises. However, integrated regional assessment provides opportunities for strong collaborations in addressing specific problems in specific places.

  2. A method for energy window optimization for quantitative tasks that includes the effects of model-mismatch on bias: application to Y-90 bremsstrahlung SPECT imaging.

    PubMed

    Rong, Xing; Du, Yong; Frey, Eric C

    2012-06-21

    Quantitative Yttrium-90 ((90)Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of (90)Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for (90)Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as (90)Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In (90)Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative (90)Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were

  3. 123I-MIBG SPECT for Evaluation of Patients with Heart Failure.

    PubMed

    Dimitriu-Leen, Aukelien C; Scholte, Arthur J H A; Jacobson, Arnold F

    2015-06-01

    Heart failure (HF) is characterized by activation of the sympathetic cardiac nerves. The condition of cardiac sympathetic nerves can be evaluated by (123)I-metaiodobenzylguanidine ((123)I-MIBG) imaging. Most cardiac (123)I-MIBG studies have relied on measurements from anterior planar images of the chest. However, it has become progressively more common to include SPECT imaging in clinical and research protocols. This review examines recent trends in (123)I-MIBG SPECT imaging and evidence that provides the basis for the increased use of the procedure in the clinical management of patients with HF. (123)I-MIBG SPECT has been shown to be complementary to planar imaging in patients with HF in studies of coronary artery disease after an acute myocardial infarction. Moreover, (123)I-MIBG SPECT has been used in numerous studies to document regional denervation for arrhythmic event risk assessment. For better quantification of the size and severity of innervation abnormalities in (123)I-MIBG SPECT, programs and protocols specifically for (123)I have been developed. Also, the introduction of new solid-state cameras has created the potential for more rapid SPECT acquisitions or a reduction in radiopharmaceutical activity. Although PET imaging has superior quantitative capabilities, (123)I-MIBG SPECT is, for the foreseeable future, the only widely available nuclear imaging method for assessing regional myocardial sympathetic innervation.

  4. Physiologic basis for understanding quantitative dehydration assessment.

    PubMed

    Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N

    2013-03-01

    Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.

  5. Comparison of 4D-microSPECT and microCT for murine cardiac function

    PubMed Central

    Befera, Nicholas T.; Badea, Cristian T.; Johnson, G. Allan

    2014-01-01

    Purpose The objective of this study was to compare a new generation of four-dimensional (4D) microSPECT with microCT for quantitative in vivo assessment of murine cardiac function. Procedures 4D isotropic cardiac images were acquired from normal C57BL/6 mice with either microSPECT at 350-micron resolution (n=6) or microCT at 88-micron resolution (n=6). One additional mouse with myocardial infarction (MI) was scanned with both modalities. Prior to imaging, mice were injected with either 99mTc -tetrofosmin for microSPECT, or a liposomal blood pool contrast agent for microCT. Segmentation of the left ventricle (LV) was performed using Vitrea (Vital Images) software, to derive global and regional function. Results Measures of global LV function between microSPECT and microCT groups were comparable (e.g. ejection fraction=71±6%-microSPECT and 68±4%-microCT). Regional functional indices (wall motion, wall thickening, regional ejection fraction) were also similar for the two modalities. In the mouse with MI, microSPECT identified a large perfusion defect that was not evident with microCT. Conclusions Despite lower spatial resolution, microSPECT was comparable to microCT in the quantitative evaluation of cardiac function. MicroSPECT offers an advantage over microCT in the ability to evaluate myocardial perfusion radiotracer distribution and function simultaneously. MicroSPECT should be considered as an alternative to microCT and MR for preclinical cardiac imaging in the mouse. PMID:24037175

  6. Validation of semi-quantitative methods for DAT SPECT: influence of anatomical variability and partial volume effect

    NASA Astrophysics Data System (ADS)

    Gallego, J.; Niñerola-Baizán, A.; Cot, A.; Aguiar, P.; Crespo, C.; Falcón, C.; Lomeña, F.; Sempau, J.; Pavía, J.; Ros, D.

    2015-08-01

    The aim of this work was to evaluate the influence of anatomical variability between subjects and of the partial volume effect (PVE) on the standardized Specific Uptake Ratio (SUR) in [123I]FP-bib SPECT studies. To this end, magnetic resonance (MR) images of 23 subjects with differences in the striatal volume of up to 44% were segmented and used to generate a database of 138 Monte Carlo simulated SPECT studies. Data included normal uptakes and pathological cases. Studies were reconstructed by filtered back projection (FBP) and the ordered-subset expectation-maximization algorithm. Quantification was carried out by applying a reference method based on regions of interest (ROIs) derived from the MR images and ROIs derived from the Automated Anatomical Labelling map. Our results showed that, regardless of anatomical variability, the relationship between calculated and true SUR values for caudate and putamen could be described by a multiple linear model which took into account the spill-over phenomenon caused by PVE ({{R}2}≥slant 0.963 for caudate and ≥0.980 for putamen) and also by a simple linear model (R2 ≥ 0.952 for caudate and ≥0.973 for putamen). Calculated values were standardized by inverting both linear systems. Differences between standardized and true values showed that, although the multiple linear model was the best approach in terms of variability ({χ2}  ≥ 11.79 for caudate and  ≤7.36 for putamen), standardization based on a simple linear model was also suitable ({χ2}  ≥ 12.44 for caudate and  ≤12.57 for putamen).

  7. Assessment of right and left ventricular function in healthy mice by blood-pool pinhole gated SPECT.

    PubMed

    Goetz, Christian; Monassier, Laurent; Choquet, Philippe; Constantinesco, André

    2008-09-01

    The feasibility of blood-pool pinhole ECG gated SPECT was investigated in healthy mice to assess right and left ventricular function analysis. Anaesthetized (isoflurane 1-1.5%) adult CD1 mice (n=11) were analyzed after intravenous administration of 0.2 ml of 550 MBq of (99m)Tc human albumin. For blood-pool gated SPECT imaging, 48 ventral step and shoot projections with eight time bins per RR over 180 degrees with 64 x 64 word images were acquired with a small animal gamma camera equipped with a pinhole collimator of 12 cm in focal length and 1.5 mm in diameter. For appropriate segmentation of right and left ventricular volumes, a 4D Fourier analysis was performed after reconstruction and reorientation of blood-pool images with a voxel size of 0.55 x 0.55 x 0.55 mm(3). Average right and left ejection fractions were respectively 52+/-4.7% and 65+/-5.2%. Right end diastolic and end systolic volumes were significantly higher compared with the corresponding left ventricular volumes (P<0.0001 each). A linear correlation between right and left stroke volumes (r=0.9, P<0.0001) was obtained and right and left cardiac outputs were not significantly different 14.2+/-1.9 and 14.1+/-2 ml/min, respectively.

  8. A 3D image analysis tool for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  9. Photon count and contrast-detail detection potential comparison between parallel and fan beam brain SPECT

    SciTech Connect

    Kim, K.I.; Lim, C.B.

    1986-02-01

    Current brain SPECT based on parallel beam projection with round shape Anger cameras suffers from both low sensitivity and poor resolution due to shoulder interference. SPECT using fan beam projection with a wide rectangular camera would significantly improve image quality by increased sensitivity and close brain access. For experimental verification a rectangular camera of 16'' x 8.7'' FOV has been developed with a shoulder edge of 3''. For this geometry sensitivity and resolution improvement have been measured. Fan beam imaging tests verified the analysis results by showing 60% sensitivity increase and resolution improvement to 10 mm from 13-14 mm at image center. In order to assess the imaging improvement level quantitatively, analytical comparison on SPECT contrast-detail detectability has been made. Experimental contrast-detail detectability comparison between parallel and fan beam brain SPECT is presented together with the predicted model result.

  10. Adaptive SPECT

    PubMed Central

    Barrett, Harrison H.; Furenlid, Lars R.; Freed, Melanie; Hesterman, Jacob Y.; Kupinski, Matthew A.; Clarkson, Eric; Whitaker, Meredith K.

    2008-01-01

    Adaptive imaging systems alter their data-acquisition configuration or protocol in response to the image information received. An adaptive pinhole single-photon emission computed tomography (SPECT) system might acquire an initial scout image to obtain preliminary information about the radiotracer distribution and then adjust the configuration or sizes of the pinholes, the magnifications, or the projection angles in order to improve performance. This paper briefly describes two small-animal SPECT systems that allow this flexibility and then presents a framework for evaluating adaptive systems in general, and adaptive SPECT systems in particular. The evaluation is in terms of the performance of linear observers on detection or estimation tasks. Expressions are derived for the ideal linear (Hotelling) observer and the ideal linear (Wiener) estimator with adaptive imaging. Detailed expressions for the performance figures of merit are given, and possible adaptation rules are discussed. PMID:18541485

  11. Assessment of hybrid rotation-translation scan schemes for in vivo animal SPECT imaging

    NASA Astrophysics Data System (ADS)

    Xia, Yan; Yao, Rutao; Deng, Xiao; Liu, Yaqiang; Wang, Shi; Ma, Tianyu

    2013-02-01

    To perform in vivo animal single photon emission computed tomography imaging on a stationary detector gantry, we introduced a hybrid rotation-translation (HRT) tomographic scan, a combination of translational and limited angle rotational movements of the image object, to minimize gravity-induced animal motion. To quantitatively assess the performance of ten HRT scan schemes and the conventional rotation-only scan scheme, two simulated phantoms were first scanned with each scheme to derive the corresponding image resolution (IR) in the image field of view. The IR results of all the scan schemes were visually assessed and compared with corresponding outputs of four scan scheme evaluation indices, i.e. sampling completeness (SC), sensitivity (S), conventional system resolution (SR), and a newly devised directional spatial resolution (DR) that measures the resolution in any specified orientation. A representative HRT scheme was tested with an experimental phantom study. Eight of the ten HRT scan schemes evaluated achieved a superior performance compared to two other HRT schemes and the rotation-only scheme in terms of phantom image resolution. The same eight HRT scan schemes also achieved equivalent or better performance in terms of the four quantitative indices than the conventional rotation-only scheme. As compared to the conventional index SR, the new index DR appears to be a more relevant indicator of system resolution performance. The experimental phantom image obtained from the selected HRT scheme was satisfactory. We conclude that it is feasible to perform in vivo animal imaging with a HRT scan scheme and SC and DR are useful predictors for quantitatively assessing the performance of a scan scheme.

  12. Intravenous adenosine (adenoscan) versus exercise in the noninvasive assessment of coronary artery disease by SPECT

    SciTech Connect

    LaManna, M.M.; Mohama, R.; Slavich, I.L. 3d.; Lumia, F.J.; Cha, S.D.; Rambaran, N.; Maranhao, V. )

    1990-11-01

    Fifteen patients at a mean age of 58 underwent adenosine and maximal exercise thallium SPECT imaging. All scans were performed 1 week apart and within 4 weeks of cardiac catheterization. SPECT imaging was performed after the infusion of 140 micrograms/kg/min of adenosine for 6 minutes. Mean heart rate increment during adenosine administration was 67 +/- 3.7 to 77 +/- 4.1. Mean blood pressure was 136 +/- 7.2 to 135 +/- 6.2 systolic and 78 +/- 1.8 to 68 +/- 2.6 diastolic. No adverse hemodynamic effects were observed. There were no changes in PR or QRS in intervals. Five stress ECGs were ischemic. No ST changes were observed with adenosine. Although 68% of the patients had symptoms of flushing, light-headedness, and dizziness during adenosine infusion, symptoms resolved within 1 minute of dosage adjustment or termination of the infusion in all but one patient, who required theophylline. Sensitivity for coronary artery detection was 77% and specificity 100%. Concordance between adenoscans and exercise thallium scintigraphy was high (13/15 = 87%). In two patients, there were minor scintigraphic differences. The authors conclude that adenosine is a sensitive, specific, and safe alternative to exercise testing in patients referred for thallium imaging and may be preferable to dipyridamole.

  13. Bone Scintigraphy SPECT/CT Evaluation of Mandibular Condylar Hyperplasia.

    PubMed

    Yang, Zhiyun; Reed, Tameron; Longino, Becky H

    2016-03-01

    Mandibular condylar hyperplasia (CH) is a complex developmental deformity resulting in asymmetries of the hyperplastic condyle. Bone scan SPECT is a sensitive and accurate method of detecting the growth activity of this disorder. This method can be used to quantitate the radionuclide uptake differences between the left and right condyles. Uptake differences of 10% or more between the left and right condyles, with increased uptake ipsilateral to the CH, are considered to be evidence of active growing CH. Quantitative assessment of CH is important to select an appropriate treatment course. Degenerative arthropathies of the temporomandibular joints may result in altered uptake, but this is mostly associated with the side contralateral to the CH. The CT portion of SPECT/CT is useful to assess the condylar dimensions and underlying bony changes. PMID:26111714

  14. Implementation and assessment of an animal management system for small-animal micro-CT / micro-SPECT imaging

    NASA Astrophysics Data System (ADS)

    Holdsworth, David W.; Detombe, Sarah A.; Chiodo, Chris; Fricke, Stanley T.; Drangova, Maria

    2011-03-01

    Advances in laboratory imaging systems for CT, SPECT, MRI, and PET facilitate routine micro-imaging during pre-clinical investigations. Challenges still arise when dealing with immune-compromised animals, biohazardous agents, and multi-modality imaging. These challenges can be overcome with an appropriate animal management system (AMS), with the capability for supporting and monitoring a rat or mouse during micro-imaging. We report the implementation and assessment of a new AMS system for mice (PRA-3000 / AHS-2750, ASI Instruments, Warren MI), designed to be compatible with a commercial micro-CT / micro-SPECT imaging system (eXplore speCZT, GE Healthcare, London ON). The AMS was assessed under the following criteria: 1) compatibility with the imaging system (i.e. artifact generation, geometric dimensions); 2) compatibility with live animals (i.e. positioning, temperature regulation, anesthetic supply); 3) monitoring capabilities (i.e. rectal temperature, respiratory and cardiac monitoring); 4) stability of co-registration; and 5) containment. Micro-CT scans performed using a standardized live-animal protocol (90 kVp, 40 mA, 900 views, 16 ms per view) exhibited low noise (+/-19 HU) and acceptable artifact from high-density components within the AMS (e.g. ECG pad contacts). Live mice were imaged repeatedly (with removal and replacement of the AMS) and spatial registration was found to be stable to within +/-0.07 mm. All animals tolerated enclosure within the AMS for extended periods (i.e. > one hour) without distress, based on continuous recordings of rectal temperature, ECG waveform and respiratory rate. A sealed AMS system extends the capability of a conventional micro-imaging system to include immune-compromised and biosafety level 2 mouse-imaging protocols.

  15. Trends in quantitative cancer risk assessment.

    PubMed Central

    Morris, S C

    1991-01-01

    Quantitative cancer risk assessment is a dynamic field, more closely coupled to rapidly advancing biomedical research than ever before. Six areas of change and growth are identified: expansion from models of cancer initiation to a more complete picture of the total carcinogenic process; trend from curve-fitting to biologically based models; movement from upperbound estimates to best estimates, with a more complete treatment of uncertainty; increased consideration of the role of susceptibility; growing development of expert systems and decision support systems; and emerging importance of risk communication. PMID:2050076

  16. A multiresolution restoration method for cardiac SPECT

    NASA Astrophysics Data System (ADS)

    Franquiz, Juan Manuel

    Single-photon emission computed tomography (SPECT) is affected by photon attenuation and image blurring due to Compton scatter and geometric detector response. Attenuation correction is important to increase diagnostic accuracy of cardiac SPECT. However, in attenuation-corrected scans, scattered photons from radioactivity in the liver could produce a spillover of counts into the inferior myocardial wall. In the clinical setting, blurring effects could be compensated by restoration with Wiener and Metz filters. Inconveniences of these procedures are that the Wiener filter depends upon the power spectra of the object image and noise, which are unknown, while Metz parameters have to be optimized by trial and error. This research develops an alternative restoration procedure based on a multiresolution denoising and regularization algorithm. It was hypothesized that this representation leads to a more straightforward and automatic restoration than conventional filters. The main objective of the research was the development and assessment of the multiresolution algorithm for compensating the liver spillover artifact. The multiresolution algorithm decomposes original SPECT projections into a set of sub-band frequency images. This allows a simple denoising and regularization procedure by discarding high frequency channels and performing inversion only in low and intermediate frequencies. The method was assessed in bull's eye polar maps and short- axis attenuation-corrected reconstructions of a realistic cardiac-chest phantom with a custom-made liver insert and different 99mTc liver-to-heart activity ratios. Inferior myocardial defects were simulated in some experiments. The cardiac phantom in free air was considered as the gold standard reference. Quantitative analysis was performed by calculating contrast of short- axis slices and the normalized chi-square measure, defect size and mean and standard deviation of polar map counts. The performance of the multiresolution

  17. Radiopharmaceuticals for SPECT Cancer Detection

    NASA Astrophysics Data System (ADS)

    Chernov, V. I.; Medvedeva, A. A.; Zelchan, R. V.; Sinilkin, I. G.; Stasyuk, E. S.; Larionova, L. A.; Slonimskaya, E. M.; Choynzonov, E. L.

    2016-06-01

    The purpose of the study was to assess the efficacy of single photon emission computed tomography (SPECT) with 199Tl and 99mTc-MIBI in the detection of breast, laryngeal and hypopharyngeal cancers. Materials and Methods: a total of 220 patients were included into the study. Of them, there were 120 patients with breast lesions (100 patients with breast cancer and 20 patients with benign breast tumors) and '00 patients with laryngeal/hypopharyngeal diseases (80 patients with laryngeal/hypopharyngeal cancer and 20 patients with benign laryngeal/hypopharyngeal lesions). Results: no abnormal 199Tl uptake was seen in all patients with benign breast and laryngeal lesions, indicating a 100% specificity of 199Tl SPECT. In breast cancer patients, increased 199Tl uptake in the breast was visualized in 94.8% patients, 99mTc-MIBI in 93.4% patients. Increased 199Tl uptake in axillary lymph nodes was detected in 60% patients and 99mTc-MIBI in 93.1% patients. In patients with laryngeal/hypopharyngeal cancer, sensitivity of SPECT with 199Tl and 99mTc-MIBI were 95%. The 199Tl SPECT sensitivity in identification of regional lymph node metastases in patients with laryngeal/hypopharyngeal cancer was 75% and the 99mTc-MIBI SPECT sensitivity was 17%. Conclusion: the data obtained show that SPECT with 199Tl and 99mTc-MIBI can be used as one of the additional imaging methods in detection of tumors.

  18. Radiopharmaceuticals for SPECT cancer detection

    NASA Astrophysics Data System (ADS)

    Chernov, V. I.; Medvedeva, A. A.; Zelchan, R. V.; Sinilkin, I. G.; Stasyuk, E. S.; Larionova, L. A.; Slonimskaya, E. M.; Choynzonov, E. L.

    2016-08-01

    The purpose of the study was to assess the efficacy of single photon emission computed tomography (SPECT) with 199Tl and 99mTc-MIBI in the detection of breast, laryngeal and hypopharyngeal cancers. A total of 220 patients were included into the study: 120 patients with breast lesions (100 patients with breast cancer and 20 patients with benign breast tumors) and 100 patients with laryngeal/hypopharyngeal diseases (80 patients with laryngeal/hypopharyngeal cancer and 20 patients with benign laryngeal/hypopharyngeal lesions). No abnormal 199Tl uptake was seen in all patients with benign breast and laryngeal lesions, indicating a 100% specificity of 199Tl SPECT. In the breast cancer patients, the increased 199Tl uptake in the breast was visualized in 94.8% patients, 99mTc-MIBI—in 93.4% patients. The increased 199Tl uptake in axillary lymph nodes was detected in 60% patients, and 99mTc-MIBI—in 93.1% patients. In patients with laryngeal/hypopharyngeal cancer, the sensitivity of SPECT with 199Tl and 99mTc-MIBI was 95%. The 199Tl SPECT sensitivity in identification of regional lymph node metastases in the patients with laryngeal/hypopharyngeal cancer was 75% and the 99mTc-MIBI SPECT sensitivity was 17%. The data obtained showed that SPECT with 199Tl and 99mTc-MIBI can be used as one of the additional imaging methods in detection of tumors.

  19. Observer assessment of multi-pinhole SPECT geometries for prostate cancer imaging: a simulation study

    NASA Astrophysics Data System (ADS)

    Kalantari, Faraz; Sen, Anando; Gifford, Howard C.

    2014-03-01

    SPECT imaging using In-111 ProstaScint is an FDA-approved method for diagnosing prostate cancer metastases within the pelvis. However, conventional medium-energy parallel-hole (MEPAR) collimators produce poor image quality and we are investigating the use of multipinhole (MPH) imaging as an alternative. This paper presents a method for evaluating MPH designs that makes use of sampling-sensitive (SS) mathematical model observers for tumor detectionlocalization tasks. Key to our approach is the redefinition of a normal (or background) reference image that is used with scanning model observers. We used this approach to compare different MPH configurations for the task of small-tumor detection in the prostate and surrounding lymph nodes. Four configurations used 10, 20, 30, and 60 pinholes evenly spaced over a complete circular orbit. A fixed-count acquisition protocol was assumed. Spherical tumors were placed within a digital anthropomorphic phantom having a realistic Prostascint biodistribution. Imaging data sets were generated with an analytical projector and reconstructed volumes were obtained with the OSEM algorithm. The MPH configurations were compared in a localization ROC (LROC) study with 2D pelvic images and both human and model observers. Regular and SS versions of the scanning channelized nonprewhitening (CNPW) and visual-search (VS) model observers were applied. The SS models demonstrated the highest correlations with the average human-observer results

  20. PET and SPECT Radiotracers to Assess Function and Expression of ABC Transporters in Vivo

    PubMed Central

    Mairinger, Severin; Erker, Thomas; Müller, Markus; Langer, Oliver

    2013-01-01

    Adenosine triphosphate-binding cassette (ABC) transporters, such as P-glycoprotein (Pgp, ABCB1), breast cancer resistance protein (BCRP, ABCG2) and multidrug resistance-associated proteins (MRPs) are expressed in high concentrations at various physiological barriers (e.g. blood-brain barrier, blood-testis barrier, blood-tumor barrier), where they impede the tissue accumulation of various drugs by active efflux transport. Changes in ABC transporter expression and function are thought to be implicated in various diseases, such as cancer, epilepsy, Alzheimer’s and Parkinson’s disease. The availability of a non-invasive imaging method which allows for measuring ABC transporter function or expression in vivo would be of great clinical use in that it could facilitate the identification of those patients that would benefit from treatment with ABC transporter modulating drugs. To date three different kinds of imaging probes have been described to measure ABC transporters in vivo: i) radiolabelled transporter substrates ii) radiolabelled transporter inhibitors and iii) radiolabelled prodrugs which are enzymatically converted into transporter substrates in the organ of interest (e.g. brain). The design of new imaging probes to visualize efflux transporters is inter alia complicated by the overlapping substrate recognition pattern of different ABC transporter types. The present article will describe currently available ABC transporter radiotracers for positron emission tomography (PET) and single-photon emission computed tomography (SPECT) and critically discuss strengths and limitations of individual probes and their potential clinical applications. PMID:21434859

  1. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  2. A toolbox for rockfall Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Agliardi, F.; Mavrouli, O.; Schubert, M.; Corominas, J.; Crosta, G. B.; Faber, M. H.; Frattini, P.; Narasimhan, H.

    2012-04-01

    Rockfall Quantitative Risk Analysis for mitigation design and implementation requires evaluating the probability of rockfall events, the probability and intensity of impacts on structures (elements at risk and countermeasures), their vulnerability, and the related expected costs for different scenarios. A sound theoretical framework has been developed during the last years both for spatially-distributed and local (i.e. single element at risk) analyses. Nevertheless, the practical application of existing methodologies remains challenging, due to difficulties in the collection of required data and to the lack of simple, dedicated analysis tools. In order to fill this gap, specific tools have been developed in the form of Excel spreadsheets, in the framework of Safeland EU project. These tools can be used by stakeholders, practitioners and other interested parties for the quantitative calculation of rock fall risk through its key components (probabilities, vulnerability, loss), using combinations of deterministic and probabilistic approaches. Three tools have been developed, namely: QuRAR (by UNIMIB), VulBlock (by UPC), and RiskNow-Falling Rocks (by ETH Zurich). QuRAR implements a spatially distributed, quantitative assessment methodology of rockfall risk for individual buildings or structures in a multi-building context (urban area). Risk is calculated in terms of expected annual cost, through the evaluation of rockfall event probability, propagation and impact probability (by 3D numerical modelling of rockfall trajectories), and empirical vulnerability for different risk protection scenarios. Vulblock allows a detailed, analytical calculation of the vulnerability of reinforced concrete frame buildings to rockfalls and related fragility curves, both as functions of block velocity and the size. The calculated vulnerability can be integrated in other methodologies/procedures based on the risk equation, by incorporating the uncertainty of the impact location of the rock

  3. LROC assessment of non-linear filtering methods in Ga-67 SPECT imaging

    NASA Astrophysics Data System (ADS)

    De Clercq, Stijn; Staelens, Steven; De Beenhouwer, Jan; D'Asseler, Yves; Lemahieu, Ignace

    2006-03-01

    In emission tomography, iterative reconstruction is usually followed by a linear smoothing filter to make such images more appropriate for visual inspection and diagnosis by a physician. This will result in a global blurring of the images, smoothing across edges and possibly discarding valuable image information for detection tasks. The purpose of this study is to investigate which possible advantages a non-linear, edge-preserving postfilter could have on lesion detection in Ga-67 SPECT imaging. Image quality can be defined based on the task that has to be performed on the image. This study used LROC observer studies based on a dataset created by CPU-intensive Gate Monte Carlo simulations of a voxelized digital phantom. The filters considered in this study were a linear Gaussian filter, a bilateral filter, the Perona-Malik anisotropic diffusion filter and the Catte filtering scheme. The 3D MCAT software phantom was used to simulate the distribution of Ga-67 citrate in the abdomen. Tumor-present cases had a 1-cm diameter tumor randomly placed near the edges of the anatomical boundaries of the kidneys, bone, liver and spleen. Our data set was generated out of a single noisy background simulation using the bootstrap method, to significantly reduce the simulation time and to allow for a larger observer data set. Lesions were simulated separately and added to the background afterwards. These were then reconstructed with an iterative approach, using a sufficiently large number of MLEM iterations to establish convergence. The output of a numerical observer was used in a simplex optimization method to estimate an optimal set of parameters for each postfilter. No significant improvement was found for using edge-preserving filtering techniques over standard linear Gaussian filtering.

  4. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  5. Activity concentration measurements using a conjugate gradient (Siemens xSPECT) reconstruction algorithm in SPECT/CT.

    PubMed

    Armstrong, Ian S; Hoffmann, Sandra A

    2016-11-01

    The interest in quantitative single photon emission computer tomography (SPECT) shows potential in a number of clinical applications and now several vendors are providing software and hardware solutions to allow 'SUV-SPECT' to mirror metrics used in PET imaging. This brief technical report assesses the accuracy of activity concentration measurements using a new algorithm 'xSPECT' from Siemens Healthcare. SPECT/CT data were acquired from a uniform cylinder with 5, 10, 15 and 20 s/projection and NEMA image quality phantom with 25 s/projection. The NEMA phantom had hot spheres filled with an 8 : 1 activity concentration relative to the background compartment. Reconstructions were performed using parameters defined by manufacturer presets available with the algorithm. The accuracy of activity concentration measurements was assessed. A dose calibrator-camera cross-calibration factor (CCF) was derived from the uniform phantom data. In uniform phantom images, a positive bias was observed, ranging from ∼6% in the lower count images to ∼4% in the higher-count images. On the basis of the higher-count data, a CCF of 0.96 was derived. As expected, considerable negative bias was measured in the NEMA spheres using region mean values whereas positive bias was measured in the four largest NEMA spheres. Nonmonotonically increasing recovery curves for the hot spheres suggested the presence of Gibbs edge enhancement from resolution modelling. Sufficiently accurate activity concentration measurements can easily be measured on images reconstructed with the xSPECT algorithm without a CCF. However, the use of a CCF is likely to improve accuracy further. A manual conversion of voxel values into SUV should be possible, provided that the patient weight, injected activity and time between injection and imaging are all known accurately.

  6. Activity concentration measurements using a conjugate gradient (Siemens xSPECT) reconstruction algorithm in SPECT/CT.

    PubMed

    Armstrong, Ian S; Hoffmann, Sandra A

    2016-11-01

    The interest in quantitative single photon emission computer tomography (SPECT) shows potential in a number of clinical applications and now several vendors are providing software and hardware solutions to allow 'SUV-SPECT' to mirror metrics used in PET imaging. This brief technical report assesses the accuracy of activity concentration measurements using a new algorithm 'xSPECT' from Siemens Healthcare. SPECT/CT data were acquired from a uniform cylinder with 5, 10, 15 and 20 s/projection and NEMA image quality phantom with 25 s/projection. The NEMA phantom had hot spheres filled with an 8 : 1 activity concentration relative to the background compartment. Reconstructions were performed using parameters defined by manufacturer presets available with the algorithm. The accuracy of activity concentration measurements was assessed. A dose calibrator-camera cross-calibration factor (CCF) was derived from the uniform phantom data. In uniform phantom images, a positive bias was observed, ranging from ∼6% in the lower count images to ∼4% in the higher-count images. On the basis of the higher-count data, a CCF of 0.96 was derived. As expected, considerable negative bias was measured in the NEMA spheres using region mean values whereas positive bias was measured in the four largest NEMA spheres. Nonmonotonically increasing recovery curves for the hot spheres suggested the presence of Gibbs edge enhancement from resolution modelling. Sufficiently accurate activity concentration measurements can easily be measured on images reconstructed with the xSPECT algorithm without a CCF. However, the use of a CCF is likely to improve accuracy further. A manual conversion of voxel values into SUV should be possible, provided that the patient weight, injected activity and time between injection and imaging are all known accurately. PMID:27501436

  7. Quantitative ultrasound assessment of cervical microstructure.

    PubMed

    Feltovich, Helen; Nam, Kibo; Hall, Timothy J

    2010-07-01

    The objective of this preliminary study was to determine whether quantitative ultrasound (QUS) can provide insight into, and characterization of, uterine cervical microstructure. Throughout pregnancy, cervical collagen reorganizes (from aligned and anisotropic to disorganized and isotropic) as the cervix changes in preparation for delivery. Premature changes in collagen are associated with premature birth in mammals. Because QUS is able to detect structural anisotropy/isotropy, we hypothesized that it may provide a means of noninvasively assessing cervical microstructure. Thorough study of cervical microstructure has been limited by lack of technology to detect small changes in collagen organization, which has in turn limited our ability to detect abnormal and/or premature changes in collagen that may lead to preterm birth. In order to determine whether QUS may be useful for detection of cervical microstructure, radiofrequency (rf) echo data were acquired from the cervices of human hysterectomy specimens (n = 10). The angle between the acoustic beam and tissue was used to assess anisotropic acoustic propagation by control of transmit/receive angles from -20 degrees to +20 degrees. The power spectrum of the echo signals from within a region of interest was computed in order to investigate the microstructure of the tissue. An identical analysis was performed on a homogeneous phantom with spherical scatterers for system calibration. Power spectra of backscattered rf from the cervix were 6 dB higher for normal (0 degree) than steered (+/- 20 degrees) beams. The spectral power for steered beams decreased monotonically (0.4 dB at +5 degrees to 3.6 dB at +20 degrees). The excess difference (compared to similar analysis for the phantom) in normally-incident (0 degree) versus steered beams is consistent with scattering from an aligned component of the cervical microstructure. Therefore, QUS appears to reliably identify an aligned component of cervical microstructure

  8. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  9. Assessment of a Monte-Carlo simulation of SPECT recordings from a new-generation heart-centric semiconductor camera: from point sources to human images.

    PubMed

    Imbert, Laetitia; Galbrun, Ernest; Odille, Freddy; Poussier, Sylvain; Noel, Alain; Wolf, Didier; Karcher, Gilles; Marie, Pierre-Yves

    2015-02-01

    Geant4 application for tomographic emission (GATE), a Monte-Carlo simulation platform, has previously been used for optimizing tomoscintigraphic images recorded with scintillation Anger cameras but not with the new-generation heart-centric cadmium-zinc-telluride (CZT) cameras. Using the GATE platform, this study aimed at simulating the SPECT recordings from one of these new CZT cameras and to assess this simulation by direct comparison between simulated and actual recorded data, ranging from point sources to human images. Geometry and movement of detectors, as well as their respective energy responses, were modeled for the CZT 'D.SPECT' camera in the GATE platform. Both simulated and actual recorded data were obtained from: (1) point and linear sources of (99m)Tc for compared assessments of detection sensitivity and spatial resolution, (2) a cardiac insert filled with a (99m)Tc solution for compared assessments of contrast-to-noise ratio and sharpness of myocardial borders and (3) in a patient with myocardial infarction using segmented cardiac magnetic resonance imaging images. Most of the data from the simulated images exhibited high concordance with the results of actual images with relative differences of only: (1) 0.5% for detection sensitivity, (2) 6.7% for spatial resolution, (3) 2.6% for contrast-to-noise ratio and 5.0% for sharpness index on the cardiac insert placed in a diffusing environment. There was also good concordance between actual and simulated gated-SPECT patient images for the delineation of the myocardial infarction area, although the quality of the simulated images was clearly superior with increases around 50% for both contrast-to-noise ratio and sharpness index. SPECT recordings from a new heart-centric CZT camera can be simulated with the GATE software with high concordance relative to the actual physical properties of this camera. These simulations may be conducted up to the stage of human SPECT-images even if further refinement is needed

  10. Comparison of SPECT using technetium-99m agents and thallium-201 and PET for the assessment of myocardial perfusion and viability

    SciTech Connect

    Berman, D.S.; Kiat, H.; Van Train, K.F.; Friedman, J.; Garcia, E.V.; Maddahi, J. )

    1990-10-16

    This report reviews the applications of tomographic imaging with current and new tracers in assessing myocardial perfusion and viability. Multiple studies with thallium-201 (TI-201) single photon emission computed tomography (SPECT) imaging for the detection of coronary artery disease (CAD) have demonstrated high sensitivity, high rates of normalcy and high reproducibility. In assessing viability, fixed defects are frequently detected in viable zones in 4-hour studies with TI-201 imaging. Redistribution imaging performed 18 to 72 hours after injection or reinjection of TI-201 before 4-hour redistribution imaging has been shown to improve accuracy of viability assessment. TI-201 SPECT studies are limited by the suboptimal physical properties of TI-201, which result in variable image quality. The 2 new technetium-99m (Tc-99m) - labeled myocardial perfusion tracers offer the ability to inject much higher amounts of radioactivity, making it possible to assess ventricular function as well as myocardial perfusion from the same injection of radiotracer. Tc-99m sestamibi has very slow myocardial clearance, which allows for prolonged imaging time and results in image quality superior to that obtained with TI-201 and Tc-99m teboroxime. The combination of minimal redistribution of Tc-99m sestamibi and high count rates makes gated SPECT imaging feasible, and also permits assessment of patients with acute ischemic syndromes by uncoupling the time of injection from the time of imaging. The combination of high image quality and first-pass exercise capabilities may lead to a choice of this agent over TI-201 for assessment of chronic CAD.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. [Studies of biologic activation associated with molecular receptor increase and tumor response in ChL6/L6 protocol patients; Studies in phantoms; Quantitative SPECT; Preclinical studies; and Clinical studies]. DOE annual report, 1994--95

    SciTech Connect

    DeNardo, S.J.

    1995-12-31

    The authors describe results which have not yet been published from their associated studies listed in the title. For the first, they discuss Lym-1 single chain genetically engineered molecules, analysis of molecular genetic coded messages to enhance tumor response, and human dosimetry and therapeutic human use radiopharmaceuticals. Studies in phantoms includes a discussion of planar image quantitation, counts coincidence correction, organ studies, tumor studies, and {sup 90}Y quantitation with Bremsstrahlung imaging. The study on SPECT discusses attenuation correction and scatter correction. Preclinical studies investigated uptake of {sup 90}Y-BrE-3 in mice using autoradiography. Clinical studies discuss image quantitation verses counts from biopsy samples, S factors for radiation dose calculation, {sup 67}Cu imaging studies for lymphoma cancer, and {sup 111}In MoAb imaging studies for breast cancer to predict {sup 90}Y MoAb therapy.

  12. Altered myocardial perfusion in patients with angina pectoris or silent ischemia during exercise as assessed by quantitative thallium-201 single-photon emission computed tomography

    SciTech Connect

    Mahmarian, J.J.; Pratt, C.M.; Cocanougher, M.K.; Verani, M.S. )

    1990-10-01

    The extent of abnormally perfused myocardium was compared in patients with and without chest pain during treadmill exercise from a large, relatively low-risk consecutive patient population (n = 356) referred for quantitative thallium-201 single-photon emission computed tomography (SPECT). All patients had concurrent coronary angiography. Patients were excluded if they had prior coronary angioplasty or bypass surgery. Tomographic images were assessed visually and from computer-generated polar maps. Chest pain during exercise was as frequent in patients with normal coronary arteries (12%) as in those with significant (greater than 50% stenosis) coronary artery disease (CAD) (14%). In the 219 patients with significant CAD, silent ischemia was fivefold more common than symptomatic ischemia (83% versus 17%, p = 0.0001). However, there were no differences in the extent, severity, or distribution of coronary stenoses in patients with silent or symptomatic ischemia. Our major observation was that the extent of quantified SPECT perfusion defects was nearly identical in patients with (20.9 +/- 15.9%) and without (20.5 +/- 15.6%) exertional chest pain. The sensitivity for detecting the presence of CAD was significantly improved with quantitative SPECT compared with stress electrocardiography (87% versus 65%, p = 0.0001). Although scintigraphic and electrocardiographic evidence of exercise-induced ischemia were comparable in patients with chest pain (67% versus 73%, respectively; p = NS), SPECT was superior to stress electrocardiography for detecting silent myocardial ischemia. The majority of patients in this study with CAD who developed ischemia during exercise testing were asymptomatic, although they exhibited an angiographic profile and extent of abnormally perfused myocardium similar to those of patients with symptomatic ischemia.

  13. Bayes` theorem and quantitative risk assessment

    SciTech Connect

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  14. Assessing Quantitative Reasoning in Young Children

    ERIC Educational Resources Information Center

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Barros, Rossana

    2015-01-01

    Before starting school, many children reason logically about concepts that are basic to their later mathematical learning. We describe a measure of quantitative reasoning that was administered to children at school entry (mean age 5.8 years) and accounted for more variance in a mathematical attainment test than general cognitive ability 16 months…

  15. SPECT assay of radiolabeled monoclonal antibodies

    SciTech Connect

    Jaszczak, R.J.

    1992-02-01

    The accurate determination of the biodistribution of radiolabeled monoclonal antibodies (MoAbs) is important for calculation of dosimetry and evaluation of pharmacokinetic variables such as antibody dose and route of administration. The hypothesis of this application is that the biodistribution of radiolabeled monoclonal antibodies (MoAbs) can be quantitatively determined using single photon emission computed tomography (SPECT). The major thrusts during the third year include the continued development and evaluation of improved 3D SPECT acquisition and reconstruction approaches to improve quantitative imaging of radiolabeled monoclonal antibodies (MoAbs), and the implementation and evaluation of algorithms to register serial SPECT image data sets, or to register 3D SPECT images with 3D image data sets acquired from positron emission tomography (PEI) and magnetic resonance images (MRI). The research has involved the investigation of statistical models and iterative reconstruction algorithms that accurately account for the physical characteristics of the SPECT acquisition system. It is our belief that SPECT quantification can be improved by accurately modeling the physical processes such as attenuation, scatter, geometric collimator response, and other factors that affect the measured projection data.

  16. SPECT assay of radiolabeled monoclonal antibodies

    SciTech Connect

    Jaszczak, R.J.

    1992-02-01

    The long-term goal of this research project is to develop methods to improve the utility of single photon emission computed tomography (SPECI) to quantify the biodistribution of monoclonal antibodies (MoAbs) labeled with clinically relevant radionuclides ({sup 123}I, {sup 131}I, and {sup 111}In) and with another radionuclide,{sup 211}At, recently used in therapy. We describe here our progress in developing quantitative SPECT methodology for {sup 111}In and {sup 123}I. We have focused our recent research thrusts on the following aspects of SPECT: (1) The development of improved SPECT hardware, such as improved acquisition geometries. (2) The development of better reconstruction methods that provide accurate compensation for the physical factors that affect SPECT quantification. (3) The application of carefully designed simulations and experiments to validate our hardware and software approaches.

  17. Comparison of image quality, myocardial perfusion, and LV function between standard imaging and single-injection ultra-low-dose imaging using a high-efficiency SPECT camera: the MILLISIEVERT study

    PubMed Central

    Einstein, Andrew J.; Blankstein, Ron; Andrews, Howard; Fish, Mathews; Padgett, Richard; Hayes, Sean W.; Friedman, John D.; Qureshi, Mehreen; Rakotoarivelo, Harivony; Slomka, Piotr; Nakazato, Ryo; Bokhari, Sabahat; Di Carli, Marcello; Berman, Daniel S.

    2015-01-01

    SPECT myocardial perfusion imaging (MPI) plays a central role in coronary artery disease diagnosis; but concerns exist regarding its radiation burden. Compared to standard Anger-SPECT (A-SPECT) cameras, new high-efficiency (HE) cameras with specialized collimators and solid-state cadmium-zinc-telluride detectors offer potential to maintain image quality (IQ), while reducing administered activity and thus radiation dose to patients. No previous study has compared IQ, interpretation, total perfusion deficit (TPD), or ejection fraction (EF) in patients receiving both ultra-low-dose (ULD) imaging on a HE-SPECT camera and standard low-dose (SLD) A-SPECT imaging. Methods We compared ULD-HE-SPECT to SLD-A-SPECT imaging by dividing the rest dose in 101 patients at 3 sites scheduled to undergo clinical A-SPECT MPI using a same day rest/stress Tc-99m protocol. Patients received HE-SPECT imaging following an initial ~130 MBq (3.5mCi) dose, and SLD-A-SPECT imaging following the remainder of the planned dose. Images were scored visually by 2 blinded readers for IQ and summed rest score (SRS). TPD and EF were assessed quantitatively. Results Mean activity was 134 MBq (3.62 mCi) for ULD-HE-SPECT (effective dose 1.15 mSv) and 278 MBq (7.50 mCi, 2.39 mSv) for SLD-A-SPECT. Overall IQ was superior for ULD-HE-SPECT (p<0.0001), with twice as many studies graded excellent quality. Extracardiac activity and overall perfusion assessment were similar. Between-method correlations were high for SRS (r=0.87), TPD (r=0.91), and EF (r=0.88). Conclusion ULD-HE-SPECT rest imaging correlates highly with SLD-A-SPECT. It has improved image quality, comparable extracardiac activity, and achieves radiation dose reduction to 1 mSv for a single injection. PMID:24982439

  18. Poster — Thur Eve — 06: Dose assessment of cone beam CT imaging protocols as part of SPECT/CT examinations

    SciTech Connect

    Tonkopi, E; Ross, AA

    2014-08-15

    Purpose: To assess radiation dose from the cone beam CT (CBCT) component of SPECT/CT studies and to compare with other CT examinations performed in our institution. Methods: We used an anthropomorphic chest phantom and the 6 cc ion chamber to measure entrance breast dose for several CBCT and diagnostic CT acquisition protocols. The CBCT effective dose was calculated with ImPACT software; the CT effective dose was evaluated from the DLP value and conversion factor, dependent on the anatomic region. The RADAR medical procedure radiation dose calculator was used to assess the nuclear medicine component of exam dose. Results: The entrance dose to the breast measured with the anthropomorphic phantom was 0.48 mGy and 9.41 mGy for cardiac and chest CBCT scans; and 4.59 mGy for diagnostic thoracic CT. The effective doses were 0.2 mSv, 3.2 mSv and 2.8 mSv respectively. For a small patient represented by the anthropomorphic phantom, the dose from the diagnostic CT was lower than from the CBCT scan, as a result of the exposure reduction options available on modern CT scanners. The CBCT protocols used the same fixed scanning techniques. The diagnostic CT dose based on the patient data was 35% higher than the phantom dose. For most SPECT/CT studies the dose from the CBCT component was comparable with the dose from the radiopharmaceutical. Conclusions: The patient radiation dose from the cone beam CT scan can be higher than that from a diagnostic CT and should be taken into consideration in evaluating total SPECT/CT patient dose.

  19. Assessing Cardiac Injury in Mice With Dual Energy-MicroCT, 4D-MicroCT, and MicroSPECT Imaging After Partial Heart Irradiation

    SciTech Connect

    Lee, Chang-Lung; Min, Hooney; Befera, Nicholas; Clark, Darin; Qi, Yi; Das, Shiva; Johnson, G. Allan; Badea, Cristian T.; Kirsch, David G.

    2014-03-01

    Purpose: To develop a mouse model of cardiac injury after partial heart irradiation (PHI) and to test whether dual energy (DE)-microCT and 4-dimensional (4D)-microCT can be used to assess cardiac injury after PHI to complement myocardial perfusion imaging using micro-single photon emission computed tomography (SPECT). Methods and Materials: To study cardiac injury from tangent field irradiation in mice, we used a small-field biological irradiator to deliver a single dose of 12 Gy x-rays to approximately one-third of the left ventricle (LV) of Tie2Cre; p53{sup FL/+} and Tie2Cre; p53{sup FL/−} mice, where 1 or both alleles of p53 are deleted in endothelial cells. Four and 8 weeks after irradiation, mice were injected with gold and iodinated nanoparticle-based contrast agents, and imaged with DE-microCT and 4D-microCT to evaluate myocardial vascular permeability and cardiac function, respectively. Additionally, the same mice were imaged with microSPECT to assess myocardial perfusion. Results: After PHI with tangent fields, DE-microCT scans showed a time-dependent increase in accumulation of gold nanoparticles (AuNp) in the myocardium of Tie2Cre; p53{sup FL/−} mice. In Tie2Cre; p53{sup FL/−} mice, extravasation of AuNp was observed within the irradiated LV, whereas in the myocardium of Tie2Cre; p53{sup FL/+} mice, AuNp were restricted to blood vessels. In addition, data from DE-microCT and microSPECT showed a linear correlation (R{sup 2} = 0.97) between the fraction of the LV that accumulated AuNp and the fraction of LV with a perfusion defect. Furthermore, 4D-microCT scans demonstrated that PHI caused a markedly decreased ejection fraction, and higher end-diastolic and end-systolic volumes, to develop in Tie2Cre; p53{sup FL/−} mice, which were associated with compensatory cardiac hypertrophy of the heart that was not irradiated. Conclusions: Our results show that DE-microCT and 4D-microCT with nanoparticle-based contrast agents are novel imaging approaches

  20. Assessment of a Monte-Carlo simulation of SPECT recordings from a new-generation heart-centric semiconductor camera: from point sources to human images

    NASA Astrophysics Data System (ADS)

    Imbert, Laetitia; Galbrun, Ernest; Odille, Freddy; Poussier, Sylvain; Noel, Alain; Wolf, Didier; Karcher, Gilles; Marie, Pierre-Yves

    2015-02-01

    Geant4 application for tomographic emission (GATE), a Monte-Carlo simulation platform, has previously been used for optimizing tomoscintigraphic images recorded with scintillation Anger cameras but not with the new-generation heart-centric cadmium-zinc-telluride (CZT) cameras. Using the GATE platform, this study aimed at simulating the SPECT recordings from one of these new CZT cameras and to assess this simulation by direct comparison between simulated and actual recorded data, ranging from point sources to human images. Geometry and movement of detectors, as well as their respective energy responses, were modeled for the CZT ‘D.SPECT’ camera in the GATE platform. Both simulated and actual recorded data were obtained from: (1) point and linear sources of 99mTc for compared assessments of detection sensitivity and spatial resolution, (2) a cardiac insert filled with a 99mTc solution for compared assessments of contrast-to-noise ratio and sharpness of myocardial borders and (3) in a patient with myocardial infarction using segmented cardiac magnetic resonance imaging images. Most of the data from the simulated images exhibited high concordance with the results of actual images with relative differences of only: (1) 0.5% for detection sensitivity, (2) 6.7% for spatial resolution, (3) 2.6% for contrast-to-noise ratio and 5.0% for sharpness index on the cardiac insert placed in a diffusing environment. There was also good concordance between actual and simulated gated-SPECT patient images for the delineation of the myocardial infarction area, although the quality of the simulated images was clearly superior with increases around 50% for both contrast-to-noise ratio and sharpness index. SPECT recordings from a new heart-centric CZT camera can be simulated with the GATE software with high concordance relative to the actual physical properties of this camera. These simulations may be conducted up to the stage of human SPECT-images even if further refinement is needed

  1. Technological Development and Advances in SPECT/CT

    PubMed Central

    Seo, Youngho; Aparici, Carina Mari; Hasegawa, Bruce H

    2010-01-01

    SPECT/CT has emerged over the past decade as a means of correlating anatomical information from CT with functional information from SPECT. The integration of SPECT and CT in a single imaging device facilitates anatomical localization of the radiopharmaceutical to differentiate physiological uptake from that associated with disease and patient-specific attenuation correction to improve the visual quality and quantitative accuracy of the SPECT image. The first clinically available SPECT/CT systems performed emission-transmission imaging using a dual-headed SPECT camera and a low-power x-ray CT sub-system. Newer SPECT/CT systems are available with high-power CT sub-systems suitable for detailed anatomical diagnosis, including CT coronary angiography and coronary calcification that can be correlated with myocardial perfusion measurements. The high-performance CT capabilities also offer the potential to improve compensation of partial volume errors for more accurate quantitation of radionuclide measurement of myocardial blood flow and other physiological processes and for radiation dosimetry for radionuclide therapy. In addition, new SPECT technologies are being developed that significantly improve the detection efficiency and spatial resolution for radionuclide imaging of small organs including the heart, brain, and breast, and therefore may provide new capabilities for SPECT/CT imaging in these important clinical applications. PMID:18396178

  2. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  3. An Assessment of a Low-Cost Visual Tracking System (VTS) to Detect and Compensate for Patient Motion during SPECT

    PubMed Central

    McNamara, Joseph E.; Bruyant, Philippe; Johnson, Karen; Feng, Bing; Lehovich, Andre; Gu, Songxiang; Gennert, Michael A.; King, Michael A.

    2008-01-01

    Patient motion is inevitable in SPECT and PET due to the lengthy period of time patients are imaged and patient motion can degrade diagnostic accuracy. The goal of our studies is to perfect a methodology for tracking and correcting patient motion when it occurs. In this paper we report on enhancements to the calibration, camera stability, accuracy of motion tracking, and temporal synchronization of a low-cost visual tracking system (VTS) we are developing. The purpose of the VTS is to track the motion of retro-reflective markers on stretchy bands wrapped about the chest and abdomen of patients. We have improved the accuracy of 3D spatial calibration by using a MATLAB optical camera calibration package with a planar calibration pattern. This allowed us to determine the intrinsic and extrinsic parameters for stereo-imaging with our CCD cameras. Locations in the VTS coordinate system are transformed to the SPECT coordinate system by a VTS/SPECT mapping using a phantom of 7 retro-reflective spheres each filled with a drop of Tc99m. We switched from pan, tilt and zoom (PTZ) network cameras to fixed network cameras to reduce the amount of camera drift. The improved stability was verified by tracking the positions of fixed retro-reflective markers on a wall. The ability of our VTS to track movement, on average, with sub-millimeter and sub-degree accuracy was established with the 7-sphere phantom for 1 cm vertical and axial steps as well as for an arbitrary rotation and translation. The difference in the time of optical image acquisition as decoded from the image headers relative to synchronization signals sent to the SPECT system was used to establish temporal synchrony between optical and list-mode SPECT acquisition. Two experiments showed better than 100 ms agreement between VTS and SPECT observed motion for three axial translations. We were able to track 3 reflective markers on an anthropomorphic phantom with a precision that allowed us to correct motion such that no

  4. Perfusion patterns in postictal 99mTc-HMPAO SPECT after coregistration with MRI in patients with mesial temporal lobe epilepsy

    PubMed Central

    Hogan, R; Cook, M.; Binns, D.; Desmond, P.; Kilpatrick, C.; Murrie, V.; Morris, K.

    1997-01-01

    OBJECTIVES—To assess patterns of postictal cerebral blood flow in the mesial temporal lobe by coregistration of postictal 99mTc-HMPAO SPECT with MRI in patients with confirmed mesial temporal lobe epilepsy.
METHODS—Ten postictal and interictal 99mTc-HMPAO SPECT scans were coregistered with MRI in 10 patients with confirmed mesial temporal lobe epilepsy. Volumetric tracings of the hippocampus and amygdala from the MRI were superimposed on the postictal and interictal SPECT. Asymmetries in hippocampal and amygdala SPECT signal were then calculated using the equation:
 % Asymmetry =100 × (right − left) / (right + left)/2.
RESULTS—In the postictal studies, quantitative measurements of amygdala SPECT intensities were greatest on the side of seizure onset in all cases, with an average % asymmetry of 11.1, range 5.2-21.9.Hippocampal intensities were greatest on the side of seizure onset in six studies, with an average % asymmetry of 9.6, range 4.7-12.0.In four scans the hippocampal intensities were less on the side of seizure onset, with an average % asymmetry of 10.2, range 5.7-15.5.There was no localising quantitative pattern in interictal studies.
CONCLUSIONS—Postictal SPECT shows distinctive perfusion patterns when coregistered with MRI, which assist in lateralisation of temporal lobe seizures. Hyperperfusion in the region of the amygdala is more consistently lateralising than hyperperfusion in the region of the hippocampus in postictal studies.

 PMID:9285464

  5. Monte Carlo simulations to assess differentiation between defects in cardiac SPECT

    NASA Astrophysics Data System (ADS)

    Chrysanthou-Baustert, I.; Parpottas, Y.; Demetriadou, O.; Christofides, S.; Yiannakkaras, Ch; Kaolis, D.; Wasilewska-Radwanska, M.

    2011-09-01

    Differentiating between various types of lesions in nuclear cardiology is a challenge. This work assesses the level of differentiation achievable between various low contrast lesions, as encountered in nuclear cardiology. The parameters investigated are defect extend, defect thickness and perfusion reduction of the defect. The images have been obtained through Monte Carlo Simulations with the program SIMIND. Results show that acceptable size resolution is obtained for defects with an extend over 25×25mm. When thickness and perfusion reduction are both unknown, the imaging results are confounding. In this work, thickness and perfusion reduction cannot be differentiated. If one of the variables is known (thickness or perfusion reduction), imaging results can differentiate between the other unknown variable.

  6. SPECT in the diagnosis of hepatic hemangioma

    SciTech Connect

    Brunetti, J.C.; Van Heertum, R.L.; Yudd, A.P.

    1985-05-01

    Tc99m labeled red blood cell blood flow and delayed static blood pool imaging is widely accepted as a reliable, accurate method for the diagnosis of hepatic hemangiomata. The purpose of this study is to assess the relative value of SPECT blood pool imaging in the evaluation of hepatic hemangionata. A total of 68 patients, including 21 patients with proven hepatic cavernous hemangiomas, were studied using both planar and SPECT imaging techniques. All patients underwent multi-phase evaluation which included a hepatic flow study, immediate planar images of the liver, followed by a 360/sup 0/ tomographic (SPECT) study and subsequent 60 minute delayed static planar hepatic blood pool images. All 21 patients with proven hepatic hemangiomas had a positive SPECT exam and 17 of the 21 (81%) patients had a positive planar exam. In the 21 patients, there were a total of 36 hemangiomas ranging in size from .7 cm to 13 cm. The SPECT imaging technique correctly identified all 36 lesions (100%) where as planar imaging detected 25 of the 36 lesions (69.4%). In all the remaining patients (10-normal, 17-metastatic disease, 12-hepatocellular disease, 6-hepatoma, 2-liver cysts), both the planar and SPECT imaging techniques were interpreted as showing no evidence of focal sequestration of red blood cells. SPECT hepatic blood pool imaging represents an improvement in the evaluation of hepatic hemangioma as a result of a reduction in imaging time (less than thirty minutes), improved spatial resolution and greater overall accuracy.

  7. Comparison of rubidium-82 positron emission tomography and thallium-201 SPECT imaging for detection of coronary artery disease

    SciTech Connect

    Stewart, R.E.; Schwaiger, M.; Molina, E.; Popma, J.; Gacioch, G.M.; Kalus, M.; Squicciarini, S.; al-Aouar, Z.R.; Schork, A.; Kuhl, D.E. )

    1991-06-15

    The diagnostic performance of rubidium-82 (Rb-82) positron emission tomography (PET) and thallium-201 (Tl-201) single-photon emission-computed tomography (SPECT) for detecting coronary artery disease was investigated in 81 patients (52 men, 29 women). PET studies using 60 mCi of Rb-82 were performed at baseline and after intravenous infusion of 0.56 mg/kg dipyridamole in conjunction with handgrip stress. Tl-201 SPECT was performed after dipyridamole-handgrip stress and, in a subset of patients, after treadmill exercise. Sensitivity, specificity and overall diagnostic accuracy were assessed using both visually and quantitatively interpreted coronary angiograms. The overall sensitivity, specificity and accuracy of PET for detection of coronary artery disease (greater than 50% diameter stenosis) were 84, 88 and 85%, respectively. In comparison, the performance of SPECT revealed a sensitivity of 84%, specificity of 53% (p less than 0.05 vs PET) and accuracy of 79%. Similar results were obtained using either visual or quantitative angiographic criteria for severity of coronary artery disease. In 43 patients without prior myocardial infarction, the sensitivity for detection of disease was 71 and 73%, respectively, similar for both PET and SPECT. There was no significant difference in diagnostic performance between imaging modalities when 2 different modes of stress (exercise treadmill vs intravenous dipyridamole plus handgrip) were used with SPECT imaging. Thus, Rb-82 PET provides improved specificity compared with Tl-201 SPECT for identifying coronary artery disease, most likely due to the higher photon energy of Rb-82 and attenuation correction provided by PET. However, post-test referral cannot be entirely excluded as a potential explanation for the lower specificity of Tl-201 SPECT.

  8. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  9. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  10. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  11. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems.

  12. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems. PMID:24110711

  13. Quantitative assessment of protein function prediction programs.

    PubMed

    Rodrigues, B N; Steffens, M B R; Raittz, R T; Santos-Weiss, I C R; Marchaukoski, J N

    2015-12-21

    Fast prediction of protein function is essential for high-throughput sequencing analysis. Bioinformatic resources provide cheaper and faster techniques for function prediction and have helped to accelerate the process of protein sequence characterization. In this study, we assessed protein function prediction programs that accept amino acid sequences as input. We analyzed the classification, equality, and similarity between programs, and, additionally, compared program performance. The following programs were selected for our assessment: Blast2GO, InterProScan, PANTHER, Pfam, and ScanProsite. This selection was based on the high number of citations (over 500), fully automatic analysis, and the possibility of returning a single best classification per sequence. We tested these programs using 12 gold standard datasets from four different sources. The gold standard classification of the databases was based on expert analysis, the Protein Data Bank, or the Structure-Function Linkage Database. We found that the miss rate among the programs is globally over 50%. Furthermore, we observed little overlap in the correct predictions from each program. Therefore, a combination of multiple types of sources and methods, including experimental data, protein-protein interaction, and data mining, may be the best way to generate more reliable predictions and decrease the miss rate.

  14. Quantitative assessment of protein function prediction programs.

    PubMed

    Rodrigues, B N; Steffens, M B R; Raittz, R T; Santos-Weiss, I C R; Marchaukoski, J N

    2015-01-01

    Fast prediction of protein function is essential for high-throughput sequencing analysis. Bioinformatic resources provide cheaper and faster techniques for function prediction and have helped to accelerate the process of protein sequence characterization. In this study, we assessed protein function prediction programs that accept amino acid sequences as input. We analyzed the classification, equality, and similarity between programs, and, additionally, compared program performance. The following programs were selected for our assessment: Blast2GO, InterProScan, PANTHER, Pfam, and ScanProsite. This selection was based on the high number of citations (over 500), fully automatic analysis, and the possibility of returning a single best classification per sequence. We tested these programs using 12 gold standard datasets from four different sources. The gold standard classification of the databases was based on expert analysis, the Protein Data Bank, or the Structure-Function Linkage Database. We found that the miss rate among the programs is globally over 50%. Furthermore, we observed little overlap in the correct predictions from each program. Therefore, a combination of multiple types of sources and methods, including experimental data, protein-protein interaction, and data mining, may be the best way to generate more reliable predictions and decrease the miss rate. PMID:26782400

  15. Quantitative estimation in Health Impact Assessment: Opportunities and challenges

    SciTech Connect

    Bhatia, Rajiv; Seto, Edmund

    2011-04-15

    Health Impact Assessment (HIA) considers multiple effects on health of policies, programs, plans and projects and thus requires the use of diverse analytic tools and sources of evidence. Quantitative estimation has desirable properties for the purpose of HIA but adequate tools for quantification exist currently for a limited number of health impacts and decision settings; furthermore, quantitative estimation generates thorny questions about the precision of estimates and the validity of methodological assumptions. In the United States, HIA has only recently emerged as an independent practice apart from integrated EIA, and this article aims to synthesize the experience with quantitative health effects estimation within that practice. We use examples identified through a scan of available identified instances of quantitative estimation in the U.S. practice experience to illustrate methods applied in different policy settings along with their strengths and limitations. We then discuss opportunity areas and practical considerations for the use of quantitative estimation in HIA.

  16. Sensitive Quantitative Assessment of Balance Disorders

    NASA Technical Reports Server (NTRS)

    Paloski, Willilam H.

    2007-01-01

    Computerized dynamic posturography (CDP) has become a standard technique for objectively quantifying balance control performance, diagnosing the nature of functional impairments underlying balance disorders, and monitoring clinical treatment outcomes. We have long used CDP protocols to assess recovery of sensory-motor function in astronauts following space flight. The most reliable indicators of post-flight crew performance are the sensory organization tests (SOTs), particularly SOTs 5 and 6, which are sensitive to changes in availability and/or utilization of vestibular cues. We have noted, however, that some astronauts exhibiting obvious signs of balance impairment after flight are able to score within clinical norms on these tests, perhaps as a result of adopting competitive strategies or by their natural skills at substituting alternate sensory information sources. This insensitivity of the CDP protocol could underestimate of the degree of impairment and, perhaps, lead to premature release of those crewmembers to normal duties. To improve the sensitivity of the CDP protocol we have introduced static and dynamic head tilt SOT trials into our protocol. The pattern of postflight recovery quantified by the enhanced CDP protocol appears to more aptly track the re-integration of sensory-motor function, with recovery time increasing as the complexity of sensory-motor/biomechanical task increases. The new CDP protocol therefore seems more suitable for monitoring post-flight sensory-motor recovery and for indicating to crewmembers and flight surgeons fitness for return to duty and/or activities of daily living. There may be classes of patients (e.g., athletes, pilots) having motivation and/or performance characteristics similar to astronauts whose sensory-motor treatment outcomes would also be more accurately monitored using the enhanced CDP protocol. Furthermore, the enhanced protocol may be useful in early detection of age-related balance disorders.

  17. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  18. Organ volume estimation using SPECT

    SciTech Connect

    Zaidi, H.

    1996-06-01

    Knowledge of in vivo thyroid volume has both diagnostic and therapeutic importance and could lead to a more precise quantification of absolute activity contained in the thyroid gland. In order to improve single-photon emission computed tomography (SPECT) quantitation, attenuation correction was performed according to Chang`s algorithm. The dual window method was used for scatter subtraction. The author used a Monte Carlo simulation of the SPECT system to accurately determine the scatter multiplier factor k. Volume estimation using SPECT was performed by summing up the volume elements (voxels) lying within the contour of the object, determined by a fixed threshold and the gray level histogram (GLH) method. Thyroid phantom and patient studies were performed and the influence of (1) fixed thresholding, (2) automatic thresholding, (3) attenuation, (4) scatter, and (5) reconstruction filter were investigated. This study shows that accurate volume estimation of the thyroid gland is feasible when accurate corrections are performed. The relative error is within 7% for the GLH method combined with attenuation and scatter corrections.

  19. Reproducibility of area at risk assessment in acute myocardial infarction by T1- and T2-mapping sequences in cardiac magnetic resonance imaging in comparison to Tc99m-sestamibi SPECT.

    PubMed

    Langhans, Birgit; Nadjiri, Jonathan; Jähnichen, Christin; Kastrati, Adnan; Martinoff, Stefan; Hadamitzky, Martin

    2014-10-01

    Area at risk (AAR) is an important parameter for the assessment of the salvage area after revascularization in acute myocardial infarction (AMI). By combining AAR assessment by T2-weighted imaging and scar quantification by late gadolinium enhancement imaging cardiovascular magnetic resonance (CMR) offers a promising alternative to the "classical" modality of Tc99m-sestamibi single photon emission tomography (SPECT). Current T2 weighted sequences for edema imaging in CMR are limited by low contrast to noise ratios and motion artifacts. During the last years novel CMR imaging techniques for quantification of acute myocardial injury, particularly the T1-mapping and T2-mapping, have attracted rising attention. But no direct comparison between the different sequences in the setting of AMI or a validation against SPECT has been reported so far. We analyzed 14 patients undergoing primary coronary revascularization in AMI in whom both a pre-intervention Tc99m-sestamibi-SPECT and CMR imaging at a median of 3.4 (interquartile range 3.3-3.6) days after the acute event were performed. Size of AAR was measured by three different non-contrast CMR techniques on corresponding short axis slices: T2-weighted, fat-suppressed turbospin echo sequence (TSE), T2-mapping from T2-prepared balanced steady state free precession sequences (T2-MAP) and T1-mapping from modified look locker inversion recovery (MOLLI) sequences. For each CMR sequence, the AAR was quantified by appropriate methods (absolute values for mapping sequences, comparison with remote myocardium for other sequences) and correlated with Tc99m-sestamibi-SPECT. All measurements were performed on a 1.5 Tesla scanner. The size of the AAR assessed by CMR was 28.7 ± 20.9 % of left ventricular myocardial volume (%LV) for TSE, 45.8 ± 16.6 %LV for T2-MAP, and 40.1 ± 14.4 %LV for MOLLI. AAR assessed by SPECT measured 41.6 ± 20.7 %LV. Correlation analysis revealed best correlation with SPECT for T2-MAP at a T2-threshold of 60 ms

  20. Quantitative wearable sensors for objective assessment of Parkinson's disease.

    PubMed

    Maetzler, Walter; Domingos, Josefa; Srulijes, Karin; Ferreira, Joaquim J; Bloem, Bastiaan R

    2013-10-01

    There is a rapidly growing interest in the quantitative assessment of Parkinson's disease (PD)-associated signs and disability using wearable technology. Both persons with PD and their clinicians see advantages in such developments. Specifically, quantitative assessments using wearable technology may allow for continuous, unobtrusive, objective, and ecologically valid data collection. Also, this approach may improve patient-doctor interaction, influence therapeutic decisions, and ultimately ameliorate patients' global health status. In addition, such measures have the potential to be used as outcome parameters in clinical trials, allowing for frequent assessments; eg, in the home setting. This review discusses promising wearable technology, addresses which parameters should be prioritized in such assessment strategies, and reports about studies that have already investigated daily life issues in PD using this new technology.

  1. Assessing digital and quantitative EEG in clinical settings.

    PubMed

    Nuwer, M R

    1998-11-01

    Assessment of clinical utility involves a series of steps based primarily on published peer-reviewed medical literature. Relevant publications usually use the scientific method, appropriate control groups, blinded reading, prospective design, and other study elements. Assessments are more credible when conducted by those who do not have a conflict of interest in the technique. A detailed assessment of digital and quantitative EEG was conducted recently by the American Academy of Neurology. The American Clinical Neurophysiology Society was a joint sponsor. This assessment concluded that digital EEG is an excellent substitute for paper EEG. It also found quantitative techniques helpful in epilepsy monitoring, seizure detections, and in operating room/intensive care unit trend monitors. Several other applications were considered promising, whereas some applications were considered not ready for clinical use. Substantial problems still plague the field, predisposing to false-positive results.

  2. Effects of CT-based attenuation correction of rat microSPECT images on relative myocardial perfusion and quantitative tracer uptake

    SciTech Connect

    Strydhorst, Jared H. Ruddy, Terrence D.; Wells, R. Glenn

    2015-04-15

    Purpose: Our goal in this work was to investigate the impact of CT-based attenuation correction on measurements of rat myocardial perfusion with {sup 99m}Tc and {sup 201}Tl single photon emission computed tomography (SPECT). Methods: Eight male Sprague-Dawley rats were injected with {sup 99m}Tc-tetrofosmin and scanned in a small animal pinhole SPECT/CT scanner. Scans were repeated weekly over a period of 5 weeks. Eight additional rats were injected with {sup 201}Tl and also scanned following a similar protocol. The images were reconstructed with and without attenuation correction, and the relative perfusion was analyzed with the commercial cardiac analysis software. The absolute uptake of {sup 99m}Tc in the heart was also quantified with and without attenuation correction. Results: For {sup 99m}Tc imaging, relative segmental perfusion changed by up to +2.1%/−1.8% as a result of attenuation correction. Relative changes of +3.6%/−1.0% were observed for the {sup 201}Tl images. Interscan and inter-rat reproducibilities of relative segmental perfusion were 2.7% and 3.9%, respectively, for the uncorrected {sup 99m}Tc scans, and 3.6% and 4.3%, respectively, for the {sup 201}Tl scans, and were not significantly affected by attenuation correction for either tracer. Attenuation correction also significantly increased the measured absolute uptake of tetrofosmin and significantly altered the relationship between the rat weight and tracer uptake. Conclusions: Our results show that attenuation correction has a small but statistically significant impact on the relative perfusion measurements in some segments of the heart and does not adversely affect reproducibility. Attenuation correction had a small but statistically significant impact on measured absolute tracer uptake.

  3. Quantitative Assessment of a Senge Learning Organization Intervention

    ERIC Educational Resources Information Center

    Kiedrowski, P. Jay

    2006-01-01

    Purpose: To quantitatively assess a Senge learning organization (LO) intervention to determine if it would result in improved employee satisfaction. Design/methodology/approach: A Senge LO intervention in Division 123 of Company ABC was undertaken in 2000. Three employee surveys using likert-scale questions over five years and correlation analysis…

  4. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment.

  5. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment. PMID:27367198

  6. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral

  7. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  8. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  9. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  10. Quantitative computed tomography for spinal mineral assessment: current status

    NASA Technical Reports Server (NTRS)

    Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U.; Arnaud, C. D.

    1985-01-01

    Quantitative CT (QCT) is an established method for the noninvasive assessment of bone mineral content in the vertebral spongiosum and other anatomic locations. The potential strengths of QCT relative to dual photon absorptiometry (DPA) are its capability for precise three-dimensional anatomic localization providing a direct density measurement and its capability for spatial separation of highly responsive cancellous bone from less responsive cortical bone. The extraction of this quantitative information from the CT image, however, requires sophisticated calibration and positioning techniques and careful technical monitoring.

  11. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  12. SPECT bone scintigraphy in the diagnosis and management of mandibular condylar hyperplasia.

    PubMed

    Hodder, S C; Rees, J I; Oliver, T B; Facey, P E; Sugar, A W

    2000-04-01

    Isotope bone scans have been used for a number of years to assess growth activity in the mandibular condyle in patients who present with facial asymmetry. The aim is to distinguish normal bone growth within the condyle from increased activity that may be the cause of the asymmetry. Previous studies have, however, relied only on planar images. SPECT (single photon emission computed tomography) has been used with quantitative assessments of one mandibular condyle to clivus or lumbar spine, but we have compared one condyle with the other, which is more sensitive and accurate in detecting abnormal activity. A relative percentage uptake of 55% or more in the affected mandibular condyle is considered to be abnormal, and this has been validated by comparison with an age-matched control group. We have used SPECT as an aid to diagnosis and treatment in 18 patients with asymmetrical growth and have constructed a therapeutic algorithm to aid the treatment of these patients. PMID:10864700

  13. The quantitative assessment of normal canine small intestinal mucosa.

    PubMed

    Hart, I R; Kidder, D E

    1978-09-01

    Quanitative methods of assessing the architecture of small intestinal mucosa have been applied to biopsy material from normal dogs. Mucosal samples taken from four predetermined sites show that there are significant quantitative differences between the various levels of the small bowel. Animals of one year of age and older show no correlation between age or weight and mucosal dimensions. The significance of these findings, in relation to examination of biopsy material from cases of clinical small intestinal disease, is discussed. PMID:364574

  14. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  15. Quantitative study designs used in quality improvement and assessment.

    PubMed

    Ormes, W S; Brim, M B; Coggan, P

    2001-01-01

    This article describes common quantitative design techniques that can be used to collect and analyze quality data. An understanding of the differences between these design techniques can help healthcare quality professionals make the most efficient use of their time, energies, and resources. To evaluate the advantages and disadvantages of these various study designs, it is necessary to assess factors that threaten the degree with which quality professionals may infer a cause-and-effect relationship from the data collected. Processes, the conduits of organizational function, often can be assessed by methods that do not take into account confounding and compromising circumstances that affect the outcomes of their analyses. An assumption that the implementation of process improvements may cause real change is incomplete without a consideration of other factors that might also have caused the same result. It is only through the identification, assessment, and exclusion of these alternative factors that administrators and healthcare quality professionals can assess the degree to which true process improvement or compliance has occurred. This article describes the advantages and disadvantages of common quantitative design techniques and reviews the corresponding threats to the interpretability of data obtained from their use. PMID:11378972

  16. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  17. Quantitative assessment of regional right ventricular function with color kinesis.

    PubMed

    Vignon, P; Weinert, L; Mor-Avi, V; Spencer, K T; Bednarz, J; Lang, R M

    1999-06-01

    We used color kinesis, a recent echocardiographic technique that provides regional information on the magnitude and timing of endocardial wall motion, to quantitatively assess regional right ventricular (RV) systolic and diastolic properties in 76 subjects who were divided into five groups, as follows: normal (n = 20), heart failure (n = 15), pressure/volume overload (n = 14), pressure overload (n = 12), and RV hypertrophy (n = 15). Quantitative segmental analysis of color kinesis images was used to obtain regional fractional area change (RFAC), which was displayed in the form of stacked histograms to determine patterns of endocardial wall motion. Time curves of integrated RFAC were used to objectively identify asynchrony of diastolic endocardial motion. When compared with normal subjects, patients with pressure overload or heart failure exhibited significantly decreased endocardial motion along the RV free wall. In the presence of mixed pressure/volume overload, the markedly increased ventricular septal motion compensated for decreased RV free wall motion. Diastolic endocardial wall motion was delayed in 17 of 72 segments (24%) in patients with RV pressure overload, and in 31 of 90 segments (34%) in patients with RV hypertrophy. Asynchrony of diastolic endocardial wall motion was greater in the latter group than in normal subjects (16% versus 10%: p < 0.01). Segmental analysis of color kinesis images allows quantitative assessment of regional RV systolic and diastolic properties.

  18. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  19. Performance assessment of the single photon emission microscope: high spatial resolution SPECT imaging of small animal organs.

    PubMed

    Mejia, J; Reis, M A; Miranda, A C C; Batista, I R; Barboza, M R F; Shih, M C; Fu, G; Chen, C T; Meng, L J; Bressan, R A; Amaro, E

    2013-11-01

    The single photon emission microscope (SPEM) is an instrument developed to obtain high spatial resolution single photon emission computed tomography (SPECT) images of small structures inside the mouse brain. SPEM consists of two independent imaging devices, which combine a multipinhole collimator, a high-resolution, thallium-doped cesium iodide [CsI(Tl)] columnar scintillator, a demagnifying/intensifier tube, and an electron-multiplying charge-coupling device (CCD). Collimators have 300- and 450-µm diameter pinholes on tungsten slabs, in hexagonal arrays of 19 and 7 holes. Projection data are acquired in a photon-counting strategy, where CCD frames are stored at 50 frames per second, with a radius of rotation of 35 mm and magnification factor of one. The image reconstruction software tool is based on the maximum likelihood algorithm. Our aim was to evaluate the spatial resolution and sensitivity attainable with the seven-pinhole imaging device, together with the linearity for quantification on the tomographic images, and to test the instrument in obtaining tomographic images of different mouse organs. A spatial resolution better than 500 µm and a sensitivity of 21.6 counts·s-1·MBq-1 were reached, as well as a correlation coefficient between activity and intensity better than 0.99, when imaging 99mTc sources. Images of the thyroid, heart, lungs, and bones of mice were registered using 99mTc-labeled radiopharmaceuticals in times appropriate for routine preclinical experimentation of <1 h per projection data set. Detailed experimental protocols and images of the aforementioned organs are shown. We plan to extend the instrument's field of view to fix larger animals and to combine data from both detectors to reduce the acquisition time or applied activity. PMID:24270908

  20. Quantitative measurements of cerebral blood flow using SPECT and [99mTc]-d,l-HM-PAO compared to xenon-133.

    PubMed

    Andersen, A R; Friberg, H H; Schmidt, J F; Hasselbalch, S G

    1988-12-01

    The uptake and retention in a 2 cm thick brain section was recorded serially by SPECT after i.v. injection of [99mTc]-d,l-HM-PAO (HM-PAO). In 16 patients, the fraction of the administered dose retained by the brain was 5.2 +/- 1%, showing a peak after 40-50s, then decreasing by 10% within the first 10 min and then by only 0.4% per hour. The image contrast was measured in each patient as the regional hemispheric asymmetry difference in percent of the highest value of the two regions. It decreased from 31% at 30-40 s to 25% at 10 min. At 24 h, a value of 19% was reached. Using the images obtained at 10 min after injection, a region to region comparison of the original and corrected HM-PAO images to the xenon-133 regional cerebral blood flow (rCBF) images was performed. Forty-four patients with stroke, epilepsy, dementia, basal ganglia disease, and tumors and control subjects were included in this comparison. The algorithm proposed by Lassen et al. was used to correct the original images for back diffusion of tracer (brain to blood); a good correlation very close to the line of identity between the corrected HM-PAO and xenon-133 data was obtained when using a conversion/clearance ratio of 1.5 and when the noninvolved hemisphere was used as a reference region (r = 0.86, p less than 0.0001). Serial arterial and cerebral venous blood sampling was performed over 10 min following i.v. injection of HM-PAO in six patients. An overall brain retention fraction of 0.37 +/- 0.03 (mean +/- SEM) was calculated from the data. An average CBF of 0.62 +/- 0.12 ml/g/min was determined on the basis of the Fick principle; this compared to a value of 0.59 +/- 0.09 ml/g/min (mean +/- SEM) measured by the xenon-133 inhalation method. The two sets of CBF values correlated linearly with a correlation coefficient of 0.97 (p less than 0.01). Inserting the average CBF value for the hemisphere as measured by the Fick principle into the algorithm described by Lassen et al. yields absolute r

  1. Quantitative measurements of cerebral blood flow using SPECT and (/sup 99m/Tc)-d,l-HM-PAO compared to xenon-133

    SciTech Connect

    Andersen, A.R.; Friberg, H.H.; Schmidt, J.F.; Hasselbalch, S.G.

    1988-12-01

    The uptake and retention in a 2 cm thick brain section was recorded serially by SPECT after i.v. injection of (99mTc)-d,l-HM-PAO (HM-PAO). In 16 patients, the fraction of the administered dose retained by the brain was 5.2 +/- 1%, showing a peak after 40-50s, then decreasing by 10% within the first 10 min and then by only 0.4% per hour. The image contrast was measured in each patient as the regional hemispheric asymmetry difference in percent of the highest value of the two regions. It decreased from 31% at 30-40 s to 25% at 10 min. At 24 h, a value of 19% was reached. Using the images obtained at 10 min after injection, a region to region comparison of the original and corrected HM-PAO images to the xenon-133 regional cerebral blood flow (rCBF) images was performed. Forty-four patients with stroke, epilepsy, dementia, basal ganglia disease, and tumors and control subjects were included in this comparison. The algorithm proposed by Lassen et al. was used to correct the original images for back diffusion of tracer (brain to blood); a good correlation very close to the line of identity between the corrected HM-PAO and xenon-133 data was obtained when using a conversion/clearance ratio of 1.5 and when the noninvolved hemisphere was used as a reference region (r = 0.86, p less than 0.0001). Serial arterial and cerebral venous blood sampling was performed over 10 min following i.v. injection of HM-PAO in six patients. An overall brain retention fraction of 0.37 +/- 0.03 (mean +/- SEM) was calculated from the data. An average CBF of 0.62 +/- 0.12 ml/g/min was determined on the basis of the Fick principle; this compared to a value of 0.59 +/- 0.09 ml/g/min (mean +/- SEM) measured by the xenon-133 inhalation method. The two sets of CBF values correlated linearly with a correlation coefficient of 0.97 (p less than 0.01).

  2. Quantitative objective assessment of peripheral nociceptive C fibre function.

    PubMed Central

    Parkhouse, N; Le Quesne, P M

    1988-01-01

    A technique is described for the quantitative assessment of peripheral nociceptive C fibre function by measurement of the axon reflex flare. Acetylcholine, introduced by electrophoresis, is used to stimulate a ring of nociceptive C fibre endings at the centre of which the increase in blood flow is measured with a laser Doppler flowmeter. This flare (neurogenic vasodilatation) has been compared with mechanically or chemically stimulated non-neurogenic cutaneous vasodilation. The flare is abolished by local anaesthetic and is absent in denervated skin. The flare has been measured on the sole of the foot of 96 healthy subjects; its size decreases with age in males, but not in females. Images PMID:3351528

  3. Short Course Introduction to Quantitative Mineral Resource Assessments

    USGS Publications Warehouse

    Singer, Donald A.

    2007-01-01

    This is an abbreviated text supplementing the content of three sets of slides used in a short course that has been presented by the author at several workshops. The slides should be viewed in the order of (1) Introduction and models, (2) Delineation and estimation, and (3) Combining estimates and summary. References cited in the slides are listed at the end of this text. The purpose of the three-part form of mineral resource assessments discussed in the accompanying slides is to make unbiased quantitative assessments in a format needed in decision-support systems so that consequences of alternative courses of action can be examined. The three-part form of mineral resource assessments was developed to assist policy makers evaluate the consequences of alternative courses of action with respect to land use and mineral-resource development. The audience for three-part assessments is a governmental or industrial policy maker, a manager of exploration, a planner of regional development, or similar decision-maker. Some of the tools and models presented here will be useful for selection of exploration sites, but that is a side benefit, not the goal. To provide unbiased information, we recommend the three-part form of mineral resource assessments where general locations of undiscovered deposits are delineated from a deposit type's geologic setting, frequency distributions of tonnages and grades of well-explored deposits serve as models of grades and tonnages of undiscovered deposits, and number of undiscovered deposits are estimated probabilistically by type. The internally consistent descriptive, grade and tonnage, deposit density, and economic models used in the design of the three-part form of assessments reduce the chances of biased estimates of the undiscovered resources. What and why quantitative resource assessments: The kind of assessment recommended here is founded in decision analysis in order to provide a framework for making decisions concerning mineral

  4. A new reconstruction strategy for image improvement in pinhole SPECT.

    PubMed

    Zeniya, Tsutomu; Watabe, Hiroshi; Aoi, Toshiyuki; Kim, Kyeong Min; Teramoto, Noboru; Hayashi, Takuya; Sohlberg, Antti; Kudo, Hiroyuki; Iida, Hidehiro

    2004-08-01

    Pinhole single-photon emission computed tomography (SPECT) is able to provide information on the biodistribution of several radioligands in small laboratory animals, but has limitations associated with non-uniform spatial resolution or axial blurring. We have hypothesised that this blurring is due to incompleteness of the projection data acquired by a single circular pinhole orbit, and have evaluated a new strategy for accurate image reconstruction with better spatial resolution uniformity. A pinhole SPECT system using two circular orbits and a dedicated three-dimensional ordered subsets expectation maximisation (3D-OSEM) reconstruction method were developed. In this system, not the camera but the object rotates, and the two orbits are at 90 degrees and 45 degrees relative to the object's axis. This system satisfies Tuy's condition, and is thus able to provide complete data for 3D pinhole SPECT reconstruction within the whole field of view (FOV). To evaluate this system, a series of experiments was carried out using a multiple-disk phantom filled with 99mTc solution. The feasibility of the proposed method for small animal imaging was tested with a mouse bone study using 99mTc-hydroxymethylene diphosphonate. Feldkamp's filtered back-projection (FBP) method and the 3D-OSEM method were applied to these data sets, and the visual and statistical properties were examined. Axial blurring, which was still visible at the edge of the FOV even after applying the conventional 3D-OSEM instead of FBP for single-orbit data, was not visible after application of 3D-OSEM using two-orbit data. 3D-OSEM using two-orbit data dramatically reduced the resolution non-uniformity and statistical noise, and also demonstrated considerably better image quality in the mouse scan. This system may be of use in quantitative assessment of bio-physiological functions in small animals.

  5. [Quantitative assessment of facial palsy by Moiré topography].

    PubMed

    Inokuchi, I

    1992-05-01

    It is essential to establish an objective and quantitative method for evaluating facial palsy and to measure the extent of paralysis in order to evaluate therapeutic efficacy, determine prognosis, select appropriate treatment and observe the process of recovery. This study utilized Moiré topography, which displays three-dimensional facial symmetry with high precision and is based on light interference theory, to determine the extent of facial palsy in 38 patients (20 men and 18 women) 5 months to 73 years of age. A stereoscopic lattice type Moiré camera (FM3013) was connected to a CCD camera and to the monitoring device for confirming Moiré stripes. Moiré photographs were taken with a thermal imager (FTI-200). The photos were visually and objectively evaluated on the basis of the Moiré pattern and were then input into a personal computer with a digitizer for data processing and analysis. To view the functions of facial nerve branches, five Moiré photographs were taken: at rest, wrinkling the forehead, closing the eyes lightly, blowing out the cheeks and grinning. Results indicated that the number of stripes and their polarization adequately reflected the function of individual facial nerve branches. Thus, a well-defined Moiré pattern could clarify the characteristics of the site and the degree of facial palsy and of recovery from paralysis. It is an analytical method that can be quickly applied and seems especially useful in infants and young children, in whom point-based assessment is difficult. It is possible to quantitatively evaluate facial palsy in terms of the Asymmetry Index (AI), which is 20-25% for severe paralysis, 12-19% for partial paralysis, and 5-10% for an essentially normal condition. However, the numerical value of the AI overlap in all three paralysis categories, indicating that quantitative assessment of paralysis would be difficult. Moiré topography is an excellent method of determining the extent of facial palsy, compensating for the

  6. SPECT assay of radiolabeled monoclonal antibodies. Final performance report, March 1992--November 1995

    SciTech Connect

    Jaszczak, R.J.

    1995-12-01

    Research is described in the following areas: development and evaluation quantitatively of reconstruction algorithms with improved compensations for attenuation, scatter, and geometric collimator response; evaluation of single photon emission computed tomography (SPECT) quantification of iodine 123 and astatine 211; and the development and evaluation of SPECT pinhole imaging for low and medium energy photons.

  7. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  8. Simplified quantification method for in vivo SPECT/CT imaging of asialoglycoprotein receptor with 99mTc-p(VLA-co-VNI) to assess and stage hepatic fibrosis in mice

    PubMed Central

    Zhang, Deliang; Guo, Zhide; Zhang, Pu; Li, Yesen; Su, Xinhui; You, Linyi; Gao, Mengna; Liu, Chang; Wu, Hua; Zhang, Xianzhong

    2016-01-01

    The goal of this study is to develop a noninvasive method of SPECT imaging to quantify and stage liver fibrosis with an Asialoglycoprotein receptor (ASGP-R) targeting tracer—99mTc-p(VLA-co-VNI). ASGP-Rs are well known to specifically express in the mammalian liver. Here, we demonstrated ASGP-R expression decreased in carbon tetrachloride (CCl4)-induced mouse model. ASGP-R expression correlated with liver fibrosis progression. ASGP-R could be a useful marker in the stage of liver fibrosis. Liver uptake value (LUV) derived by SPECT imaging was used to assess liver fibrosis in the CCl4-induced mouse model. LUV = [radioactivity (liver uptake)/radioactivity (injected)] × 100/liver volume. The LUV decreased along with the disease progression. The relationships between LUV and liver hydroxyproline (i.e. collagen), as well as Sirius Red were established and verified. A strong negative linear correlation was found between LUV and hydroxyproline levels (r = −0.83) as well as LUV and Sirius Red quantification (r = −0.83). In conclusion, SPECT imaging with 99mTc-p(VLA-co-VNI) is useful in evaluating and staging liver fibrosis in vivo. PMID:27150943

  9. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  10. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes.

  11. Quantitative Elastography for Cervical Stiffness Assessment during Pregnancy

    PubMed Central

    Fruscalzo, A.; Londero, A. P.; Fröhlich, C.; Möllmann, U.; Schmitz, R.

    2014-01-01

    Aim. Feasibility and reliability of tissue Doppler imaging-(TDI-) based elastography for cervical quantitative stiffness assessment during all three trimesters of pregnancy were evaluated. Materials and Methods. Prospective case-control study including seventy-four patients collected between the 12th and 42nd weeks of gestation. The tissue strain (TS) was measured by two independent operators as natural strain. Intra- and interoperator intraclass correlation coefficient (ICC) agreements were evaluated. Results. TS measurement was always feasible and exhibited a high performance in terms of reliability (intraoperator ICC-agreement = 0.93; interoperator ICC agreement = 0.89 and 0.93 for a single measurement and for the average of two measurements, resp.). Cervical TS showed also a significant correlation with gestational age, cervical length, and parity. Conclusions. TS measurement during pregnancy demonstrated high feasibility and reliability. Furthermore, TS significantly correlated with gestational age, cervical length, and parity. PMID:24734246

  12. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  13. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  14. A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios

    ERIC Educational Resources Information Center

    Hubert, David A.; Lewis, Kati J.

    2014-01-01

    This essay presents the findings of an authentic and holistic assessment, using a random sample of one hundred student General Education ePortfolios, of two of Salt Lake Community College's (SLCC) college-wide learning outcomes: quantitative literacy (QL) and information literacy (IL). Performed by four faculty from biology, humanities, and…

  15. Singular Value Decomposition of Pinhole SPECT Systems.

    PubMed

    Palit, Robin; Kupinski, Matthew A; Barrett, Harrison H; Clarkson, Eric W; Aarsvold, John N; Volokh, Lana; Grobshtein, Yariv

    2009-03-12

    A single photon emission computed tomography (SPECT) imaging system can be modeled by a linear operator H that maps from object space to detector pixels in image space. The singular vectors and singular-value spectra of H provide useful tools for assessing system performance. The number of voxels used to discretize object space and the number of collection angles and pixels used to measure image space make the matrix dimensions H large. As a result, H must be stored sparsely which renders several conventional singular value decomposition (SVD) methods impractical. We used an iterative power methods SVD algorithm (Lanczos) designed to operate on very large sparsely stored matrices to calculate the singular vectors and singular-value spectra for two small animal pinhole SPECT imaging systems: FastSPECT II and M(3)R. The FastSPECT II system consisted of two rings of eight scintillation cameras each. The resulting dimensions of H were 68921 voxels by 97344 detector pixels. The M(3)R system is a four camera system that was reconfigured to measure image space using a single scintillation camera. The resulting dimensions of H were 50864 voxels by 6241 detector pixels. In this paper we present results of the SVD of each system and discuss calculation of the measurement and null space for each system.

  16. Application of three-class ROC analysis to task-based image quality assessment of simultaneous dual-isotope myocardial perfusion SPECT (MPS).

    PubMed

    He, Xin; Song, Xiyun; Frey, Eric C

    2008-11-01

    The diagnosis of cardiac disease using dual-isotope myocardial perfusion SPECT (MPS) is based on the defect status in both stress and rest images, and can be modeled as a three-class task of classifying patients as having no, reversible, or fixed perfusion defects. Simultaneous acquisition protocols for dual-isotope MPS imaging have gained much interest due to their advantages including perfect registration of the (201)Tl and (99m)Tc images in space and time, increased patient comfort, and higher clinical throughput. As a result of simultaneous acquisition, however, crosstalk contamination, where photons emitted by one isotope contribute to the image of the other isotope, degrades image quality. Minimizing the crosstalk is important in obtaining the best possible image quality. One way to minimize the crosstalk is to optimize the injected activity of the two isotopes by considering the three-class nature of the diagnostic problem. To effectively do so, we have previously developed a three-class receiver operating characteristic (ROC) analysis methodology that extends and unifies the decision theoretic, linear discriminant analysis, and psychophysical foundations of binary ROC analysis in a three-class paradigm. In this work, we applied the proposed three-class ROC methodology to the assessment of the image quality of simultaneous dual-isotope MPS imaging techniques and the determination of the optimal injected activity combination. In addition to this application, the rapid development of diagnostic imaging techniques has produced an increasing number of clinical diagnostic tasks that involve not only disease detection, but also disease characterization and are thus multiclass tasks. This paper provides a practical example of the application of the proposed three-class ROC analysis methodology to medical problems.

  17. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    PubMed

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out.

  18. Dermal sensitization quantitative risk assessment (QRA) for fragrance ingredients.

    PubMed

    Api, Anne Marie; Basketter, David A; Cadby, Peter A; Cano, Marie-France; Ellis, Graham; Gerberick, G Frank; Griem, Peter; McNamee, Pauline M; Ryan, Cindy A; Safford, Robert

    2008-10-01

    Based on chemical, cellular, and molecular understanding of dermal sensitization, an exposure-based quantitative risk assessment (QRA) can be conducted to determine safe use levels of fragrance ingredients in different consumer product types. The key steps are: (1) determination of benchmarks (no expected sensitization induction level (NESIL)); (2) application of sensitization assessment factors (SAF); and (3) consumer exposure (CEL) calculation through product use. Using these parameters, an acceptable exposure level (AEL) can be calculated and compared with the CEL. The ratio of AEL to CEL must be favorable to support safe use of the potential skin sensitizer. This ratio must be calculated for the fragrance ingredient in each product type. Based on the Research Institute for Fragrance Materials, Inc. (RIFM) Expert Panel's recommendation, RIFM and the International Fragrance Association (IFRA) have adopted the dermal sensitization QRA approach described in this review for fragrance ingredients identified as potential dermal sensitizers. This now forms the fragrance industry's core strategy for primary prevention of dermal sensitization to these materials in consumer products. This methodology is used to determine global fragrance industry product management practices (IFRA Standards) for fragrance ingredients that are potential dermal sensitizers. This paper describes the principles of the recommended approach, provides detailed review of all the information used in the dermal sensitization QRA approach for fragrance ingredients and presents key conclusions for its use now and refinement in the future.

  19. The potential optical coherence tomography in tooth bleaching quantitative assessment

    NASA Astrophysics Data System (ADS)

    Ni, Y. R.; Guo, Z. Y.; Shu, S. Y.; Zeng, C. C.; Zhong, H. Q.; Chen, B. L.; Liu, Z. M.; Bao, Y.

    2011-12-01

    In this paper, we report the outcomes from a pilot study on using OCT functional imaging method to evaluate and quantify color alteration in the human teeth in vitro. The image formations of the dental tissues without and with treatment 35% hydrogen peroxide were obtained by an OCT system at a 1310 nm central wavelength. One parameter for the quantification of optical properties from OCT measurements is introduced in our study: attenuate coefficient (μ). And the attenuate coefficient have significant decrease ( p < 0.001) in dentine as well as a significant increase ( p < 0.001) in enamel was observed during tooth bleaching process. From the experimental results, it is found that attenuate coefficient could be useful to assess color alteration of the human tooth samples. OCT has a potential to become an effective tool for the assessment tooth bleaching. And our experiment offer a now method to evaluate color change in visible region by quantitative analysis of the infrared region information from OCT.

  20. A quantitative model for assessing community dynamics of pleistocene mammals.

    PubMed

    Lyons, S Kathleen

    2005-06-01

    Previous studies have suggested that species responded individualistically to the climate change of the last glaciation, expanding and contracting their ranges independently. Consequently, many researchers have concluded that community composition is plastic over time. Here I quantitatively assess changes in community composition over broad timescales and assess the effect of range shifts on community composition. Data on Pleistocene mammal assemblages from the FAUNMAP database were divided into four time periods (preglacial, full glacial, postglacial, and modern). Simulation analyses were designed to determine whether the degree of change in community composition is consistent with independent range shifts, given the distribution of range shifts observed. Results indicate that many of the communities examined in the United States were more similar through time than expected if individual range shifts were completely independent. However, in each time transition examined, there were areas of nonanalogue communities. I conducted sensitivity analyses to explore how the results were affected by the assumptions of the null model. Conclusions about changes in mammalian distributions and community composition are robust with respect to the assumptions of the model. Thus, whether because of biotic interactions or because of common environmental requirements, community structure through time is more complex than previously thought.

  1. An Assessment of the Quantitative Literacy of Undergraduate Students

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2016-01-01

    Quantitative literacy (QLT) represents an underlying higher-order construct that accounts for a person's willingness to engage in quantitative situations in everyday life. The purpose of this study is to retest the construct validity of a model of quantitative literacy (Wilkins, 2010). In this model, QLT represents a second-order factor that…

  2. Cerebral Microbleeds: Burden Assessment by Using Quantitative Susceptibility Mapping

    PubMed Central

    Liu, Tian; Surapaneni, Krishna; Lou, Min; Cheng, Liuquan; Spincemaille, Pascal

    2012-01-01

    Purpose: To assess quantitative susceptibility mapping (QSM) for reducing the inconsistency of standard magnetic resonance (MR) imaging sequences in measurements of cerebral microbleed burden. Materials and Methods: This retrospective study was HIPAA compliant and institutional review board approved. Ten patients (5.6%) were selected from among 178 consecutive patients suspected of having experienced a stroke who were imaged with a multiecho gradient-echo sequence at 3.0 T and who had cerebral microbleeds on T2*-weighted images. QSM was performed for various ranges of echo time by using both the magnitude and phase components in the morphology-enabled dipole inversion method. Cerebral microbleed size was measured by two neuroradiologists on QSM images, T2*-weighted images, susceptibility-weighted (SW) images, and R2* maps calculated by using different echo times. The sum of susceptibility over a region containing a cerebral microbleed was also estimated on QSM images as its total susceptibility. Measurement differences were assessed by using the Student t test and the F test; P < .05 was considered to indicate a statistically significant difference. Results: When echo time was increased from approximately 20 to 40 msec, the measured cerebral microbleed volume increased by mean factors of 1.49 ± 0.86 (standard deviation), 1.64 ± 0.84, 2.30 ± 1.20, and 2.30 ± 1.19 for QSM, R2*, T2*-weighted, and SW images, respectively (P < .01). However, the measured total susceptibility with QSM did not show significant change over echo time (P = .31), and the variation was significantly smaller than any of the volume increases (P < .01 for each). Conclusion: The total susceptibility of a cerebral microbleed measured by using QSM is a physical property that is independent of echo time. © RSNA, 2011 PMID:22056688

  3. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  4. Quantitative assessment of neutrophil phagocytosis using flow cytometry.

    PubMed

    Nordenfelt, Pontus

    2014-01-01

    Neutrophils have an incredible ability to find and eradicate intruders such as bacteria and fungi. They do this largely through the process of phagocytosis, where the target is internalized into a phagosome, and eventually destroyed by the hostile phagosomal environment. It is important to study phagocytosis in order to understand how neutrophils interact with various pathogens and how they respond to different stimuli. Here, I describe a method to study neutrophil phagocytosis of bacteria using flow cytometry. The bacteria are fluorescently labeled before being introduced to neutrophils. After phagocytosis, both any remaining extracellular bacteria and neutrophils are labeled using one-step staining before three-color analysis. To assess phagocytosis, first the average time it takes for the neutrophils to internalize all bound bacteria is determined. Experiments are then performed using that time point while varying the bacteria-to-neutrophil ratio for full control of the analysis. Due to the ease with which multiple samples can be analyzed, and the quantitative nature of flow cytometry, this approach is both reproducible and sensitive.

  5. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate.

  6. Quantitative assessment of healthy and reconstructed cleft lip using ultrasonography

    PubMed Central

    Devadiga, Sumana; Desai, Anil Kumar; Joshi, Shamsunder; Gopalakrishnan, K.

    2016-01-01

    Purpose: This study is conducted to investigate the feasibility of echographic imaging of tissue thickness of healthy and reconstructed cleft lip. Design: Prospective study. Materials and Methods: The study was conducted in SDM Craniofacial Unit, Dharwad and was approved by Local Institutional Review Board. A total of 30 patients, age group ranging from 4 to 25 years, of which 15 postoperative unilateral cleft lip constituted the test group. The remaining 15 with no cleft deformities, no gross facial asymmetry, constituted the control group. The thickness of the mucosa, submucosa, muscle and full thickness of the upper lip were measured with the transversal images using ultrasonography at midpoint of philtrum, right and left side philtral ridges and vermillion border, at 1, 3, 6 months interval. Results: There was an increase in muscle thickness at the vermillion border (mean = 6.9 mm) and philtral ridge (5.9 mm). Equal muscle thickness were found between the normal and test group at 6 months follow-up in a relaxed position, which was statistically significant (P = 0.0404). Conclusion: Quantitative assessment of thickness and echo levels of various lip tissues are done with proper echographic calibration. Diagnostic potentials of this method for noninvasive evaluation of cleft lip reconstructions were achieved by this study. PMID:27134448

  7. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. PMID:23994086

  8. Dual-energy micro-CT imaging of pulmonary airway obstruction: correlation with micro-SPECT

    NASA Astrophysics Data System (ADS)

    Badea, C. T.; Befera, N.; Clark, D.; Qi, Y.; Johnson, G. A.

    2014-03-01

    To match recent clinical dual energy (DE) CT studies focusing on the lung, similar developments for DE micro-CT of the rodent lung are required. Our group has been actively engaged in designing pulmonary gating techniques for micro- CT, and has also introduced the first DE micro-CT imaging method of the rodent lung. The aim of this study was to assess the feasibility of DE micro-CT imaging for the evaluation of airway obstruction in mice, and to compare the method with micro single photon emission computed tomography (micro-SPECT) using technetium-99m labeled macroaggregated albumin (99mTc-MAA). The results suggest that the induced pulmonary airway obstruction causes either atelectasis, or air-trapping similar to asthma or chronic bronchitis. Atelectasis could only be detected at early time points in DE micro-CT images, and is associated with a large increase in blood fraction and decrease in air fraction. Air trapping had an opposite effect with larger air fraction and decreased blood fraction shown by DE micro-CT. The decrease in perfusion to the hypoventilated lung (hypoxic vasoconstriction) is also seen in micro-SPECT. The proposed DE micro-CT technique for imaging localized airway obstruction performed well in our evaluation, and provides a higher resolution compared to micro-SPECT. Both DE micro-CT and micro-SPECT provide critical, quantitative lung biomarkers for image-based anatomical and functional information in the small animal. The methods are readily linked to clinical methods allowing direct comparison of preclinical and clinical results.

  9. Quantitative assessment of the effectiveness of a rockfall warning system

    NASA Astrophysics Data System (ADS)

    Bründl, Michael; Sättele, Martina; Krautblatter, Michael; Straub, Daniel

    2016-04-01

    Rockslides and rockfalls can pose high risk to human settlements and traffic infrastructure. In addition to structural mitigation measures like rockfall nets, warning systems are increasingly installed to reduce rockfall risks. Whereas for structural mitigation measures with reducing effects on the spatial extent a structured evaluation method is existing, no or only few approaches to assess the effectiveness for warning systems are known. Especially for higher magnitude rockfalls structural mitigation measures are not effective, and reliable early warning systems will be essential in future. In response to that, we developed a classification and a framework to assess the reliability and effectiveness of early warning systems (Sättele et al, 2015a; 2016). Here, we demonstrate an application for the rockfall warning system installed in Preonzo prior to a major rockfall in May 2012 (Sättele et al., 2015b). We show that it is necessary to design such a warning system as fail-safe construction, which has to incorporate components with low failure probabilities, high redundancy, low warning thresholds, and additional control systems. With a hypothetical probabilistic analysis, we investigate the effect of the risk attitude of decision makers and of the number of sensors on the probability of detecting an event and on initiating a timely evacuation, as well as on related intervention cost. We conclude that it is possible to quantitatively assess the effectiveness of warning systems, which helps to optimize mitigation strategies against rockfall events. References Sättele, M., Bründl, M., and Straub, D.: Reliability and effectiveness of warning systems for natural hazards: concept and application to debris flow warning, Rel. Eng. Syst. Safety, 142, 192-202, 2015a. Sättele, M., Krautblatter, M., Bründl, M., and Straub, D.: Forecasting rock slope failure: How reliable and effective are warning systems?, Landslides, 605, 1-14, 2015b. Sättele, M., Bründl, M., and

  10. Assessment of cardiac function using myocardial perfusion imaging technique on SPECT with 99mTc sestamibi

    NASA Astrophysics Data System (ADS)

    Gani, M. R. A.; Nazir, F.; Pawiro, S. A.; Soejoko, D. S.

    2016-03-01

    Suspicion on coronary heart disease can be confirmed by observing the function of left ventricle cardiac muscle with Myocardial Perfusion Imaging techniques. The function perfusion itself is indicated by the uptake of radiopharmaceutical tracer. The 31 patients were studied undergoing the MPI examination on Gatot Soebroto Hospital using 99mTc-sestamibi radiopharmaceutical with stress and rest conditions. Stress was stimulated by physical exercise or pharmacological agent. After two hours, the patient did rest condition on the same day. The difference of uptake percentage between stress and rest conditions will be used to determine the malfunction of perfusion due to ischemic or infarct. Degradation of cardiac function was determined based on the image-based assessment of five segments of left ventricle cardiac. As a result, 8 (25.8%) patients had normal myocardial perfusion and 11 (35.5%) patients suspected for having partial ischemia. Total ischemia occurred to 8 (25.8%) patients with reversible and irreversible ischemia and the remaining 4 (12.9%) patients for partial infarct with characteristic the percentage of perfusion ≤50%. It is concluded that MPI technique of image-based assessment on uptake percentage difference between stress and rest conditions can be employed to predict abnormal perfusion as complementary information to diagnose the cardiac function.

  11. Is there a place for quantitative risk assessment?

    PubMed Central

    Hall, Eric J

    2013-01-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk–benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  12. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  13. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  14. Quantitative ultrasound (QUS) assessment of tissue properties for Achilles tendons

    NASA Astrophysics Data System (ADS)

    Du, Yi-Chun; Chen, Yung-Fu; Chen, Pei-Jarn; Lin, Yu-Ching; Chen, Tainsong; Lin, Chii-Jeng

    2007-09-01

    Quantitative ultrasound (QUS) techniques have recently been widely applied for the characterization of tissues. For example, they can be used for the quantification of Achilles tendon properties based on the broadband ultrasound attenuation (BUA) and the speed of sound (SOS) when the ultrasound wave passes through the tissues. This study is to develop an integrated system to investigate the properties of Achilles tendons using QUS images from UBIS 5000 (DMS, Montpellier, France) and B-mode ultrasound images from HDI 5000 (ATL, Ultramark, USA). Subjects including young (32 females and 17 males; mean age: 23.7 ± 2.0) and middle-aged groups (8 female and 8 males; mean age: 47.3 ± 8.5 s) were recruited and tested for this study. Only subjects who did not exercise regularly and had no record of tendon injury were studied. The results show that the BUA is significantly higher for the young group (45.2 ± 1.6 dB MHz-1) than the middle-age group (40.5 ± 1.9 dB MHz-1), while the SOS is significantly lower for the young (1601.9 ± 11.2 ms-1) compared to the middle-aged (1624.1 ± 8.7 m s-1). On the other hand, the thicknesses of Achilles tendons for both groups (young: 4.31 ± 0.23 mm; middle age: 4.24 ± 0.23 mm) are very similar. For one patient who had an Achilles tendon lengthening (ATL) surgery, the thickness of the Achilles tendon increased from 4 mm to 4.33 mm after the surgery. In addition, the BUA increased by about 7.2% while the SOS decreased by about 0.6%. In conclusion, noninvasive ultrasonic assessment of Achilles tendons is useful for assisting clinical diagnosis and for the evaluation of a therapeutic regimen.

  15. Quantitative risk assessment of Cryptosporidium species infection in dairy calves.

    PubMed

    Nydam, D V; Mohammed, H O

    2005-11-01

    Cryptosporidium parvum is a zoonotic protozoan that infects many different mammals including cattle and humans. Cryptosporidiosis has become a concern for dairy producers because of the direct losses due to calves not performing well and the potential for environmental contamination with C. parvum. Identifying modifiable control points in the dynamics of infection in dairy herds will help identify management strategies that mitigate its risk. The quantitative risk assessment approach provides estimates of the risk associated with these factors so that cost-effective strategies can be implemented. Using published data from epidemiologic studies and a stochastic approach, we modeled the risk that C. parvum presents to dairy calves in 2 geographic areas: 1) the New York City Watershed (NYCW) in southeastern New York, and 2) the entire United States. The approach focused on 2 possible areas of exposure--the rearing environment and the maternity environment. In addition, we evaluated the contribution of many risk factors (e.g., age, housing, flies) to the end-state (i.e., total) risk to identify areas of intervention to decrease the risk to dairy calves. Expected risks from C. parvum in US dairy herds in rearing and maternity environments were 41.7 and 33.9%, respectively. In the NYCW, the expected risks from C. parvum in the rearing and maternity environments were 0.36 and 0.33%, respectively. In the US scenarios, the immediate environment contributed most of the risk to calves, whereas in the NYCW scenario, it was new calf infection. Therefore, within the NYCW, risk management activities may be focused on preventing new calf infections, whereas in the general US population, cleaning of calf housing would be a good choice for resource allocation. Despite the many assumptions inherent with modeling techniques, its usefulness to quantify the likelihood of risk and identify risk management areas is illustrated.

  16. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  17. Molecular SPECT Imaging: An Overview

    PubMed Central

    Khalil, Magdy M.; Tremoleda, Jordi L.; Bayomy, Tamer B.; Gsell, Willy

    2011-01-01

    Molecular imaging has witnessed a tremendous change over the last decade. Growing interest and emphasis are placed on this specialized technology represented by developing new scanners, pharmaceutical drugs, diagnostic agents, new therapeutic regimens, and ultimately, significant improvement of patient health care. Single photon emission computed tomography (SPECT) and positron emission tomography (PET) have their signature on paving the way to molecular diagnostics and personalized medicine. The former will be the topic of the current paper where the authors address the current position of the molecular SPECT imaging among other imaging techniques, describing strengths and weaknesses, differences between SPECT and PET, and focusing on different SPECT designs and detection systems. Radiopharmaceutical compounds of clinical as well-preclinical interest have also been reviewed. Moreover, the last section covers several application, of μSPECT imaging in many areas of disease detection and diagnosis. PMID:21603240

  18. Qualitative and quantitative procedures for health risk assessment.

    PubMed

    Lohman, P H

    1999-07-16

    Numerous reactive mutagenic electrophiles are present in the environment or are formed in the human body through metabolizing processes. Those electrophiles can directly react with DNA and are considered to be ultimate carcinogens. In the past decades more than 200 in vitro and in vivo genotoxic tests have been described to identify, monitor and characterize the exposure of humans to such agents. When the responses of such genotoxic tests are quantified by a weight-of-evidence analysis, it is found that the intrinsic potency of electrophiles being mutagens does not differ much for the majority of the agents studied. Considering the fact that under normal environmental circumstances human are exposed to low concentration of about a million electrophiles, the relation between exposure to such agents and adverse health effects (e.g., cancer) will become a 'Pandora's box'. For quantitative risk assessment it will be necessary not only to detect whether the agent is genotoxic, but also understand the mechanism of interaction of the agent with the DNA in target cells needs to be taken into account. Examples are given for a limited group of important environmental and carcinogenic agents for which such an approach is feasible. The groups identified are agents that form cross-links with DNA or are mono-alkylating agents that react with base-moieties in the DNA strands. Quantitative hazard ranking of the mutagenic potency of these groups of chemical can be performed and there is ample evidence that such a ranking corresponds with the individual carcinogenic potency of those agents in rodents. Still, in practice, with the exception of certain occupational or accidental exposure situations, these approaches have not be successful in preventing cancer death in the human population. However, this is not only due to the described 'Pandora's box' situation. At least three other factors are described. Firstly, in the industrial world the medical treatment of cancer in patients

  19. Multipinhole SPECT helical scan parameters and imaging volume

    SciTech Connect

    Yao, Rutao Deng, Xiao; Wei, Qingyang; Dai, Tiantian; Ma, Tianyu; Lecomte, Roger

    2015-11-15

    Purpose: The authors developed SPECT imaging capability on an animal PET scanner using a multiple-pinhole collimator and step-and-shoot helical data acquisition protocols. The objective of this work was to determine the preferred helical scan parameters, i.e., the angular and axial step sizes, and the imaging volume, that provide optimal imaging performance. Methods: The authors studied nine helical scan protocols formed by permuting three rotational and three axial step sizes. These step sizes were chosen around the reference values analytically calculated from the estimated spatial resolution of the SPECT system and the Nyquist sampling theorem. The nine helical protocols were evaluated by two figures-of-merit: the sampling completeness percentage (SCP) and the root-mean-square (RMS) resolution. SCP was an analytically calculated numerical index based on projection sampling. RMS resolution was derived from the reconstructed images of a sphere-grid phantom. Results: The RMS resolution results show that (1) the start and end pinhole planes of the helical scheme determine the axial extent of the effective field of view (EFOV), and (2) the diameter of the transverse EFOV is adequately calculated from the geometry of the pinhole opening, since the peripheral region beyond EFOV would introduce projection multiplexing and consequent effects. The RMS resolution results of the nine helical scan schemes show optimal resolution is achieved when the axial step size is the half, and the angular step size is about twice the corresponding values derived from the Nyquist theorem. The SCP results agree in general with that of RMS resolution but are less critical in assessing the effects of helical parameters and EFOV. Conclusions: The authors quantitatively validated the effective FOV of multiple pinhole helical scan protocols and proposed a simple method to calculate optimal helical scan parameters.

  20. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  1. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  2. Accelerated GPU based SPECT Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-01

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  4. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  5. Optimization of the filter parameters in (99m)Tc myocardial perfusion SPECT studies: the formulation of flowchart.

    PubMed

    Shibutani, Takayuki; Onoguchi, Masahisa; Yamada, Tomoki; Kamida, Hiroki; Kunishita, Kohei; Hayashi, Yuuki; Nakajima, Tadashi; Kinuya, Seigo

    2016-06-01

    Myocardial perfusion single photon emission computed tomography (SPECT) is typically subject to a variation in image quality due to the use of different acquisition protocols, image reconstruction parameters and image display settings by each institution. One of the principal image reconstruction parameters is the Butterworth filter cut-off frequency, a parameter strongly affecting the quality of myocardial images. The objective of this study was to formulate a flowchart for the determination of the optimal parameters of the Butterworth filter for filtered back projection (FBP), ordered subset expectation maximization (OSEM) and collimator-detector response compensation OSEM (CDR-OSEM) methods using the evaluation system of the myocardial image based on technical grounds phantom. SPECT studies were acquired for seven simulated defects where the average counts of the normal myocardial components of 45° left anterior oblique projections were approximately 10-120 counts/pixel. These SPECT images were then reconstructed by FBP, OSEM and CDR-OSEM methods. Visual and quantitative assessment of short axis images were performed for the defect and normal parts. Finally, we formulated a flowchart indicating the optimal image processing procedure for SPECT images. Correlation between normal myocardial counts and the optimal cut-off frequency could be represented as a regression expression, which had high or medium coefficient of determination. We formulated the flowchart in order to optimize the image reconstruction parameters based on a comprehensive assessment, which enabled us to perform objectively processing. Furthermore, the usefulness of image reconstruction using the flowchart was demonstrated by a clinical case.

  6. Quantitative assessment of hemadsorption by myxoviruses: virus hemadsorption assay.

    PubMed

    Hahon, N; Booth, J A; Eckert, H L

    1973-04-01

    The standardization and quantitative evaluation of an assay for myxoviruses, based on the enumeration of individual infected clone 1-5C-4 cells manifesting hemadsorption within 24 h of infection, are described. Hemadsorption was detectable earlier than immunofluorescence in infected cells or hemagglutinins in culture medium. The relationship between virus concentration and cells exhibiting hemadsorption was linear. The assay was highly precise, sensitive, and reproducible. PMID:4349248

  7. Detection and assessment of unstable angina using myocardial perfusion imaging: Comparison between technetium-99m sestamibi SPECT and 12-lead electrocardiogram

    SciTech Connect

    Gregoire, J.; Theroux, P. )

    1990-10-16

    Forty-five studies using technetium-99m (Tc-99m) sestamibi single photon emission computed tomography (SPECT) were performed on patients hospitalized for spontaneous chest pain suggestive of myocardial ischemia. The studies were done after an injection during an episode of chest pain and a repeated injection when the patients were free of pain. All patients were hospitalized with a presumed diagnosis of unstable angina, and none had evidence of a previous myocardial infarction. The presence of a perfusion defect observed with Tc-99m sestamibi injected during chest pain had a 96% sensitivity and a 79% specificity for the detection of significant coronary artery disease (stenosis greater than or equal to 50%) on subsequent angiography. When the criterion of a larger perfusion defect during pain compared to absence of pain was used, the sensitivity was 81% and the specificity was 84%. In contrast, transient electrocardiographic ischemic changes during pain had a sensitivity of 35% and a specificity of 68%; electrocardiographic changes during or outside episodes of chest pain had a sensitivity of 65% and a specificity of 63% for the diagnosis. Tc-99m sestamibi SPECT represents a reliable noninvasive diagnostic tool that could aid in the diagnosis of myocardial ischemia in patients with spontaneous chest pain and provide additional information to that provided by the electrocardiogram.

  8. Cardiac AAV9 Gene Delivery Strategies in Adult Canines: Assessment by Long-term Serial SPECT Imaging of Sodium Iodide Symporter Expression

    PubMed Central

    Moulay, Gilles; Ohtani, Tomohito; Ogut, Ozgur; Guenzel, Adam; Behfar, Atta; Zakeri, Rosita; Haines, Philip; Storlie, Jimmy; Bowen, Lorna; Pham, Linh; Kaye, David; Sandhu, Gurpreet; O'Connor, Michael; Russell, Stephen; Redfield, Margaret

    2015-01-01

    Heart failure is a leading cause of morbidity and mortality, and cardiac gene delivery has the potential to provide novel therapeutic approaches. Adeno-associated virus serotype 9 (AAV9) transduces the rodent heart efficiently, but cardiotropism, immune tolerance, and optimal delivery strategies in large animals are unclear. In this study, an AAV9 vector encoding canine sodium iodide symporter (NIS) was administered to adult immunocompetent dogs via epicardial injection, coronary infusion without and with cardiac recirculation, or endocardial injection via a novel catheter with curved needle and both end- and side-holes. As NIS mediates cellular uptake of clinical radioisotopes, expression was tracked by single-photon emission computerized tomography (SPECT) imaging in addition to Western blot and immunohistochemistry. Direct epicardial or endocardial injection resulted in strong cardiac expression, whereas expression after intracoronary infusion or cardiac recirculation was undetectable. A threshold myocardial injection dose that provides robust nonimmunogenic expression was identified. The extent of transmural myocardial expression was greater with the novel catheter versus straight end-hole needle delivery. Furthermore, the authors demonstrate that cardiac NIS reporter gene expression and duration can be quantified using serial noninvasive SPECT imaging up to 1 year after vector administration. These data are relevant to efforts to develop cardiac gene delivery as heart failure therapy. PMID:25915925

  9. Assessment of the severity of partial volume effects and the performance of two template-based correction methods in a SPECT/CT phantom experiment.

    PubMed

    Shcherbinin, S; Celler, A

    2011-08-21

    We investigated the severity of partial volume effects (PVE), which may occur in SPECT/CT studies, and the performance of two template-based correction techniques. A hybrid SPECT/CT system was used to scan a thorax phantom that included lungs, a heart insert and six cylindrical containers of different sizes and activity concentrations. This phantom configuration allowed us to have non-uniform background activity and a combination of spill-in and spill-out effects for several compartments. The reconstruction with corrections for attenuation, scatter and resolution loss but not PVE correction accurately recovered absolute activities in large organs. However, the activities inside segmented 17-120 mL containers were underestimated by 20%-40%. After applying our PVE correction to the data pertaining to six small containers, the accuracy of the recovered total activity improved with errors ranging between 3% and 22% (non-iterative method) and between 5% and 15% (method with an iteratively updated background activity). While the non-iterative template-based algorithm demonstrated slightly better accuracy for cases with less severe PVE than the iterative algorithm, it underperformed in situations with considerable spill out and/or mixture of spill-in and spill-out effects.

  10. Cardiac AAV9 Gene Delivery Strategies in Adult Canines: Assessment by Long-term Serial SPECT Imaging of Sodium Iodide Symporter Expression.

    PubMed

    Moulay, Gilles; Ohtani, Tomohito; Ogut, Ozgur; Guenzel, Adam; Behfar, Atta; Zakeri, Rosita; Haines, Philip; Storlie, Jimmy; Bowen, Lorna; Pham, Linh; Kaye, David; Sandhu, Gurpreet; O'Connor, Michael; Russell, Stephen; Redfield, Margaret

    2015-07-01

    Heart failure is a leading cause of morbidity and mortality, and cardiac gene delivery has the potential to provide novel therapeutic approaches. Adeno-associated virus serotype 9 (AAV9) transduces the rodent heart efficiently, but cardiotropism, immune tolerance, and optimal delivery strategies in large animals are unclear. In this study, an AAV9 vector encoding canine sodium iodide symporter (NIS) was administered to adult immunocompetent dogs via epicardial injection, coronary infusion without and with cardiac recirculation, or endocardial injection via a novel catheter with curved needle and both end- and side-holes. As NIS mediates cellular uptake of clinical radioisotopes, expression was tracked by single-photon emission computerized tomography (SPECT) imaging in addition to Western blot and immunohistochemistry. Direct epicardial or endocardial injection resulted in strong cardiac expression, whereas expression after intracoronary infusion or cardiac recirculation was undetectable. A threshold myocardial injection dose that provides robust nonimmunogenic expression was identified. The extent of transmural myocardial expression was greater with the novel catheter versus straight end-hole needle delivery. Furthermore, the authors demonstrate that cardiac NIS reporter gene expression and duration can be quantified using serial noninvasive SPECT imaging up to 1 year after vector administration. These data are relevant to efforts to develop cardiac gene delivery as heart failure therapy. PMID:25915925

  11. Quantitative Assessment of Countermeasure Efficacy for Long-Term Space Missions

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.

    2000-01-01

    This slide presentation reviews the development of quantitative assessments of the effectiveness of countermeasures (CM) for the effects of space travel on humans for long term space missions. An example of bone mineral density (BMD) is examined to show specific quantitative measures for failure and success.

  12. I Vivo Quantitative Ultrasound Imaging and Scatter Assessments.

    NASA Astrophysics Data System (ADS)

    Lu, Zheng Feng

    There is evidence that "instrument independent" measurements of ultrasonic scattering properties would provide useful diagnostic information that is not available with conventional ultrasound imaging. This dissertation is a continuing effort to test the above hypothesis and to incorporate quantitative ultrasound methods into clinical examinations for early detection of diffuse liver disease. A well-established reference phantom method was employed to construct quantitative ultrasound images of tissue in vivo. The method was verified by extensive phantom tests. A new method was developed to measure the effective attenuation coefficient of the body wall. The method relates the slope of the difference between the echo signal power spectrum from a uniform region distal to the body wall and the echo signal power spectrum from a reference phantom to the body wall attenuation. The accuracy obtained from phantom tests suggests further studies with animal experiments. Clinically, thirty-five healthy subjects and sixteen patients with diffuse liver disease were studied by these quantitative ultrasound methods. The average attenuation coefficient in normals agreed with previous investigators' results; in vivo backscatter coefficients agreed with the results from normals measured by O'Donnell. Strong discriminating power (p < 0.001) was found for both attenuation and backscatter coefficients between fatty livers and normals; a significant difference (p < 0.01) was observed in the backscatter coefficient but not in the attenuation coefficient between cirrhotic livers and normals. An in vivo animal model of steroid hepatopathy was used to investigate the system sensitivity in detecting early changes in canine liver resulting from corticosteroid administration. The average attenuation coefficient slope increased from 0.7 dB/cm/MHz in controls to 0.82 dB/cm/MHz (at 6 MHz) in treated animals on day 14 into the treatment, and the backscatter coefficient was 26times 10^{ -4}cm^{-1}sr

  13. Quantitative phylogenetic assessment of microbial communities indiverse environments

    SciTech Connect

    von Mering, C.; Hugenholtz, P.; Raes, J.; Tringe, S.G.; Doerks,T.; Jensen, L.J.; Ward, N.; Bork, P.

    2007-01-01

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. Here, we use a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative and accurate picture of community composition than traditional rRNA-based approaches using polymerase chain reaction (PCR). By mapping marker genes from four diverse environmental data sets onto a reference species phylogeny, we show that certain communities evolve faster than others, determine preferred habitats for entire microbial clades, and provide evidence that such habitat preferences are often remarkably stable over time.

  14. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  15. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  16. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  17. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  18. Assessment of metabolic bone diseases by quantitative computed tomography

    SciTech Connect

    Richardson, M.L.; Genant, H.K.; Cann, C.E.; Ettinger, B.; Gordan, G.S.; Kolb, F.O.; Reiser, U.J.

    1985-05-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid- induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements.

  19. New approach to quantitative angiographic assessment after stent implantation.

    PubMed

    Reimers, B; Di Mario, C; Di Francesco, L; Moussa, I; Blengino, S; Martini, G; Reiber, J H; Colombo, A

    1997-04-01

    The new generation quantitative angiographic systems apply the interpolated technique to calculate the reference diameter at the site of the stenosis by integrating measurements of the segments proximal and distal to the stenosis. After stent implantation these measurements can be misleading as the treated segment, which is frequently larger than the adjacent not stented segments, is included in the measurements. The consequence is an overestimation of the reference diameter and the residual diameter stenosis. The present study was performed to compare this conventional technique of measurement with a new method which excludes the stented segment for the calculation of the reference diameter. Fifty-two lesions treated with poorly radiopaque stents (56% Palmaz-Schatz, 28% NIR, 10% Gianturco-Roubin, 6% Wallstent) expanded at high pressure (> = or 16 atm) were analyzed according to the conventional and stent excluded method. After stent implantation the reference diameter was 3.39 +/- 0.48 mm with conventional measurements and 3.02 +/- 0.45 mm with the stent excluded method (P < 0.05). The corresponding % diameter stenosis was 13 +/- 9 for the conventional technique and 1 +/- 13 for the stent excluded analysis (P < 0.05). The new approach to quantitative coronary analysis after stenting provides higher accuracy in reference diameter calculations and allows a more appropriate matching of stented segments with adjacent normal segments.

  20. Recent developments and future prospects of SPECT myocardial perfusion imaging.

    PubMed

    Zaman, Maseeh Uz; Hashmi, Ibrahim; Fatima, Nosheen

    2010-10-01

    Myocardial perfusion SPECT imaging is the most commonly performed functional imaging for assessment of coronary artery disease. High diagnostic accuracy and incremental prognostic value are the major benefits while suboptimal spatial resolution and significant radiation exposure are the main limitations. Its ability to detect hemodynamic significance of lesions seen on multidetector CT angiogram (MDCTA) has paved the path for a successful marriage between anatomical and functional imaging modalities in the form of hybrid SPECT/MDCTA system. In recent years, there have been enormous efforts by industry and academia to develop new SPECT imaging systems with better sensitivity, resolution, compact design and new reconstruction algorithms with ability to improve image quality and resolution. Furthermore, expected arrival of Tc-99m-labeled deoxyglucose in next few years would further strengthen the role of SPECT in imaging hibernating myocardium. In view of these developments, it seems that SPECT would enjoy its pivotal role in spite of major threat to be replaced by fluorine-18-labeled positron emission tomography perfusion and glucose metabolism imaging agents. PMID:20652774

  1. Quantitative Assessment of Neurite Outgrowth in PC12 Cells

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity. In order to identify potential developmental neurotoxicants, assessment of critical neurodevelopmental processes such as neuronal differenti...

  2. Quantitative risk assessment: an emerging tool for emerging foodborne pathogens.

    PubMed Central

    Lammerding, A. M.; Paoli, G. M.

    1997-01-01

    New challenges to the safety of the food supply require new strategies for evaluating and managing food safety risks. Changes in pathogens, food preparation, distribution, and consumption, and population immunity have the potential to adversely affect human health. Risk assessment offers a framework for predicting the impact of changes and trends on the provision of safe food. Risk assessment models facilitate the evaluation of active or passive changes in how foods are produced, processed, distributed, and consumed. PMID:9366601

  3. Cardiac ⁹⁹mTc sestamibi SPECT and ¹⁸F FDG PET as viability markers in Takotsubo cardiomyopathy.

    PubMed

    Christensen, Thomas Emil; Bang, Lia Evi; Holmvang, Lene; Ghotbi, Adam Ali; Lassen, Martin Lyngby; Andersen, Flemming; Ihlemann, Nikolaj; Andersson, Hedvig; Grande, Peer; Kjaer, Andreas; Hasbak, Philip

    2014-10-01

    In patients with heart failure (HF) due to coronary disease, a combined evaluation of perfusion and glucose metabolism by cardiac single photon emission computed tomography (SPECT)/positron emission tomography (PET) can be used to distinguish viable from non-viable myocardium, and current guidelines recommend cardiac SPECT and fluorodeoxyglucose (FDG) PET for viability assessment. Takotsubo cardiomyopathy (TTC) is a disease characterized by acute but reversible HF leaving no scarring. To explore how robust the semi-quantitative viability criteria used in cardiac SPECT and FDG PET stands their ground in a population with TTC. From 1 September 2009 to 1 October 2012, 24 patients suspected of TTC were enrolled in a multimodality cardiac imaging research project. Echocardiography, (99m)Tc SPECT, and (18)F FDG PET were performed during the acute admission and at follow-up 4 months later. Nineteen patients had a final diagnosis of TTC consistent with Mayo Clinic Diagnostic Criteria. Three of these patients were excluded from further analysis, since wall motion abnormalities were not persistent at the time of nuclear imaging. The remaining sixteen patients exhibited a distinct pattern with HF, "apical ballooning" and a perfusion-metabolism defect in the midventricular/apical region. When viability criteria were applied, they identified significant scarring/limited hibernation in the akinetic part of the left ventricle. However, full recovery was found in all TTC patients on follow-up. Using the current guideline-endorsed viability criteria for semiquantitative cardiac SPECT and FDG PET, these modalities failed to demonstrate the presence of viability in the acute state of TTC.

  4. Quantitative criteria for assessment of gamma-ray imager performance

    NASA Astrophysics Data System (ADS)

    Gottesman, Steve; Keller, Kristi; Malik, Hans

    2015-08-01

    In recent years gamma ray imagers such as the GammaCamTM and Polaris have demonstrated good imaging performance in the field. Imager performance is often summarized as "resolution", either angular, or spatial at some distance from the imager, however the definition of resolution is not always related to the ability to image an object. It is difficult to quantitatively compare imagers without a common definition of image quality. This paper examines three categories of definition: point source; line source; and area source. It discusses the details of those definitions and which ones are more relevant for different situations. Metrics such as Full Width Half Maximum (FWHM), variations on the Rayleigh criterion, and some analogous to National Imagery Interpretability Rating Scale (NIIRS) are discussed. The performance against these metrics is evaluated for a high resolution coded aperture imager modeled using Monte Carlo N-Particle (MCNP), and for a medium resolution imager measured in the lab.

  5. A quantitative assessment of Arctic shipping in 2010–2014

    NASA Astrophysics Data System (ADS)

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-08-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far.

  6. Peri-infarct ischaemia assessed by cardiovascular MRI: comparison with quantitative perfusion single photon emission CT imaging

    PubMed Central

    Cochet, H; Bullier, E; Ragot, C; Gilbert, S H; Pucheu, Y; Laurent, F; Coste, P; Bordenave, L; Montaudon, M

    2014-01-01

    Objective: To develop a new method for the cardiac MR (CMR) quantification of peri-infarct ischaemia using fused perfusion and delayed–enhanced images and to evaluate this method using quantitative single photon emission CT (SPECT) imaging as a reference. Methods: 40 patients presenting with peri-infarct ischaemia on a routine stress 99mTc-SPECT imaging were recruited. Within 8 days of the SPECT study, myocardial perfusion was evaluated using stress adenosine CMR. Using fused perfusion and delayed–enhanced images, peri-infarct ischaemia was quantified as the percentage of myocardium with stress-induced perfusion defect that was adjacent to and larger than a scar. This parameter was compared with both the percent myocardium ischaemia (SD%) and the ischaemic total perfusion deficit (TPD). The diagnostic performance of CMR in detection of significant coronary artery stenosis (of ≥70%) was also determined. Results: On SPECT imaging, in addition to peri-infarct ischaemia, reversible perfusion abnormalities were detected in a remote zone in seven patients. In the 33 patients presenting with only peri-infarct ischaemia, the agreement between CMR peri-infarct ischaemia and both SD% and ischaemic TPD was excellent [intraclass coefficient of correlation (ICC) = 0.969 and ICC = 0.877, respectively]. CMR-defined peri-infarct ischaemia for the detection of a significant coronary artery stenosis showed an areas under receiver–operating characteristic curve of 0.856 (95% confidence interval, 0.680–0.939). The best cut-off value was 8.1% and allowed a 72% sensitivity, 96% specificity, 60% negative predictive value and 97% positive predictive value. Conclusion: This proof-of-concept study shows that CMR imaging has the potential as a test for quantification of peri-infarct ischaemia. Advances in knowledge: This study demonstrates the proof of concept of a commonly known intuitive idea, that is, evaluating the peri-infarct ischaemic burden by subtracting delayed

  7. Quantitative Assessment of Faculty Workloads. ASHE 1984 Annual Meeting Paper.

    ERIC Educational Resources Information Center

    Shull, H. Eugene

    A system of measuring faculty workloads consistently and objectively has been devised and successfully applied at Pennsylvania State University's Behrend College. Its value is greatest in assessing and balancing the diverse faculty assignments within interdisciplinary and heterogeneous administrative units. It permits a legitimate comparison of…

  8. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  9. Quantitative Assessments of Sensitivity to Reinforcement Contingencies in Mental Retardation.

    ERIC Educational Resources Information Center

    Dube, William V.; McIlvane, William J.

    2002-01-01

    Sensitivity to reinforcement contingencies was examined in six individuals with mental retardation using a concurrent operants procedure in the context of a computer game. Results included individual differences in sensitivity and differential sensitivity to rate and magnitude variation. Results suggest that comprehensive assessments of potential…

  10. Developing a Quantitative Tool for Sustainability Assessment of HEIs

    ERIC Educational Resources Information Center

    Waheed, Bushra; Khan, Faisal I.; Veitch, Brian

    2011-01-01

    Purpose: Implementation of a sustainability paradigm demands new choices and innovative ways of thinking. The main objective of this paper is to provide a meaningful sustainability assessment tool for make informed decisions, which is applied to higher education institutions (HEIs). Design/methodology/approach: The objective is achieved by…

  11. A quantitative assessment of Arctic shipping in 2010-2014.

    PubMed

    Eguíluz, Victor M; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M

    2016-01-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011-2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far. PMID:27477878

  12. Quantitative assessment of rabbit alveolar macrophage function by chemiluminescence

    SciTech Connect

    Brennan, P.C.; Kirchner, F.R.

    1985-08-01

    Rabbit alveolar macrophages (RAM) were cultured for 24 hr with concentrations ranging from 3 to 12 ..mu..g/ml of vanadium oxide (V/sub 2/O/sub 5/), a known cytotoxic agent, or with high-molecular-weight organic by-products from coal gasification processes. After culture the cells were harvested and tested for functional capacity using three types of indicators: (1) luminol-amplified chemiluminescence (CL), which quantitatively detects photon emission due to respiratory burst activity measured in a newly designed instrument with standardized reagents; (2) the reduction of nitro blue tetrazolium-saturated polyacrylamide beads, a semiquantitative measure of respiratory burst activity; and (3) phagocytic efficiency, defined as percentage of cells incorporating immunoglobulin-coated polyacrylamide beads. Chemiluminescence declined linearly with increasing concentrations of V/sub 2/O/sub 5/ over the dose range tested. Dye reduction and phagocytic efficiency similarly decreased with increasing V/sub 2/O/sub 5/ concentration, but were less sensitive indicators of functional impairment than CL as measured by the amount required to reduce the response to 50% of untreated cells. The effect of coal gasification condensates on RAM function varied, but in general these test also indicated that the CL response was the most sensitive indicator.

  13. A quantitative assessment of Arctic shipping in 2010–2014

    PubMed Central

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-01-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far. PMID:27477878

  14. Quantitative Assessment of Parametric Uncertainty in Northern Hemisphere PAH Concentrations.

    PubMed

    Thackray, Colin P; Friedman, Carey L; Zhang, Yanxu; Selin, Noelle E

    2015-08-01

    We quantitatively examine the relative importance of uncertainty in emissions and physicochemical properties (including reaction rate constants) to Northern Hemisphere (NH) and Arctic polycyclic aromatic hydrocarbon (PAH) concentrations, using a computationally efficient numerical uncertainty technique applied to the global-scale chemical transport model GEOS-Chem. Using polynomial chaos (PC) methods, we propagate uncertainties in physicochemical properties and emissions for the PAHs benzo[a]pyrene, pyrene and phenanthrene to simulated spatially resolved concentration uncertainties. We find that the leading contributors to parametric uncertainty in simulated concentrations are the black carbon-air partition coefficient and oxidation rate constant for benzo[a]pyrene, and the oxidation rate constants for phenanthrene and pyrene. NH geometric average concentrations are more sensitive to uncertainty in the atmospheric lifetime than to emissions rate. We use the PC expansions and measurement data to constrain parameter uncertainty distributions to observations. This narrows a priori parameter uncertainty distributions for phenanthrene and pyrene, and leads to higher values for OH oxidation rate constants and lower values for European PHE emission rates.

  15. Monte Carlo simulation of PET and SPECT imaging of {sup 90}Y

    SciTech Connect

    Takahashi, Akihiko Sasaki, Masayuki; Himuro, Kazuhiko; Yamashita, Yasuo; Komiya, Isao; Baba, Shingo

    2015-04-15

    Purpose: Yittrium-90 ({sup 90}Y) is traditionally thought of as a pure beta emitter, and is used in targeted radionuclide therapy, with imaging performed using bremsstrahlung single-photon emission computed tomography (SPECT). However, because {sup 90}Y also emits positrons through internal pair production with a very small branching ratio, positron emission tomography (PET) imaging is also available. Because of the insufficient image quality of {sup 90}Y bremsstrahlung SPECT, PET imaging has been suggested as an alternative. In this paper, the authors present the Monte Carlo-based simulation–reconstruction framework for {sup 90}Y to comprehensively analyze the PET and SPECT imaging techniques and to quantitatively consider the disadvantages associated with them. Methods: Our PET and SPECT simulation modules were developed using Monte Carlo simulation of Electrons and Photons (MCEP), developed by Dr. S. Uehara. PET code (MCEP-PET) generates a sinogram, and reconstructs the tomography image using a time-of-flight ordered subset expectation maximization (TOF-OSEM) algorithm with attenuation compensation. To evaluate MCEP-PET, simulated results of {sup 18}F PET imaging were compared with the experimental results. The results confirmed that MCEP-PET can simulate the experimental results very well. The SPECT code (MCEP-SPECT) models the collimator and NaI detector system, and generates the projection images and projection data. To save the computational time, the authors adopt the prerecorded {sup 90}Y bremsstrahlung photon data calculated by MCEP. The projection data are also reconstructed using the OSEM algorithm. The authors simulated PET and SPECT images of a water phantom containing six hot spheres filled with different concentrations of {sup 90}Y without background activity. The amount of activity was 163 MBq, with an acquisition time of 40 min. Results: The simulated {sup 90}Y-PET image accurately simulated the experimental results. PET image is visually

  16. Measuring astatine-211 distributions with SPECT.

    PubMed

    Turkington, T G; Zalutsky, M R; Jaszczak, R J; Garg, P K; Vaidyanathan, G; Coleman, R E

    1993-08-01

    We have investigated standard SPECT techniques (rotating gamma cameras, multi-hole collimators, and filtered backprojection reconstruction) for imaging astatine-211 distributions. Since 211At emits alpha particles, this nuclide has potential for use in radiotherapy. The capability of imaging this nuclide would allow in vivo evaluation of the distribution and stability of potential 211At-labelled radiotherapeutic agents. 211At decay yields x-rays in the 77-92 keV range in addition to 500-900 keV gamma rays. This study evaluates the feasibility of SPECT imaging using the x-ray emissions of 211At. We have evaluated several collimators, with the determination that the medium-energy collimators we used are suitable, with 7% penetration (uncollimated counts versus collimated counts). Several phantoms were imaged and attenuation coefficients were measured (narrow-beam mu = 0.182 cm-1 for 77-80 keV x-rays in water). Reconstructed images demonstrate qualitative capabilities and a simple quantitative study demonstrates good correction for attenuation and scatter (approximately 10% error), at low count densities, at least for the phantom geometries used in this study.

  17. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  18. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    PubMed

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level. PMID:15376934

  19. Myocardial perfusion imaging with technetium-99m sestamibi SPECT in the evaluation of coronary artery disease

    SciTech Connect

    Maddahi, J.; Kiat, H.; Van Train, K.F.; Prigent, F.; Friedman, J.; Garcia, E.V.; Alazraki, N.; DePuey, E.G.; Nichols, K.; Berman, D.S. )

    1990-10-16

    Technetium-99m (Tc-99m) sestamibi is a new myocardial perfusion imaging agent that offers significant advantages over thallium-201 (Tl-201) for myocardial perfusion imaging. The results of the current clinical trials using acquisition and processing parameters similar to those for Tl-201 and a separate (2-day) injection protocol suggest that Tc-99m sestamibi and Tl-201 single photon emission computed tomography (SPECT) provide similar information with respect to detection of myocardial perfusion defects, assessment of the pattern of defect reversibility, overall detection of coronary artery disease (CAD) and detection of disease in individual coronary arteries. Tc-99m sestamibi SPECT appears to be superior to Tc-99m sestamibi planar imaging because the former provides a higher defect contrast and is more accurate for detection of disease in individual coronary arteries. Research is currently under way addressing optimization of acquisition and processing of Tc-99m sestamibi studies and development of quantitative algorithms for detection and localization of CAD and sizing of transmural and nontransmural myocardial perfusion defects. It is expected that with the implementation of the final results of these new developments, further significant improvement in image quality will be attained, which in turn will further increase the confidence in image interpretation. Development of algorithms for analysis of end-diastolic myocardial images may allow better evaluation of small and nontransmural myocardial defects. Furthermore, gated studies may provide valuable information with respect to regional myocardial wall motion and wall thickening. With the implementation of algorithms for attenuation and scatter correction, the overall specificity of Tc-99m sestamibi SPECT should improve significantly. 32 references.

  20. A quantitative assessment of results with the Angelchik prosthesis.

    PubMed Central

    Wyllie, J. H.; Edwards, D. A.

    1985-01-01

    The Angelchik antireflux prosthesis was assessed in 15 unpromising patients, 12 of whom had peptic strictures of the oesophagus. Radiological techniques were used to show the effect of the device on gastro-oesophageal reflux, and on the bore and length of strictures. Twelve months later (range 6-24) most patients were well satisfied with the operation, and all considered it had been worthwhile; there was radiological evidence of reduction in reflux and remission of strictures. The device never surrounded the oesophageal sphincter; in all but 1 case it encircled a tube of stomach. Images Fig. 5 Fig. 6 PMID:4037629

  1. Aliasing as noise - A quantitative and qualitative assessment

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Hazra, Rajeeb

    1993-01-01

    We present a model-based argument that, for the purposes of system design and digital image processing, aliasing should be treated as signal-dependent additive noise. By using a computational simulation based on this model, we process (high resolution images of) natural scenes in a way which enables the 'aliased component' of the reconstructed image to be isolated unambiguously. We demonstrate that our model-based argument leads naturally to system design metrics which quantify the extent of aliasing. And, by illustrating several aliased component images, we provide a qualitative assessment of aliasing as noise.

  2. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  3. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  4. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  5. New Trends in Quantitative Assessment of the Corneal Barrier Function

    PubMed Central

    Guimerà, Anton; Illa, Xavi; Traver, Estefania; Herrero, Carmen; Maldonado, Miguel J.; Villa, Rosa

    2014-01-01

    The cornea is a very particular tissue due to its transparency and its barrier function as it has to resist against the daily insults of the external environment. In addition, maintenance of this barrier function is of crucial importance to ensure a correct corneal homeostasis. Here, the corneal epithelial permeability has been assessed in vivo by means of non-invasive tetrapolar impedance measurements, taking advantage of the huge impact of the ion fluxes in the passive electrical properties of living tissues. This has been possible by using a flexible sensor based in SU-8 photoresist. In this work, a further analysis focused on the validation of the presented sensor is performed by monitoring the healing process of corneas that were previously wounded. The obtained impedance measurements have been compared with the damaged area observed in corneal fluorescein staining images. The successful results confirm the feasibility of this novel method, as it represents a more sensitive in vivo and non-invasive test to assess low alterations of the epithelial permeability. Then, it could be used as an excellent complement to the fluorescein staining image evaluation. PMID:24841249

  6. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  7. SPECT assay of radiolabeled monoclonal antibodies. Third yearly progress report, September 1991--February 1992

    SciTech Connect

    Jaszczak, R.J.

    1992-02-01

    The accurate determination of the biodistribution of radiolabeled monoclonal antibodies (MoAbs) is important for calculation of dosimetry and evaluation of pharmacokinetic variables such as antibody dose and route of administration. The hypothesis of this application is that the biodistribution of radiolabeled monoclonal antibodies (MoAbs) can be quantitatively determined using single photon emission computed tomography (SPECT). The major thrusts during the third year include the continued development and evaluation of improved 3D SPECT acquisition and reconstruction approaches to improve quantitative imaging of radiolabeled monoclonal antibodies (MoAbs), and the implementation and evaluation of algorithms to register serial SPECT image data sets, or to register 3D SPECT images with 3D image data sets acquired from positron emission tomography (PEI) and magnetic resonance images (MRI). The research has involved the investigation of statistical models and iterative reconstruction algorithms that accurately account for the physical characteristics of the SPECT acquisition system. It is our belief that SPECT quantification can be improved by accurately modeling the physical processes such as attenuation, scatter, geometric collimator response, and other factors that affect the measured projection data.

  8. Quantitative cancer risk assessment for dioxins using an occupational cohort.

    PubMed Central

    Becher, H; Steindorf, K; Flesch-Janys, D

    1998-01-01

    We consider a cohort of 1189 male German factory workers (production period 1952-1984) who produced phenoxy herbicides and were exposed to dioxins. Follow-up until the end of 1992 yielded a significantly increased standardized mortality ratio (SMR) for total cancer (SMR 141; 95% confidence interval 117-168). 2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) concentrations up to 2252 ng/kg body fat were measured in 275 cohort members. Other higher chlorinated dioxins and furans also occurred in high concentrations. For quantitative analysis, the integrated TCDD concentration over time was used as an exposure variable, which was calculated using results from half-life estimation for TCDD and workplace history data. The other congeners were expressed as toxic equivalency (TEQ) and compared to TCDD using international toxic equivalency factors. Poisson and Cox regressions were used to investigate dose-response relationships. Various covariables (e.g., exposure to beta-hexachlorocyclohexane, employment characteristics) were considered. In all analyses, TCDD and TEQ exposures were related to total cancer mortality. The power model yielded a relative risk (RR) function RR(x) = (1 + 0.17x)0.326 for TCDD (in microgram/kilogram blood fat x years)--only a slightly better fit than a linear RR function--and RR(x) = (1 + 0.023x)0.795 for TEQ. Investigations on latency did not show strong effects. Different methods were applied to investigate the robustness of the results and yielded almost identical results. The results were used for unit risk estimation. Taking into account different sources of variation, an interval of 10(-3) to 10(-2) for the additional lifetime cancer risk under a daily intake of 1 pg TCDD/kg body weight/day was estimated from the dose-response models considered. Uncertainties regarding the dose-response function remain. These data did not indicate the existence of a threshold value; however, such a value cannot be excluded with any certainty. PMID:9599714

  9. SPECT assay of radiolabeled monoclonal antibodies. Comprehensive progress report, September 1989--February 1992

    SciTech Connect

    Jaszczak, R.J.

    1992-02-01

    The long-term goal of this research project is to develop methods to improve the utility of single photon emission computed tomography (SPECI) to quantify the biodistribution of monoclonal antibodies (MoAbs) labeled with clinically relevant radionuclides ({sup 123}I, {sup 131}I, and {sup 111}In) and with another radionuclide,{sup 211}At, recently used in therapy. We describe here our progress in developing quantitative SPECT methodology for {sup 111}In and {sup 123}I. We have focused our recent research thrusts on the following aspects of SPECT: (1) The development of improved SPECT hardware, such as improved acquisition geometries. (2) The development of better reconstruction methods that provide accurate compensation for the physical factors that affect SPECT quantification. (3) The application of carefully designed simulations and experiments to validate our hardware and software approaches.

  10. Quantitative Assessment of Workload and Stressors in Clinical Radiation Oncology

    SciTech Connect

    Mazur, Lukasz M.; Mosaly, Prithima R.; Jackson, Marianne; Chang, Sha X.; Burkhardt, Katharin Deschesne; Adams, Robert D.; Jones, Ellen L.; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B.

    2012-08-01

    Purpose: Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Methods and Materials: Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). Results: A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045

  11. [The concept of amnesia and quantitative assessment of amnesic disorders].

    PubMed

    Metzler, P; Rudolph, M; Voshage, J; Nickel, B

    1991-06-01

    This article presents first a short historical overview of the different viewpoints concerning psychiatric approaches to define the concept "amnesia" (Ribot, Korsakow, K. Schneider, Bleuler, Bonhoeffer et al.). A generally accepted result is the differentiation between retrograde and anterograde amnesia. Research work of the last two decades has focussed on the experimental investigation of anterograde amnesia, the so-called amnesic syndrome. In this context four main factors responsible for memory performance are distinguished: encoding, retrieval, forgetting and interference. One of the main results of neuropsychological research in amnesia consists in having discovered a set of symptoms or features common to most if not all forms of amnesia. These features appear regardless of etiology and locus of lesion. This set or features is described in detail in the paper. On the basis of these amnesic features a clinical test was developed, the Berliner Amnesie Test (BAT). This standardized test can be used for the assessment from mild up to severe memory disorders.

  12. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping.

  13. SPECT imaging with resolution recovery

    SciTech Connect

    Bronnikov, A. V.

    2011-07-01

    Single-photon emission computed tomography (SPECT) is a method of choice for imaging spatial distributions of radioisotopes. Many applications of this method are found in nuclear industry, medicine, and biomedical research. We study mathematical modeling of a micro-SPECT system by using a point-spread function (PSF) and implement an OSEM-based iterative algorithm for image reconstruction with resolution recovery. Unlike other known implementations of the OSEM algorithm, we apply en efficient computation scheme based on a useful approximation of the PSF, which ensures relatively fast computations. The proposed approach can be applied with the data acquired with any type of collimators, including parallel-beam fan-beam, cone-beam and pinhole collimators. Experimental results obtained with a micro SPECT system demonstrate high efficiency of resolution recovery. (authors)

  14. Quantitative assessment of the retinal microvasculature using optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Chu, Zhongdi; Lin, Jason; Gao, Chen; Xin, Chen; Zhang, Qinqin; Chen, Chieh-Li; Roisman, Luis; Gregori, Giovanni; Rosenfeld, Philip J.; Wang, Ruikang K.

    2016-06-01

    Optical coherence tomography angiography (OCTA) is clinically useful for the qualitative assessment of the macular microvasculature. However, there is a need for comprehensive quantitative tools to help objectively analyze the OCT angiograms. Few studies have reported the use of a single quantitative index to describe vessel density in OCT angiograms. In this study, we introduce a five-index quantitative analysis of OCT angiograms in an attempt to detect and assess vascular abnormalities from multiple perspectives. The indices include vessel area density, vessel skeleton density, vessel diameter index, vessel perimeter index, and vessel complexity index. We show the usefulness of the proposed indices with five illustrative cases. Repeatability is tested on both a healthy case and a stable diseased case, giving interclass coefficients smaller than 0.031. The results demonstrate that our proposed quantitative analysis may be useful as a complement to conventional OCTA for the diagnosis of disease and monitoring of treatment.

  15. Methodological issues in the quantitative assessment of quality of life.

    PubMed

    Panagiotakos, Demosthenes B; Yfantopoulos, John N

    2011-10-01

    The term quality of life can be identified in Aristotle's classical writings of 330 BC. In his Nichomachian ethics he recognises the multiple relationships between happiness, well-being, "eudemonia" and quality of life. Historically the concept of quality of life has undergone various interpretations. It involves personal experience, perceptions and beliefs, attitudes concerning philosophical, cultural, spiritual, psychological, political, and financial aspects of everyday living. Quality of life has been extensively used both as an outcome and an explanatory factor in relation to human health, in various clinical trials, epidemiologic studies and health interview surveys. Because of the variations in the definition of quality of life, both in theory and in practice, there are also a wide range of procedures that are used to assess quality of life. In this paper several methodological issues regarding the tools used to evaluate quality of life is discussed. In summary, the use of components consisted of large number of classes, as well as the use of specific weights for each scale component, and the low-to-moderate inter-correlation level between the components, is evident from simulated and empirical studies.

  16. Validation of a quantitative phosphorus loss assessment tool.

    PubMed

    White, Michael J; Storm, Daniel E; Smolen, Michael D; Busteed, Philip R; Zhang, Hailin; Fox, Garey A

    2014-01-01

    Pasture Phosphorus Management Plus (PPM Plus) is a tool that allows nutrient management and conservation planners to evaluate phosphorus (P) loss from agricultural fields. This tool uses a modified version of the widely used Soil and Water Assessment Tool model with a vastly simplified interface. The development of PPM Plus has been fully described in previous publications; in this article we evaluate the accuracy of PPM Plus using 286 field-years of runoff, sediment, and P validation data from runoff studies at various locations in Oklahoma, Texas, Arkansas, and Georgia. Land uses include pasture, small grains, and row crops with rainfall ranging from 630 to 1390 mm yr, with and without animal manure application. PPM Plus explained 68% of the variability in total P loss, 56% of runoff, and 73% of the variability of sediment yield. An empirical model developed from these data using soil test P, total applied P, slope, and precipitation only accounted for 15% of the variability in total P loss, which implies that a process-based model is required to account for the diversity present in these data. PPM Plus is an easy-to-use conservation planning tool for P loss prediction, which, with modification, could be applicable at the regional and national scales. PMID:25602555

  17. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  18. Validation of a quantitative phosphorus loss assessment tool.

    PubMed

    White, Michael J; Storm, Daniel E; Smolen, Michael D; Busteed, Philip R; Zhang, Hailin; Fox, Garey A

    2014-01-01

    Pasture Phosphorus Management Plus (PPM Plus) is a tool that allows nutrient management and conservation planners to evaluate phosphorus (P) loss from agricultural fields. This tool uses a modified version of the widely used Soil and Water Assessment Tool model with a vastly simplified interface. The development of PPM Plus has been fully described in previous publications; in this article we evaluate the accuracy of PPM Plus using 286 field-years of runoff, sediment, and P validation data from runoff studies at various locations in Oklahoma, Texas, Arkansas, and Georgia. Land uses include pasture, small grains, and row crops with rainfall ranging from 630 to 1390 mm yr, with and without animal manure application. PPM Plus explained 68% of the variability in total P loss, 56% of runoff, and 73% of the variability of sediment yield. An empirical model developed from these data using soil test P, total applied P, slope, and precipitation only accounted for 15% of the variability in total P loss, which implies that a process-based model is required to account for the diversity present in these data. PPM Plus is an easy-to-use conservation planning tool for P loss prediction, which, with modification, could be applicable at the regional and national scales.

  19. A Quantitative Measure of Handwriting Dysfluency for Assessing Tardive Dyskinesia

    PubMed Central

    Caligiuri, Michael P.; Teulings, Hans-Leo; Dean, Charles E.; Lohr, James B.

    2015-01-01

    Tardive dyskinesia (TD) is movement disorder commonly associated with chronic exposure to antidopaminergic medications which may be in some cases disfiguring and socially disabling. The consensus from a growing body of research on the incidence and prevalence of TD in the modern era of antipsychotics indicates that this disorder has not disappeared continues to challenge the effective management of psychotic symptoms in patients with schizophrenia. A fundamental component in an effective strategy for managing TD is its reliable and accurate assessment. In the present study, we examined the clinical utility of a brief handwriting dysfluency measure for quantifying TD. Digitized samples of handwritten circles and loops were obtained from 62 psychosis patients with or without TD and from 50 healthy subjects. Two measures of dysfluent pen movements were extracted from each vertical pen stroke, including normalized jerk and the number of acceleration peaks. TD patients exhibited significantly higher dysfluency scores than non-TD patients and controls. Severity of handwriting movement dysfluency was correlated with AIMS severity ratings for some tasks. The procedure yielded high degrees of test-retest reliability. These results suggest that measures of handwriting movement dysfluency may be particularly useful for objectively evaluating the efficacy of pharmacotherapeutic strategies for treating TD. PMID:25679121

  20. Quantitative assessments of ecological impact/recovery in freshwater systems

    SciTech Connect

    Birge, W.J.; Keogh, D.P.; Zuiderveen, J.A.; Robison, W.A.

    1994-12-31

    Long-term studies were under taken to evaluate the fidelity of multi-metric scoring systems, and other means of quantifying effects of chemical stresses on aquatic ecosystems. Integrity of macroinvertebrate communities was assessed using the Rapid Bioassessment Protocol III; Brillouin and Shannon-Weaver diversity indices; judgment based on traditional parameters of species richness, abundance and trophic group assemblages, and cluster analysis. In addition, chemical and toxicological monitoring data and periphyton studies were used in these evaluations. Surveys were performed at 8 or more stations selected for comparable conditions, including upstream reference and downstream recovery areas. In two streams, ecological impact varied from severe to extreme near point-source outfalls and decreased progressively with downstream distance. Station to station scoring with Protocol III and diversity indices correlated well with independent chemical and toxicological evaluations. However, in metal-stressed streams affected by slight to moderate impact, or which were in early recovery, Protocol III scoring and other family-level metrics did not consistently reflect losses in species richness and mean abundance of up to 32% and 75%, respectively. Observations on deformities (e.g., eyespots, gills), selected subfamily and species-level metrics, including ratios of metal sensitive to metal tolerant chironomids, gave greater accuracy in characterizing marginal to moderate perturbations. Observations on fugitive and opportunistic species also were useful.

  1. Qualitative and Quantitative Assessment of Four Marketed Formulations of Brahmi

    PubMed Central

    Saini, Neeti; Mathur, Rajani; Agrawal, S. S.

    2012-01-01

    This study was conducted with the aim to compare two batches each of four popular commercial formulations of Bacopa monnieri (Brahmi), and report, if any, inter-batch variations. The formulations were procured from local market and analyzed for label specifications, uniformity of weight of capsule, identity, purity and strength parameters (total ash content test, acid insoluble ash content, water soluble extractive, alcohol soluble extractive, loss on drying). Bacoside A, one of the pharmacologically active saponin present in B. monnieri, was quantified in all the formulations using UV-spectrophotometer. In addition each formulation was assessed and compared for variation in biological activity using in vitro test for hemolytic activity using human erythrocytes. The results of the study show that there is a wide variation in the quality and content of herbal drugs marketed by different manufacturers. More importantly this study demonstrates that there exists a bigger challenge of batch-to-batch variation in the quality and content of herbal formulations of the same manufacturer. This challenge of providing standardized formulations is being faced by not any one manufacturing house but by all, and may be attributed firstly to, lack of stringent regulations and secondly to high variability in raw material quality. PMID:23204618

  2. Qualitative and quantitative assessment of four marketed formulations of brahmi.

    PubMed

    Saini, Neeti; Mathur, Rajani; Agrawal, S S

    2012-01-01

    This study was conducted with the aim to compare two batches each of four popular commercial formulations of Bacopa monnieri (Brahmi), and report, if any, inter-batch variations. The formulations were procured from local market and analyzed for label specifications, uniformity of weight of capsule, identity, purity and strength parameters (total ash content test, acid insoluble ash content, water soluble extractive, alcohol soluble extractive, loss on drying). Bacoside A, one of the pharmacologically active saponin present in B. monnieri, was quantified in all the formulations using UV-spectrophotometer. In addition each formulation was assessed and compared for variation in biological activity using in vitro test for hemolytic activity using human erythrocytes. The results of the study show that there is a wide variation in the quality and content of herbal drugs marketed by different manufacturers. More importantly this study demonstrates that there exists a bigger challenge of batch-to-batch variation in the quality and content of herbal formulations of the same manufacturer. This challenge of providing standardized formulations is being faced by not any one manufacturing house but by all, and may be attributed firstly to, lack of stringent regulations and secondly to high variability in raw material quality.

  3. Quantitative use of photography in orthognathic outcome assessment.

    PubMed

    Edler, R J; Wertheim, D; Greenhill, D; Jaisinghani, A

    2011-03-01

    This study reports an independent audit of two aspects of orthognathic surgery, namely control of inter-alar width and mandibular outline asymmetry. Measurements were taken from standardized photographs of a consecutive series of 27 patients, using an on-screen digitizing program (IPTool). All patients had undergone bimaxillary osteotomies involving maxillary impaction and/or advancement, by one surgeon, using a cinch suture for nasal width control. Nine-twelve months after surgery, inter-alar width had increased by just 0.08 cm mean (SD 0.3). Four patients showed an increase of just over 2mm, whilst six showed a small reduction. Based on ratios of size (area) and shape (compactness) of the right and left mandibular segments, there was a small overall improvement in mandibular symmetry (0.019 and 0.005 respectively). Whilst in most of the patients the need for surgery was primarily the correction of antero-posterior and vertical discrepancies, five patients with demonstrable asymmetry showed a clear improvement. In three patients whose asymmetry scores were very mild pre-treatment, there was a small, measured increase in asymmetry, but not to a degree that would be clinically noticeable. At a time when 3D imaging is still unavailable to many clinicians, the results of this study suggest that appropriate measurements taken from carefully standardized conventional photographs can provide a valid and objective means of assessing treatment outcome.

  4. A quantitative measure of handwriting dysfluency for assessing tardive dyskinesia.

    PubMed

    Caligiuri, Michael P; Teulings, Hans-Leo; Dean, Charles E; Lohr, James B

    2015-04-01

    Tardive dyskinesia (TD) is a movement disorder commonly associated with chronic exposure to antidopaminergic medications, which may be in some cases disfiguring and socially disabling. The consensus from a growing body of research on the incidence and prevalence of TD in the modern era of antipsychotics indicates that this disorder has not disappeared continues to challenge the effective management of psychotic symptoms in patients with schizophrenia. A fundamental component in an effective strategy for managing TD is its reliable and accurate assessment. In the present study, we examined the clinical utility of a brief handwriting dysfluency measure for quantifying TD. Digitized samples of handwritten circles and loops were obtained from 62 psychosis patients with or without TD and from 50 healthy subjects. Two measures of dysfluent pen movements were extracted from each vertical pen stroke, including normalized jerk and the number of acceleration peaks. Tardive dyskinesia patients exhibited significantly higher dysfluency scores than non-TD patients and controls. Severity of handwriting movement dysfluency was correlated with Abnormal Involuntary Movement Scale severity ratings for some tasks. The procedure yielded high degrees of test-retest reliability. These results suggest that measures of handwriting movement dysfluency may be particularly useful for objectively evaluating the efficacy of pharmacotherapeutic strategies for treating TD.

  5. Sci—Thur PM: Imaging — 04: An iterative triple energy window (TEW) approach to cross talk correction in quantitative small animal Tc99m and In111 SPECT

    SciTech Connect

    Prior, P; Timmins, R; Wells, R G

    2014-08-15

    Dual isotope SPECT allows simultaneous measurement of two different tracers in vivo. With In111 (emission energies of 171keV and 245keV) and Tc99m (140keV), quantification of Tc99m is degraded by cross talk from the In111 photons that scatter and are detected at an energy corresponding to Tc99m. TEW uses counts recorded in two narrow windows surrounding the Tc99m primary window to estimate scatter. Iterative TEW corrects for the bias introduced into the TEW estimate resulting from un-scattered counts detected in the scatter windows. The contamination in the scatter windows is iteratively estimated and subtracted as a fraction of the scatter-corrected primary window counts. The iterative TEW approach was validated with a small-animal SPECT/CT camera using a 2.5mL plastic container holding thoroughly mixed Tc99m/In111 activity fractions of 0.15, 0.28, 0.52, 0.99, 2.47 and 6.90. Dose calibrator measurements were the gold standard. Uncorrected for scatter, the Tc99m activity was over-estimated by as much as 80%. Unmodified TEW underestimated the Tc99m activity by 13%. With iterative TEW corrections applied in projection space, the Tc99m activity was estimated within 5% of truth across all activity fractions above 0.15. This is an improvement over the non-iterative TEW, which could not sufficiently correct for scatter in the 0.15 and 0.28 phantoms.

  6. A SPECT Camera for Combined MRI and SPECT for Small Animals.

    PubMed

    Meier, D; Wagenaar, D J; Chen, S; Xu, J; Yu, J; Tsui, B M W

    2011-10-01

    We describe an MR-compatible SPECT camera for small animals. The SPECT camera system can be inserted into the bore of a state-of-the-art MRI system and allows researchers to acquire tomographic images from a mouse in-vivo with the MRI and the SPECT acquiring simultaneously. The SPECT system provides functional information, while MRI provides anatomical information. Until today it was impossible to operate conventional SPECT inside the MRI because of mutual interference. The new SPECT technology is based on semiconductor radiation sensors (CZT, ASICs), and it fits into conventional high field MRI systems with a minimum 12-cm bore size. The SPECT camera has an MR-compatible multi-pinhole collimator for mice with a ø25-mm field-of-view. For the work reported here we assembled a prototype SPECT camera system and acquired SPECT and MRI data from radioactive sources and resolution phantoms using the camera outside and inside the MRI.

  7. High Concordance Between Mental Stress–Induced and Adenosine-Induced Myocardial Ischemia Assessed Using SPECT in Heart Failure Patients: Hemodynamic and Biomarker Correlates

    PubMed Central

    Wawrzyniak, Andrew J.; Dilsizian, Vasken; Krantz, David S.; Harris, Kristie M.; Smith, Mark F.; Shankovich, Anthony; Whittaker, Kerry S.; Rodriguez, Gabriel A.; Gottdiener, John; Li, Shuying; Kop, Willem; Gottlieb, Stephen S.

    2016-01-01

    Mental stress can trigger myocardial ischemia, but the prevalence of mental stress–induced ischemia in congestive heart failure (CHF) patients is unknown. We characterized mental stress–induced and adenosine-induced changes in myocardial perfusion and neurohormonal activation in CHF patients with reduced left-ventricular function using SPECT to precisely quantify segment-level myocardial perfusion. Methods Thirty-four coronary artery disease patients (mean age ± SD, 62 ± 10 y) with CHF longer than 3 mo and ejection fraction less than 40% underwent both adenosine and mental stress myocardial perfusion SPECT on consecutive days. Mental stress consisted of anger recall (anger-provoking speech) followed by subtraction of serial sevens. The presence and extent of myocardial ischemia was quantified using the conventional 17-segment model. Results Sixty-eight percent of patients had 1 ischemic segment or more during mental stress and 81% during adenosine. On segment-by-segment analysis, perfusion with mental stress and adenosine were highly correlated. No significant differences were found between any 2 time points for B-type natriuretic peptide, tumor necrosis factor-α, IL-1b, troponin, vascular endothelin growth factor, IL-17a, matrix metallopeptidase-9, or C-reactive protein. However, endothelin-1 and IL-6 increased, and IL-10 decreased, between the stressor and 30 min after stress. Left-ventricular end diastolic dimension was 179 ± 65 mL at rest and increased to 217 ± 71 after mental stress and 229 ± 86 after adenosine (P < 0.01 for both). Resting end systolic volume was 129 ± 60 mL at rest and increased to 158 ± 66 after mental stress (P < 0.05) and 171 ± 87 after adenosine (P < 0.07), with no significant differences between adenosine and mental stress. Ejection fraction was 30 ± 12 at baseline, 29 ± 11 with mental stress, and 28 ± 10 with adenosine (P = not significant). Conclusion There was high concordance between ischemic perfusion defects induced

  8. Quantitative assessments of ecological impact/recovery in freshwater systems

    SciTech Connect

    Birge, W.J.; Keogh, D.P.; Zuiderveen, J.A. |

    1995-12-31

    Long-term studies were undertaken to evaluate the fidelity of multi-metric scoring systems, and other means of quantifying effects of chemical stresses on aquatic biota. Integrity of macroinvertebrate communities was assessed using the Rapid Bioassessment Protocol III; trophic group analysis, diversity indices and various individual parameters, including species richness, abundance and indicator species. In addition, chemical and toxicological monitoring data and periphyton studies were used in the evaluations. Surveys were performed at monitoring stations selected for comparable conditions, and included upstream reference and downstream recovery areas. In two streams, ecological impact varied from severe to extreme near point-source outfalls and decreased progressively with distance downstream. Station to station scoring with Protocol III and diversity indices correlated well with independent chemical and toxicological evaluations. However, in metal-stressed streams affected by slight to moderate impact, or which were in early recovery, Protocol III scoring and other family-level metrics did not consistently reflect losses in species richness and mean abundance up to 32% and 75%, respectively. Observations on morphological deformities (e.g., eyespots, gills), selected subfamily and species-level metrics, including ratios of metal sensitive to metal tolerant chironomids, gave greater accuracy in characterizing low to moderate perturbations. However, in conclusion, it appeared that marginal losses in biodiversity over time may not be detectable with current procedures. Major factors affecting precision included the normal range of seasonal and annual fluctuations in ecological parameters within and among stream systems, inadequate historical data, as well as drought and high water events.

  9. Quantitative assessment of corpus callosum morphology in periventricular nodular heterotopia.

    PubMed

    Pardoe, Heath R; Mandelstam, Simone A; Hiess, Rebecca Kucharsky; Kuzniecky, Ruben I; Jackson, Graeme D

    2015-01-01

    We investigated systematic differences in corpus callosum morphology in periventricular nodular heterotopia (PVNH). Differences in corpus callosum mid-sagittal area and subregional area changes were measured using an automated software-based method. Heterotopic gray matter deposits were automatically labeled and compared with corpus callosum changes. The spatial pattern of corpus callosum changes were interpreted in the context of the characteristic anterior-posterior development of the corpus callosum in healthy individuals. Individuals with periventricular nodular heterotopia were imaged at the Melbourne Brain Center or as part of the multi-site Epilepsy Phenome Genome project. Whole brain T1 weighted MRI was acquired in cases (n=48) and controls (n=663). The corpus callosum was segmented on the mid-sagittal plane using the software "yuki". Heterotopic gray matter and intracranial brain volume was measured using Freesurfer. Differences in corpus callosum area and subregional areas were assessed, as well as the relationship between corpus callosum area and heterotopic GM volume. The anterior-posterior distribution of corpus callosum changes and heterotopic GM nodules were quantified using a novel metric and compared with each other. Corpus callosum area was reduced by 14% in PVNH (p=1.59×10(-9)). The magnitude of the effect was least in the genu (7% reduction) and greatest in the isthmus and splenium (26% reduction). Individuals with higher heterotopic GM volume had a smaller corpus callosum. Heterotopic GM volume was highest in posterior brain regions, however there was no linear relationship between the anterior-posterior position of corpus callosum changes and PVNH nodules. Reduced corpus callosum area is strongly associated with PVNH, and is probably associated with abnormal brain development in this neurological disorder. The primarily posterior corpus callosum changes may inform our understanding of the etiology of PVNH. Our results suggest that

  10. Iterative restoration of SPECT projection images

    SciTech Connect

    Glick, S.J.; Xia, W.

    1997-04-01

    Photon attenuation and the limited nonstationary spatial resolution of the detector can reduce both qualitative and quantitative image quality in single photon emission computed tomography (SPECT). In this paper, a reconstruction approach is described which can compensate for both of these degradations. The approach involves processing the project data with Bellini`s method for attenuation compensation followed by an iterative deconvolution technique which uses the frequency distance principle (FDP) to model the distance-dependent camera blur. Modeling of the camera blur with the FDP allows an efficient implementation using fast Fourier transformation (FFT) methods. After processing of the project data, reconstruction is performed using filtered backprojections. Simulation studies using two different brain phantoms show that this approach gives reconstructions with a favorable bias versus noise tradeoff, provides no visually undesirable noise artifacts, and requires a low computational load.

  11. Progress in SPECT/CT imaging of prostate cancer.

    PubMed

    Seo, Youngho; Franc, Benjamin L; Hawkins, Randall A; Wong, Kenneth H; Hasegawa, Bruce H

    2006-08-01

    Prostate cancer is the most common type of cancer (other than skin cancer) among men in the United States. Although prostate cancer is one of the few cancers that grow so slowly that it may never threaten the lives of some patients, it can be lethal once metastasized. Indium-111 capromab pendetide (ProstaScint, Cytogen Corporation, Princeton, NJ) imaging is indicated for staging and recurrence detection of the disease, and is particularly useful to determine whether or not the disease has spread to distant metastatic sites. However, the interpretation of 111In-capromab pendetide is challenging without correlated structural information mostly because the radiopharmaceutical demonstrates nonspecific uptake in the normal vasculature, bowel, bone marrow, and the prostate gland. We developed an improved method of imaging and localizing 111In-Capromab pendetide using a SPECT/CT imaging system. The specific goals included: i) development and application of a novel iterative SPECT reconstruction algorithm that utilizes a priori information from coregistered CT; and ii) assessment of clinical impact of adding SPECT/CT for prostate cancer imaging with capromab pendetide utilizing the standard and novel reconstruction techniques. Patient imaging studies with capromab pendetide were performed from 1999 to 2004 using two different SPECT/CT scanners, a prototype SPECT/CT system and a commercial SPECT/CT system (Discovery VH, GE Healthcare, Waukesha, WI). SPECT projection data from both systems were reconstructed using an experimental iterative algorithm that compensates for both photon attenuation and collimator blurring. In addition, the data obtained from the commercial system were reconstructed with attenuation correction using an OSEM reconstruction supplied by the camera manufacturer for routine clinical interpretation. For 12 sets of patient data, SPECT images reconstructed using the experimental algorithm were interpreted separately and compared with interpretation of

  12. Progress in SPECT/CT imaging of prostate cancer.

    PubMed

    Seo, Youngho; Franc, Benjamin L; Hawkins, Randall A; Wong, Kenneth H; Hasegawa, Bruce H

    2006-08-01

    Prostate cancer is the most common type of cancer (other than skin cancer) among men in the United States. Although prostate cancer is one of the few cancers that grow so slowly that it may never threaten the lives of some patients, it can be lethal once metastasized. Indium-111 capromab pendetide (ProstaScint, Cytogen Corporation, Princeton, NJ) imaging is indicated for staging and recurrence detection of the disease, and is particularly useful to determine whether or not the disease has spread to distant metastatic sites. However, the interpretation of 111In-capromab pendetide is challenging without correlated structural information mostly because the radiopharmaceutical demonstrates nonspecific uptake in the normal vasculature, bowel, bone marrow, and the prostate gland. We developed an improved method of imaging and localizing 111In-Capromab pendetide using a SPECT/CT imaging system. The specific goals included: i) development and application of a novel iterative SPECT reconstruction algorithm that utilizes a priori information from coregistered CT; and ii) assessment of clinical impact of adding SPECT/CT for prostate cancer imaging with capromab pendetide utilizing the standard and novel reconstruction techniques. Patient imaging studies with capromab pendetide were performed from 1999 to 2004 using two different SPECT/CT scanners, a prototype SPECT/CT system and a commercial SPECT/CT system (Discovery VH, GE Healthcare, Waukesha, WI). SPECT projection data from both systems were reconstructed using an experimental iterative algorithm that compensates for both photon attenuation and collimator blurring. In addition, the data obtained from the commercial system were reconstructed with attenuation correction using an OSEM reconstruction supplied by the camera manufacturer for routine clinical interpretation. For 12 sets of patient data, SPECT images reconstructed using the experimental algorithm were interpreted separately and compared with interpretation of

  13. Real Time Quantitative Radiological Monitoring Equipment for Environmental Assessment

    SciTech Connect

    John R. Giles; Lyle G. Roybal; Michael V. Carpenter

    2006-03-01

    The Idaho National Laboratory (INL) has developed a suite of systems that rapidly scan, analyze, and characterize radiological contamination in soil. These systems have been successfully deployed at several Department of Energy (DOE) laboratories and Cold War Legacy closure sites. Traditionally, these systems have been used during the characterization and remediation of radiologically contaminated soils and surfaces; however, subsequent to the terrorist attacks of September 11, 2001, the applications of these systems have expanded to include homeland security operations for first response, continuing assessment and verification of cleanup activities in the event of the detonation of a radiological dispersal device. The core system components are a detector, a spectral analyzer, and a global positioning system (GPS). The system is computer controlled by menu-driven, user-friendly custom software designed for a technician-level operator. A wide variety of detectors have been used including several configurations of sodium iodide (NaI) and high-purity germanium (HPGe) detectors, and a large area proportional counter designed for the detection of x-rays from actinides such as Am-241 and Pu-238. Systems have been deployed from several platforms including a small all-terrain vehicle (ATV), hand-pushed carts, a backpack mounted unit, and an excavator mounted unit used where personnel safety considerations are paramount. The INL has advanced this concept, and expanded the system functionality to create an integrated, field-deployed analytical system through the use of tailored analysis and operations software. Customized, site specific software is assembled from a supporting toolbox of algorithms that streamline the data acquisition, analysis and reporting process. These algorithms include region specific spectral stripping, automated energy calibration, background subtraction, activity calculations based on measured detector efficiencies, and on-line data quality checks

  14. Towards quantitative ecological risk assessment of elevated carbon dioxide levels in the marine environment.

    PubMed

    de Vries, Pepijn; Tamis, Jacqueline E; Foekema, Edwin M; Klok, Chris; Murk, Albertinka J

    2013-08-30

    The environmental impact of elevated carbon dioxide (CO2) levels has become of more interest in recent years. This, in relation to globally rising CO2 levels and related considerations of geological CO2 storage as a mitigating measure. In the present study effect data from literature were collected in order to conduct a marine ecological risk assessment of elevated CO2 levels, using a Species Sensitivity Distribution (SSD). It became evident that information currently available from the literature is mostly insufficient for such a quantitative approach. Most studies focus on effects of expected future CO2 levels, testing only one or two elevated concentrations. A full dose-response relationship, a uniform measure of exposure, and standardized test protocols are essential for conducting a proper quantitative risk assessment of elevated CO2 levels. Improvements are proposed to make future tests more valuable and usable for quantitative risk assessment.

  15. Detection of Sentinel Lymph Nodes in Gynecologic Tumours by Planar Scintigraphy and SPECT/CT

    PubMed Central

    Kraft, Otakar; Havel, Martin

    2012-01-01

    Objective: Assess the role of planar lymphoscintigraphy and fusion imaging of SPECT/CT in sentinel lymph node (SLN) detection in patients with gynecologic tumours. Material and Methods: Planar scintigraphy and hybrid modality SPECT/CT were performed in 64 consecutive women with gynecologic tumours (mean age 53.6 with range 30-77 years): 36 pts with cervical cancer (Group A), 21 pts with endometrial cancer (Group B), 7 pts with vulvar carcinoma (Group C). Planar and SPECT/CT images were interpreted separately by two nuclear medicine physicians. Efficacy of these two techniques to image SLN were compared. Results: Planar scintigraphy did not image SLN in 7 patients (10.9%), SPECT/CT was negative in 4 patients (6.3%). In 35 (54.7%) patients the number of SLNs captured on SPECT/CT was higher than on planar imaging. Differences in detection of SLN between planar and SPECT/CT imaging in the group of all 64 patients are statistically significant (p<0.05). Three foci of uptake (1.7% from totally visible 177 foci on planar images) in 2 patients interpreted on planar images as hot LNs were found to be false positive non-nodal sites of uptake when further assessed on SPECT/CT. SPECT/CT showed the exact anatomical location of all visualised sentinel nodes. Conclusion: In some patients with gynecologic cancers SPECT/CT improves detection of sentinel lymph nodes. It can image nodes not visible on planar scintigrams, exclude false positive uptake and exactly localise pelvic and paraaortal SLNs. It improves anatomic localization of SLNs. Conflict of interest:None declared. PMID:23486989

  16. Performance of Myocardial Perfusion Imaging Using Multi-focus Fan Beam Collimator with Resolution Recovery Reconstruction in a Comparison with Conventional SPECT

    PubMed Central

    Matsutomo, Norikazu; Nagaki, Akio; Sasaki, Masayuki

    2014-01-01

    Objective(s): IQ-SPECT is an advanced high-speed SPECT modality for myocardial perfusion imaging (MPI), which uses a multi-focus fan beam collimator with resolution recovery reconstruction. The aim of this study was to compare IQ-SPECT with conventional SPECT in terms of performance, based on standard clinical protocols. In addition, we examined the concordance between conventional and IQ_SPECT in patients with coronary artery disease (CAD). Methods: Fifty-three patients, undergoing rest-gated MPI for the evaluation of known or suspected CAD, were enrolled in this study. In each patient, conventional SPECT (99mTc-tetrofosmin, 9.6 min and 201Tl, 12.9 min) was performed, immediately followed by IQ-SPECT, using a short acquisition time (4.3 min for 99mTc-tetrofosmin and 6.2 min for 201Tl). A quantitative analysis was performed on an MPI polar map, using a 20-segment model of the left ventricle. An automated analysis by gated SPECT was carried out to determine the left ventricular volume and function including end-diastolic volume (EDV), end-systolic volume (ESV), and left ventricular ejection fraction (LVEF). The degree of concordance between conventional SPECT and IQ-SPECT images was evaluated according to linear regression and Bland-Altman analyses. Results: The segmental percent uptake exhibited a significant correlation between IQ-SPECT and conventional SPECT (P<0.05). The mean differences in 99mTc-tetrofosmin studies were 1.1±6.6% (apex), 2.8±5.7% (anterior wall), 2.9±6.2% (septal wall), 4.9±6.7% (lateral wall), and 1.8±5.6% (inferior wall). Meanwhile, regarding the 201Tl-SPECT studies, these values were 1.6±6.9%, 2.0±6.6%, 2.1±5.9%, 3.3±7.2%, and 2.4±5.8%, respectively. Although the mean LVEF in IQ-SPECT tended to be higher than that observed in conventional SPECT (conventional SPECT=64.8±11.8% and IQ-SPECT=68.3±12.1% for 99mTc-tetrofosmin; conventional SPECT= 56.0±11.7% and IQ-SPECT=61.5±12.2% for 201Tl), quantitative parameters were not

  17. Reliability of Quantitative Ultrasonic Assessment of Normal-Tissue Toxicity in Breast Cancer Radiotherapy

    SciTech Connect

    Yoshida, Emi J.; Chen Hao; Torres, Mylin; Andic, Fundagul; Liu Haoyang; Chen Zhengjia; Sun, Xiaoyan; Curran, Walter J.; Liu Tian

    2012-02-01

    Purpose: We have recently reported that ultrasound imaging, together with ultrasound tissue characterization (UTC), can provide quantitative assessment of radiation-induced normal-tissue toxicity. This study's purpose is to evaluate the reliability of our quantitative ultrasound technology in assessing acute and late normal-tissue toxicity in breast cancer radiotherapy. Method and Materials: Our ultrasound technique analyzes radiofrequency echo signals and provides quantitative measures of dermal, hypodermal, and glandular tissue toxicities. To facilitate easy clinical implementation, we further refined this technique by developing a semiautomatic ultrasound-based toxicity assessment tool (UBTAT). Seventy-two ultrasound studies of 26 patients (720 images) were analyzed. Images of 8 patients were evaluated for acute toxicity (<6 months postradiotherapy) and those of 18 patients were evaluated for late toxicity ({>=}6 months postradiotherapy). All patients were treated according to a standard radiotherapy protocol. To assess intraobserver reliability, one observer analyzed 720 images in UBTAT and then repeated the analysis 3 months later. To assess interobserver reliability, three observers (two radiation oncologists and one ultrasound expert) each analyzed 720 images in UBTAT. An intraclass correlation coefficient (ICC) was used to evaluate intra- and interobserver reliability. Ultrasound assessment and clinical evaluation were also compared. Results: Intraobserver ICC was 0.89 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.96 for glandular tissue toxicity. Interobserver ICC was 0.78 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.94 for glandular tissue toxicity. Statistical analysis found significant changes in dermal (p < 0.0001), hypodermal (p = 0.0027), and glandular tissue (p < 0.0001) assessments in the acute toxicity group. Ultrasound measurements correlated with clinical Radiation Therapy Oncology Group (RTOG) toxicity scores of patients

  18. The benefit of personalized hybrid SPECT/CT pulmonary imaging.

    PubMed

    Simanek, Milan; Koranda, Pavel

    2016-01-01

    Hybrid pulmonary imaging in the present day has seen a fusion of various uses of CT scans, including angiography (CTAG), diagnostic CT, low dose CT (LDCT), and perfusion or ventilation scintigraphy in tomographic or planar imaging. Determining the most effective individualized test for the complete diagnostics of patients with pulmonary symptoms for various groups of patients is a major issue. The aim of the present study was to assess the effectiveness of the implementation of hybrid imaging in current methods of nuclear medicine in differential diagnostics of pulmonary embolism (PE). 326 patients were examined for symptomatology of PE. Patients were initially examined with SPECT perfusion scintigraphy. SPECT finding without sub-segmental or segmental defects was considered unproven PE but the finding of more segments or sub-segments in various lung parts was considered nearly proven PE. In the case of unclear findings, LDCT was added and in the case of a higher suspicion of PE, a ventilation examination was applied. It was possible to determine 83% of patients with the occurrence or exclusion of PE only on the basis of the perfusion SPECT examination and an X-ray or LDCT. LDCT was determined with 26% of the patients. With 41% of them, the use of LDCT resulted in an alternative diagnosis, explaining perfusion abnormalities. The research proved that use of SPECT/LDCT for differential diagnosis of lung symptoms brings about improvement in the diagnosis of pulmonary embolism or the identification of other lung diseases when lung perfusion abnormalities are recorded. PMID:27648373

  19. Molecular Imaging of Conscious, Unrestrained Mice with AwakeSPECT

    SciTech Connect

    Baba, Justin S.; Endres, Christopher J.; Foss, Catherine A.; Nimmagadda, Sridhar; Jung, Hyeyun; Goddard, James S.; Lee, Seung Joon; McKisson, John; Smith, Mark F.; Stolin, Alexander V.; Weisenberger, Andrew G.; Pomper, Martin G.

    2013-06-01

    We have developed a SPECT imaging system, AwakeSPECT, to enable molecular brain imaging of untrained mice that are conscious, unanesthetized, and unrestrained. We accomplished this with head tracking and motion correction techniques. Methods: The capability of the system for motion-corrected imaging was demonstrated with a ^99mTc-pertechnetate phantom, ^99mTc-methylene diphosphonate bone imaging, and measurement of the binding potential of the dopamine transporter radioligand ^123I-ioflupane in mouse brain in the awake and anesthetized (isoflurane) states. Stress induced by imaging in the awake state was assessed through measurement of plasma corticosterone levels. Results: AwakeSPECT provided high-resolution bone images reminiscent of those obtained from CT. The binding potential of ^123I-ioflupane in the awake state was on the order of 50% of that obtained with the animal under anesthesia, consistent with previous studies in nonhuman primates. Levels of stress induced were on the order of those seen in other behavioral tasks and imaging studies of awake animals. Conclusion: These results demonstrate the feasibility of SPECT molecular brain imaging of mice in the conscious, unrestrained state and demonstrate the effects of isoflurane anesthesia on radiotracer uptake.

  20. Molecular Imaging of Conscious, Unrestrained Mice with AwakeSPECT

    SciTech Connect

    Baba, Justin S; Endres, Christopher; Foss, Catherine; Nimmagadda, Sridhar; Jung, Hyeyun; Goddard Jr, James Samuel; Lee, Seung Joon; McKisson, John; Smith, Mark F.; Stolin, Alexander; Weisenberger, Andrew G.; Pomper, Martin

    2013-01-01

    We have developed a SPECT imaging system, AwakeSPECT, to enable molecular brain imaging of untrained mice that are conscious, unanesthetized, and unrestrained. We accomplished this with head tracking and motion correction techniques. Methods: The capability of the system for motion-corrected imaging was demonstrated with a 99mTc-pertechnetate phantom, 99mTcmethylene diphosphonate bone imaging, and measurement of the binding potential of the dopamine transporter radioligand 123I-ioflupane in mouse brain in the awake and anesthetized (isoflurane) states. Stress induced by imaging in the awake state was assessed through measurement of plasma corticosterone levels. Results: AwakeSPECT provided high-resolution bone images reminiscent of those obtained from CT. The binding potential of 123I-ioflupane in the awake state was on the order of 50% of that obtained with the animal under anesthesia, consistent with previous studies in nonhuman primates. Levels of stress induced were on the order of those seen in other behavioral tasks and imaging studies of awake animals. Conclusion: These results demonstrate the feasibility of SPECT molecular brain imaging of mice in the conscious, unrestrained state and demonstrate the effects of isoflurane anesthesia on radiotracer uptake.

  1. The benefit of personalized hybrid SPECT/CT pulmonary imaging

    PubMed Central

    Simanek, Milan; Koranda, Pavel

    2016-01-01

    Hybrid pulmonary imaging in the present day has seen a fusion of various uses of CT scans, including angiography (CTAG), diagnostic CT, low dose CT (LDCT), and perfusion or ventilation scintigraphy in tomographic or planar imaging. Determining the most effective individualized test for the complete diagnostics of patients with pulmonary symptoms for various groups of patients is a major issue. The aim of the present study was to assess the effectiveness of the implementation of hybrid imaging in current methods of nuclear medicine in differential diagnostics of pulmonary embolism (PE). 326 patients were examined for symptomatology of PE. Patients were initially examined with SPECT perfusion scintigraphy. SPECT finding without sub-segmental or segmental defects was considered unproven PE but the finding of more segments or sub-segments in various lung parts was considered nearly proven PE. In the case of unclear findings, LDCT was added and in the case of a higher suspicion of PE, a ventilation examination was applied. It was possible to determine 83% of patients with the occurrence or exclusion of PE only on the basis of the perfusion SPECT examination and an X-ray or LDCT. LDCT was determined with 26% of the patients. With 41% of them, the use of LDCT resulted in an alternative diagnosis, explaining perfusion abnormalities. The research proved that use of SPECT/LDCT for differential diagnosis of lung symptoms brings about improvement in the diagnosis of pulmonary embolism or the identification of other lung diseases when lung perfusion abnormalities are recorded.

  2. The benefit of personalized hybrid SPECT/CT pulmonary imaging

    PubMed Central

    Simanek, Milan; Koranda, Pavel

    2016-01-01

    Hybrid pulmonary imaging in the present day has seen a fusion of various uses of CT scans, including angiography (CTAG), diagnostic CT, low dose CT (LDCT), and perfusion or ventilation scintigraphy in tomographic or planar imaging. Determining the most effective individualized test for the complete diagnostics of patients with pulmonary symptoms for various groups of patients is a major issue. The aim of the present study was to assess the effectiveness of the implementation of hybrid imaging in current methods of nuclear medicine in differential diagnostics of pulmonary embolism (PE). 326 patients were examined for symptomatology of PE. Patients were initially examined with SPECT perfusion scintigraphy. SPECT finding without sub-segmental or segmental defects was considered unproven PE but the finding of more segments or sub-segments in various lung parts was considered nearly proven PE. In the case of unclear findings, LDCT was added and in the case of a higher suspicion of PE, a ventilation examination was applied. It was possible to determine 83% of patients with the occurrence or exclusion of PE only on the basis of the perfusion SPECT examination and an X-ray or LDCT. LDCT was determined with 26% of the patients. With 41% of them, the use of LDCT resulted in an alternative diagnosis, explaining perfusion abnormalities. The research proved that use of SPECT/LDCT for differential diagnosis of lung symptoms brings about improvement in the diagnosis of pulmonary embolism or the identification of other lung diseases when lung perfusion abnormalities are recorded. PMID:27648373

  3. Onboard functional and molecular imaging: A design investigation for robotic multipinhole SPECT

    SciTech Connect

    Bowsher, James Giles, William; Yin, Fang-Fang; Yan, Susu; Roper, Justin

    2014-01-15

    Purpose: Onboard imaging—currently performed primarily by x-ray transmission modalities—is essential in modern radiation therapy. As radiation therapy moves toward personalized medicine, molecular imaging, which views individual gene expression, may also be important onboard. Nuclear medicine methods, such as single photon emission computed tomography (SPECT), are premier modalities for molecular imaging. The purpose of this study is to investigate a robotic multipinhole approach to onboard SPECT. Methods: Computer-aided design (CAD) studies were performed to assess the feasibility of maneuvering a robotic SPECT system about a patient in position for radiation therapy. In order to obtain fast, high-quality SPECT images, a 49-pinhole SPECT camera was designed which provides high sensitivity to photons emitted from an imaging region of interest. This multipinhole system was investigated by computer-simulation studies. Seventeen hot spots 10 and 7 mm in diameter were placed in the breast region of a supine female phantom. Hot spot activity concentration was six times that of background. For the 49-pinhole camera and a reference, more conventional, broad field-of-view (FOV) SPECT system, projection data were computer simulated for 4-min scans and SPECT images were reconstructed. Hot-spot localization was evaluated using a nonprewhitening forced-choice numerical observer. Results: The CAD simulation studies found that robots could maneuver SPECT cameras about patients in position for radiation therapy. In the imaging studies, most hot spots were apparent in the 49-pinhole images. Average localization errors for 10-mm- and 7-mm-diameter hot spots were 0.4 and 1.7 mm, respectively, for the 49-pinhole system, and 3.1 and 5.7 mm, respectively, for the reference broad-FOV system. Conclusions: A robot could maneuver a multipinhole SPECT system about a patient in position for radiation therapy. The system could provide onboard functional and molecular imaging with 4-min

  4. Onboard functional and molecular imaging: A design investigation for robotic multipinhole SPECT

    PubMed Central

    Bowsher, James; Yan, Susu; Roper, Justin; Giles, William; Yin, Fang-Fang

    2014-01-01

    Purpose: Onboard imaging—currently performed primarily by x-ray transmission modalities—is essential in modern radiation therapy. As radiation therapy moves toward personalized medicine, molecular imaging, which views individual gene expression, may also be important onboard. Nuclear medicine methods, such as single photon emission computed tomography (SPECT), are premier modalities for molecular imaging. The purpose of this study is to investigate a robotic multipinhole approach to onboard SPECT. Methods: Computer-aided design (CAD) studies were performed to assess the feasibility of maneuvering a robotic SPECT system about a patient in position for radiation therapy. In order to obtain fast, high-quality SPECT images, a 49-pinhole SPECT camera was designed which provides high sensitivity to photons emitted from an imaging region of interest. This multipinhole system was investigated by computer-simulation studies. Seventeen hot spots 10 and 7 mm in diameter were placed in the breast region of a supine female phantom. Hot spot activity concentration was six times that of background. For the 49-pinhole camera and a reference, more conventional, broad field-of-view (FOV) SPECT system, projection data were computer simulated for 4-min scans and SPECT images were reconstructed. Hot-spot localization was evaluated using a nonprewhitening forced-choice numerical observer. Results: The CAD simulation studies found that robots could maneuver SPECT cameras about patients in position for radiation therapy. In the imaging studies, most hot spots were apparent in the 49-pinhole images. Average localization errors for 10-mm- and 7-mm-diameter hot spots were 0.4 and 1.7 mm, respectively, for the 49-pinhole system, and 3.1 and 5.7 mm, respectively, for the reference broad-FOV system. Conclusions: A robot could maneuver a multipinhole SPECT system about a patient in position for radiation therapy. The system could provide onboard functional and molecular imaging with 4-min

  5. High-throughput automated image analysis of neuroinflammation and neurodegeneration enables quantitative assessment of virus neurovirulence

    PubMed Central

    Maximova, Olga A.; Murphy, Brian R.; Pletnev, Alexander G.

    2010-01-01

    Historically, the safety of live attenuated vaccine candidates against neurotropic viruses was assessed by semi-quantitative analysis of virus-induced histopathology in the central nervous system of monkeys. We have developed a high-throughput automated image analysis (AIA) for the quantitative assessment of virus-induced neuroinflammation and neurodegeneration. Evaluation of the results generated by AIA showed that quantitative estimates of lymphocytic infiltration, microglial activation, and neurodegeneration strongly and significantly correlated with results of traditional histopathological scoring. In addition, we show that AIA is a targeted, objective, accurate, and time-efficient approach that provides reliable differentiation of virus neurovirulence. As such, it may become a useful tool in establishing consistent analytical standards across research and development laboratories and regulatory agencies, and may improve the safety evaluation of live virus vaccines. The implementation of this high-throughput AIA will markedly advance many fields of research including virology, neuroinflammation, neuroscience, and vaccinology. PMID:20688036

  6. The quantitative assessment of domino effects caused by overpressure. Part I. Probit models.

    PubMed

    Cozzani, Valerio; Salzano, Ernesto

    2004-03-19

    Accidents caused by domino effect are among the more severe that took place in the chemical and process industry. However, a well established and widely accepted methodology for the quantitative assessment of domino accidents contribution to industrial risk is still missing. Hence, available data on damage to process equipment caused by blast waves were revised in the framework of quantitative risk analysis, aiming at the quantitative assessment of domino effects caused by overpressure. Specific probit models were derived for several categories of process equipment and were compared to other literature approaches for the prediction of probability of damage of equipment loaded by overpressure. The results evidence the importance of using equipment-specific models for the probability of damage and equipment-specific damage threshold values, rather than general equipment correlation, which may lead to errors up to 500%. PMID:15072815

  7. Variable Activation of the DNA Damage Response Pathways in Patients Undergoing SPECT Myocardial Perfusion Imaging

    PubMed Central

    Hu, Shijun; Liang, Grace; Ong, Sang-Ging; Han, Leng; Sanchez-Freire, Veronica; Lee, Andrew S.; Vasanawala, Minal; Segall, George; Wu, Joseph C.

    2015-01-01

    Background Although single photon emission computed tomography myocardial perfusion imaging (SPECT MPI) has improved the diagnosis and risk stratification of patients with suspected coronary artery disease, it remains a primary source of low dose radiation exposure for cardiac patients. To determine the biological effects of low dose radiation from SPECT MPI, we measured the activation of the DNA damage response pathways using quantitative flow cytometry and single cell gene expression profiling. Methods and Results Blood samples were collected from patients before and after SPECT MPI (n=63). Overall, analysis of all recruited patients showed no marked differences in the phosphorylation of proteins (H2AX, p53, and ATM) following SPECT. The majority of patients also had either down-regulated or unchanged expression in DNA damage response genes at both 24 and 48 hours post-SPECT. Interestingly, a small subset of patients with increased phosphorylation also had significant up-regulation of genes associated with DNA damage, whereas those with no changes in phosphorylation had significant down-regulation or no difference, suggesting that some patients may potentially be more sensitive to low dose radiation exposure. Conclusions Our findings showed that SPECT MPI resulted in a variable activation of the DNA damage response pathways. Although only a small subset of patients had increased protein phosphorylation and elevated gene expression post-imaging, continued care should be taken to reduce radiation exposure to both patients and operators. PMID:25609688

  8. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  9. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  10. QUANTITATIVE ASSESSMENT OF CORAL DISEASES IN THE FLORIDA KEYS: STRATEGY AND METHODOLOGY

    EPA Science Inventory

    Most studies of coral disease have focused on the incidence of a single disease within a single location. Our overall objective is to use quantitative assessments to characterize annual patterns in the distribution and frequency of scleractinian and gorgonian coral diseases over ...

  11. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  12. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  13. Effect of Teacher Specialized Training on Limited English Speaking Students' Assessment Outcomes: Quantitative Study

    ERIC Educational Resources Information Center

    Palaroan, Michelle A.

    2009-01-01

    The quantitative study was a comparison of Limited English Proficient (LEP) students' assessment outcomes when taught by a teacher with specialized training and when taught by teachers with no specialized training. The comparison of 2007-2008 Northern Nevada LEP third grade student scores in the content areas of English language arts and…

  14. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical…

  15. A Quantitative Synthesis of Developmental Disability Research: The Impact of Functional Assessment Methodology on Treatment Effectiveness

    ERIC Educational Resources Information Center

    Delfs, Caitlin H.; Campbell, Jonathan M.

    2010-01-01

    Methods and outcomes from functional behavioral assessment have been researched widely over the past twenty-five years. However, several important research questions have yet to be examined sufficiently. This quantitative review of developmental disability research aims to make comparisons of different functional behavioral assessment…

  16. U-SPECT-BioFluo: an integrated radionuclide, bioluminescence, and fluorescence imaging platform

    PubMed Central

    2014-01-01

    Background In vivo bioluminescence, fluorescence, and single-photon emission computed tomography (SPECT) imaging provide complementary information about biological processes. However, to date these signatures are evaluated separately on individual preclinical systems. In this paper, we introduce a fully integrated bioluminescence-fluorescence-SPECT platform. Next to an optimization in logistics and image fusion, this integration can help improve understanding of the optical imaging (OI) results. Methods An OI module was developed for a preclinical SPECT system (U-SPECT, MILabs, Utrecht, the Netherlands). The applicability of the module for bioluminescence and fluorescence imaging was evaluated in both a phantom and in an in vivo setting using mice implanted with a 4 T1-luc + tumor. A combination of a fluorescent dye and radioactive moiety was used to directly relate the optical images of the module to the SPECT findings. Bioluminescence imaging (BLI) was compared to the localization of the fluorescence signal in the tumors. Results Both the phantom and in vivo mouse studies showed that superficial fluorescence signals could be imaged accurately. The SPECT and bioluminescence images could be used to place the fluorescence findings in perspective, e.g. by showing tracer accumulation in non-target organs such as the liver and kidneys (SPECT) and giving a semi-quantitative read-out for tumor spread (bioluminescence). Conclusions We developed a fully integrated multimodal platform that provides complementary registered imaging of bioluminescent, fluorescent, and SPECT signatures in a single scanning session with a single dose of anesthesia. In our view, integration of these modalities helps to improve data interpretation of optical findings in relation to radionuclide images. PMID:25386389

  17. Spatial correspondence of 4D CT ventilation and SPECT pulmonary perfusion defects in patients with malignant airway stenosis

    NASA Astrophysics Data System (ADS)

    Castillo, Richard; Castillo, Edward; McCurdy, Matthew; Gomez, Daniel R.; Block, Alec M.; Bergsma, Derek; Joy, Sarah; Guerrero, Thomas

    2012-04-01

    To determine the spatial overlap agreement between four-dimensional computed tomography (4D CT) ventilation and single photon emission computed tomography (SPECT) perfusion hypo-functioning pulmonary defect regions in a patient population with malignant airway stenosis. Treatment planning 4D CT images were obtained retrospectively for ten lung cancer patients with radiographically demonstrated airway obstruction due to gross tumor volume. Each patient also received a SPECT perfusion study within one week of the planning 4D CT, and prior to the initiation of treatment. Deformable image registration was used to map corresponding lung tissue elements between the extreme component phase images, from which quantitative three-dimensional (3D) images representing the local pulmonary specific ventilation were constructed. Semi-automated segmentation of the percentile perfusion distribution was performed to identify regional defects distal to the known obstructing lesion. Semi-automated segmentation was similarly performed by multiple observers to delineate corresponding defect regions depicted on 4D CT ventilation. Normalized Dice similarity coefficient (NDSC) indices were determined for each observer between SPECT perfusion and 4D CT ventilation defect regions to assess spatial overlap agreement. Tidal volumes determined from 4D CT ventilation were evaluated versus measurements obtained from lung parenchyma segmentation. Linear regression resulted in a linear fit with slope = 1.01 (R2 = 0.99). Respective values for the average DSC, NDSC1 mm and NDSC2 mm for all cases and multiple observers were 0.78, 0.88 and 0.99, indicating that, on average, spatial overlap agreement between ventilation and perfusion defect regions was comparable to the threshold for agreement within 1-2 mm uncertainty. Corresponding coefficients of variation for all metrics were similarly in the range: 0.10%-19%. This study is the first to quantitatively assess 3D spatial overlap agreement between

  18. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    PubMed Central

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  19. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning.

  20. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  1. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  2. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  3. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  4. Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.

    PubMed

    Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo

    2015-01-01

    We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced. PMID:26353956

  5. The edaphic quantitative protargol stain: a sampling protocol for assessing soil ciliate abundance and diversity.

    PubMed

    Acosta-Mercado, Dimaris; Lynn, Denis H

    2003-06-01

    It has been suggested that species loss from microbial groups low in diversity that occupy trophic positions close to the base of the detrital food web could be critical for terrestrial ecosystem functioning. Among the protozoans within the soil microbial loop, ciliates are presumably the least abundant and of low diversity. However, the lack of a standardized method to quantitatively enumerate and identify them has hampered our knowledge about the magnitude of their active and potential diversity, and about the interactions in which they are involved. Thus, the Edaphic Quantitative Protargol Staining (EQPS) method is provided to simultaneously account for ciliate species richness and abundance in a quantitative and qualitative way. This direct method allows this rapid and simultaneous assessment by merging the Non-flooded Petri Dish (NFPD) method [Prog. Protistol. 2 (1987) 69] and the Quantitative Protargol Stain (QPS) method [Montagnes, D.J.S., Lynn, D.H., 1993. A quantitative protargol stain (QPS) for ciliates and other protists. In: Kemp, P.F., Sherr, B.F., Sherr, E.B., Cole, J.J. (Eds.), Handbook of Methods in Aquatic Microbial Ecology. Lewis Publishers, Boca Raton, FL, pp. 229-240]. The abovementioned protocols were refined by experiments examining the spatial distribution of ciliates under natural field conditions, sampling intensity, the effect of storage, and the use of cytological preparations versus live observations. The EQPS could be useful in ecological studies since it provides both a "snapshot" of the active and effective diversity and a robust estimate of the potential diversity.

  6. Dynamic heart-in-thorax phantom for functional SPECT

    SciTech Connect

    Celler, A.; Lyster, D.; Farncombe, T.

    1996-12-31

    We have designed and built a dynamic heart-in-thorax phantom to be used as a primary tool during the experimental verification of the performance of the quantitative dynamic functional imaging method we are developing for standard rotating single photon emission computed tomography (SPECT) cameras. The phantom consists of two independent parts (i) a dynamic heart model with the possibility of mounting {open_quotes}defects{close_quotes} inside it and (ii) a non-uniform thorax model with lungs and spinal cord, and uses the fact that the washout of a tracer by dilution is governed by a linear first order equation, the same type of equation as is used to model time-activity distribution in myocardial viability studies. Tests of the dynamic performance of the phantom in planar scanning mode have confirmed the validity of these assumptions. Also the preliminary results obtained in SPECT mode show that the values of characteristic times could be experimentally determined and that these values agreed well with the values preset on the phantom. We consider that the phantom is ready for extensive use in studies into development of the dynamic SPECT method.

  7. Quantitative assessment of binding affinities for nanoparticles targeted to vulnerable plaque.

    PubMed

    Tang, Tang; Tu, Chuqiao; Chow, Sarah Y; Leung, Kevin H; Du, Siyi; Louie, Angelique Y

    2015-06-17

    Recent successes in targeted immune and cell-based therapies have driven new directions for pharmaceutical research. With the rise of these new therapies there is an unfilled need for companion diagnostics to assess patients' potential for therapeutic response. Targeted nanomaterials have been widely investigated to fill this niche; however, in contrast to small molecule or peptide-based targeted agents, binding affinities are not reported for nanomaterials, and to date there has been no standard, quantitative measure for the interaction of targeted nanoparticle agents with their targets. Without a standard measure, accurate comparisons between systems and optimization of targeting behavior are challenging. Here, we demonstrate a method for quantitative assessment of the binding affinity for targeted nanoparticles to cell surface receptors in living systems and apply it to optimize the development of a novel targeted nanoprobe for imaging vulnerable atherosclerotic plaques. In this work, we developed sulfated dextran-coated iron oxide nanoparticles with specific targeting to macrophages, a cell type whose density strongly correlates with plaque vulnerability. Detailed quantitative, in vitro characterizations of (111)In(3+) radiolabeled probes show high-affinity binding to the macrophage scavenger receptor A (SR-A). Cell uptake studies illustrate that higher surface sulfation levels result in much higher uptake efficiency by macrophages. We use a modified Scatchard analysis to quantitatively describe nanoparticle binding to targeted receptors. This characterization represents a potential new standard metric for targeted nanomaterials. PMID:25970303

  8. Reliability of quantitative ultrasonic assessment of normal-tissue toxicity in breast cancer radiotherapy

    PubMed Central

    Yoshida, Emi J.; Chen, Hao; Torres, Mylin; Andic, Fundagul; Liu, Hao-Yang; Chen, Zhengjia; Sun, Xiaoyan; Curran, Walter J; Liu, Tian

    2011-01-01

    Purpose We have recently reported that ultrasound imaging, together with ultrasound tissue characterization (UTC), can provide quantitative assessment of radiation-induced normal-tissue toxicity. This study’s purpose is to evaluate the reliability of our quantitative ultrasound technology in assessing acute and late normal-tissue toxicity in breast cancer radiotherapy. Method and Materials Our ultrasound technique analyzes radio-frequency echo signals and provides quantitative measures of dermal, hypodermal, and glandular-tissue toxicities. To facilitate easy clinical implementation, we further refined this technique by developing a semi-automatic ultrasound-based toxicity assessment tool (UBTAT). Seventy-two ultrasound studies of 26 patients (720 images) were analyzed. Images of 8 patients were evaluated for acute toxicity (<6 months post radiotherapy) and those of 18 patients were evaluated for late toxicity (≥6 months post radiotherapy). All patients were treated according to a standard radiotherapy protocol. To assess intra-observer reliability, one observer analyzed 720 images in UBTAT and then repeated the analysis 3 months later. To assess inter-observer reliability, three observers (two radiation oncologists and one ultrasound expert) each analyzed 720 images in UBTAT. An intraclass correlation coefficient (ICC) was used to evaluate intra- and inter-observer reliability. Ultrasound assessment and clinical evaluation were also compared. Results Intra-observer ICC was 0.89 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.96 for glandular-tissue toxicity. Inter-observer ICC was 0.78 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.94 for glandular-tissue toxicity. Statistical analysis found significant changes in dermal (p < 0.0001), hypodermal (p=0.0027), and glandular-tissue (p < 0.0001) assessments in the acute toxicity group. Ultrasound measurements correlated with clinical RTOG toxicity scores of patients in the late toxicity group

  9. Quantitative assessment of radiation force effect at the dielectric air-liquid interface

    PubMed Central

    Capeloto, Otávio Augusto; Zanuto, Vitor Santaella; Malacarne, Luis Carlos; Baesso, Mauro Luciano; Lukasievicz, Gustavo Vinicius Bassi; Bialkowski, Stephen Edward; Astrath, Nelson Guilherme Castelli

    2016-01-01

    We induce nanometer-scale surface deformation by exploiting momentum conservation of the interaction between laser light and dielectric liquids. The effect of radiation force at the air-liquid interface is quantitatively assessed for fluids with different density, viscosity and surface tension. The imparted pressure on the liquids by continuous or pulsed laser light excitation is fully described by the Helmholtz electromagnetic force density. PMID:26856622

  10. Postoperative Quantitative Assessment of Reconstructive Tissue Status in Cutaneous Flap Model using Spatial Frequency Domain Imaging

    PubMed Central

    Yafi, Amr; Vetter, Thomas S; Scholz, Thomas; Patel, Sarin; Saager, Rolf B; Cuccia, David J; Evans, Gregory R; Durkin, Anthony J

    2010-01-01

    Background The purpose of this study is to investigate the capabilities of a novel optical wide-field imaging technology known as Spatial Frequency Domain Imaging (SFDI) to quantitatively assess reconstructive tissue status. Methods Twenty two cutaneous pedicle flaps were created on eleven rats based on the inferior epigastric vessels. After baseline measurement, all flaps underwent vascular ischemia, induced by clamping the supporting vessels for two hours (either arterio-venous or selective venous occlusions) normal saline was injected to the control flap, and hypertonic hyperoncotic saline solution to the experimental flap. Flaps were monitored for two hours after reperfusion. The SFDI system was used for quantitative assessment of flap status over the duration of the experiment. Results All flaps demonstrated a significant decline in oxy-hemoglobin and tissue oxygen saturation in response to occlusion. Total hemoglobin and deoxy-hemoglobin were markedly increased in the selective venous occlusion group. After reperfusion and the solutions were administered, oxy-hemoglobin and tissue oxygen saturation in those flaps that survived gradually returned to the baseline levels. However, flaps for which oxy-hemoglobin and tissue oxygen saturation didn’t show any signs of recovery appeared to be compromised and eventually became necrotic within 24–48 hours in both occlusion groups. Conclusion SFDI technology provides a quantitative, objective method to assess tissue status. This study demonstrates the potential of this optical technology to assess tissue perfusion in a very precise and quantitative way, enabling wide-field visualization of physiological parameters. The results of this study suggest that SFDI may provide a means for prospectively identifying dysfunctional flaps well in advance of failure. PMID:21200206

  11. A method of quantitative risk assessment for transmission pipeline carrying natural gas.

    PubMed

    Jo, Young-Do; Ahn, Bum Jong

    2005-08-31

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline.

  12. The Quantitative Reasoning for College Science (QuaRCS) Assessment in non-Astro 101 Courses

    NASA Astrophysics Data System (ADS)

    Kirkman, Thomas W.; Jensen, Ellen

    2016-06-01

    The innumeracy of American students and adults is a much lamented educational problem. The quantitative reasoning skills of college students may be particularly addressed and improved in "general education" science courses like Astro 101. Demonstrating improvement requires a standardized instrument. Among the non-proprietary instruments the Quantitative Literacy and Reasoning Assessment[1] (QRLA) and the Quantitative Reasoning for College Science (QuaRCS) Assessment[2] stand out.Follette et al. developed the QuaRCS in the context of Astro 101 at University of Arizona. We report on QuaRCS results in different contexts: pre-med physics and pre-nursing microbiology at a liberal arts college. We report on the mismatch between students' contemporaneous report of a question's difficulty and the actual probability of success. We report correlations between QuaRCS and other assessments of overall student performance in the class. We report differences in attitude towards mathematics in these two different but health-related student populations .[1] QLRA, Gaze et al., 2014, DOI: http://dx.doi.org/10.5038/1936-4660.7.2.4[2] QuaRCS, Follette, et al., 2015, DOI: http://dx.doi.org/10.5038/1936-4660.8.2.2

  13. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  14. Combining qualitative and quantitative imaging evaluation for the assessment of genomic DNA integrity: The SPIDIA experience.

    PubMed

    Ciniselli, Chiara Maura; Pizzamiglio, Sara; Malentacchi, Francesca; Gelmini, Stefania; Pazzagli, Mario; Hartmann, Christina C; Ibrahim-Gawel, Hady; Verderio, Paolo

    2015-06-15

    In this note, we present an ad hoc procedure that combines qualitative (visual evaluation) and quantitative (ImageJ software) evaluations of Pulsed-Field Gel Electrophoresis (PFGE) images to assess the genomic DNA (gDNA) integrity of analyzed samples. This procedure could be suitable for the analysis of a large number of images by taking into consideration both the expertise of researchers and the objectiveness of the software. We applied this procedure on the first SPIDIA DNA External Quality Assessment (EQA) samples. Results show that the classification obtained by this ad hoc procedure allows a more accurate evaluation of gDNA integrity with respect to a single approach.

  15. A framework for quantitative assessment of impacts related to energy and mineral resource development

    USGS Publications Warehouse

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katie

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  16. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. PMID:27197566

  17. A methodology for the quantitative risk assessment of major accidents triggered by seismic events.

    PubMed

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-08-17

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units.

  18. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  19. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  20. Vertebrobasilar insufficiency revealed by xenon-133 inhalation SPECT

    SciTech Connect

    Delecluse, F.; Voordecker, P.; Raftopoulos, C.

    1989-07-01

    A study of cerebral and cerebellar blood flow reactivity to acetazolamide by xenon-133-inhalation single photon emission computed tomography (/sup 133/Xe SPECT) was carried out in a patient with bouts of transient basilar ischemia, whose neurological examination, computed tomographic scan, and auditory evoked potentials were normal. Though the patient was symptom-free at the time of the study, /sup 133/Xe SPECT demonstrated vertebrobasilar insufficiency by showing an impaired vasodilatory response in both the occipital lobes and the right cerebellar hemisphere. Three weeks later, the patient suffered an extensive stroke in these same areas. We therefore suggest that this method could be of great value in the assessment of vertebrobasilar insufficiency.

  1. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI

    PubMed Central

    Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2015-01-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of R2* and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and R2* values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher R2* and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in R2* and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2–8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced R2* and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies. PMID:26661253

  2. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI.

    PubMed

    Klohs, Jan; Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2016-09-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of [Formula: see text] and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and [Formula: see text] values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher [Formula: see text] and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in [Formula: see text] and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2-8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced [Formula: see text] and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies. PMID:26661253

  3. C-SPECT - a Clinical Cardiac SPECT/Tct Platform: Design Concepts and Performance Potential

    PubMed Central

    Chang, Wei; Ordonez, Caesar E.; Liang, Haoning; Li, Yusheng; Liu, Jingai

    2013-01-01

    Because of scarcity of photons emitted from the heart, clinical cardiac SPECT imaging is mainly limited by photon statistics. The sub-optimal detection efficiency of current SPECT systems not only limits the quality of clinical cardiac SPECT imaging but also makes more advanced potential applications difficult to be realized. We propose a high-performance system platform - C-SPECT, which has its sampling geometry optimized for detection of emitted photons in quality and quantity. The C-SPECT has a stationary C-shaped gantry that surrounds the left-front side of a patient’s thorax. The stationary C-shaped collimator and detector systems in the gantry provide effective and efficient detection and sampling of photon emission. For cardiac imaging, the C-SPECT platform could achieve 2 to 4 times the system geometric efficiency of conventional SPECT systems at the same sampling resolution. This platform also includes an integrated transmission CT for attenuation correction. The ability of C-SPECT systems to perform sequential high-quality emission and transmission imaging could bring cost-effective high-performance to clinical imaging. In addition, a C-SPECT system could provide high detection efficiency to accommodate fast acquisition rate for gated and dynamic cardiac imaging. This paper describes the design concepts and performance potential of C-SPECT, and illustrates how these concepts can be implemented in a basic system. PMID:23885129

  4. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-02-05

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information.

  5. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed Central

    Hertzberg, Richard C; Teuschler, Linda K

    2002-01-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions. PMID:12634126

  6. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  7. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients

    PubMed Central

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  8. Fibrosis assessment: impact on current management of chronic liver disease and application of quantitative invasive tools.

    PubMed

    Wang, Yan; Hou, Jin-Lin

    2016-05-01

    Fibrosis, a common pathogenic pathway of chronic liver disease (CLD), has long been indicated to be significantly and most importantly associated with severe prognosis. Nowadays, with remarkable advances in understanding and/or treatment of major CLDs such as hepatitis C, B, and nonalcoholic fatty liver disease, there is an unprecedented requirement for the diagnosis and assessment of liver fibrosis or cirrhosis in various clinical settings. Among the available approaches, liver biopsy remains the one which possibly provides the most direct and reliable information regarding fibrosis patterns and changes in the parenchyma at different clinical stages and with different etiologies. Thus, many endeavors have been undertaken for developing methodologies based on the strategy of quantitation for the invasive assessment. Here, we analyze the impact of fibrosis assessment on the CLD patient care based on the data of recent clinical studies. We discuss and update the current invasive tools regarding their technological features and potentials for the particular clinical applications. Furthermore, we propose the potential resolutions with application of quantitative invasive tools for some major issues in fibrosis assessment, which appear to be obstacles against the nowadays rapid progress in CLD medicine.

  9. Bayesian reconstruction strategy of fluorescence-mediated tomography using an integrated SPECT-CT-OT system

    NASA Astrophysics Data System (ADS)

    Cao, Liji; Peter, Jörg

    2010-05-01

    Following the assembly of a triple-modality SPECT-CT-OT small animal imaging system providing intrinsically co-registered projection data of all three submodalities and under the assumption and investigation of dual-labeled probes consisting of both fluorophores and radionuclides, a novel multi-modal reconstruction strategy is presented in this paper aimed at improving fluorescence-mediated tomography (FMT). The following reconstruction procedure is proposed: firstly, standard x-ray CT image reconstruction is performed employing the FDK algorithm. Secondly, standard SPECT image reconstruction is performed using OSEM. Thirdly, from the reconstructed CT volume data the surface boundary of the imaged object is extracted for finite element definition. Finally, the reconstructed SPECT data are used as a priori information within a Bayesian reconstruction framework for optical (FMT) reconstruction. We provide results of this multi-modal approach using phantom experimental data and illustrate that this strategy does suppress artifacts and facilitates quantitative analysis for optical imaging studies.

  10. Implementation OSEM mesh-domain SPECT reconstruction with explicit prior information

    NASA Astrophysics Data System (ADS)

    Krol, Andrzej; Vogelsang, Levon; Lu, Yao; Xu, Yuesheng; Hu, Xiaofei; Shen, Lixin; Feiglin, David; Lipson, Edward

    2009-02-01

    In order to improve reconstructed image quality, we investigated performance of OSEM mesh-domain SPECT reconstruction with explicit prior anatomical and physiological information that was used to perform accurate attenuation compensation. It was accomplished in the following steps: (i) Obtain anatomical and physiological atlas of desired region of interest; (ii) Generate mesh that encodes properties of the atlas; (iii) Perform initial pixel-based reconstruction on projection dataset; (iv) Register the expected emission atlas to the initial pixel-based reconstruction and apply resulting transformation to meshed atlas; (v) Perform reconstruction in mesh-domain using deformed mesh of the atlas. This approach was tested on synthetic SPECT noise-free and noisy data. Comparative quantitative analysis demonstrated that this method outperformed pixel-based OSEM with uniform AC and is a promising approach that might lead to improved SPECT reconstruction quality.

  11. A remote quantitative Fugl-Meyer assessment framework for stroke patients based on wearable sensor networks.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-05-01

    To extend the use of wearable sensor networks for stroke patients training and assessment in non-clinical settings, this paper proposes a novel remote quantitative Fugl-Meyer assessment (FMA) framework, in which two accelerometer and seven flex sensors were used to monitoring the movement function of upper limb, wrist and fingers. The extreme learning machine based ensemble regression model was established to map the sensor data to clinical FMA scores while the RRelief algorithm was applied to find the optimal features subset. Considering the FMA scale is time-consuming and complicated, seven training exercises were designed to replace the upper limb related 33 items in FMA scale. 24 stroke inpatients participated in the experiments in clinical settings and 5 of them were involved in the experiments in home settings after they left the hospital. Both the experimental results in clinical and home settings showed that the proposed quantitative FMA model can precisely predict the FMA scores based on wearable sensor data, the coefficient of determination can reach as high as 0.917. It also indicated that the proposed framework can provide a potential approach to the remote quantitative rehabilitation training and evaluation.

  12. A review of small animal imaging planar and pinhole spect Gamma camera imaging.

    PubMed

    Peremans, Kathelijne; Cornelissen, Bart; Van Den Bossche, Bieke; Audenaert, Kurt; Van de Wiele, Christophe

    2005-01-01

    Scintigraphy (positron emission tomography (PET) or single photon emission computed tomography (SPECT) techniques) allows qualitative and quantitative measurement of physiologic processes as well as alterations secondary to various disease states. With the use of specific radioligands, molecular pathways and pharmaco-kinetic processes can be investigated. Radioligand delivery can be (semi)quantified in the region of interest in cross-sectional and longitudinal examinations, which can be performed under the same conditions or after physiologic or pharmacologic interventions. Most preclinical pharmacokinetic studies on physiological and experimentally altered physiological processes are performed in laboratory animals using high-resolution imaging systems. Single photon emission imaging has the disadvantage of decreased spatial and temporal resolution compared with PET. The advantage of SPECT is that equipment is generally more accessible and commonly used radionuclides have a longer physical half-life allowing for investigations over a longer time interval. This review will focus on single photon emission scintigraphy. An overview of contemporary techniques to measure biodistribution and kinetics of radiopharmaceuticals in small animal in vivo is presented. Theoretical as well as practical aspects of planar gamma camera and SPECT pinhole (PH) imaging are discussed. Current research is focusing on refining PH SPECT methodology, so specific regarding technical aspects and applications of PH SPECT will be reviewed.

  13. An investigation of inconsistent projections and artefacts in multi-pinhole SPECT with axially aligned pinholes

    NASA Astrophysics Data System (ADS)

    Kench, P. L.; Lin, J.; Gregoire, M. C.; Meikle, S. R.

    2011-12-01

    Multiple pinholes are advantageous for maximizing the use of the available field of view (FOV) of compact small animal single photon emission computed tomography (SPECT) detectors. However, when the pinholes are aligned axially to optimize imaging of extended objects, such as rodents, multiplexing of the pinhole projections can give rise to inconsistent data which leads to 'ghost point' artefacts in the reconstructed volume. A novel four pinhole collimator with a baffle was designed and implemented to eliminate these inconsistent projections. Simulation and physical phantom studies were performed to investigate artefacts from axially aligned pinholes and the efficacy of the baffle in removing inconsistent data and, thus, reducing reconstruction artefacts. SPECT was performed using a Defrise phantom to investigate the impact of collimator design on FOV utilization and axial blurring effects. Multiple pinhole SPECT acquired with a baffle had fewer artefacts and improved quantitative accuracy when compared to SPECT acquired without a baffle. The use of four pinholes positioned in a square maximized the available FOV, increased acquisition sensitivity and reduced axial blurring effects. These findings support the use of a baffle to eliminate inconsistent projection data arising from axially aligned pinholes and improve small animal SPECT reconstructions.

  14. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  15. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  16. Quantitative assessment of olfactory receptors activity in immobilized nanosomes: a novel concept for bioelectronic nose.

    PubMed

    Vidic, Jasmina Minic; Grosclaude, Jeanne; Persuy, Marie-Annick; Aioun, Josiane; Salesse, Roland; Pajot-Augy, Edith

    2006-08-01

    We describe how mammalian olfactory receptors (ORs) could be used as sensing elements of highly specific and sensitive bioelectronic noses. An OR and an appropriate G(alpha) protein were co-expressed in Saccharomyces cerevisiae cells from which membrane nanosomes were prepared, and immobilized on a sensor chip. By Surface Plasmon Resonance, we were able to quantitatively evaluate OR stimulation by an odorant, and G protein activation. We demonstrate that ORs in nanosomes discriminate between odorant ligands and unrelated odorants, as in whole cells. This assay also provides the possibility for quantitative assessment of the coupling efficiency of the OR with different G(alpha) subunits, without the interference of the cellular transduction pathway. Our findings will be useful to develop a new generation of electronic noses for detection and discrimination of volatile compounds, particularly amenable to micro- and nano-sensor formats.

  17. Experimental assessment of bone mineral density using quantitative computed tomography in holstein dairy cows

    PubMed Central

    MAETANI, Ayami; ITOH, Megumi; NISHIHARA, Kahori; AOKI, Takahiro; OHTANI, Masayuki; SHIBANO, Kenichi; KAYANO, Mitsunori; YAMADA, Kazutaka

    2016-01-01

    The aim of this study was to assess the measurement of bone mineral density (BMD) by quantitative computed tomography (QCT), comparing the relationships of BMD between QCT and dual-energy X-ray absorptiometry (DXA) and between QCT and radiographic absorptiometry (RA) in the metacarpal bone of Holstein dairy cows (n=27). A significant positive correlation was found between QCT and DXA measurements (r=0.70, P<0.01), and a significant correlation was found between QCT and RA measurements (r=0.50, P<0.01). We conclude that QCT provides quantitative evaluation of BMD in dairy cows, because BMD measured by QCT showed positive correlations with BMD measured by the two conventional methods: DXA and RA. PMID:27075115

  18. Novel method for quantitative assessment of physical workload of healthcare workers by a tetherless ergonomics workstation.

    PubMed

    Smith, Warren D; Alharbi, Kamal A; Dixon, Jeremy B; Reggad, Hind

    2012-01-01

    Healthcare workers are at risk of physical injury. Our laboratory has developed a tetherless ergonomics workstation that is suitable for studying physicians' and nurses' physical workloads in clinical settings. The workstation uses wearable sensors to record multiple channels of body orientation and muscle activity and wirelessly transmits them to a base station laptop computer for display, storage, and analysis. The ergonomics workstation generates long records of multi-channel data, so it is desired that the workstation automatically process these records and provide graphical and quantitative summaries of the physical workloads experienced by the healthcare workers. This paper describes a novel method of automated quantitative assessment of physical workload, termed joint cumulative amplitude-duration (JCAD) analysis, that has advantages over previous methods and illustrates its use in a comparison of the physical workloads of robotically-assisted surgery versus manual video-endoscopic surgery.

  19. Valuation of ecotoxicological impacts from tributyltin based on a quantitative environmental assessment framework.

    PubMed

    Noring, Maria; Håkansson, Cecilia; Dahlgren, Elin

    2016-02-01

    In the scientific literature, few valuations of biodiversity and ecosystem services following the impacts of toxicity are available, hampered by the lack of ecotoxicological documentation. Here, tributyltin is used to conduct a contingent valuation study as well as cost-benefit analysis (CBA) of measures for improving the environmental status in Swedish coastal waters of the Baltic Sea. Benefits considering different dimensions when assessing environmental status are highlighted and a quantitative environmental assessment framework based on available technology, ecological conditions, and economic valuation methodology is developed. Two scenarios are used in the valuation study: (a) achieving good environmental status by 2020 in accordance with EU legislation (USD 119 household(-1) year(-1)) and (b) achieving visible improvements by 2100 due to natural degradation (USD 108 household(-1) year(-1)) during 8 years. The later scenario was used to illustrate an application of the assessment framework. The CBA results indicate that both scenarios might generate a welfare improvement.

  20. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    PubMed

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment.

  1. Differential impact of multi-focus fan beam collimation with L-mode and conventional systems on the accuracy of myocardial perfusion imaging: Quantitative evaluation using phantoms

    PubMed Central

    Onishi, Hideo; Matsutomo, Norikazu; Kangai, Yoshiharu; Saho, Tatsunori; Amijima, Hizuru

    2013-01-01

    Objective(s): A novel IQ-SPECT™ method has become widely used in clinical studies. The present study compares the quality of myocardial perfusion images (MPI) acquired using the IQ-SPECT™ (IQ-mode), conventional (180° apart: C-mode) and L-mode (90° apart: L-mode) systems. We assessed spatial resolution, image reproducibility and quantifiability using various physical phantoms. Methods: SPECT images were acquired using a dual-headed gamma camera with C-mode, L-mode, and IQ-mode acquisition systems from line source, pai and cardiac phantoms containing solutions of 99mTc. The line source phantom was placed in the center of the orbit and at ± 4.0, ± 8.0, ± 12.0, ± 16.0 and ± 20.0 cm off center. We examined quantifiability using the pai phantom comprising six chambers containing 0.0, 0.016, 0.03, 0.045, 0.062, and 0.074 MBq/mL of 99m-Tc and cross-calibrating the SPECT counts. Image resolution and reproducibility were quantified as myocardial wall thickness (MWT) and %uptake using polar maps. Results: The full width at half maximum (FWHM) of the IQ-mode in the center was increased by 11% as compared with C-mode, and FWHM in the periphery was increased 41% compared with FWHM at the center. Calibrated SPECT counts were essentially the same when quantified using IQ-and C-modes. IQ-SPECT images of MWT were significantly improved (P<0.001) over L-mode, and C-mode SPECT imaging with IQ-mode became increasingly inhomogeneous, both visually and quantitatively (C-mode vs. L-mode, ns; C-mode vs. IQ-mode, P<0.05). Conclusion: Myocardial perfusion images acquired by IQ-SPECT were comparable to those acquired by conventional and L-mode SPECT, but with significantly improved resolution and quality. Our results suggest that IQ-SPECT is the optimal technology for myocardial perfusion SPECT imaging. PMID:27408847

  2. Quantitative microbial risk assessment for Staphylococcus aureus and Staphylococcus enterotoxin A in raw milk.

    PubMed

    Heidinger, Joelle C; Winter, Carl K; Cullor, James S

    2009-08-01

    A quantitative microbial risk assessment was constructed to determine consumer risk from Staphylococcus aureus and staphylococcal enterotoxin in raw milk. A Monte Carlo simulation model was developed to assess the risk from raw milk consumption using data on levels of S. aureus in milk collected by the University of California-Davis Dairy Food Safety Laboratory from 2,336 California dairies from 2005 to 2008 and using U.S. milk consumption data from the National Health and Nutrition Examination Survey of 2003 and 2004. Four modules were constructed to simulate pathogen growth and staphylococcal enterotoxin A production scenarios to quantify consumer risk levels under various time and temperature storage conditions. The three growth modules predicted that S. aureus levels could surpass the 10(5) CFU/ml level of concern at the 99.9th or 99.99th percentile of servings and therefore may represent a potential consumer risk. Results obtained from the staphylococcal enterotoxin A production module predicted that exposure at the 99.99th percentile could represent a dose capable of eliciting staphylococcal enterotoxin intoxication in all consumer age groups. This study illustrates the utility of quantitative microbial risk assessments for identifying potential food safety issues. PMID:19722395

  3. Quantitative risk assessment for human salmonellosis through the consumption of pork sausage in Porto Alegre, Brazil.

    PubMed

    Mürmann, Lisandra; Corbellini, Luis Gustavo; Collor, Alexandre Ávila; Cardoso, Marisa

    2011-04-01

    A quantitative microbiology risk assessment was conducted to evaluate the risk of Salmonella infection to consumers of fresh pork sausages prepared at barbecues in Porto Alegre, Brazil. For the analysis, a prevalence of 24.4% positive pork sausages with a level of contamination between 0.03 and 460 CFU g(-1) was assumed. Data related to frequency and habits of consumption were obtained by a questionnaire survey given to 424 people. A second-order Monte Carlo simulation separating the uncertain parameter of cooking time from the variable parameters was run. Of the people interviewed, 87.5% consumed pork sausage, and 85.4% ate it at barbecues. The average risk of salmonellosis per barbecue at a minimum cooking time of 15.6 min (worst-case scenario) was 6.24 × 10(-4), and the risk assessed per month was 1.61 × 10(-3). Cooking for 19 min would fully inactivate Salmonella in 99.9% of the cases. At this cooking time, the sausage reached a mean internal temperature of 75.7°C. The results of the quantitative microbiology risk assessment revealed that the consumption of fresh pork sausage is safe when cooking time is approximately 19 min, whereas undercooked pork sausage may represent a nonnegligible health risk for consumers.

  4. Assessment of liver tumor response to therapy: role of quantitative imaging.

    PubMed

    Gonzalez-Guindalini, Fernanda D; Botelho, Marcos P F; Harmath, Carla B; Sandrasegaran, Kumaresan; Miller, Frank H; Salem, Riad; Yaghmai, Vahid

    2013-10-01

    Quantitative imaging is the analysis of retrieved numeric data from images with the goal of reducing subjective assessment. It is an increasingly important radiologic tool to assess treatment response in oncology patients. Quantification of response to therapy depends on the tumor type and method of treatment. Anatomic imaging biomarkers that quantify liver tumor response to cytotoxic therapy are based on temporal change in the size of the tumors. Anatomic biomarkers have been incorporated into the World Health Organization criteria and the Response Evaluation Criteria in Solid Tumors (RECIST) versions 1.0 and 1.1. However, the development of novel therapies with different mechanisms of action, such as antiangiogenesis or radioembolization, has required new methods for measuring response to therapy. This need has led to development of tumor- or therapy-specific guidelines such as the Modified CT Response Evaluation (Choi) Criteria for gastrointestinal stromal tumors, the European Association for Study of the Liver (EASL) criteria, and modified RECIST for hepatocellular carcinoma, among many others. The authors review the current quantification criteria used in the evaluation of treatment response in liver tumors, summarizing their indications, advantages, and disadvantages, and discuss future directions with newer methods that have the potential for assessment of treatment response. Knowledge of these quantitative methods is important to facilitate pivotal communication between oncologists and radiologists about cancer treatment, with benefit ultimately accruing to the patient. PMID:24108562

  5. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  6. Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.

  7. Adaptive Autoregressive Model for Reduction of Noise in SPECT.

    PubMed

    Takalo, Reijo; Hytti, Heli; Ihalainen, Heimo; Sohlberg, Antti

    2015-01-01

    This paper presents improved autoregressive modelling (AR) to reduce noise in SPECT images. An AR filter was applied to prefilter projection images and postfilter ordered subset expectation maximisation (OSEM) reconstruction images (AR-OSEM-AR method). The performance of this method was compared with filtered back projection (FBP) preceded by Butterworth filtering (BW-FBP method) and the OSEM reconstruction method followed by Butterworth filtering (OSEM-BW method). A mathematical cylinder phantom was used for the study. It consisted of hot and cold objects. The tests were performed using three simulated SPECT datasets. Image quality was assessed by means of the percentage contrast resolution (CR%) and the full width at half maximum (FWHM) of the line spread functions of the cylinders. The BW-FBP method showed the highest CR% values and the AR-OSEM-AR method gave the lowest CR% values for cold stacks. In the analysis of hot stacks, the BW-FBP method had higher CR% values than the OSEM-BW method. The BW-FBP method exhibited the lowest FWHM values for cold stacks and the AR-OSEM-AR method for hot stacks. In conclusion, the AR-OSEM-AR method is a feasible way to remove noise from SPECT images. It has good spatial resolution for hot objects.

  8. SPECT reconstruction on the GPU

    NASA Astrophysics Data System (ADS)

    Vetter, Christoph; Westermann, Rüdiger

    2008-03-01

    With the increasing reliance of doctors on imaging procedures, not only visualization needs to be optimized, but the reconstruction of the volumes from the scanner output is another bottleneck. Accelerating the computationally intensive reconstruction process improves the medical work flow, matches the reconstruction speed to the acquisition speed, and allows fast batch processing and interactive or near-interactive parameter tuning. Recently, much effort has been focused on using the computational power of graphics processing units (GPUs) for general purpose computations. This paper presents a GPU-accelerated implementation of single photon emission computed tomography (SPECT) reconstruction based on an ordered-subset expectation maximization algorithm. The algorithm uses models for the point-spread-function (PSF) to improve spatial resolution in the reconstruction images. Instead of computing the PSF directly, it is modeled as efficient blurring of slabs on the GPU in order to accelerate the process. The algorithm for the calculation of accumulated attenuation factors that allows correcting the generated volume according to the attenuation properties of the volume is optimized for processing on the GPU. Since these factors can be reused between different iterations, a cache is used that is adapted to different sizes of the video memory so that only those factors have to be recomputed that do not fit onto graphics memory. These improvements make the reconstruction of typical SPECT volume near interactive.

  9. Integration of AdaptiSPECT, a small-animal adaptive SPECT imaging system

    PubMed Central

    Chaix, Cécile; Kovalsky, Stephen; Kosmider, Matthew; Barrett, Harrison H.; Furenlid, Lars R.

    2015-01-01

    AdaptiSPECT is a pre-clinical adaptive SPECT imaging system under final development at the Center for Gamma-ray Imaging. The system incorporates multiple adaptive features: an adaptive aperture, 16 detectors mounted on translational stages, and the ability to switch between a non-multiplexed and a multiplexed imaging configuration. In this paper, we review the design of AdaptiSPECT and its adaptive features. We then describe the on-going integration of the imaging system. PMID:26347197

  10. The economics of drug abuse: a quantitative assessment of drug demand.

    PubMed

    Hursh, Steven R; Galuska, Chad M; Winger, Gail; Woods, James H

    2005-02-01

    Behavioral economic concepts have proven useful for an overall understanding of the regulation of behavior by environmental commodities and complements a pharmacological perspective on drug abuse in several ways. First, a quantitative assessment of drug demand, equated in terms of drug potency, allows meaningful comparisons to be made among drug reinforcers within and across pharmacological classes. Second, behavioral economics provides a conceptual framework for understanding key factors, both pharmacological and environmental, that contribute to reductions in consumption of illicit drugs. Finally, behavioral economics provides a basis for generalization from laboratory and clinical studies to the development of novel behavioral and pharmacological therapies.

  11. Quantitative Assessment of the Effects of Oxidants on Antigen-Antibody Binding In Vitro

    PubMed Central

    Han, Shuang; Wang, Guanyu; Xu, Naijin; Liu, Hui

    2016-01-01

    Objective. We quantitatively assessed the influence of oxidants on antigen-antibody-binding activity. Methods. We used several immunological detection methods, including precipitation reactions, agglutination reactions, and enzyme immunoassays, to determine antibody activity. The oxidation-reduction potential was measured in order to determine total serum antioxidant capacity. Results. Certain concentrations of oxidants resulted in significant inhibition of antibody activity but had little influence on total serum antioxidant capacity. Conclusions. Oxidants had a significant influence on interactions between antigen and antibody, but minimal effect on the peptide of the antibody molecule. PMID:27313823

  12. Quantitative assessment of synovitis in Legg-Calvé-Perthes disease using gadolinium-enhanced MRI.

    PubMed

    Neal, David C; O'Brien, Jack C; Burgess, Jamie; Jo, Chanhee; Kim, Harry K W

    2015-03-01

    A quantitative method to assess hip synovitis in Legg-Calvé-Perthes disease (LCPD) is not currently available. To develop this method, the areas of synovial enhancement on gadolinium-enhanced MRI (Gd-MRI) were measured by two independent observers. The volume of synovial enhancement was significantly increased in the initial and the fragmentation stages of LCPD (Waldenström stages I and II), with a persistence of synovitis into the reossification stage (stage III). The Gd-MRI method had high interobserver and intraobserver agreements and may serve as a useful method to monitor the effect of various treatments on hip synovitis in LCPD. PMID:25305048

  13. Semi-quantitative exposure assessment of occupational exposure to wood dust and nasopharyngeal cancer risk.

    PubMed

    Ekpanyaskul, Chatchai; Sangrajrang, Suleeporn; Ekburanawat, Wiwat; Brennan, Paul; Mannetje, Andrea; Thetkathuek, Anamai; Saejiw, Nutjaree; Ruangsuwan, Tassanu; Boffetta, Paolo

    2015-01-01

    Occupational exposure to wood dust is one cause of nasopharyngeal cancer (NPC); however, assessing this exposure remains problematic. Therefore, the objective of this study was to develop a semi-quantitative exposure assessment method and then utilize it to evaluate the association between occupational exposure to wood dust and the development of NPC. In addition, variations in risk by histology were examined. A case-control study was conducted with 327 newly diagnosed cases of NPC at the National Cancer Institute and regional cancer centers in Thailand with 1:1 controls matched for age, gender and geographical residence. Occupational information was obtained through personal interviews. The potential probability, frequency and intensity of exposure to wood dust were assessed on a job-by-job basis by experienced experts. Analysis was performed by conditional logistic regression and presented in odds ratio (ORs) estimates and 95% confidence intervals (CI). Overall, a non significant relationship between occupational wood dust exposure and NPC risk for all subjects was observed (ORs=1.61, 95%CI 0.99-2.59); however, the risk became significant when analyses focused on types 2 and 3 of NPC (ORs=1.62, 95%CI 1.03-2.74). The significant association was stronger for those exposed to wood dust for >10 year (ORs=2.26, 95%CI 1.10-4.63), for those with first-time exposure at age>25 year (ORs=2.07, 95%CI 1.08-3.94), and for those who had a high cumulative exposure (ORs=2.17, 95%CI 1.03-4.58) when compared with those considered unexposed. In conclusion, wood dust is likely to be associated with an increased risk of type 2 or 3 NPC in the Thai population. The results of this study show that semi-quantitative exposure assessment is suitable for occupational exposure assessment in a case control study and complements the information from self-reporting.

  14. Quantitative assessment of intrinsic groundwater vulnerability to contamination using numerical simulations.

    PubMed

    Neukum, Christoph; Azzam, Rafig

    2009-12-20

    Intrinsic vulnerability assessment to groundwater contamination is part of groundwater management in many areas of the world. However, popular assessment methods estimate vulnerability only qualitatively. To enhance vulnerability assessment, an approach for quantitative vulnerability assessment using numerical simulation of water flow and solute transport with transient boundary conditions and new vulnerability indicators are presented in this work. Based on a conceptual model of the unsaturated underground with distinct hydrogeological layers and site specific hydrological characteristics the numerical simulations of water flow and solute transport are applied on each hydrogeological layer with standardized conditions separately. Analysis of the simulation results reveals functional relationships between layer thickness, groundwater recharge and transit time. Based on the first, second and third quartiles of solute mass breakthrough at the lower boundary of the unsaturated zone, and the solute dilution, four vulnerability indicators are extracted. The indicator transit time t(50) is the time were 50% of solute mass breakthrough passes the groundwater table. Dilution is referred as maximum solute concentration C(max) in the percolation water when entering the groundwater table in relation to the injected mass or solute concentration C(0) at the ground surface. Duration of solute breakthrough is defined as the time period between 25% and 75% (t(25%)-t(75%)) of total solute mass breakthrough at the groundwater table. The temporal shape of the breakthrough curve is expressed with the quotient (t(25%)-t(50%))/(t(25%)-t(75%)). Results from an application of this new quantitative vulnerability assessment approach, its advantages and disadvantages, and potential benefits for future groundwater management strategies are discussed.

  15. SPECT Imaging: Basics and New Trends

    NASA Astrophysics Data System (ADS)

    Hutton, Brian F.

    Single Photon Emission Computed Tomography (SPECT) is widely used as a means of imaging the distribution of administered radiotracers that have single-photon emission. The most widely used SPECT systems are based on the Anger gamma camera, usually involving dual detectors that rotate around the patient. Several factors affect the quality of SPECT images (e.g., resolution and noise) and the ability to perform absolute quantification (e.g., attenuation, scatter, motion, and resolution). There is a trend to introduce dual-modality systems and organ-specific systems, both developments that enhance diagnostic capability.

  16. [Acceptance check and quality control of SPECT].

    PubMed

    Sun, L M; Liu, C B

    2001-05-01

    This paper explains the testing of SPECT, especially the new SPECT with double digital detector and spiral scanning frames that has been introduced to China recently, in the acceptance check, proceeding from the physical functions of the system to its mechanical functions, to the NEMA standard functions, and then to the computer hardware specified in the contract. A brief introduction is also given of the quality control of SPECT in terms of its spatial resolution, energy resolution, spatial linearity, sensitivity, and center of rotation. PMID:12583289

  17. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography

    PubMed Central

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-01-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of–interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  18. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  19. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    PubMed Central

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J

    2015-01-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle. PMID:18612176

  20. The Quantitative Ideas and Methods in Assessment of Four Properties of Chinese Medicinal Herbs.

    PubMed

    Fu, Jialei; Pang, Jingxiang; Zhao, Xiaolei; Han, Jinxiang

    2015-04-01

    The purpose of this review is to summarize and reflect on the current status and problems of the research on the properties of Chinese medicinal herbs. Hot, warm, cold, and cool are the four properties/natures of Chinese medicinal herbs. They are defined based on the interaction between the herbs with human body. How to quantitatively assess the therapeutic effect of Chinese medicinal herbs based on the theoretical system of Traditional Chinese medicine (TCM) remains to be a challenge. Previous studies on the topic from several perspectives have been presented. Results and problems were discussed. New ideas based on the technology of biophoton radiation detection are proposed. With the development of biophoton detection technology, detection and characterization of human biophoton emission has led to its potential applications in TCM. The possibility of using the biophoton analysis system to study the interaction of Chinese medicinal herbs with human body and to quantitatively determine the effect of the Chinese medicinal herbal is entirely consistent with the holistic concept of TCM theory. The statistical entropy of electromagnetic radiations from the biological systems can characterize the four properties of Chinese medicinal herbs, and the spectrum can characterize the meridian tropism of it. Therefore, we hypothesize that by the use of biophoton analysis system, the four properties and meridian tropism of Chinese medicinal herbs can be quantitatively expressed.

  1. Monitoring and quantitative assessment of tumor burden using in vivo bioluminescence imaging

    NASA Astrophysics Data System (ADS)

    Chen, Chia-Chi; Hwang, Jeng-Jong; Ting, Gann; Tseng, Yun-Long; Wang, Shyh-Jen; Whang-Peng, Jaqueline

    2007-02-01

    In vivo bioluminescence imaging (BLI) is a sensitive imaging modality that is rapid and accessible, and may comprise an ideal tool for evaluating tumor growth. In this study, the kinetic of tumor growth has been assessed in C26 colon carcinoma bearing BALB/c mouse model. The ability of BLI to noninvasively quantitate the growth of subcutaneous tumors transplanted with C26 cells genetically engineered to stably express firefly luciferase and herpes simplex virus type-1 thymidine kinase (C26/ tk-luc). A good correlation ( R2=0.998) of photon emission to the cell number was found in vitro. Tumor burden and tumor volume were monitored in vivo over time by quantitation of photon emission using Xenogen IVIS 50 and standard external caliper measurement, respectively. At various time intervals, tumor-bearing mice were imaged to determine the correlation of in vivo BLI to tumor volume. However, a correlation of BLI to tumor volume was observed when tumor volume was smaller than 1000 mm 3 ( R2=0.907). γ Scintigraphy combined with [ 131I]FIAU was another imaging modality used for verifying the previous results. In conclusion, this study showed that bioluminescence imaging is a powerful and quantitative tool for the direct assay to monitor tumor growth in vivo. The dual reporter genes transfected tumor-bearing animal model can be applied in the evaluation of the efficacy of new developed anti-cancer drugs.

  2. Advances in SPECT for Optimizing the Liver Tumors Radioembolization Using Yttrium-90 Microspheres

    PubMed Central

    Roshan, Hoda Rezaei; Azarm, Ahmadreza; Mahmoudian, Babak; Islamian, Jalil Pirayesh

    2015-01-01

    Radioembolization (RE) with Yttrium-90 (90Y) microspheres is an effective treatment for unresectable liver tumors. The activity of the microspheres to be administered should be calculated based on the type of microspheres. Technetium-99m macroaggregated albumin (99mTc-MAA) single photon emission computed tomography/computed tomography (SPECT/CT) is a reliable assessment before RE to ensure the safe delivery of microspheres into the target. 90Y bremsstrahlung SPECT imaging as a posttherapeutic assessment approach enables the reliable determination of absorbed dose, which is indispensable for the verification of treatment efficacy. This article intends to provide a review of the methods of optimizing 90Y bremsstrahlung SPECT imaging to improve the treatment efficacy of liver tumor RE using 90Y microspheres. PMID:26097416

  3. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    PubMed

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  4. Quantitative, fluorescence-based in-situ assessment of protein expression.

    PubMed

    Moeder, Christopher B; Giltnane, Jennifer M; Moulis, Sharon Pozner; Rimm, David L

    2009-01-01

    As companion diagnostics grow in prevalence and importance, the need for accurate assessment of in situ protein concentrations has increased. Traditional immunohistochemistry (IHC), while valuable for assessment of context of expression, is less valuable for quantification. The lack of rigorous quantitative potential of traditional IHC led to our development of an immunofluorescence-based method now commercialized as the AQUA technology. Immunostaining of tissue samples, image acquisition, and use of AQUA software allow investigators to quickly, efficiently, and accurately measure levels of expression within user-defined subcellular or architectural compartments. IHC analyzed by AQUA shows high reproducibility and demonstrates protein measurement accuracy similar to ELISA assays. The process is largely automated, eliminating potential error, and the resultant scores are exported on a continuous scale. There are now numerous published examples where observations made with this technology are not seen by traditional methods.

  5. Benchmark dose profiles for joint-action continuous data in quantitative risk assessment.

    PubMed

    Deutsch, Roland C; Piegorsch, Walter W

    2013-09-01

    Benchmark analysis is a widely used tool in biomedical and environmental risk assessment. Therein, estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a prespecified benchmark response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This paper demonstrates how the benchmark modeling paradigm can be expanded from the single-agent setting to joint-action, two-agent studies. Focus is on continuous response outcomes. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile-a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR-is defined for use in quantitative risk characterization and assessment.

  6. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  7. Quantitative photoacoustic assessment of ex-vivo lymph nodes of colorectal cancer patients

    NASA Astrophysics Data System (ADS)

    Sampathkumar, Ashwin; Mamou, Jonathan; Saegusa-Beercroft, Emi; Chitnis, Parag V.; Machi, Junji; Feleppa, Ernest J.

    2015-03-01

    Staging of cancers and selection of appropriate treatment requires histological examination of multiple dissected lymph nodes (LNs) per patient, so that a staggering number of nodes require histopathological examination, and the finite resources of pathology facilities create a severe processing bottleneck. Histologically examining the entire 3D volume of every dissected node is not feasible, and therefore, only the central region of each node is examined histologically, which results in severe sampling limitations. In this work, we assess the feasibility of using quantitative photoacoustics (QPA) to overcome the limitations imposed by current procedures and eliminate the resulting under sampling in node assessments. QPA is emerging as a new hybrid modality that assesses tissue properties and classifies tissue type based on multiple estimates derived from spectrum analysis of photoacoustic (PA) radiofrequency (RF) data and from statistical analysis of envelope-signal data derived from the RF signals. Our study seeks to use QPA to distinguish cancerous from non-cancerous regions of dissected LNs and hence serve as a reliable means of imaging and detecting small but clinically significant cancerous foci that would be missed by current methods. Dissected lymph nodes were placed in a water bath and PA signals were generated using a wavelength-tunable (680-950 nm) laser. A 26-MHz, f-2 transducer was used to sense the PA signals. We present an overview of our experimental setup; provide a statistical analysis of multi-wavelength classification parameters (mid-band fit, slope, intercept) obtained from the PA signal spectrum generated in the LNs; and compare QPA performance with our established quantitative ultrasound (QUS) techniques in distinguishing metastatic from non-cancerous tissue in dissected LNs. QPA-QUS methods offer a novel general means of tissue typing and evaluation in a broad range of disease-assessment applications, e.g., cardiac, intravascular

  8. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese.

  9. Quantitative assessments of burn degree by high-frequency ultrasonic backscattering and statistical model

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Hsun; Huang, Chih-Chung; Wang, Shyh-Hau

    2011-02-01

    An accurate and quantitative modality to assess the burn degree is crucial for determining further treatments to be properly applied to burn injury patients. Ultrasounds with frequencies higher than 20 MHz have been applied to dermatological diagnosis due to its high resolution and noninvasive capability. Yet, it is still lacking a substantial means to sensitively correlate the burn degree and ultrasonic measurements quantitatively. Thus, a 50 MHz ultrasound system was developed and implemented to measure ultrasonic signals backscattered from the burned skin tissues. Various burn degrees were achieved by placing a 100 °C brass plate onto the dorsal skins of anesthetized rats for various durations ranged from 5 to 20 s. The burn degrees were correlated with ultrasonic parameters, including integrated backscatter (IB) and Nakagami parameter (m) calculated from ultrasonic signals acquired from the burned tissues of a 5 × 1.4 mm (width × depth) area. Results demonstrated that both IB and m decreased exponentially with the increase of burn degree. Specifically, an IB of -79.0 ± 2.4 (mean ± standard deviation) dB for normal skin tissues tended to decrease to -94.0 ± 1.3 dB for those burned for 20 s, while the corresponding Nakagami parameters tended to decrease from 0.76 ± 0.08 to 0.45 ± 0.04. The variation of both IB and m was partially associated with the change of properties of collagen fibers from the burned tissues verified by samples of tissue histological sections. Particularly, the m parameter may be more sensitive to differentiate burned skin due to the fact that it has a greater rate of change with respect to different burn durations. These ultrasonic parameters in conjunction with high-frequency B-mode and Nakagami images could have the potential to assess the burn degree quantitatively.

  10. Changes in transmural distribution of myocardial perfusion assessed by quantitative intravenous myocardial contrast echocardiography in humans

    PubMed Central

    Fukuda, S; Muro, T; Hozumi, T; Watanabe, H; Shimada, K; Yoshiyama, M; Takeuchi, K; Yoshikawa, J

    2002-01-01

    Objective: To clarify whether changes in transmural distribution of myocardial perfusion under significant coronary artery stenosis can be assessed by quantitative intravenous myocardial contrast echocardiography (MCE) in humans. Methods: 31 patients underwent dipyridamole stress MCE and quantitative coronary angiography. Intravenous MCE was performed by continuous infusion of Levovist. Images were obtained from the apical four chamber view with alternating pulsing intervals both at rest and after dipyridamole infusion. Images were analysed offline by placing regions of interest over both endocardial and epicardial sides of the mid-septum. The background subtracted intensity versus pulsing interval plots were fitted to an exponential function, y = A (1 − e−βt), where A is plateau level and β is rate of rise. Results: Of the 31 patients, 16 had significant stenosis (> 70%) in the left anterior descending artery (group A) and 15 did not (group B). At rest, there were no differences in the A endocardial to epicardial ratio (A-EER) and β-EER between the two groups (mean (SD) 1.2 (0.6) v 1.2 (0.8) and 1.2 (0.7) v 1.1 (0.6), respectively, NS). During hyperaemia, β-EER in group A was significantly lower than that in group B (1.0 (0.5) v 1.4 (0.5), p < 0.05) and A-EER did not differ between the two groups (1.0 (0.5) v 1.2 (0.4), NS). Conclusions: Changes in transmural distribution of myocardial perfusion under significant coronary artery stenosis can be assessed by quantitative intravenous MCE in humans. PMID:12231594

  11. Quantitative Assessment of Amino Acid Damage upon keV Ion Beam Irradiation Through FTIR Spectroscopy

    NASA Astrophysics Data System (ADS)

    Huang, Qing; Ke, Zhigang; Su, Xi; Yuan, Hang; Zhang, Shuqing; Yu, Zengliang

    2010-06-01

    Ion beam irradiation induces important biological effects and it is a long-standing task to acquire both qualitative and quantitative assessment of these effects. One effective way in the investigation is to utilize Fourier transformation infrared (FTIR) spectroscopy because it can offer sensitive and non-invasive measurements. In this paper a novel protocol was employed to prepare biomolecular samples in the form of thin and transversely uniform solid films that were suitable for both infrared and low-energy ion beam irradiation experiments. Under the irradiation of N+ and Ar+ ion beams of 25 keV with fluence ranging from 5×1015 ions/cm2 to 2.5×10 ions/cm2, the ion radio-sensitivity of four amino acids, namely, glycine, tyrosine, methionine and phenylalanine, were evaluated and compared. The ion beam irradiation caused biomolecular decomposition accompanied by molecular desorption of volatile species and the damage was dependent on ion type, fluence, energy and types of amino acids. The effectiveness of application of FTIR spectroscopy to the quantitative assessment of biomolecular damage dose effect induced by low-energy ion radiation was thus demonstrated.

  12. Using Non-Invasive Multi-Spectral Imaging to Quantitatively Assess Tissue Vasculature

    SciTech Connect

    Vogel, A; Chernomordik, V; Riley, J; Hassan, M; Amyot, F; Dasgeb, B; Demos, S G; Pursley, R; Little, R; Yarchoan, R; Tao, Y; Gandjbakhche, A H

    2007-10-04

    This research describes a non-invasive, non-contact method used to quantitatively analyze the functional characteristics of tissue. Multi-spectral images collected at several near-infrared wavelengths are input into a mathematical optical skin model that considers the contributions from different analytes in the epidermis and dermis skin layers. Through a reconstruction algorithm, we can quantify the percent of blood in a given area of tissue and the fraction of that blood that is oxygenated. Imaging normal tissue confirms previously reported values for the percent of blood in tissue and the percent of blood that is oxygenated in tissue and surrounding vasculature, for the normal state and when ischemia is induced. This methodology has been applied to assess vascular Kaposi's sarcoma lesions and the surrounding tissue before and during experimental therapies. The multi-spectral imaging technique has been combined with laser Doppler imaging to gain additional information. Results indicate that these techniques are able to provide quantitative and functional information about tissue changes during experimental drug therapy and investigate progression of disease before changes are visibly apparent, suggesting a potential for them to be used as complementary imaging techniques to clinical assessment.

  13. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    NASA Astrophysics Data System (ADS)

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-03-01

    We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution.

  14. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. PMID:23892022

  15. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article.

  16. Quantitative assessment on soil enzyme activities of heavy metal contaminated soils with various soil properties.

    PubMed

    Xian, Yu; Wang, Meie; Chen, Weiping

    2015-11-01

    Soil enzyme activities are greatly influenced by soil properties and could be significant indicators of heavy metal toxicity in soil for bioavailability assessment. Two groups of experiments were conducted to determine the joint effects of heavy metals and soil properties on soil enzyme activities. Results showed that arylsulfatase was the most sensitive soil enzyme and could be used as an indicator to study the enzymatic toxicity of heavy metals under various soil properties. Soil organic matter (SOM) was the dominant factor affecting the activity of arylsulfatase in soil. A quantitative model was derived to predict the changes of arylsulfatase activity with SOM content. When the soil organic matter content was less than the critical point A (1.05% in our study), the arylsulfatase activity dropped rapidly. When the soil organic matter content was greater than the critical point A, the arylsulfatase activity gradually rose to higher levels showing that instead of harm the soil microbial activities were enhanced. The SOM content needs to be over the critical point B (2.42% in our study) to protect its microbial community from harm due to the severe Pb pollution (500mgkg(-1) in our study). The quantitative model revealed the pattern of variation of enzymatic toxicity due to heavy metals under various SOM contents. The applicability of the model under wider soil properties need to be tested. The model however may provide a methodological basis for ecological risk assessment of heavy metals in soil.

  17. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    PubMed Central

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-01-01

    Abstract. We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution. PMID:25734405

  18. Quantitative safety assessment of computer based I and C systems via modular Markov analysis

    SciTech Connect

    Elks, C. R.; Yu, Y.; Johnson, B. W.

    2006-07-01

    This paper gives a brief overview of the methodology based on quantitative metrics for evaluating digital I and C system that has been under development at the Univ. of Virginia for a number years. Our quantitative assessment methodology is based on three well understood and extensively practiced disciplines in the dependability assessment field: (1) System level fault modeling and fault injection, (2) safety and coverage based dependability modeling methods, and (3) statistical estimation of model parameters used for safety predication. There are two contributions of this paper; the first contribution is related to incorporating design flaw information into homogenous Markov models when such data is available. The second is to introduce a Markov modeling method for managing the modeling complexities of large distributed I and C systems for the predication of safety and reliability. The method is called Modular Markov Chain analysis. This method allows Markov models of the system to be composed in a modular manner. In doing so, it address two important issues. (1) The models are more visually representative of the functional the system. (2) Important failure dependencies that naturally occur in complex systems are modeled accurately with our approach. (authors)

  19. Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Thong, William E.; Ou, Phalla

    2013-03-01

    This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.

  20. Single Photon Emission Computed Tomography (SPECT)

    MedlinePlus

    ... High Blood Pressure Tools & Resources Stroke More Single Photon Emission Computed Tomography (SPECT) Updated:Sep 11,2015 ... Persantine) or dobutamine. The tests may take between 2 and 2 1/2 hours. What happens after ...

  1. Cerebral SPECT imaging: Impact on clinical management

    SciTech Connect

    Bloom, M.; Jacobs, S.; Pozniakof, T.

    1994-05-01

    Although cerebral SPECT has been reported to be of value in a variety of neurologic disorders, there is limited data available on the value of SPECT relative to clinical management decisions. The purpose of this study was to determine the effect of cerebral SPECT imaging on patient management. A total of 94 consecutive patients referred for clinical evaluation with brain SPECT were included in this study. Patients were assigned to one of nine groups depending on the clinical indication for the study. These groups included transient ischemia (16), stroke (20), dementia (18), seizures (5), hemorrhage (13), head trauma (6), arteriovenous malformations (6), encephalopathy (6) and a miscellaneous (4) group. All patients were injected with 99mTc HMPAO in doses ranging from 15 mCi to 22 mCi (555 MBq to 814 MBq) and scanned on a triple headed SPECT gamma camera. Two weeks after completion of the study, a standardized interview was conducted between the nuclear and referring physicians to determine if the SPECT findings contributed to an alteration in patient management. Overall, patient management was significantly altered in 47% of the cases referred. The greatest impact on patient management occurred in the group evaluated for transient ischemia, where a total of 13/16 (81%) of patients had their clinical management altered as a result of the cerebral SPECT findings. Clinical management was altered in 61% of patients referred for evaluation of dementia, 67% of patients evaluated for arteriovenous malformations, and 50% of patients with head trauma. In the remainder of the patients, alteration in clinical management ranged from 17% to 50% of patients. This study demonstrates the clinical utility of cerebral SPECT imaging since in a significant number of cases clinical management was altered as a result of the examination. Long term follow up will be necessary to determine patient outcome.

  2. A qualitative and quantitative needs assessment of pain management for hospitalized orthopedic patients.

    PubMed

    Cordts, Grace A; Grant, Marian S; Brandt, Lynsey E; Mears, Simon C

    2011-08-08

    Despite advances in pain management, little formal teaching is given to practitioners and nurses in its use for postoperative orthopedic patients. The goal of our study was to determine the educational needs for orthopedic pain management of our residents, nurses, and physical therapists using a quantitative and qualitative assessment. The needs analysis was conducted in a 10-bed orthopedic unit at a teaching hospital and included a survey given to 20 orthopedic residents, 9 nurses, and 6 physical therapists, followed by focus groups addressing barriers to pain control and knowledge of pain management. Key challenges for nurses included not always having breakthrough pain medication orders and the gap in pain management between cessation of patient-controlled analgesia and ordering and administering oral medications. Key challenges for orthopedic residents included treating pain in patients with a history of substance abuse, assessing pain, and determining when to use long-acting vs short-acting opioids. Focus group assessments revealed a lack of training in pain management and the need for better coordination of care between nurses and practitioners and improved education about special needs groups (the elderly and those with substance abuse issues). This needs assessment showed that orthopedic residents and nurses receive little formal education on pain management, despite having to address pain on a daily basis. This information will be used to develop an educational program to improve pain management for postoperative orthopedic patients. An integrated educational program with orthopedic residents, nurses, and physical therapists would promote understanding of issues for each discipline.

  3. Quantitative Integration of Ndt with Probabilistic Fracture Mechanics for the Assessment of Fracture Risk in Pipelines

    NASA Astrophysics Data System (ADS)

    Kurz, J. H.; Cioclov, D.; Dobmann, G.; Boller, C.

    2010-02-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. The ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI) is an important feature of this approach. It is achieved by integrating, algorithmically, probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. A better substantiation is enabled of both the component reliability management and the costs-effectiveness of NDI timing.

  4. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  5. Quantitative assessment of properties of make-up products by video imaging: application to lipsticks.

    PubMed

    Korichi, Rodolphe; Provost, Robin; Heusèle, Catherine; Schnebert, Sylvianne

    2000-11-01

    BACKGROUND/AIMS: The different properties and visual effects of lipstick have been studied by image analysis directly on volunteers. METHODS: After controlling the volunteer's position mechanically using an ophthalmic table and visually using an acquirement mask, which is an indicator of luminance and guide marks, we carried out video colour images of the make-up area. From these images, we quantified the colour, gloss, covering power, long-lasting effect and streakiness, using computer science programs. RESULTS/CONCLUSION: Quantitative colorimetric assessment requires the transformation of the RGB components obtained by a video colour camera into CIELAB colorimetric space. The expression of each coordinate of the L*a*b* space according to R,G,B was carried out by a statistical method of polynomial approximations. A study, using 24 colour images extracted from a Pantone(R) palette, showed a very good correlation with a Minolta Colorimeter(R) CR 300. The colour assessment on volunteers required a segmentation method by maximizing the entropy. The aim was to separate the colour information sent back by the skin to the make-up area. It was very useful to precisely delimit the contour between the skin and the product in the case of almost identical colours and to evaluate the streakiness. From this colour segmentation, an algorithm was studied to search for the shades most represented in the overall colour of the make-up area. The capacity to replicate what the consumer perceives of the make-up product, to carry out studies without having any contact with the skin surface, and the constant improvement of software and video acquirement systems all make video imaging a very useful tool in the quantitative assessment of the properties and visual effects of a make-up product. PMID:11428961

  6. Quantitative assessment of properties of make-up products by video imaging: application to lipsticks.

    PubMed

    Korichi, Rodolphe; Provost, Robin; Heusèle, Catherine; Schnebert, Sylvianne

    2000-11-01

    BACKGROUND/AIMS: The different properties and visual effects of lipstick have been studied by image analysis directly on volunteers. METHODS: After controlling the volunteer's position mechanically using an ophthalmic table and visually using an acquirement mask, which is an indicator of luminance and guide marks, we carried out video colour images of the make-up area. From these images, we quantified the colour, gloss, covering power, long-lasting effect and streakiness, using computer science programs. RESULTS/CONCLUSION: Quantitative colorimetric assessment requires the transformation of the RGB components obtained by a video colour camera into CIELAB colorimetric space. The expression of each coordinate of the L*a*b* space according to R,G,B was carried out by a statistical method of polynomial approximations. A study, using 24 colour images extracted from a Pantone(R) palette, showed a very good correlation with a Minolta Colorimeter(R) CR 300. The colour assessment on volunteers required a segmentation method by maximizing the entropy. The aim was to separate the colour information sent back by the skin to the make-up area. It was very useful to precisely delimit the contour between the skin and the product in the case of almost identical colours and to evaluate the streakiness. From this colour segmentation, an algorithm was studied to search for the shades most represented in the overall colour of the make-up area. The capacity to replicate what the consumer perceives of the make-up product, to carry out studies without having any contact with the skin surface, and the constant improvement of software and video acquirement systems all make video imaging a very useful tool in the quantitative assessment of the properties and visual effects of a make-up product.

  7. Zebrafish Caudal Fin Angiogenesis Assay—Advanced Quantitative Assessment Including 3-Way Correlative Microscopy

    PubMed Central

    Correa Shokiche, Carlos; Schaad, Laura; Triet, Ramona; Jazwinska, Anna; Tschanz, Stefan A.; Djonov, Valentin

    2016-01-01

    Background Researchers evaluating angiomodulating compounds as a part of scientific projects or pre-clinical studies are often confronted with limitations of applied animal models. The rough and insufficient early-stage compound assessment without reliable quantification of the vascular response counts, at least partially, to the low transition rate to clinics. Objective To establish an advanced, rapid and cost-effective angiogenesis assay for the precise and sensitive assessment of angiomodulating compounds using zebrafish caudal fin regeneration. It should provide information regarding the angiogenic mechanisms involved and should include qualitative and quantitative data of drug effects in a non-biased and time-efficient way. Approach & Results Basic vascular parameters (total regenerated area, vascular projection area, contour length, vessel area density) were extracted from in vivo fluorescence microscopy images using a stereological approach. Skeletonization of the vasculature by our custom-made software Skelios provided additional parameters including “graph energy” and “distance to farthest node”. The latter gave important insights into the complexity, connectivity and maturation status of the regenerating vascular network. The employment of a reference point (vascular parameters prior amputation) is unique for the model and crucial for a proper assessment. Additionally, the assay provides exceptional possibilities for correlative microscopy by combining in vivo-imaging and morphological investigation of the area of interest. The 3-way correlative microscopy links the dynamic changes in vivo with their structural substrate at the subcellular level. Conclusions The improved zebrafish fin regeneration model with advanced quantitative analysis and optional 3-way correlative morphology is a promising in vivo angiogenesis assay, well-suitable for basic research and preclinical investigations. PMID:26950851

  8. Basic concepts in three-part quantitative assessments of undiscovered mineral resources

    USGS Publications Warehouse

    Singer, D.A.

    1993-01-01

    Since 1975, mineral resource assessments have been made for over 27 areas covering 5??106 km2 at various scales using what is now called the three-part form of quantitative assessment. In these assessments, (1) areas are delineated according to the types of deposits permitted by the geology,(2) the amount of metal and some ore characteristics are estimated using grade and tonnage models, and (3) the number of undiscovered deposits of each type is estimated. Permissive boundaries are drawn for one or more deposit types such that the probability of a deposit lying outside the boundary is negligible, that is, less than 1 in 100,000 to 1,000,000. Grade and tonnage models combined with estimates of the number of deposits are the fundamental means of translating geologists' resource assessments into a language that economists can use. Estimates of the number of deposits explicitly represent the probability (or degree of belief) that some fixed but unknown number of undiscovered deposits exist in the delineated tracts. Estimates are by deposit type and must be consistent with the grade and tonnage model. Other guidelines for these estimates include (1) frequency of deposits from well-explored areas, (2) local deposit extrapolations, (3) counting and assigning probabilities to anomalies and occurrences, (4) process constraints, (5) relative frequencies of related deposit types, and (6) area spatial limits. In most cases, estimates are made subjectively, as they are in meteorology, gambling, and geologic interpretations. In three-part assessments, the estimates are internally consistent because delineated tracts are consistent with descriptive models, grade and tonnage models are consistent with descriptive models, as well as with known deposits in the area, and estimates of number of deposits are consistent with grade and tonnage models. All available information is used in the assessment, and uncertainty is explicitly represented. ?? 1993 Oxford University Press.

  9. Assessment of Nutritional Status of Nepalese Hemodialysis Patients by Anthropometric Examinations and Modified Quantitative Subjective Global Assessment

    PubMed Central

    Sedhain, Arun; Hada, Rajani; Agrawal, Rajendra Kumar; Bhattarai, Gandhi R; Baral, Anil

    2015-01-01

    OBJECTIVE To assess the nutritional status of patients on maintenance hemodialysis by using modified quantitative subjective global assessment (MQSGA) and anthropometric measurements. METHOD We Conducted a cross sectional descriptive analytical study to assess the nutritional status of fifty four patients with chronic kidney disease undergoing maintenance hemodialysis by using MQSGA and different anthropometric and laboratory measurements like body mass index (BMI), mid-arm circumference (MAC), mid-arm muscle circumference (MAMC), triceps skin fold (TSF) and biceps skin fold (BSF), serum albumin, C-reactive protein (CRP) and lipid profile in a government tertiary hospital at Kathmandu, Nepal. RESULTS Based on MQSGA criteria, 66.7% of the patients suffered from mild to moderate malnutrition and 33.3% were well nourished. None of the patients were severely malnourished. CRP was positive in 56.3% patients. Serum albumin, MAC and BMI were (mean + SD) 4.0 + 0.3 mg/dl, 22 + 2.6 cm and 19.6 ± 3.2 kg/m2 respectively. MQSGA showed negative correlation with MAC (r = −0.563; P = <0.001), BMI (r = −0.448; P = <0.001), MAMC (r = −0.506; P = <.0001), TSF (r = −0.483; P = <.0002), and BSF (r = −0.508; P = <0.0001). Negative correlation of MQSGA was also found with total cholesterol, triglyceride, LDL cholesterol and HDL cholesterol without any statistical significance. CONCLUSION Mild to moderate malnutrition was found to be present in two thirds of the patients undergoing hemodialysis. Anthropometric measurements like BMI, MAC, MAMC, BSF and TSF were negatively correlated with MQSGA. Anthropometric and laboratory assessment tools could be used for nutritional assessment as they are relatively easier, cheaper and practical markers of nutritional status. PMID:26327781

  10. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  11. Quantitative Assessment of Upstream Source Influences on TGM Observations at Three CAMNet Sites

    NASA Astrophysics Data System (ADS)

    Wen, D.; Lin, J. C.; Meng, F.; Gbor, P. K.; He, Z.; Sloan, J. J.

    2009-05-01

    Mercury is a persistent and toxic substance in the environment. Exposure to high levels of mercury can cause a range of adverse health effects, including damage to the nervous system, reproduction system and childhood development. Proper recognition and prediction of atmospheric levels of mercury can effectively avoid the adverse affect of Hg, however they cannot be achieved without accurate and quantitative identification of source influences, which is a great challenge due to the complexity of Hg in the air. The objective of this study is to present a new method to simulate Hg concentrations at the location of a monitoring site and quantitatively assess its upstream source influences. Hourly total gaseous mercury (TGM) concentrations at three CAMNet monitoring sites (receptors) in Ontario were predicted for four selected periods using the Stochastic Time-Inverted Lagrangian Transport (STILT) model, which is capable of representing near-field influences that are not resolved by typical grid sizes in transport models. The model was modified to deal with Hg depositions and point source Hg emissions. The model-predicted Hg concentrations were compared with observations, as well as with the results from a CMAQ-Hg simulation in which the same emission and meteorology inputs were used. The comparisons show that STILT-predicted Hg concentrations agree well with observations, and are generally closer to the observations than those predicted by CMAQ-Hg. The better performance of the STILT simulation can be attributed to its ability to account for near-field influences. STILT was also applied to assess quantitatively the relative importance of different upstream source regions for the selected episodes. The assessment was made based on emission fluxes and STILT footprints, i.e., sensitivities of atmospheric concentrations to upstream surface fluxes. The results indicated that the main source regions of observed low Hg concentrations were in Northeastern Ontario, whereas

  12. Preclinical properties and human in vivo assessment of 123I-ABC577 as a novel SPECT agent for imaging amyloid-β.

    PubMed

    Maya, Yoshifumi; Okumura, Yuki; Kobayashi, Ryohei; Onishi, Takako; Shoyama, Yoshinari; Barret, Olivier; Alagille, David; Jennings, Danna; Marek, Kenneth; Seibyl, John; Tamagnan, Gilles; Tanaka, Akihiro; Shirakami, Yoshifumi

    2016-01-01

    Non-invasive imaging of amyloid-β in the brain, a hallmark of Alzheimer's disease, may support earlier and more accurate diagnosis of the disease. In this study, we assessed the novel single photon emission computed tomography tracer (123)I-ABC577 as a potential imaging biomarker for amyloid-β in the brain. The radio-iodinated imidazopyridine derivative (123)I-ABC577 was designed as a candidate for a novel amyloid-β imaging agent. The binding affinity of (123)I-ABC577 for amyloid-β was evaluated by saturation binding assay and in vitro autoradiography using post-mortem Alzheimer's disease brain tissue. Biodistribution experiments using normal rats were performed to evaluate the biokinetics of (123)I-ABC577. Furthermore, to validate (123)I-ABC577 as a biomarker for Alzheimer's disease, we performed a clinical study to compare the brain uptake of (123)I-ABC577 in three patients with Alzheimer's disease and three healthy control subjects. (123)I-ABC577 binding was quantified by use of the standardized uptake value ratio, which was calculated for the cortex using the cerebellum as a reference region. Standardized uptake value ratio images were visually scored as positive or negative. As a result, (123)I-ABC577 showed high binding affinity for amyloid-β and desirable pharmacokinetics in the preclinical studies. In the clinical study, (123)I-ABC577 was an effective marker for discriminating patients with Alzheimer's disease from healthy control subjects based on visual images or the ratio of cortical-to-cerebellar binding. In patients with Alzheimer's disease, (123)I-ABC577 demonstrated clear retention in cortical regions known to accumulate amyloid, such as the frontal cortex, temporal cortex, and posterior cingulate. In contrast, less, more diffuse, and non-specific uptake without localization to these key regions was observed in healthy controls. At 150 min after injection, the cortical standardized uptake value ratio increased by ∼ 60% in patients with

  13. The feasibility of using CT-guided ROI for semiquantifying striatal dopamine transporter availability in a hybrid SPECT/CT system.

    PubMed

    Hsu, Chien-Chin; Chang, Yen-Hsiang; Lin, Wei-Che; Tang, Shu-Wen; Wang, Pei-Wen; Huang, Yung-Cheng; Chiu, Nan-Tsing

    2014-01-01

    A hybrid SPECT/CT system provides accurate coregistration of functional and morphological images. CT-guided region of interest (ROI) for semiquantifying striatal dopamine transporter (DAT) availability may be a feasible method. We therefore assessed the intra- and interobserver reproducibility of manual SPECT and CT-guided ROI methods and compared their semiquantitative data with data from MRI-guided ROIs. We enrolled twenty-eight patients who underwent Tc-99m TRODAT-1 brain SPECT/CT and brain MRI. ROIs of the striatal, caudate, putamen, and occipital cortex were manually delineated on the SPECT, CT, and MRI. ROIs from CT and MRI were transferred to the coregistered SPECT for semiquantification. The striatal, caudate, and putamen nondisplaceable binding potential (BPND) were calculated. Using CT-guided ROIs had higher intra- and interobserver concordance correlation coefficients, closer Bland-Altman biases to zero, and narrower limits of agreement than using manual SPECT ROIs. The correlation coefficients of striatal, caudate, and putamen BPND were good between manual SPECT and MRI-guided ROI methods and even better between CT-guided and MRI-guided ROI methods. Conclusively, CT-guided ROI delineation for semiquantifying striatal DAT availability in a hybrid SPECT/CT system is highly reproducible, and the semiquantitative data correlate well with data from MRI-guided ROIs.

  14. Design of multipinhole collimators for small animal SPECT

    SciTech Connect

    Mark Smith; Steve Meikle; Stanislaw Majewski; Andrew Weisenberger

    2003-10-01

    High resolution single photon emission computed tomography (SPECT) of small animals with pinhole collimators is hampered by poor sensitivity. For the target application of imaging the brain of an unanesthetized, unrestrained mouse, the design of multipinhole collimators for small, compact gamma cameras was studied using numerical simulations. The collimators were designed to image the mouse at different axial positions inside a tube aligned with the axis of rotation (AOR) of the SPECT system. The proposed system consisted of a 10 cm/spl times/20 cm pixelated detector with 2 cm from the AOR to the collimator mask and a focal length of 7 cm for a magnification of 3.5 on the AOR. A digital mouse phantom was created with a brain:background activity concentration ratio of 8:1. The brain region contained small disks and hot and cool cylinders for contrast and signal-to-noise ratio (SNR) quantitation. SPECT projection data were simulated at 120 angles for four pinhole masks having 1, 5 and two arrangements of 13 pinholes. Projections were scaled to model a 30 min study with doses of 0.3, 1 and 3 mCi in the mouse. Images were reconstructed with an iterative ordered subset expectation maximization (OSEM) algorithm. The peak SNR improved with multipinhole collimation compared with single pinhole imaging when there was no increased activity in the liver or bladder. With such activity, peak SNR was about the same for all collimators. For all cases peak SNR occurred at higher contrast values with multipinhole collimation. The extended axial range of multipinhole arrays enabled the mouse brain to be imaged with decreased axial blurring when the mouse was translated axially. These results suggest that mouse brains can be successfully imaged at realistic activity levels with multipinhole arrays and projection data multiplexing.

  15. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods.

    PubMed

    Ahmed, Rafay; Oborski, Matthew J; Hwang, Misun; Lieberman, Frank S; Mountz, James M

    2014-01-01

    Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12-15 months for glioblastomas and 2-5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies, and importantly, for facilitating patient management, sparing patients from weeks or months of toxicity and ineffective treatment. This review will present an overview of epidemiology, molecular pathogenesis and current advances in diagnoses, and management of malignant gliomas.

  16. Influences of reconstruction and attenuation correction in brain SPECT images obtained by the hybrid SPECT/CT device: evaluation with a 3-dimensional brain phantom

    PubMed Central

    Akamatsu, Mana; Yamashita, Yasuo; Akamatsu, Go; Tsutsui, Yuji; Ohya, Nobuyoshi; Nakamura, Yasuhiko; Sasaki, Masayuki

    2014-01-01

    Objective(s): The aim of this study was to evaluate the influences of reconstruction and attenuation correction on the differences in the radioactivity distributions in 123I brain SPECT obtained by the hybrid SPECT/CT device. Methods: We used the 3-dimensional (3D) brain phantom, which imitates the precise structure of gray matter, white matter and bone regions. It was filled with 123I solution (20.1 kBq/mL) in the gray matter region and with K2HPO4 in the bone region. The SPECT/CT data were acquired by the hybrid SPECT/CT device. SPECT images were reconstructed by using filtered back projection with uniform attenuation correction (FBP-uAC), 3D ordered-subsets expectation-maximization with uniform AC (3D-OSEM-uAC) and 3D OSEM with CT-based non-uniform AC (3D-OSEM-CTAC). We evaluated the differences in the radioactivity distributions among these reconstruction methods using a 3D digital phantom, which was developed from CT images of the 3D brain phantom, as a reference. The normalized mean square error (NMSE) and regional radioactivity were calculated to evaluate the similarity of SPECT images to the 3D digital phantom. Results: The NMSE values were 0.0811 in FBP-uAC, 0.0914 in 3D-OSEM-uAC and 0.0766 in 3D-OSEM-CTAC. The regional radioactivity of FBP-uAC was 11.5% lower in the middle cerebral artery territory, and that of 3D-OSEM-uAC was 5.8% higher in the anterior cerebral artery territory, compared with the digital phantom. On the other hand, that of 3D-OSEM-CTAC was 1.8% lower in all brain areas. Conclusion: By using the hybrid SPECT/CT device, the brain SPECT reconstructed by 3D-OSEM with CT attenuation correction can provide an accurate assessment of the distribution of brain radioactivity. PMID:27408856

  17. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data

  18. Quantitative assessment of cutaneous sensory function in subjects with neurologic disease.

    PubMed

    Conomy, J P; Barnes, K L

    1976-12-01

    Based upon techniques devised for the behavioral study of cutaneous sensation in monkeys, a method has been developed which studies quantitatively cutaneous sensation in man. The techniques is analogous to the von Békésy method of audiometry and employs a subject-operated stimulus and signalling divice. In tests utilizing electrical stimulation of the skin surfaces the subject serves as his own control for comparison of one cutaneous zone with another and from one trial session to another. A permanent, written record of stimulus and nonverbal perceptual response is produced in this instrumental method which permits statistical analysis of responses. The analysis includes determination of cutaneous sensory thresholds, limits of stimulus intensity during detection, duration of perception, detection cycle rates, and persistence indices. This instrumental method of cutaneous sensory assessment is quantifiable, free of verbal bias, and repeatable in terms of defined stimulus strengths. In applied clinical studies, patients with peripheral nerve lesions show elevations of perceptual thresholds, reduced numbers of detection-disappearance cycles per unit time, prolonged, contorted decay slopes, and occasionally persistence of perception in the absence of stimulation. Patients with central lesions have variable threshold abnormalities, but little slowing of cycle rate or perceptual persistence. These quantitative sensation parameters can be evaluated longitudinally during the course of an illness and its treatment. The method has potential use in the investigation of basic aspects of sensation and its interactions with behavior.

  19. Residual Isocyanates in Medical Devices and Products: A Qualitative and Quantitative Assessment

    PubMed Central

    Franklin, Gillian; Harari, Homero; Ahsan, Samavi; Bello, Dhimiter; Sterling, David A.; Nedrelow, Jonathan; Raynaud, Scott; Biswas, Swati; Liu, Youcheng

    2016-01-01

    We conducted a pilot qualitative and quantitative assessment of residual isocyanates and their potential initial exposures in neonates, as little is known about their contact effect. After a neonatal intensive care unit (NICU) stockroom inventory, polyurethane (PU) and PU foam (PUF) devices and products were qualitatively evaluated for residual isocyanates using Surface SWYPE™. Those containing isocyanates were quantitatively tested for methylene diphenyl diisocyanate (MDI) species, using UPLC-UV-MS/MS method. Ten of 37 products and devices tested, indicated both free and bound residual surface isocyanates; PU/PUF pieces contained aromatic isocyanates; one product contained aliphatic isocyanates. Overall, quantified mean MDI concentrations were low (4,4′-MDI = 0.52 to 140.1 pg/mg) and (2,4′-MDI = 0.01 to 4.48 pg/mg). The 4,4′-MDI species had the highest measured concentration (280 pg/mg). Commonly used medical devices/products contain low, but measurable concentrations of residual isocyanates. Quantifying other isocyanate species and neonatal skin exposure to isocyanates from these devices and products requires further investigation. PMID:27773989

  20. Quantitative assessment of mis-registration issues of diffusion tensor imaging (DTI)

    NASA Astrophysics Data System (ADS)

    Li, Yue; Jiang, Hangyi; Mori, Susumu

    2012-02-01

    Image distortions caused by eddy current and patient motion have been two major sources of the mis-registration issues in diffusion tensor imaging (DTI). Numerous registration methods have been proposed to correct them. However, quality control of DTI remains an important issue, because we rarely report how much mis-registration existed and how well they were corrected. In this paper, we propose a method for quantitative reporting of DTI data quality. This registration method minimizes a cost function based on mean square tensor fitting errors. Registration with twelve-parameter full affine transformation is used. From the registration result, distortion and motion parameters are estimated. Because the translation parameters involve both eddy-current-induced image translation and the patient motion, by analyzing the transformation model, we separate them by removing the contributions that are linearly correlated with diffusion gradients. We define the metrics measuring the amounts of distortion, rotation, translation. We tested our method on a database with 64 subjects and found the statistics of each of metrics. Finally we demonstrate that how these statistics can be used for assessing the data quality quantitatively in several examples.

  1. Quantitative assessment of developmental levels in overarm throwing using wearable inertial sensing technology.

    PubMed

    Grimpampi, Eleni; Masci, Ilaria; Pesce, Caterina; Vannozzi, Giuseppe

    2016-09-01

    Motor proficiency in childhood has been recently recognised as a public health determinant, having a potential impact on the physical activity level and possible sedentary behaviour of the child later in life. Among fundamental motor skills, ballistic skills assessment based on in-field quantitative observations is progressively needed in the motor development community. The aim of this study was to propose an in-field quantitative approach to identify different developmental levels in overarm throwing. Fifty-eight children aged 5-10 years performed an overarm throwing task while wearing three inertial sensors located at the wrist, trunk and pelvis level and were then categorised using a developmental sequence of overarm throwing. A set of biomechanical parameters were defined and analysed using multivariate statistics to evaluate whether they can be used as developmental indicators. Trunk and pelvis angular velocities and time durations before the ball release showed increasing/decreasing trends with increasing developmental level. Significant differences between developmental level pairs were observed for selected biomechanical parameters. The results support the suitability and feasibility of objective developmental measures in ecological learning contexts, suggesting their potential supportiveness to motor learning experiences in educational and youth sports training settings. PMID:26818205

  2. Quantitative gallium 67 lung scan to assess the inflammatory activity in the pneumoconioses

    SciTech Connect

    Bisson, G.; Lamoureux, G.; Begin, R.

    1987-01-01

    Gallium 67 lung scan has recently become increasingly used to evaluate the biological activity of alveolitis of interstitial lung diseases and to stage the disease process. In order to have a more precise and objective indicator of the inflammatory activity in the lung, we and others have developed computer-based quantitative techniques to process the /sup 67/Ga scan. In this report, we compare the results of three such computer-based methods of analysis of the scans of 38 normal humans and 60 patients suspected to have pneumoconiosis. Results of previous investigations on the mechanisms of /sup 67/Ga uptake in interstitial lung disease are reviewed. These data strengthen the view that quantitative /sup 67/Ga lung scan has become a standard technique to assess inflammatory activity in the interstitial lung diseases and that computer-based method of analysis of the scan provides an index of inflammatory activity of the lung disease that correlates with lung lavage and biopsy indices of inflammation in the lung tissue. 51 references.

  3. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  4. Specific and quantitative assessment of naphthalene and salicylate bioavailability by using a bioluminescent catabolic reporter bacterium

    SciTech Connect

    Heitzer, A.; Thonnard, J.E.; Sayler, G.S.; Webb, O.F. )

    1992-06-01

    A bioassay was developed and standardized for the rapid, specific, and quantitative assessment of naphthalene and salicylate bioavailability by use of bioluminescence monitoring of catabolic gene expression. The bioluminescent reporter strain Pseudomonas fluorescens HK44, which carries a transcriptional nahG-luxCDABE fusion for naphthalene and salicylate catabolism, was used. The physiological state of the reporter cultures as well as the intrinsic regulatory properties of the naphthalene degradation operon must be taken into account to obtain a high specificity at low target substrate concentrations. Experiments have shown that the use of exponentially growing reporter cultures has advantages over the use of carbon-starved, resting cultures. In aqueous solutions for both substrates, naphthalene and salicylate, linear relationships between initial substrate concentration and bioluminescence response were found over concentration ranges of 1 to 2 orders of magnitude. Naphthalene could be detected at a concentration of 45 ppb. Studies conducted under defined conditions with extracts and slurries of experimentally contaminated sterile soils and identical uncontaminated soil controls demonstrated that this method can be used for specific and quantitative estimations of target pollutant presence and bioavailability in soil extracts and for specific and qualitative estimations of napthalene in soil slurries.

  5. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-04-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen.

  6. Quantitative assessment of apatite formation via a biomimetic method using quartz crystal microbalance.

    PubMed

    Tanahashi, M; Kokubo, T; Matsuda, T

    1996-06-01

    Quantitative assessment of hydroxyapatite formation on a gold surface via the biomimetic method, composed of a nucleation step in a simulated body fluid (SBF) containing glass powders and a subsequent apatite growth step in glass powder-free SBF, was made using a quartz crystal microbalance (QCM) technique. The frequency change of the QCM linearly increased with increasing soaking time, and largely depended on the nucleation period. The growth rates, defined as daily increase in thickness, increased monotonicly with an increasing nucleation period of up to 96 h, thereafter being constant at 2.0 microns/day. The growth rate of the apatite layer increased with increasing temperature of the SBF: 0.9, 2.0, and 3.8 microns/day at 25, 37, and 50 degrees C, respectively. The Arrhenius-type activation energy for the growth of apatite was 47.3 kJ/mol. The QCM method was found to be a very powerful tool for quantitative, in situ measurement of precipitation and growth of apatite in real time.

  7. Noninvasive Qualitative and Quantitative Assessment of Spoilage Attributes of Chilled Pork Using Hyperspectral Scattering Technique.

    PubMed

    Zhang, Leilei; Peng, Yankun

    2016-08-01

    The objective of this research was to develop a rapid noninvasive method for quantitative and qualitative determination of chilled pork spoilage. Microbiological, physicochemical, and organoleptic characteristics such as the total viable count (TVC), Pseudomonas spp., total volatile basic-nitrogen (TVB-N), pH value, and color parameter L* were determined to appraise pork quality. The hyperspectral scattering characteristics from 54 meat samples were fitted by four-parameter modified Gompertz function accurately. Support vector machines (SVM) was applied to establish quantitative prediction model between scattering fitting parameters and reference values. In addition, partial least squares discriminant analysis (PLS-DA) and Bayesian analysis were utilized as supervised and unsupervised techniques for the qualitative identification of meat spoilage. All stored chilled meat samples were classified into three grades: "fresh," "semi-fresh," and "spoiled." Bayesian classification model was superior to PLS-DA with overall classification accuracy of 92.86%. The results demonstrated that hyperspectral scattering technique combined with SVM and Bayesian possessed a powerful capability for meat spoilage assessment rapidly and noninvasively.

  8. Noninvasive Qualitative and Quantitative Assessment of Spoilage Attributes of Chilled Pork Using Hyperspectral Scattering Technique.

    PubMed

    Zhang, Leilei; Peng, Yankun

    2016-08-01

    The objective of this research was to develop a rapid noninvasive method for quantitative and qualitative determination of chilled pork spoilage. Microbiological, physicochemical, and organoleptic characteristics such as the total viable count (TVC), Pseudomonas spp., total volatile basic-nitrogen (TVB-N), pH value, and color parameter L* were determined to appraise pork quality. The hyperspectral scattering characteristics from 54 meat samples were fitted by four-parameter modified Gompertz function accurately. Support vector machines (SVM) was applied to establish quantitative prediction model between scattering fitting parameters and reference values. In addition, partial least squares discriminant analysis (PLS-DA) and Bayesian analysis were utilized as supervised and unsupervised techniques for the qualitative identification of meat spoilage. All stored chilled meat samples were classified into three grades: "fresh," "semi-fresh," and "spoiled." Bayesian classification model was superior to PLS-DA with overall classification accuracy of 92.86%. The results demonstrated that hyperspectral scattering technique combined with SVM and Bayesian possessed a powerful capability for meat spoilage assessment rapidly and noninvasively. PMID:27340214

  9. Provisional guidance for quantitative risk assessment of polycyclic aromatic hydrocarbons. Final report

    SciTech Connect

    Schoeny, R.; Poirier, K.

    1993-07-01

    PAHs are products of incomplete combustion of organic materials; sources are, thus, widespread including cigarette smoke, municipal waste incineration, wood stove emissions, coal conversion, energy production form fossil fuels, and automobile and diesel exhaust. As PAHs are common environmental contaminants, it is important that EPA have a scientifically justified, consistent approach to the evaluation of human health risk from exposure to these compounds. For the majority of PAHs classified as B2, probable human carcinogen, data are insufficient for calculation of an inhalation or drinking water unit risk. Benzo(a)pyrene (BAP) is the most completely studied of the PAHs, and data, while problematic, are sufficient for calculation of quantitative estimates of carcinogenic potency. Toxicity Equivalency Factors (TEF) have been used by U.S. EPA on an interim basis for risk assessment of chlorinated dibenzodioxins and dibenzofurans. Data for PAHs do not meet all criteria for use of TEF. The document presents a somewhat different approach to quantitative estimation for PAHs using weighted potential potencies.

  10. Drug Development in Alzheimer’s Disease: The Contribution of PET and SPECT

    PubMed Central

    Declercq, Lieven D.; Vandenberghe, Rik; Van Laere, Koen; Verbruggen, Alfons; Bormans, Guy

    2016-01-01

    Clinical trials aiming to develop disease-altering drugs for Alzheimer’s disease (AD), a neurodegenerative disorder with devastating consequences, are failing at an alarming rate. Poorly defined inclusion-and outcome criteria, due to a limited amount of objective biomarkers, is one of the major concerns. Non-invasive molecular imaging techniques, positron emission tomography and single photon emission (computed) tomography (PET and SPE(C)T), allow visualization and quantification of a wide variety of (patho)physiological processes and allow early (differential) diagnosis in many disorders. PET and SPECT have the ability to provide biomarkers that permit spatial assessment of pathophysiological molecular changes and therefore objectively evaluate and follow up therapeutic response, especially in the brain. A number of specific PET/SPECT biomarkers used in support of emerging clinical therapies in AD are discussed in this review. PMID:27065872

  11. Quantitative assessment of resilience of a water supply system under rainfall reduction due to climate change

    NASA Astrophysics Data System (ADS)

    Amarasinghe, Pradeep; Liu, An; Egodawatta, Prasanna; Barnes, Paul; McGree, James; Goonetilleke, Ashantha

    2016-09-01

    A water supply system can be impacted by rainfall reduction due to climate change, thereby reducing its supply potential. This highlights the need to understand the system resilience, which refers to the ability to maintain service under various pressures (or disruptions). Currently, the concept of resilience has not yet been widely applied in managing water supply systems. This paper proposed three technical resilience indictors to assess the resilience of a water supply system. A case study analysis was undertaken of the Water Grid system of Queensland State, Australia, to showcase how the proposed indicators can be applied to assess resilience. The research outcomes confirmed that the use of resilience indicators is capable of identifying critical conditions in relation to the water supply system operation, such as the maximum allowable rainfall reduction for the system to maintain its operation without failure. Additionally, resilience indicators also provided useful insight regarding the sensitivity of the water supply system to a changing rainfall pattern in the context of climate change, which represents the system's stability when experiencing pressure. The study outcomes will help in the quantitative assessment of resilience and provide improved guidance to system operators to enhance the efficiency and reliability of a water supply system.

  12. Quantitative Gait Measurement With Pulse-Doppler Radar for Passive In-Home Gait Assessment

    PubMed Central

    Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E.

    2014-01-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%–18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566

  13. Quantitative gait measurement with pulse-Doppler radar for passive in-home gait assessment.

    PubMed

    Wang, Fang; Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E

    2014-09-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%-18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment.

  14. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  15. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients. PMID:23069142

  16. Quantitative gait measurement with pulse-Doppler radar for passive in-home gait assessment.

    PubMed

    Wang, Fang; Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E

    2014-09-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%-18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566

  17. Quantitative Assessment of Eye Phenotypes for Functional Genetic Studies Using Drosophila melanogaster

    PubMed Central

    Iyer, Janani; Wang, Qingyu; Le, Thanh; Pizzo, Lucilla; Grönke, Sebastian; Ambegaokar, Surendra S.; Imai, Yuzuru; Srivastava, Ashutosh; Troisí, Beatriz Llamusí; Mardon, Graeme; Artero, Ruben; Jackson, George R.; Isaacs, Adrian M.; Partridge, Linda; Lu, Bingwei; Kumar, Justin P.; Girirajan, Santhosh

    2016-01-01

    About two-thirds of the vital genes in the Drosophila genome are involved in eye development, making the fly eye an excellent genetic system to study cellular function and development, neurodevelopment/degeneration, and complex diseases such as cancer and diabetes. We developed a novel computational method, implemented as Flynotyper software (http://flynotyper.sourceforge.net), to quantitatively assess the morphological defects in the Drosophila eye resulting from genetic alterations affecting basic cellular and developmental processes. Flynotyper utilizes a series of image processing operations to automatically detect the fly eye and the individual ommatidium, and calculates a phenotypic score as a measure of the disorderliness of ommatidial arrangement in the fly eye. As a proof of principle, we tested our method by analyzing the defects due to eye-specific knockdown of Drosophila orthologs of 12 neurodevelopmental genes to accurately document differential sensitivities of these genes to dosage alteration. We also evaluated eye images from six independent studies assessing the effect of overexpression of repeats, candidates from peptide library screens, and modifiers of neurotoxicity and developmental processes on eye morphology, and show strong concordance with the original assessment. We further demonstrate the utility of this method by analyzing 16 modifiers of sine oculis obtained from two genome-wide deficiency screens of Drosophila and accurately quantifying the effect of its enhancers and suppressors during eye development. Our method will complement existing assays for eye phenotypes, and increase the accuracy of studies that use fly eyes for functional evaluation of genes and genetic interactions. PMID:26994292

  18. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  19. Assessing the toxic effects of ethylene glycol ethers using Quantitative Structure Toxicity Relationship models

    SciTech Connect

    Ruiz, Patricia; Mumtaz, Moiz; Gombar, Vijay

    2011-07-15

    Experimental determination of toxicity profiles consumes a great deal of time, money, and other resources. Consequently, businesses, societies, and regulators strive for reliable alternatives such as Quantitative Structure Toxicity Relationship (QSTR) models to fill gaps in toxicity profiles of compounds of concern to human health. The use of glycol ethers and their health effects have recently attracted the attention of international organizations such as the World Health Organization (WHO). The board members of Concise International Chemical Assessment Documents (CICAD) recently identified inadequate testing as well as gaps in toxicity profiles of ethylene glycol mono-n-alkyl ethers (EGEs). The CICAD board requested the ATSDR Computational Toxicology and Methods Development Laboratory to conduct QSTR assessments of certain specific toxicity endpoints for these chemicals. In order to evaluate the potential health effects of EGEs, CICAD proposed a critical QSTR analysis of the mutagenicity, carcinogenicity, and developmental effects of EGEs and other selected chemicals. We report here results of the application of QSTRs to assess rodent carcinogenicity, mutagenicity, and developmental toxicity of four EGEs: 2-methoxyethanol, 2-ethoxyethanol, 2-propoxyethanol, and 2-butoxyethanol and their metabolites. Neither mutagenicity nor carcinogenicity is indicated for the parent compounds, but these compounds are predicted to be developmental toxicants. The predicted toxicity effects were subjected to reverse QSTR (rQSTR) analysis to identify structural attributes that may be the main drivers of the developmental toxicity potential of these compounds.

  20. A quantitative health assessment index for rapid evaluation of fish condition in the field

    SciTech Connect

    Adams, S.M. ); Brown, A.M. ); Goede, R.W. )

    1993-01-01

    The health assessment index (HAI) is an extension and refinement of a previously published field necropsy system. The HAI is a quantitative index that allows statistical comparisons of fish health among data sets. Index variables are assigned numerical values based on the degree of severity or damage incurred by an organ or tissue from environmental stressors. This approach has been used to evaluate the general health status of fish populations in a wide range of reservoir types in the Tennessee River basin (North Carolina, Tennessee, Alabama, Kentucky), in Hartwell Reservoir (Georgia, South Carolina) that is contaminated by polychlorinated biphenyls, and in the Pigeon River (Tennessee, North Carolina) that receives effluents from a bleaches kraft mill. The ability of the HAI to accurately characterize the health of fish in these systems was evaluated by comparing this index to other types of fish health measures (contaminant, bioindicator, and reproductive analysis) made at the same time as the HAI. In all cases, the HAI demonstrated the same pattern of fish health status between sites as did each of the other more sophisticated health assessment methods. The HAI has proven to be a simple and inexpensive means of rapidly assessing general fish health in field situations. 29 refs., 5 tabs.

  1. Quantitative assessment of reactive hyperemia using laser speckle contrast imaging at multiple wavelengths

    NASA Astrophysics Data System (ADS)

    Young, Anthony; Vishwanath, Karthik

    2016-03-01

    Reactive hyperemia refers to an increase of blood flow in tissue post release of an occlusion in the local vasculature. Measuring the temporal response of reactive hyperemia, post-occlusion in patients has the potential to shed information about microvascular diseases such as systemic sclerosis and diabetes. Laser speckle contrast imaging (LSCI) is an imaging technique capable of sensing superficial blood flow in tissue which can be used to quantitatively assess reactive hyperemia. Here, we employ LSCI using coherent sources in the blue, green and red wavelengths to evaluate reactive hyperemia in healthy human volunteers. Blood flow in the forearms of subjects were measured using LSCI to assess the time-course of reactive hyperemia that was triggered by a pressure cuff applied to the biceps of the subjects. Raw speckle images were acquired and processed to yield blood-flow parameters from a region of interest before, during and after application of occlusion. Reactive hyperemia was quantified via two measures - (1) by calculating the difference between the peak LSCI flow during the hyperemia and baseline flow, and (2) by measuring the amount of time that elapsed between the release of the occlusion and peak flow. These measurements were acquired in three healthy human participants, under the three laser wavelengths employed. The studies shed light on the utility of in vivo LSCI-based flow sensing for non-invasive assessment of reactive hyperemia responses and how they varied with the choice source wavelength influences the measured parameters.

  2. Coherent and consistent decision making for mixed hazardous waste management: The application of quantitative assessment techniques

    SciTech Connect

    Smith, G.M.; Little, R.H.; Torres, C.

    1994-12-31

    This paper focuses on predictive modelling capacity for post-disposal safety assessments of land-based disposal facilities, illustrated by presentation of the development and application of a comprehensive, yet practicable, assessment framework. The issues addressed include: (1) land-based disposal practice, (2) the conceptual and mathematical representation of processes leading to release, migration and accumulation of contaminants, (3) the identification and evaluation of relevant assessment end-points, including human health, health of non-human biota and eco-systems, and property and resource effects, (4) the gap between data requirements and data availability, and (5) the application of results in decision making, given the uncertainties in assessment results and the difficulty of comparing qualitatively different impacts arising in different temporal and spatial scales. The paper illustrates the issues with examples based on disposal of metals and radionuclides to shallow facilities. The types of disposal facility considered include features consistent with facilities for radioactive wastes as well as other types of design more typical of hazardous wastes. The intention is to raise the question of whether radioactive and other hazardous wastes are being consistently managed, and to show that assessment methods are being developed which can provide quantitative information on the levels of environmental impact as well as a consistent approach for different types of waste, such methods can then be applied to mixed hazardous wastes contained radionuclides as well as other contaminants. The remaining question is whether the will exists to employ them. The discussion and worked illustrations are based on a methodology developed and being extended within the current European Atomic Energy Community`s cost-sharing research program on radioactive waste management and disposal, with co-funding support from Empresa Nacional de Residuous Radiactivos SA, Spain.

  3. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  4. Evaluation of dental enamel caries assessment using Quantitative Light Induced Fluorescence and Optical Coherence Tomography.

    PubMed

    Maia, Ana Marly Araújo; de Freitas, Anderson Zanardi; de L Campello, Sergio; Gomes, Anderson Stevens Leônidas; Karlsson, Lena

    2016-06-01

    An in vitro study of morphological alterations between sound dental structure and artificially induced white spot lesions in human teeth, was performed through the loss of fluorescence by Quantitative Light-Induced Fluorescence (QLF) and the alterations of the light attenuation coefficient by Optical Coherence Tomography (OCT). To analyze the OCT images using a commercially available system, a special algorithm was applied, whereas the QLF images were analyzed using the software available in the commercial system employed. When analyzing the sound region against white spot lesions region by QLF, a reduction in the fluorescence intensity was observed, whilst an increase of light attenuation by the OCT system occurred. Comparison of the percentage of alteration between optical properties of sound and artificial enamel caries regions showed that OCT processed images through the attenuation of light enhanced the tooth optical alterations more than fluorescence detected by QLF System. QLF versus OCT imaging of enamel caries: a photonics assessment.

  5. Quantitative assessment of the surface crack density in thermal barrier coatings

    NASA Astrophysics Data System (ADS)

    Yang, Li; Zhong, Zhi-Chun; Zhou, Yi-Chun; Lu, Chun-Sheng

    2014-04-01

    In this paper, a modified shear-lag model is developed to calculate the surface crack density in thermal barrier coatings (TBCs). The mechanical properties of TBCs are also measured to quantitatively assess their surface crack density. Acoustic emission (AE) and digital image correlation methods are applied to monitor the surface cracking in TBCs under tensile loading. The results show that the calculated surface crack density from the modified model is in agreement with that obtained from experiments. The surface cracking process of TBCs can be discriminated by their AE characteristics and strain evolution. Based on the correlation of energy released from cracking and its corresponding AE signals, a linear relationship is built up between the surface crack density and AE parameters, with the slope being dependent on the mechanical properties of TBCs. [Figure not available: see fulltext.

  6. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  7. Estimation methods for monthly humidity from dynamical downscaling data for quantitative assessments of climate change impacts

    NASA Astrophysics Data System (ADS)

    Ueyama, Hideki

    2012-07-01

    Methods are proposed to estimate the monthly relative humidity and wet bulb temperature based on observations from a dynamical downscaling coupled general circulation model with a regional climate model (RCM) for a quantitative assessment of climate change impacts. The water vapor pressure estimation model developed was a regression model with a monthly saturated water vapor pressure that used minimum air temperature as a variable. The monthly minimum air temperature correction model for RCM bias was developed by stepwise multiple regression analysis using the difference in monthly minimum air temperatures between observations and RCM output as a dependent variable and geographic factors as independent variables. The wet bulb temperature was estimated using the estimated water vapor pressure, air temperature, and atmospheric pressure at ground level both corrected for RCM bias. Root mean square errors of the data decreased considerably in August.

  8. Quantitative assessment of motion correction for high angular resolution diffusion imaging.

    PubMed

    Sakaie, Ken E; Lowe, Mark J

    2010-02-01

    Several methods have been proposed for motion correction of high angular resolution diffusion imaging (HARDI) data. There have been few comparisons of these methods, partly due to a lack of quantitative metrics of performance. We compare two motion correction strategies using two figures of merit: displacement introduced by the motion correction and the 95% confidence interval of the cone of uncertainty of voxels with prolate tensors. What follows is a general approach for assessing motion correction of HARDI data that may have broad application for quality assurance and optimization of postprocessing protocols. Our analysis demonstrates two important issues related to motion correction of HARDI data: (1) although neither method we tested was dramatically superior in performance, both were dramatically better than performing no motion correction, and (2) iteration of motion correction can improve the final results. Based on the results demonstrated here, iterative motion correction is strongly recommended for HARDI acquisitions. PMID:19695824

  9. Quantitative risk assessment & leak detection criteria for a subsea oil export pipeline

    NASA Astrophysics Data System (ADS)

    Zhang, Fang-Yuan; Bai, Yong; Badaruddin, Mohd Fauzi; Tuty, Suhartodjo

    2009-06-01

    A quantitative risk assessment (QRA) based on leak detection criteria (LDC) for the design of a proposed subsea oil export pipeline is presented in this paper. The objective of this QRA/LDC study was to determine if current leak detection methodologies were sufficient, based on QRA results, while excluding the use of statistical leak detection; if not, an appropriate LDC for the leak detection system would need to be established. The famous UK PARLOC database was used for the calculation of pipeline failure rates, and the software POSVCM from MMS was used for oil spill simulations. QRA results revealed that the installation of a statistically based leak detection system (LDS) can significantly reduce time to leak detection, thereby mitigating the consequences of leakage. A sound LDC has been defined based on QRA study results and comments from various LDS vendors to assist the emergency response team (ERT) to quickly identify and locate leakage and employ the most effective measures to contain damage.

  10. Attribution of human VTEC O157 infection from meat products: a quantitative risk assessment approach.

    PubMed

    Kosmider, Rowena D; Nally, Pádraig; Simons, Robin R L; Brouwer, Adam; Cheung, Susan; Snary, Emma L; Wooldridge, Marion

    2010-05-01

    To address the risk posed to human health by the consumption of VTEC O157 within contaminated pork, lamb, and beef products within Great Britain, a quantitative risk assessment model has been developed. This model aims to simulate the prevalence and amount of VTEC O157 in different meat products at consumption within a single model framework by adapting previously developed models. The model is stochastic in nature, enabling both variability (natural variation between animals, carcasses, products) and uncertainty (lack of knowledge) about the input parameters to be modeled. Based on the model assumptions and data, it is concluded that the prevalence of VTEC O157 in meat products (joints and mince) at consumption is low (i.e., <0.04%). Beef products, particularly beef burgers, present the highest estimated risk with an estimated eight out of 100,000 servings on average resulting in human infection with VTEC O157.

  11. A quantitative structure-activity relationship approach for assessing toxicity of mixture of organic compounds.

    PubMed

    Chang, C M; Ou, Y H; Liu, T-C; Lu, S-Y; Wang, M-K

    2016-06-01

    Four types of reactivity indices were employed to construct quantitative structure-activity relationships for the assessment of toxicity of organic chemical mixtures. Results of analysis indicated that the maximum positive charge of the hydrogen atom and the inverse of the apolar surface area are the most important descriptors for the toxicity of mixture of benzene and its derivatives to Vibrio fischeri. The toxicity of mixture of aromatic compounds to green alga Scenedesmus obliquus is mainly affected by the electron flow and electrostatic interactions. The electron-acceptance chemical potential and the maximum positive charge of the hydrogen atom are found to be the most important descriptors for the joint toxicity of aromatic compounds.

  12. A quantitative assessment of using the Kinect for Xbox 360 for respiratory surface motion tracking

    NASA Astrophysics Data System (ADS)

    Alnowami, M.; Alnwaimi, B.; Tahavori, F.; Copland, M.; Wells, K.

    2012-02-01

    This paper describes a quantitative assessment of the Microsoft Kinect for X-box360TM for potential application in tracking respiratory and body motion in diagnostic imaging and external beam radiotherapy. However, the results can also be used in many other biomedical applications. We consider the performance of the Kinect in controlled conditions and find mm precision at depths of 0.8-1.5m. We also demonstrate the use of the Kinect for monitoring respiratory motion of the anterior surface. To improve the performance of respiratory monitoring, we fit a spline model of the chest surface through the depth data as a method of a marker-less monitoring of a respiratory motion. In addition, a comparison between the Kinect camera with and without zoom lens and a marker-based system was used to evaluate the accuracy of using the Kinect camera as a respiratory tracking system.

  13. Quantitative Framework for Retrospective Assessment of Interim Decisions in Clinical Trials

    PubMed Central

    Stanev, Roger

    2016-01-01

    This article presents a quantitative way of modeling the interim decisions of clinical trials. While statistical approaches tend to focus on the epistemic aspects of statistical monitoring rules, often overlooking ethical considerations, ethical approaches tend to neglect the key epistemic dimension. The proposal is a second-order decision-analytic framework. The framework provides means for retrospective assessment of interim decisions based on a clear and consistent set of criteria that combines both ethical and epistemic considerations. The framework is broadly Bayesian and addresses a fundamental question behind many concerns about clinical trials: What does it take for an interim decision (e.g., whether to stop the trial or continue) to be a good decision? Simulations illustrating the modeling of interim decisions counterfactually are provided. PMID:27353825

  14. Quantitative assessment of the stent/scaffold strut embedment analysis by optical coherence tomography.

    PubMed

    Sotomi, Yohei; Tateishi, Hiroki; Suwannasom, Pannipa; Dijkstra, Jouke; Eggermont, Jeroen; Liu, Shengnan; Tenekecioglu, Erhan; Zheng, Yaping; Abdelghani, Mohammad; Cavalcante, Rafael; de Winter, Robbert J; Wykrzykowska, Joanna J; Onuma, Yoshinobu; Serruys, Patrick W; Kimura, Takeshi

    2016-06-01

    The degree of stent/scaffold embedment could be a surrogate parameter of the vessel wall-stent/scaffold interaction and could have biological implications in the vascular response. We have developed a new specific software for the quantitative evaluation of embedment of struts by optical coherence tomography (OCT). In the present study, we described the algorithm of the embedment analysis and its reproducibility. The degree of embedment was evaluated as the ratio of the embedded part versus the whole strut height and subdivided into quartiles. The agreement and the inter- and intra-observer reproducibility were evaluated using the kappa and the interclass correlation coefficient (ICC). A total of 4 pullbacks of OCT images in 4 randomly selected coronary lesions with 3.0 × 18 mm devices [2 lesions with Absorb BVS and 2 lesions with XIENCE (both from Abbott Vascular, Santa Clara, CA, USA)] from Absorb Japan trial were evaluated by two investigators with QCU-CMS software version 4.69 (Leiden University Medical Center, Leiden, The Netherlands). Finally, 1481 polymeric struts in 174 cross-sections and 1415 metallic struts in 161 cross-sections were analyzed. Inter- and intra-observer reproducibility of quantitative measurements of embedment ratio and categorical assessment of embedment in Absorb BVS and XIENCE had excellent agreement with ICC ranging from 0.958 to 0.999 and kappa ranging from 0.850 to 0.980. The newly developed embedment software showed excellent reproducibility. Computer-assisted embedment analysis could be a feasible tool to assess the strut penetration into the vessel wall that could be a surrogate of acute injury caused by implantation of devices. PMID:26898315

  15. MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?

    SciTech Connect

    Giger, M; Petrick, N; Obuchowski, N; Kinahan, P

    2014-06-15

    The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. As such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.

  16. Robin Hood caught in Wonderland: brain SPECT findings.

    PubMed

    Morland, David; Wolff, Valérie; Dietemann, Jean-Louis; Marescaux, Christian; Namer, Izzie Jacques

    2013-12-01

    We present the case of a 53-year-old woman presenting several episodes of body image distortions, ground deformation illusions, and problems assessing distance in the orthostatic position corresponding to the Alice in Wonderland syndrome. No symptoms were reported when sitting or lying down. She had uncontrolled hypertension, hyperglycemia, hypercholesterolemia, and a history of head trauma. Her condition had been diagnosed with left internal carotid artery dissection 2 years earlier. Brain SPECT with 99mTc-ECD performed after i.v. injection of the radiotracer in supine and in standing positions showed hypoperfusion in the healthy contralateral frontoparietal operculum (Robin Hood syndrome), deteriorating when standing up. PMID:24152648

  17. A guide to SPECT equipment for brain imaging

    SciTech Connect

    Hoffer, P.B.; Zubal, G.

    1991-12-31

    Single photon emission computed tomography (SPECT) was started by Kuhl and Edwards about 30 years ago. Their original instrument consisted of four focused Nal probes mounted on a moving gantry. During the 1980s, clinical SPECT imaging was most frequently performed using single-headed Anger-type cameras which were modified for rotational as well as static imaging. Such instruments are still available and may be useful in settings where there are few patients and SPECT is used only occasionally. More frequently, however, dedicated SPECT devices are purchased which optimize equipment potential while being user-friendly. Modern SPECT instrumentation incorporates improvements in the detector, computers, mathematical formulations, electronics and display systems. A comprehensive discussion of all aspects of SPECT is beyond the scope of this article. The authors, however, discuss general concepts of SPECT, the current state-of-the-art in clinical SPECT instrumentation, and areas of common misunderstanding. 9 refs.

  18. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  19. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.

  20. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  1. Quantitative Assessment of Regional Wall Motion Abnormalities Using Dual-Energy Digital Subtraction Intravenous Ventriculography

    NASA Astrophysics Data System (ADS)

    McCollough, Cynthia H.

    Healthy portions of the left ventricle (LV) can often compensate for regional dysfunction, thereby masking regional disease when global indices of LV function are employed. Thus, quantitation of regional function provides a more useful method of assessing LV function, especially in diseases that have regional effects such as coronary artery disease. This dissertation studied the ability of a phase -matched dual-energy digital subtraction angiography (DE -DSA) technique to quantitate changes in regional LV systolic volume. The potential benefits and a theoretical description of the DE imaging technique are detailed. A correlated noise reduction algorithm is also presented which raises the signal-to-noise ratio of DE images by a factor of 2 -4. Ten open-chest dogs were instrumented with transmural ultrasonic crystals to assess regional LV function in terms of systolic normalized-wall-thickening rate (NWTR) and percent-systolic-thickening (PST). A pneumatic occluder was placed on the left-anterior-descending (LAD) coronary artery to temporarily reduce myocardial blood flow, thereby changing regional LV function in the LAD bed. DE-DSA intravenous left ventriculograms were obtained at control and four levels of graded myocardial ischemia, as determined by reductions in PST. Phase-matched images displaying changes in systolic contractile function were created by subtracting an end-systolic (ES) control image from ES images acquired at each level of myocardial ischemia. The resulting wall-motion difference signal (WMD), which represents a change in regional systolic volume between the control and ischemic states, was quantitated by videodensitometry and compared with changes in NWTR and PST. Regression analysis of 56 data points from 10 animals shows a linear relationship between WMD and both NWTR and PST: WMD = -2.46 NWTR + 13.9, r = 0.64, p < 0.001; WMD = -2.11 PST + 18.4, r = 0.54, p < 0.001. Thus, changes in regional ES LV volume between rest and ischemic states, as

  2. End-expiration Respiratory Gating for a High Resolution Stationary Cardiac SPECT system

    PubMed Central

    Chan, Chung; Harris, Mark; Le, Max; Biondi, James; Grobshtein, Yariv; Liu, Yi-Hwa; Sinusas, Albert J.; Liu, Chi

    2014-01-01

    Respiratory and cardiac motions can degrade myocardial perfusion SPECT (MPS) image quality and reduce defect detection and quantitative accuracy. In this study, we developed a dual-respiratory and cardiac gating system for a high resolution fully stationary cardiac SPECT scanner in order to improve the image quality and defect detection. Respiratory motion was monitored using a compressive sensor pillow connected to a dual respiratory-cardiac gating box, which sends cardiac triggers only during end-expiration phases to the single cardiac trigger input on the SPECT scanners. The listmode data were rebinned retrospectively into end-expiration frames for respiratory motion reduction or 8 cardiac gates only during end-expiration phases to compensate for both respiratory and cardiac motions. The proposed method was first validated on a motion phantom in the presence and absence of multiple perfusion defects, and then applied on 11 patient studies with and without perfusion defects. In the normal phantom studies, the end-expiration gated SPECT (EXG-SPECT) reduced respiratory motion blur and increased myocardium to blood pool contrast by 51.2% as compared to the ungated images. The proposed method also yielded an average of 11.2% increase in myocardium to defect contrast as compared to the ungated images in the phantom studies with perfusion defects. In the patient studies, EXG-SPECT significantly improved the myocardium to blood pool contrast (p<0.005) by 24% on average as compared to the ungated images, and led to improved perfusion uniformity across segments on polar maps for normal patients. For a patient with defect, EXG-SPECT improved the defect contrast and definition. The dual respiratory-cardiac gating further reduced the blurring effect, increased the myocardium to blood pool contrast significantly by 36% (p<0.05) compared to EXG SPECT, and further improved defect characteristics and visualization of fine structures at the expense of increased noise on the

  3. End-expiration respiratory gating for a high-resolution stationary cardiac SPECT system

    NASA Astrophysics Data System (ADS)

    Chan, Chung; Harris, Mark; Le, Max; Biondi, James; Grobshtein, Yariv; Liu, Yi-Hwa; Sinusas, Albert J.; Liu, Chi

    2014-10-01

    Respiratory and cardiac motions can degrade myocardial perfusion SPECT (MPS) image quality and reduce defect detection and quantitative accuracy. In this study, we developed a dual respiratory and cardiac gating system for a high-resolution fully stationary cardiac SPECT scanner in order to improve the image quality and defect detection. Respiratory motion was monitored using a compressive sensor pillow connected to a dual respiratory-cardiac gating box, which sends cardiac triggers only during end-expiration phases to the single cardiac trigger input on the SPECT scanners. The listmode data were rebinned retrospectively into end-expiration frames for respiratory motion reduction or eight cardiac gates only during end-expiration phases to compensate for both respiratory and cardiac motions. The proposed method was first validated on a motion phantom in the presence and absence of multiple perfusion defects, and then applied on 11 patient studies with and without perfusion defects. In the normal phantom studies, the end-expiration gated SPECT (EXG-SPECT) reduced respiratory motion blur and increased myocardium to blood pool contrast by 51.2% as compared to the ungated images. The proposed method also yielded an average of 11.2% increase in myocardium to defect contrast as compared to the ungated images in the phantom studies with perfusion defects. In the patient studies, EXG-SPECT significantly improved the myocardium to blood pool contrast (p < 0.005) by 24% on average as compared to the ungated images, and led to improved perfusion uniformity across segments on polar maps for normal patients. For a patient with defect, EXG-SPECT improved the defect contrast and definition. The dual respiratory-cardiac gating further reduced the blurring effect, increased the myocardium to blood pool contrast significantly by 36% (p < 0.05) compared to EXG-SPECT, and further improved defect characteristics and visualization of fine structures at the expense of increased noise on

  4. End-expiration respiratory gating for a high-resolution stationary cardiac SPECT system.

    PubMed

    Chan, Chung; Harris, Mark; Le, Max; Biondi, James; Grobshtein, Yariv; Liu, Yi-Hwa; Sinusas, Albert J; Liu, Chi

    2014-10-21

    Respiratory and cardiac motions can degrade myocardial perfusion SPECT (MPS) image quality and reduce defect detection and quantitative accuracy. In this study, we developed a dual respiratory and cardiac gating system for a high-resolution fully stationary cardiac SPECT scanner in order to improve the image quality and defect detection. Respiratory motion was monitored using a compressive sensor pillow connected to a dual respiratory-cardiac gating box, which sends cardiac triggers only during end-expiration phases to the single cardiac trigger input on the SPECT scanners. The listmode data were rebinned retrospectively into end-expiration frames for respiratory motion reduction or eight cardiac gates only during end-expiration phases to compensate for both respiratory and cardiac motions. The proposed method was first validated on a motion phantom in the presence and absence of multiple perfusion defects, and then applied on 11 patient studies with and without perfusion defects. In the normal phantom studies, the end-expiration gated SPECT (EXG-SPECT) reduced respiratory motion blur and increased myocardium to blood pool contrast by 51.2% as compared to the ungated images. The proposed method also yielded an average of 11.2% increase in myocardium to defect contrast as compared to the ungated images in the phantom studies with perfusion defects. In the patient studies, EXG-SPECT significantly improved the myocardium to blood pool contrast (p < 0.005) by 24% on average as compared to the ungated images, and led to improved perfusion uniformity across segments on polar maps for normal patients. For a patient with defect, EXG-SPECT improved the defect contrast and definition. The dual respiratory-cardiac gating further reduced the blurring effect, increased the myocardium to blood pool contrast significantly by 36% (p < 0.05) compared to EXG-SPECT, and further improved defect characteristics and visualization of fine structures at the expense of increased noise

  5. NPEC Sourcebook on Assessment: Definitions and Assessment Methods for Communication, Leadership, Information Literacy, Quantitative Reasoning, and Quantitative Skills. NPEC 2005-0832

    ERIC Educational Resources Information Center

    Jones, Elizabeth A.; RiCharde, Stephen

    2005-01-01

    Faculty, instructional staff, and assessment professionals are interested in student outcomes assessment processes and tools that can be used to improve learning experiences and academic programs. How can students' skills be assessed effectively? What assessments measure skills in communication? Leadership? Information literacy? Quantitative…

  6. Quantitative assessment of scatter correction techniques incorporated in next generation dual-source computed tomography

    NASA Astrophysics Data System (ADS)

    Mobberley, Sean David

    Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU regardless of reconstruction kernel (chapters 3 and 4). Similarly, HU values of water and blood were consistently closer to their nominal values of 0 HU and 55 HU respectively. When using image data obtained in the SS mode the air CT numbers demonstrated a consistent positive shift of up to 35 HU

  7. Comparison of SPECT/CT and MRI in Diagnosing Symptomatic Lesions in Ankle and Foot Pain Patients: Diagnostic Performance and Relation to Lesion Type

    PubMed Central

    Ha, Seunggyun; Hong, Sung Hwan; Paeng, Jin Chul; Lee, Dong Yeon; Cheon, Gi Jeong; Arya, Amitabh; Chung, June-Key; Lee, Dong Soo; Kang, Keon Wook

    2015-01-01

    Purpose The purpose of this study was to compare the diagnostic performance of SPECT/CT and MRI in patients with ankle and foot pain, with regard to the lesion types. Materials and Methods Fifty consecutive patients with ankle and foot pain, who underwent 99mTc-MDP SPECT/CT and MRI, were retrospectively enrolled in this study. Symptomatic lesions were determined based on clinical examination and response to treatment. On MRI and SPECT/CT, detected lesions were classified as bone, ligament/tendon, and joint lesions. Uptake on SPECT/CT was assessed using a 4-grade system. Sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of SPECT/CT and MRI were evaluated in all detected lesions and each lesion type. Diagnostic value of uptake grade was analyzed using receiver-operating characteristics (ROC) curve analysis, and diagnostic performance was compared using Chi-square or McNemar tests. Results In overall lesions, the sensitivity, PPV and NPV of SPECT/CT for symptomatic lesions were 93%, 56%, 91%, and they were 98%, 48%, 95% for MRI. There was no significant difference between SPECT/CT and MRI. However, the specificity of SPECT/CT was significantly higher than that of MRI (48% versus 24%, P = 0.016). Uptake grade on SPECT/CT was significantly higher in symptomatic lesions (P < 0.001), and its area under curve on ROC analysis was 0.787. In the analysis of each lesion type, the specificity of SPECT/CT was poor in joint lesions compared with other lesion types and MRI (P < 0.001, respectively). MRI exhibited lower specificity than SPECT/CT in bone lesions (P = 0.004) and ligament/tendon lesions (P < 0.001). Conclusions SPECT/CT has MRI-comparable diagnostic performance for symptomatic lesions in ankle and foot pain patients. SPECT/CT and MRI exhibit different diagnostic specificity in different lesion types. SPECT/CT may be used as a complementary imaging method to MRI for enhancing diagnostic specificity. PMID:25668182

  8. SPECT Analysis of Cardiac Perfusion Changes After Whole-Breast/Chest Wall Radiation Therapy With or Without Active Breathing Coordinator: Results of a Randomized Phase 3 Trial

    SciTech Connect

    Zellars, Richard; Bravo, Paco E.; Tryggestad, Erik; Hopfer, Kari; Myers, Lee; Tahari, Abdel; Asrari, Fariba; Ziessman, Harvey; Garrett-Mayer, Elizabeth

    2014-03-15

    Purpose: Cardiac muscle perfusion, as determined by single-photon emission computed tomography (SPECT), decreases after breast and/or chest wall (BCW) irradiation. The active breathing coordinator (ABC) enables radiation delivery when the BCW is farther from the heart, thereby decreasing cardiac exposure. We hypothesized that ABC would prevent radiation-induced cardiac toxicity and conducted a randomized controlled trial evaluating myocardial perfusion changes after radiation for left-sided breast cancer with or without ABC. Methods and Materials: Stages I to III left breast cancer patients requiring adjuvant radiation therapy (XRT) were randomized to ABC or No-ABC. Myocardial perfusion was evaluated by SPECT scans (before and 6 months after BCW radiation) using 2 methods: (1) fully automated quantitative polar mapping; and (2) semiquantitative visual assessment. The left ventricle was divided into 20 segments for the polar map and 17 segments for the visual method. Segments were grouped by anatomical rings (apical, mid, basal) or by coronary artery distribution. For the visual method, 2 nuclear medicine physicians, blinded to treatment groups, scored each segment's perfusion. Scores were analyzed with nonparametric tests and linear regression. Results: Between 2006 and 2010, 57 patients were enrolled and 43 were available for analysis. The cohorts were well matched. The apical and left anterior descending coronary artery segments had significant decreases in perfusion on SPECT scans in both ABC and No-ABC cohorts. In unadjusted and adjusted analyses, controlling for pretreatment perfusion score, age, and chemotherapy, ABC was not significantly associated with prevention of perfusion deficits. Conclusions: In this randomized controlled trial, ABC does not appear to prevent radiation-induced cardiac perfusion deficits.

  9. Non-destructive assessment of human ribs mechanical properties using quantitative ultrasound.

    PubMed

    Mitton, David; Minonzio, Jean-Gabriel; Talmant, Maryline; Ellouz, Rafaa; Rongieras, Frédéric; Laugier, Pascal; Bruyère-Garnier, Karine

    2014-04-11

    Advanced finite element models of the thorax have been developed to study, for example, the effects of car crashes. While there is a need for material properties to parameterize such models, specific properties are largely missing. Non-destructive techniques applicable in vivo would, therefore, be of interest to support further development of thorax models. The only non-destructive technique available today to derive rib bone properties would be based on quantitative computed tomography that measures bone mineral density. However, this approach is limited by the radiation dose. Bidirectional ultrasound axial transmission was developed on long bones ex vivo and used to assess in vivo health status of the radius. However, it is currently unknown if the ribs are good candidates for such a measurement. Therefore, the goal of this study is to evaluate the relationship between ex vivo ultrasonic measurements (axial transmission) and the mechanical properties of human ribs to determine if the mechanical properties of the ribs can be quantified non-destructively. The results show statistically significant relationships between the ultrasonic measurements and mechanical properties of the ribs. These results are promising with respect to a non-destructive and non-ionizing assessment of rib mechanical properties. This ex vivo study is a first step toward in vivo studies to derive subject-specific rib properties.

  10. Assessment and application of quantitative schlieren methods: Calibrated color schlieren and background oriented schlieren

    NASA Astrophysics Data System (ADS)

    Elsinga, G. E.; van Oudheusden, B. W.; Scarano, F.; Watt, D. W.

    Two quantitative schlieren methods are assessed and compared: calibrated color schlieren (CCS) and background oriented schlieren (BOS). Both methods are capable of measuring the light deflection angle in two spatial directions, and hence the projected density gradient vector field. Spatial integration using the conjugate gradient method returns the projected density field. To assess the performance of CCS and BOS, density measurements of a two-dimensional benchmark flow (a Prandtl-Meyer expansion fan) are compared with the theoretical density field and with the density inferred from PIV velocity measurements. The method's performance is also evaluated a priori from an experiment ray-tracing simulation. The density measurements show good agreement with theory. Moreover, CCS and BOS return comparable results with respect to each other and with respect to the PIV measurements. BOS proves to be very sensitive to displacements of the wind tunnel during the experiment and requires a correction for it, making it necessary to apply extra boundary conditions in the integration procedure. Furthermore, spatial resolution can be a limiting factor for accurate measurements using BOS. CCS suffers from relatively high noise in the density gradient measurement due to camera noise and has a smaller dynamic range when compared to BOS. Finally the application of the two schlieren methods to a separated wake flow is demonstrated. Flow features such as shear layers and expansion and recompression waves are measured with both methods.

  11. Disability adjusted life year (DALY): a useful tool for quantitative assessment of environmental pollution.

    PubMed

    Gao, Tingting; Wang, Xiaochang C; Chen, Rong; Ngo, Huu Hao; Guo, Wenshan

    2015-04-01

    Disability adjusted life year (DALY) has been widely used since 1990s for evaluating global and/or regional burden of diseases. As many environmental pollutants are hazardous to human health, DALY is also recognized as an indicator to quantify the health impact of environmental pollution related to disease burden. Based on literature reviews, this article aims to give an overview of the applicable methodologies and research directions for using DALY as a tool for quantitative assessment of environmental pollution. With an introduction of the methodological framework of DALY, the requirements on data collection and manipulation for quantifying disease burdens are summarized. Regarding environmental pollutants hazardous to human beings, health effect/risk evaluation is indispensable for transforming pollution data into disease data through exposure and dose-response analyses which need careful selection of models and determination of parameters. Following the methodological discussions, real cases are analyzed with attention paid to chemical pollutants and pathogens usually encountered in environmental pollution. It can be seen from existing studies that DALY is advantageous over conventional environmental impact assessment for quantification and comparison of the risks resulted from environmental pollution. However, further studies are still required to standardize the methods of health effect evaluation regarding varied pollutants under varied circumstances before DALY calculation.

  12. Quantitative microbial risk assessment of human illness from exposure to marine beach sand.

    PubMed

    Shibata, Tomoyuki; Solo-Gabriele, Helena M

    2012-03-01

    Currently no U.S. federal guideline is available for assessing risk of illness from sand at recreational sites. The objectives of this study were to compute a reference level guideline for pathogens in beach sand and to compare these reference levels with measurements from a beach impacted by nonpoint sources of contamination. Reference levels were computed using quantitative microbial risk assessment (QMRA) coupled with Monte Carlo simulations. In order to reach an equivalent level of risk of illness as set by the U.S. EPA for marine water exposure (1.9 × 10(-2)), levels would need to be at least about 10 oocysts/g (about 1 oocyst/g for a pica child) for Cryptosporidium, about 5 MPN/g (about 1 MPN/g for pica) for enterovirus, and less than 10(6) CFU/g for S. aureus. Pathogen levels measured in sand at a nonpoint source recreational beach were lower than the reference levels. More research is needed in evaluating risk from yeast and helminth exposures as well as in identifying acceptable levels of risk for skin infections associated with sand exposures.

  13. Quantitative assessment of the probability of bluetongue virus overwintering by horizontal transmission: application to Germany

    PubMed Central

    2011-01-01

    Even though bluetongue virus (BTV) transmission is apparently interrupted during winter, bluetongue outbreaks often reappear in the next season (overwintering). Several mechanisms for BTV overwintering have been proposed, but to date, their relative importance remain unclear. In order to assess the probability of BTV overwintering by persistence in adult vectors, ruminants (through prolonged viraemia) or a combination of both, a quantitative risk assessment model was developed. Furthermore, the model allowed the role played by the residual number of vectors present during winter to be examined, and the effect of a proportion of Culicoides living inside buildings (endophilic behaviour) to be explored. The model was then applied to a real scenario: overwintering in Germany between 2006 and 2007. The results showed that the limited number of vectors active during winter seemed to allow the transmission of BTV during this period, and that while transmission was favoured by the endophilic behaviour of some Culicoides, its effect was limited. Even though transmission was possible, the likelihood of BTV overwintering by the mechanisms studied seemed too low to explain the observed re-emergence of the disease. Therefore, other overwintering mechanisms not considered in the model are likely to have played a significant role in BTV overwintering in Germany between 2006 and 2007. PMID:21314966

  14. Development of a new quantitative gas permeability method for dental implant-abutment connection tightness assessment

    PubMed Central

    2011-01-01

    Background Most dental implant systems are presently made of two pieces: the implant itself and the abutment. The connection tightness between those two pieces is a key point to prevent bacterial proliferation, tissue inflammation and bone loss. The leak has been previously estimated by microbial, color tracer and endotoxin percolation. Methods A new nitrogen flow technique was developed for implant-abutment connection leakage measurement, adapted from a recent, sensitive, reproducible and quantitative method used to assess endodontic sealing. Results The results show very significant differences between various sealing and screwing conditions. The remaining flow was lower after key screwing compared to hand screwing (p = 0.03) and remained different from the negative test (p = 0.0004). The method reproducibility was very good, with a coefficient of variation of 1.29%. Conclusions Therefore, the presented new gas flow method appears to be a simple and robust method to compare different implant systems. It allows successive measures without disconnecting the abutment from the implant and should in particular be used to assess the behavior of the connection before and after mechanical stress. PMID:21492459

  15. Skeletal status assessed by quantitative ultrasound at the hand phalanges in karate training males.

    PubMed

    Drozdzowska, Bogna; Münzer, Ulrich; Adamczyk, Piotr; Pluskiewicz, Wojciech

    2011-02-01

    The aim of the study was to assess the influence of regularly exercised karate on the skeletal status. The study comprised a group of 226 males (the mean age: 25.64 ± 12.3 years, range 7-61 years), exercising for 61.9 ± 68.4 months, with the mean frequency of 3.12 ± 1.4 times per week, and 502 controls, matched for age and body size. The skeletal status was assessed by quantitative ultrasound, using a DBM Sonic 1200 (IGEA, Italy) sonographic device, which measures amplitude-dependent speed of sound (Ad-SoS [m/s]) at hand phalanges. Ad-SoS, T-score, Z-score were significantly higher in the examined karatekas than in controls. Up to age 18, there had been no difference between the study subjects and controls, while afterwards, up to age 35, the difference increased to stabilize again after age 35. Longer duration, higher frequency and earlier start of physical training positively influenced the skeletal status. In conclusion, karate is a sport with a positive influence on the skeletal status with the most significant benefits occurring in adults. PMID:21208731

  16. Large-Scale Quantitative Assessment of Binding Preferences in Protein-Nucleic Acid Complexes.

    PubMed

    Jakubec, Dávid; Hostas, Jirí; Laskowski, Roman A; Hobza, Pavel; Vondrásek, Jirí

    2015-04-14

    The growing number of high-quality experimental (X-ray, NMR) structures of protein–DNA complexes has sufficient enough information to assess whether universal rules governing the DNA sequence recognition process apply. While previous studies have investigated the relative abundance of various modes of amino acid–base contacts (van der Waals contacts, hydrogen bonds), relatively little is known about the energetics of these noncovalent interactions. In the present study, we have performed the first large-scale quantitative assessment of binding preferences in protein–DNA complexes by calculating the interaction energies in all 80 possible amino acid–DNA base combinations. We found that several mutual amino acid–base orientations featuring bidentate hydrogen bonds capable of unambiguous one-to-one recognition correspond to unique minima in the potential energy space of the amino acid–base pairs. A clustering algorithm revealed that these contacts form a spatially well-defined group offering relatively little conformational freedom. Various molecular mechanics force field and DFT-D ab initio calculations were performed, yielding similar results. PMID:26894243

  17. Quantitative assessment of the differential impacts of arbuscular and ectomycorrhiza on soil carbon cycling.

    PubMed

    Soudzilovskaia, Nadejda A; van der Heijden, Marcel G A; Cornelissen, Johannes H C; Makarov, Mikhail I; Onipchenko, Vladimir G; Maslov, Mikhail N; Akhmetzhanova, Asem A; van Bodegom, Peter M

    2015-10-01

    A significant fraction of carbon stored in the Earth's soil moves through arbuscular mycorrhiza (AM) and ectomycorrhiza (EM). The impacts of AM and EM on the soil carbon budget are poorly understood. We propose a method to quantify the mycorrhizal contribution to carbon cycling, explicitly accounting for the abundance of plant-associated and extraradical mycorrhizal mycelium. We discuss the need to acquire additional data to use our method, and present our new global database holding information on plant species-by-site intensity of root colonization by mycorrhizas. We demonstrate that the degree of mycorrhizal fungal colonization has globally consistent patterns across plant species. This suggests that the level of plant species-specific root colonization can be used as a plant trait. To exemplify our method, we assessed the differential impacts of AM : EM ratio and EM shrub encroachment on carbon stocks in sub-arctic tundra. AM and EM affect tundra carbon stocks at different magnitudes, and via partly distinct dominant pathways: via extraradical mycelium (both EM and AM) and via mycorrhizal impacts on above- and belowground biomass carbon (mostly AM). Our method provides a powerful tool for the quantitative assessment of mycorrhizal impact on local and global carbon cycling processes, paving the way towards an improved understanding of the role of mycorrhizas in the Earth's carbon cycle.

  18. Quantitative microbial risk assessment of distributed drinking water using faecal indicator incidence and concentrations.

    PubMed

    van Lieverloo, J Hein M; Blokker, E J Mirjam; Medema, Gertjan

    2007-01-01

    Quantitative Microbial Risk Assessments (QMRA) have focused on drinking water system components upstream of distribution to customers, for nominal and event conditions. Yet some 15-33% of waterborne outbreaks are reported to be caused by contamination events in distribution systems. In the majority of these cases and probably in all non-outbreak contamination events, no pathogen concentration data was available. Faecal contamination events are usually detected or confirmed by the presence of E. coli or other faecal indicators, although the absence of this indicator is no guarantee of the absence of faecal pathogens. In this paper, the incidence and concentrations of various coliforms and sources of faecal contamination were used to estimate the possible concentrations of faecal pathogens and consequently the infection risks to consumers in event-affected areas. The results indicate that the infection risks may be very high, especially from Campylobacter and enteroviruses, but also that the uncertainties are very high. The high variability of pathogen to thermotolerant coliform ratios estimated in environmental samples severely limits the applicability of the approach described. Importantly, the highest ratios of enteroviruses to thermotolerant coliform were suggested from soil and shallow groundwaters, the most likely sources of faecal contamination that are detected in distribution systems. Epidemiological evaluations of non-outbreak faecal contamination of drinking water distribution systems and thorough tracking and characterisation of the contamination sources are necessary to assess the actual risks of these events.

  19. Use of coefficient of variation in assessing variability of quantitative assays.

    PubMed

    Reed, George F; Lynn, Freyja; Meade, Bruce D

    2002-11-01

    We have derived the mathematical relationship between the coefficient of variation associated with repeated measurements from quantitative assays and the expected fraction of pairs of those measurements that differ by at least some given factor, i.e., the expected frequency of disparate results that are due to assay variability rather than true differences. Knowledge of this frequency helps determine what magnitudes of differences can be expected by chance alone when the particular coefficient of variation is in effect. This frequency is an operational index of variability in the sense that it indicates the probability of observing a particular disparity between two measurements under the assumption that they measure the same quantity. Thus the frequency or probability becomes the basis for assessing if an assay is sufficiently precise. This assessment also provides a standard for determining if two assay results for the same subject, separated by an intervention such as vaccination or infection, differ by more than expected from the variation of the assay, thus indicating an intervention effect. Data from an international collaborative study are used to illustrate the application of this proposed interpretation of the coefficient of variation, and they also provide support for the assumptions used in the mathematical derivation.

  20. Use of Coefficient of Variation in Assessing Variability of Quantitative Assays

    PubMed Central

    Reed, George F.; Lynn, Freyja; Meade, Bruce D.

    2002-01-01

    We have derived the mathematical relationship between the coefficient of variation associated with repeated measurements from quantitative assays and the expected fraction of pairs of those measurements that differ by at least some given factor, i.e., the expected frequency of disparate results that are due to assay variability rather than true differences. Knowledge of this frequency helps determine what magnitudes of differences can be expected by chance alone when the particular coefficient of variation is in effect. This frequency is an operational index of variability in the sense that it indicates the probability of observing a particular disparity between two measurements under the assumption that they measure the same quantity. Thus the frequency or probability becomes the basis for assessing if an assay is sufficiently precise. This assessment also provides a standard for determining if two assay results for the same subject, separated by an intervention such as vaccination or infection, differ by more than expected from the variation of the assay, thus indicating an intervention effect. Data from an international collaborative study are used to illustrate the application of this proposed interpretation of the coefficient of variation, and they also provide support for the assumptions used in the mathematical derivation. PMID:12414755

  1. Quantitative structure-activity relationships and ecological risk assessment: an overview of predictive aquatic toxicology research.

    PubMed

    Bradbury, S P

    1995-09-01

    In the field of aquatic toxicology, quantitative structure-activity relationships (QSARs) have developed as scientifically credible tools for predicting the toxicity of chemicals when little or no empirical data are available. A fundamental understanding of toxicological principles has been considered an important component to the acceptance and application of QSAR approaches as biologically relevant in ecological risk assessments. As a consequence, there has been an evolution of QSAR development and application from that of a chemical-class perspective to one that is more consistent with assumptions regarding modes of toxic action. In this review, techniques to assess modes of toxic action from chemical structure are discussed, with consideration that toxicodynamic knowledge bases must be clearly defined with regard to exposure regimes, biological models/endpoints and compounds that adequately span the diversity of chemicals anticipated for future applications. With such knowledge bases, classification systems, including rule-based expert systems, have been established for use in predictive aquatic toxicology applications. The establishment of QSAR techniques that are based on an understanding of toxic mechanisms is needed to provide a link to physiologically based toxicokinetic and toxicodynamic models, which can provide the means to extrapolate adverse effects across species and exposure regimes. PMID:7570660

  2. Application of quantitative uncertainty analysis for human health risk assessment at Rocky Flats

    SciTech Connect

    Duncan, F.L.W.; Gordon, J.W. ); Smith, D. ); Singh, S.P. )

    1993-01-01

    The characterization of uncertainty is an important component of the risk assessment process. According to the U.S. Environmental Protection Agency's (EPA's) [open quotes]Guidance on Risk Characterization for Risk Managers and Risk Assessors,[close quotes] point estimates of risk [open quotes]do not fully convey the range of information considered and used in developing the assessment.[close quotes] Furthermore, the guidance states that the Monte Carlo simulation may be used to estimate descriptive risk percentiles. To provide information about the uncertainties associated with the reasonable maximum exposure (RME) estimate and the relation of the RME to other percentiles of the risk distribution for Operable Unit 1 (OU-1) at Rocky Flats, uncertainties were identified and quantitatively evaluated. Monte Carlo simulation is a technique that can be used to provide a probability function of estimated risk using random values of exposure factors and toxicity values in an exposure scenario. The Monte Carlo simulation involves assigning a joint probability distribution to the input variables (i.e., exposure factors) of an exposure scenario. Next, a large number of independent samples from the assigned joint distribution are taken and the corresponding outputs calculated. Methods of statistical inference are used to estimate, from the output sample, some parameters of the output distribution, such as percentiles and the expected value.

  3. Quantitative ultrasound criteria for risk stratification in clinical practice: a comparative assessment.

    PubMed

    Noale, Marianna; Maggi, Stefania; Gonnelli, Stefano; Limongi, Federica; Zanoni, Silvia; Zambon, Sabina; Rozzini, Renzo; Crepaldi, Gaetano

    2012-07-01

    This study aimed to compare two different classifications of the risk of fracture/osteoporosis (OP) based on quantitative ultrasound (QUS). Analyses were based on data from the Epidemiological Study on the Prevalence of Osteoporosis, a cross-sectional study conducted in 2000 aimed at assessing the risk of OP in a representative sample of the Italian population. Subjects were classified into 5 groups considering the cross-classification found in previous studies; logistic regression models were defined separately for women and men to study the fracture risk attributable to groups defined by the cross-classification, adjusting for traditional risk factors. Eight-thousand six-hundred eighty-one subjects were considered in the analyses. Logistic regression models revealed that the two classifications seem to be able to identify a common core of individuals at low and at high risk of fractures, and the importance of a multidimensional assessment in older patients to evaluate clinical risk factors together with a simple, inexpensive, radiation-free device such as QUS.

  4. Swept source optical coherence tomography for quantitative and qualitative assessment of dental composite restorations

    NASA Astrophysics Data System (ADS)

    Sadr, Alireza; Shimada, Yasushi; Mayoral, Juan Ricardo; Hariri, Ilnaz; Bakhsh, Turki A.; Sumi, Yasunori; Tagami, Junji

    2011-03-01

    The aim of this work was to explore the utility of swept-source optical coherence tomography (SS-OCT) for quantitative evaluation of dental composite restorations. The system (Santec, Japan) with a center wavelength of around 1300 nm and axial resolution of 12 μm was used to record data during and after placement of light-cured composites. The Fresnel phenomenon at the interfacial defects resulted in brighter areas indicating gaps as small as a few micrometers. The gap extension at the interface was quantified and compared to the observation by confocal laser scanning microscope after trimming the specimen to the same cross-section. Also, video imaging of the composite during polymerization could provide information about real-time kinetics of contraction stress and resulting gaps, distinguishing them from those gaps resulting from poor adaptation of composite to the cavity prior to polymerization. Some samples were also subjected to a high resolution microfocus X-ray computed tomography (μCT) assessment; it was found that differentiation of smaller gaps from the radiolucent bonding layer was difficult with 3D μCT. Finally, a clinical imaging example using a newly developed dental SS-OCT system with an intra-oral scanning probe (Panasonic Healthcare, Japan) is presented. SS-OCT is a unique tool for clinical assessment and laboratory research on resin-based dental restorations. Supported by GCOE at TMDU and NCGG.

  5. Electroencephalographic Data Analysis With Visibility Graph Technique for Quantitative Assessment of Brain Dysfunction.

    PubMed

    Bhaduri, Susmita; Ghosh, Dipak

    2015-07-01

    Usual techniques for electroencephalographic (EEG) data analysis lack some of the important properties essential for quantitative assessment of the progress of the dysfunction of the human brain. EEG data are essentially nonlinear and this nonlinear time series has been identified as multi-fractal in nature. We need rigorous techniques for such analysis. In this article, we present the visibility graph as the latest, rigorous technique that can assess the degree of multifractality accurately and reliably. Moreover, it has also been found that this technique can give reliable results with test data of comparatively short length. In this work, the visibility graph algorithm has been used for mapping a time series-EEG signals-to a graph to study complexity and fractality of the time series through investigation of its complexity. The power of scale-freeness of visibility graph has been used as an effective method for measuring fractality in the EEG signal. The scale-freeness of the visibility graph has also been observed after averaging the statistically independent samples of the signal. Scale-freeness of the visibility graph has been calculated for 5 sets of EEG data patterns varying from normal eye closed to epileptic. The change in the values is analyzed further, and it has been observed that it reduces uniformly from normal eye closed to epileptic.

  6. Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment

    PubMed Central

    David, Simon; Visvikis, Dimitris; Roux, Christian; Hatt, Mathieu

    2011-01-01

    In Positron Emission Tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumour volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumour metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets, the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on the clinical datasets, it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracers datasets in order to evaluate its potential impact on the biological tumour volume definition for radiotherapy applications. PMID:21846937

  7. Estimation of undiscovered deposits in quantitative mineral resource assessments-examples from Venezuela and Puerto Rico

    USGS Publications Warehouse

    Cox, D.P.

    1993-01-01

    Quantitative mineral resource assessments used by the United States Geological Survey are based on deposit models. These assessments consist of three parts: (1) selecting appropriate deposit models and delineating on maps areas permissive for each type of deposit; (2) constructing a grade-tonnage model for each deposit model; and (3) estimating the number of undiscovered deposits of each type. In this article, I focus on the estimation of undiscovered deposits using two methods: the deposit density method and the target counting method. In the deposit density method, estimates are made by analogy with well-explored areas that are geologically similar to the study area and that contain a known density of deposits per unit area. The deposit density method is useful for regions where there is little or no data. This method was used to estimate undiscovered low-sulfide gold-quartz vein deposits in Venezuela. Estimates can also be made by counting targets such as mineral occurrences, geophysical or geochemical anomalies, or exploration "plays" and by assigning to each target a probability that it represents an undiscovered deposit that is a member of the grade-tonnage distribution. This method is useful in areas where detailed geological, geophysical, geochemical, and mineral occurrence data exist. Using this method, porphyry copper-gold deposits were estimated in Puerto Rico. ?? 1993 Oxford University Press.

  8. Skeletal status assessed by quantitative ultrasound at the hand phalanges in karate training males.

    PubMed

    Drozdzowska, Bogna; Münzer, Ulrich; Adamczyk, Piotr; Pluskiewicz, Wojciech

    2011-02-01

    The aim of the study was to assess the influence of regularly exercised karate on the skeletal status. The study comprised a group of 226 males (the mean age: 25.64 ± 12.3 years, range 7-61 years), exercising for 61.9 ± 68.4 months, with the mean frequency of 3.12 ± 1.4 times per week, and 502 controls, matched for age and body size. The skeletal status was assessed by quantitative ultrasound, using a DBM Sonic 1200 (IGEA, Italy) sonographic device, which measures amplitude-dependent speed of sound (Ad-SoS [m/s]) at hand phalanges. Ad-SoS, T-score, Z-score were significantly higher in the examined karatekas than in controls. Up to age 18, there had been no difference between the study subjects and controls, while afterwards, up to age 35, the difference increased to stabilize again after age 35. Longer duration, higher frequency and earlier start of physical training positively influenced the skeletal status. In conclusion, karate is a sport with a positive influence on the skeletal status with the most significant benefits occurring in adults.

  9. Quantitative risk assessment for the induction of allergic contact dermatitis: uncertainty factors for mucosal exposures.

    PubMed

    Farage, Miranda A; Bjerke, Donald L; Mahony, Catherine; Blackburn, Karen L; Gerberick, G Frank

    2003-09-01

    The quantitative risk assessment (QRA) paradigm has been extended to evaluating the risk of induction of allergic contact dermatitis from consumer products. Sensitization QRA compares product-related, topical exposures to a safe benchmark, the sensitization reference dose. The latter is based on an experimentally or clinically determined 'no observable adverse effect level' (NOAEL) and further refined by incorporating 'sensitization uncertainty factors' (SUFs) that address variables not adequately reflected in the data from which the threshold NOAEL was derived. A critical area of uncertainty for the risk assessment of oral care or feminine hygiene products is the extrapolation from skin to mucosal exposures. Most sensitization data are derived from skin contact, but the permeability of vulvovaginal and oral mucosae is greater than that of keratinized skin. Consequently, the QRA for some personal products that are exposed to mucosal tissue may require the use of more conservative SUFs. This article reviews the scientific basis for SUFs applied to topical exposure to vulvovaginal and oral mucosae. We propose a 20-fold range in the default uncertainty factor used in the contact sensitization QRA when extrapolating from data derived from the skin to situations involving exposure to non-keratinized mucosal tissue.

  10. The quantitative assessment of the pre- and postoperative craniosynostosis using the methods of image analysis.

    PubMed

    Fabijańska, Anna; Węgliński, Tomasz

    2015-12-01

    This paper considers the problem of the CT based quantitative assessment of the craniosynostosis before and after the surgery. First, fast and efficient brain segmentation approach is proposed. The algorithm is robust to discontinuity of skull. As a result it can be applied both in pre- and post-operative cases. Additionally, image processing and analysis algorithms are proposed for describing the disease based on CT scans. The proposed algorithms automate determination of the standard linear indices used for assessment of the craniosynostosis (i.e. cephalic index CI and head circumference HC) and allow for planar and volumetric analysis which so far have not been reported. Results of applying the introduced methods to sample craniosynostotic cases before and after the surgery are presented and discussed. The results show that the proposed brain segmentation algorithm is characterized by high accuracy when applied both in the pre- and postoperative craniosynostosis, while the introduced planar and volumetric indices for the disease description may be helpful to distinguish between the types of the disease.

  11. [Multi-component quantitative analysis combined with chromatographic fingerprint for quality assessment of Onosma hookeri].

    PubMed

    Aga, Er-bu; Nie, Li-juan; Dongzhi, Zhuo-ma; Wang, Ju-le

    2015-11-01

    A method for simultaneous determination of the shikonin, acetyl shikonin and β, β'-dimethylpropene shikonin in Onosma hookeri and the chromatographic fingerprint was estabished by HPLC-DAD on an Agilent Zorbax SB-column with a gradient elution of acetonitrile and water at 0.8 mL x min(-1), 30 degrees C. The quality assessment was conducted by comparing the content difference of three naphthoquinone constituents, in combination with chromatographic fingerprint analysis and systems cluster analysis among 7 batches of radix O. hookeri. The content of the three naphthoquinone constituents showed wide variations in 7 bathces. The similarity value of the fingerprints of sample 5, 6 and 7 was above 0.99, sample 2 and 3 above 0.97, sample 3 and 4 above 0.90, and other samples larger than 0.8, which was in concert with the content of three naphthoquinone constituents. The 7 samples were roughly divided into 4 categories. The results above indicated that the using of this medicine is complex and rather spotty. The established HPLC fingerprints and the quantitative analysis method can be used efficiently for quality assessment of O. hookeri.

  12. Three-Dimensional Quantitative Validation of Breast Magnetic Resonance Imaging Background Parenchymal Enhancement Assessments.

    PubMed

    Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng

    2016-01-01

    The magnetic resonance imaging (MRI) background parenchymal enhancement (BPE) and its clinical significance as a biomarker of breast cancer risk has been proposed based on qualitative studies. Previous BPE quantification studies lack appropriate